Tag Archives: Right to Private Life

Right to Information Recognised in New European Court Rulings

Image: Group of scholars studying books. Text: A Right to Information: Finding a Good Balance with the Right to Be ForgottenTwo much anticipated rulings have come from the Court of Justice of the European Union. Both are ‘preliminary rulings’, effectively requests to the Court to offer clarification on what EU law – in this case the ‘right to be forgotten’ doctrine created by the Court in 2014 and placed in legislation in the General Data Protection Regulation of 2016.

As a reminder, the right to be forgotten refers to the right of individuals to ask that particular stories not be included in search results for their name. The idea is to ensure that there is a way of avoiding that search engines automatically give prominence to information that is unduly invasive of privacy.

IFLA has released a statement on the subject, underlining that the right to remove search results risks undermining access to information for internet users. While the IFLA statement notes that in some situations, a right to be forgotten may make sense, it argues strongly that this should be the exception, not the norm, and stresses concern about the impacts of leaving this choice to private actors.

The two cases in question come from France, and its Conseil national de l’informatique et des libertés (CNIL) – the national digital data protection authority. In the first (C-507/17), the CNIL itself was in dispute with Google about whether, once there had been a decision to award the right to be forgotten, this should only be applied within Europe, or whether Google should be obliged to apply it on all versions of its search engine, around the world.

The second (C-136/17) asked whether the ban on ‘processing’ (doing things with) certain types of personal data, such as that about religious beliefs or politics, should also apply to search engines.

 

The Right to Information

In the first case, the Court decided that there was no obligation to remove relevant links from search engines around the world, rather than just in France or the EU (global delisting). This is an important decision, and one that IFLA itself supported, given our own statement on the subject.

Significantly, the Court explores the question of the costs of global delisting: ‘However, it states that numerous third States do not recognise the right to dereferencing or have a different approach to that right. The Court adds that the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality. In addition, the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.’

This definitely a welcome point for libraries, and one that underpins the final decision of the European Court, given its explicit recognition of a right to information of internet users around the world.

In the second case, the Court does note that the bar on processing highly personal information applies also to search engines to the extent that they process it.

However, it also argues that the exceptions to this bar do too – in a case where including a link in search results is essential if a balance is to be found between the rights of individuals and of information seekers, then this can be OK.

Therefore, in cases where the subject of the information has a prominent role in public life, it may well be acceptable to maintain search results, in order to ‘protect[…] the freedom of information of internet users potentially interested in accessing that web page by means of such a search.

 

But No Resolution Yet

In both cases, the final decision rests with the French courts. The European Court has given guidance on how to take this, but leaves enough margin of appreciation the judges in Paris. As a result, in the case of global delisting, despite all of the arguments to suggest that this is a questionable move, the judgement still says that there’s nothing saying that this cannot still be requested.

Similarly, the judgement on highly personal data suggests that it is for the French judges to determine whether Google has taken sufficient care in working out whether it was necessary to include the relevant links in its search results. As a result, we will not know the final results for a while yet.

Clearly Google itself is a lightning rod. Its size and reputation make it a bogeyman for many. However, it is worth noting that the judgements apply not just to Google, but also to any other company or information service offering search functionality.

As seen in the Le Soir judgement in Belgium in 2016, the idea of the right to be forgotten can also be applied to a service offering search into digitised old newspapers.

Crucially, while Google may be in a position to apply the rules set out, it may be harder for others to do the same. For example, in the judgement on highly sensitive data, the Court argues that a search engine should be able to rearrange results about court judgements in order to ensure that the most recent information comes first.

If the rules around offering search services become more complicated, the risk is that it’s the smaller players who will fall foul of the rules, not Google, reducing the choice of information seeking tools available to users around the world.

 

Why Privacy Matters, For Everyone: Chose Privacy Week 2019

Choose Privacy Week was initiated by the American Library Association to draw attention to the importance of privacy, and what people can do about it. It is a great opportunity to learn about the important role librarians play in achieving this.

This year’s theme of Choose Privacy Week is “Inclusive Privacy: Closing the Gap”, and raises awareness of the privacy inequities imposed on vulnerable and historically underrepresented groups. It highlights how libraries can close the privacy gap for those who need it most.

Why Privacy Matters

Privacy is of course a right. As set out in Article 12 of the Universal Declaration of Human Rights, people should be able to live free of arbitrary interventions in their private life.

There is a good reason for this. The possibility to have a private life is central to much of what makes us human. In particular, it gives us the freedom to think, speak and access information freely.

IFLA’s submission to the UN Special Rapporteur on Privacy stresses this point, underlining that without privacy, there can be a powerful chilling effect on creativity and innovation.

Privacy has traditionally been seen as a means of protecting the individual against efforts by states to import control. However, increasingly, it is privacy in the face of companies that is coming to the fore.

Data collection has never been easier, and the companies whose services we use are increasingly able to draw conclusions about us on the basis of what they see. Indeed, many of these conclusions may reveal traits and preferences of which we are not necessarily conscious ourselves.

Clearly advertising has done this for years, but the possibility to do so in such a targeted, individual manner is new.

If this was only about advertising, it would not necessarily be so important, although clearly still has a certain ‘creepiness’ factor. However, more is at stake. It can also shape the content we see on line – which stories, posts or search-results are promoted.

Ironically, perhaps, the effort to personalise services comes at the cost of individuality and privacy, as a coded version of your personality is constructed, held on a server somewhere, and then used.

This is not just an issue on social media, but also in the research space. With efforts to move from institutional to personal log-ins to academic articles, the possibility for publishers and platforms to monitor use, and make their own efforts to tailor results and experience also arise.

This is a problem, because it means that we cannot assume that the person next to us is seeing the same thing as we would. Moreover, given that the algorithmic version of your personality can only work on the basis of past data, it does not allow for you to change in the future, potentially locking you into a particular set of preferences and interests.

 

Privacy Can’t Be A Luxury

Yet privacy – and the need for privacy – may not be equally distributed or equally shared.

A first challenge is for people who belong to a vulnerable or marginalised group. In many cases, they may feel the need to hide what it is that makes them unique, given political, cultural or social pressures in the society around them.

The internet has been a major source of support for many in this position, given the possibility to connect to those in a similar situation elsewhere, without having to use what may be a hostile public space.

To have these characteristics and interest coded and used to shape advertising and online experience (and potentially even inform governments) takes these gains away.

There may also be challenges for people on lower incomes, who may, for example, be more reliant on smart phones to access the internet (which pose a number of privacy concerns).

They can also be obliged to share more personal information anyway online in order to apply for government services or other programmes. A 2017 study on privacy, poverty and big data by Data & Society reveals some key trends.

Add to this stories of internet subscribers being asked to pay more for a privacy-friendly connection, or the fact that more expensive phone brands are using privacy as a selling point, and the potential connection between income and the right to a private life becomes clear.

Finally, there is often not a connection between the risks faced, and the ability to do something about it.

Recent privacy legislation, such as the General Data Protection Regulation in the European Union, gives important new rights to individuals. The success of this depends on people being sufficiently skilled and motivate to choose privacy.

Yet is seems clear that even where there is awareness, there may not be the skills – or even the attitude – necessary to act on it. As the Data & Society study shows, while there is demand, people with less money, less time, and less education may feel helpless in the face of companies and government agencies.

This is just as true in the case of right to be forgotten cases. While there is certainly a place for such rules in protecting people against unfair, irrelevant or incorrect information about them being found through search results, the risk is that it becomes a tool for those in positions of power to ‘edit’ the historical record.

 

How Libraries Can Help

A year ago IFLA and the FAIFE Committee used the momentum of the Chose Privacy Week to bring awareness to how personal data ownership affect libraries and library users and offered practical steps that individuals can take to keep their private lives private in regards to the General Data Protection Regulation.

A year after, there is still a need to work to ensure that everyone really is aware of, skilled and motivated to use their choice of privacy.

Libraries have an expertise in information management, and a responsibility to help others develop their own information literacy skills. With more and more library resources found online, libraries can not only offer a means of accessing information and expressing yourself in as private a way as possible, but can encourage privacy-friendly behaviours in their users’ own lives.

In short, the library is not only a trusted source of information but also a community support and can “close the privacy gap” for its users by providing a safe space, training and resources to help them take control of their private lives and data.

Here are a few steps that you can take to ensure the users privacy:

  • Make use of the privacy guidelines for libraries. In 2016, IFLA published the IFLA Statement on Privacy in the Library Environment. The Statement is intended to give guidance to libraries and information services in an environment that includes mass surveillance by governments and routine user data collection by commercial interests that provide content or services through the Internet.
  • Reduce data traces online. Greater care in choosing privacy settings, and simply better data hygiene can all help. And there are great tools such as the Data Detox Kit already available.
  • Apply tools to protect user privacy. ALA has created a list of resources on relevant tools, you can find the list here, while Scottish PEN has a Libraries for Privacy Toolkit.
  • Watch presentations and webinars on the subject. You can learn a lot by watching webinars such as the IFLA webinar on the GDPR, or the ALA video on raising privacy awareness in your library.
  • Help raise awareness throughout Chose Privacy Week!