Skip to content
forgetit
02 June 2014

The “Right to be Forgotten” in the digital age – the ECJ decision in the case of Google Spain

“The Internet never forgets” – this truth/truism is no longer correct, at least not in the European Union. On 13 May 2014 the European Court of Justice (ECJ) ruled that persons may, under certain circumstances, request search engine providers to delete links to web pages which contain personal data from their list of results (case C-131/12).

The ECJ decision

The facts

The judgment was based on the following circumstances: a Spanish national lodged a complaint with the Spanish data protection agency (AEPD) against a Spanish daily newspaper and against Google Spain and Google Inc. When an internet user entered the complainant’s name in the search engine of the Google group, he would obtain links to two newspaper articles referring to a real-estate auction connected with attachment proceedings for the recovery of social security debts 16 years earlier.

The AEPD rejected the complaint in so far as it related to the daily newspaper, taking the view that the publication by it of the information in question was legally justified. On the other hand, the complaint was upheld in so far as it was directed against Google Spain and Google Inc. The latter challenged that decision at the National High Court. Subsequently the Spanish court referred several questions to the ECJ for a preliminary ruling.

Accountability of the search engine operators

The ECJ decided that the search engine activities which consist in finding information, indexing it automatically, storing it temporarily and, finally, making it available to internet users according to a particular order of preference must be classified as ‘processing of personal data’ under the terms of the European Data Protection Directive 95/46 (Directive 95/46). According to the ECJ this finding is not affected by the circumstance that such data has already been published on the internet and has not been altered by the search engine.

Furthermore, the ECJ ruled that the search engine operator must be regarded as the ‘controller’ in regard to such processing of personal data. For it is the search engine operator who determines the purposes and means of the processing of personal data (independently of and in addition to the processing of personal data carried out by publishers of the websites that are then listed as result of the search engine’s web search).

Territorial applicability of European Data Protection Law

Further the ECJ stated that European Data Protection Law also applies to organizations located outside the European Union (EU), e.g. in the USA, if the processing of personal data by the controller be ‘carried out in the context of the activities’ of an establishment of the controller on the territory of a Member State. In the case at hand the data processing, i.e. the indexing and storing of websites of third parties, is operated by Google Inc., which is the parent company of the Google Group (inter alia of Google Spain) and is based in the United States. However, in the judgment of the ECJ the European Data Protection Directive is applicable if the data processor has an establishment in a Member State the purpose of which is to promote and sell, in that Member State, advertising space offered by the search engine. For both activities are inextricably linked: the activities relating to the advertising space constitute the means of rendering the search engine economically viable and that engine is, at the same time, the means enabling those activities to be performed.

Legitimacy of data processing or the “Right to be Forgotten”?

According to European data protection law, the processing of personal data is legitimate if, the processing is necessary for the purposes of the legitimate interests pursued by the controller, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject. When balancing the opposing interests, so the ECJ pointed out, particular attention should be paid to the fact that the processing of personal data by a search engine is liable to significantly compromise an individual’s fundamental rights with regard to privacy and to the protection of personal data when a search by means of that engine is carried out on the basis of an individual’s name. Because such processing enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet — information potentially pertaining to a vast number of aspects of his private life and which, without the search engine, could only have been interconnected with great difficulty or not at all— and thereby to establish a more or less detailed profile of the individual in question. Furthermore, the level of interference with the individual’s rights is heightened due to the important role played by the internet and search engines in modern society and by the way in which these render the information contained in such result lists ubiquitous.

Due to the potential gravity of such interference, the ECJ ruled that it cannot be justified merely by the economic interest which the operator of such an engine has in that processing. The data subject’s rights also override, as a general rule, the interest of internet users in having access to that information; that balance may however depend, in specific cases, on

  • the nature of the information in question
  • its sensitivity for the data subject’s private life and
  • the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.

The ECJ pointed out that the search engine operator’s obligation is not dependent on the data processing of the publisher of the (linked) webpage. For the latter an independent balancing of interests and rights must take place.

Additionally, the ECJ indicated that even initially lawful processing of accurate data may, in the course of time, become incompatible with the directive. Thus, data has to be deleted when it is no longer necessary in the light of the purposes for which it was collected or processed. That is so in particular where it appears to be inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes and in the light of the time that has elapsed.

Requests may be addressed by the data subject directly to the controller who must then duly examine their merits and, as the case may be, put an end to any processing of the data in question. Where the controller does not grant the request, the data subject may bring the matter before the supervisory authority or the judicial authority so that it carries out the necessary checks and orders the controller to take specific measures accordingly.

In the case at issue, the ECJ recommends that, with regard to the sensitivity of the information contained in the relevant newspaper articles for the data subject’s private life and to the fact that its initial publication had taken place 16 years earlier, the data subject establishes a right that that information should no longer be linked to his name by means of such a list.

Critique and annotation

With its decision the ECJ has for the first time and in variance to the opinion of EU advocate general Jääskinen confirmed the hitherto de facto disputed “right to be forgotten”. This finding is both acclaimed and criticized – and thereby demonstrates the schizophrenic conflict between the simultaneous demands for privacy and transparency. So long as our own personal data is concerned we claim maximum privacy but when it comes to the activities of others we demand maximum transparency.

Critics in Europe and the USA accuse the ECJ of giving only comparatively general and brief regard to public interest. They highlight the need for debate regarding freedom of opinion and information. The balance between the right to privacy and to protection of personal data, on the one hand, and freedom of opinion and information as well as freedom of the press, on the other, is tipped one-sidedly in favour of the protection of privacy, they say. If individuals and organizations are to be given more opportunities to suppress access to inconvenient facts this will make it difficult for citizens and media to access objective information. Corrupt politicians or blatant polluters could assert their “right to be forgotten” in such a way that only a few years after the event references to a corruption scandal or an oil spill will be untraceable (see previous link).

In the USA in particular any attempt to delete true and legally obtained data is regarded as contrary to the “freedom of expression” expounded in the first amendment to the constitution and as inadmissible private censorship. Such censorship would furthermore enable people to “rewrite history” (see previous link). Neither the state nor its courts should be able to decide what is permissible on the Internet and what not (Volokh, Freedom of Speech, Information Privacy, and the Troubling Implications of a Right to Stop People From Speaking About You, Stanford Law Review, Vol. 52, 2000, S. 1-65).

However in response to this criticism it can be pointed out that the ECJ – in concurrence with the German Federal Court of Justice in the Sedlmayr case (Az. VI ZR 227/08 und VI ZR 228/08), which albeit concerned the original article in an online archive and not a link in the result list of a search engine – in no way decided that that information must be fully deleted, i. e. deleted also from the (online) archive of the newspaper. Rather the court confines itself to limit the danger of permanently available private information and its high speed correlation by internet search engines. At the same time the ECJ recognizes the possibility of exceptions in the case of considerable public interest with regard to long term access to information. Thus individuals will not be put in a position to be able to completely rewrite their histories. On the contrary it can be assumed that public interest with regard to access to information will be given precedence when, for example, it comes to corrupt politicians or polluters. The ECJ, therefore, does not want the past to be forgotten. It simply advocates a principal return to the pre-digital rule of exception to the rule-relationship of recollection and oblivion.


Image:

This post is part of a weekly series of articles by doctoral candidates of the Alexander von Humboldt Institute for Internet and Society. It does not necessarily represent the view of the Institute itself. For more information about the topics of these articles and associated research projects, please contact presse@hiig.de.

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Emma Peters

Former Associated Researcher: Data, actors, infrastructures

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.

Further articles

Modern subway station escalators leading to platforms, symbolizing the structured pathways of access rights. In the context of online platforms, such rights enable research but impose narrow constraints, raising questions about academic freedom.

Why access rights to platform data for researchers restrict, not promote, academic freedom

New German and EU digital laws grant researchers access rights to platform data, but narrow definitions of research risk undermining academic freedom.

Three groups of icons representing people have shapes travelling between them and a page in the middle of the image. The page is a simple rectangle with straight lines representing data used for people analytics. The shapes traveling towards the page are irregular and in squiggly bands.

Empowering workers with data

As workplaces become data-driven, can workers use people analytics to advocate for their rights? This article explores how data empowers workers and unions.

A stylised illustration featuring a large "X" in a minimalist font, with a dry branch and faded leaves on one side, and a vibrant blue bird in flight on the other. The image symbolises transition, with the bird representing the former Twitter logo and the "X" symbolising the platform's rebranding and policy changes under Elon Musk.

Two years after the takeover: Four key policy changes of X under Musk

This article outlines four key policy changes of X since Musk’s 2022 takeover, highlighting how the platform's approach to content moderation has evolved.