SSRN Id2916608
SSRN Id2916608
SSRN Id2916608
No. 19
2017
The European Union Law Working Paper Series presents research on the law and
policy of the European Union. The objective of the European Union Law Working
Paper Series is to share “work in progress”. The authors of the papers are solely
responsible for the content of their contributions and may use the citation standards of
their home country. The working papers can be found at http://ttlf.stanford.edu.
The European Union Law Working Paper Series is a joint initiative of Stanford Law
School and the University of Vienna School of Law’s LLM Program in European and
International Business Law.
If you should have any questions regarding the European Union Law Working Paper
Series, please contact Professor Dr. Siegfried Fina, Jean Monnet Professor of European
Union Law, or Dr. Roland Vogl, Executive Director of the Stanford Program in Law,
Science and Technology, at the
Álvaro Fomperosa is a Spanish attorney. He earned his LL.B. and BSc. in Economics
from Universidad Carlos III de Madrid in 2011. He also holds an LL.M. in European
Law and Economic Analysis from the College of Europe in Bruges, Belgium, and an
LL.M. in Law, Science & Technology from Stanford Law School. He has worked in
the Cabinet of the Vice President of the European Commission and Commissioner for
Competition, Mr. Joaquín Almunia, where he contributed to policy and decision-
making, both in competition and other broader EU regulatory issues. In 2013, Álvaro
joined the Brussels office of Cleary Gottlieb Steen & Hamilton, where he has been
involved in numerous antitrust cases and litigation before the European Commission
and the Court of Justice of the European Union. From 2015 to 2016, he was resident in
the Washington, D.C. office of Cleary Gottlieb.
The opinions expressed in this student paper are those of the author and not necessarily
those of the Transatlantic Technology Law Forum or any of its partner institutions, or
the sponsors of this research project.
Suggested Citation
Copyright
In the seminal Google Spain case, the European Court of Justice had the opportunity to
define the applicability of the Data Protection Directive to search engines in general
and the boundaries between privacy rights and free speech in the Internet age in
particular. By broadly defining its territorial scope, the ECJ characterized search
engines as “controllers,” for which the Directive sets burdensome obligations related to
data subjects’ rights to blockage, erasure, and objection. The ECJ went further by
recognizing a broader right to request search engines to delist links to personal
information upon request by the data subject, even when the information is legally
published: the so-called Right to be Forgotten.
This paper proceeds as follows. First, I analyze the conclusions of the Court and offer
a critique on the adjudication, particularly that: 1) the ruling effectively places search
engines in permanent breach of data protection rules; 2) the decision shows apparent
preference for privacy rights and improperly balances other fundamental rights; 3) the
Court acknowledges the existence of a Right to be Forgotten within the boundaries of
the Data Protection Directive.
Next, I assess EU law and case law of the European Court of Human Rights on the
balancing of privacy and free speech freedoms. I conclude that the holding in Google
Spain barely fits within the boundaries of the acquis, particularly in light of the
principle of proportionality. I also scrutinize the striking relinquishment of the
balancing of fundamental rights by public authorities that stems from this decision.
The Google Spain holding transfers the foundational task of defining the public
interest in the balancing of fundamental rights, which traditionally has lied with public
authorities in Europe, to private economics entities: search engines, which effectively
become the gatekeepers of privacy and censors of the Internet.
Finally, I analyze the potential threat of conflict of laws stemming from the
extraterritorial application of the Right to be Forgotten, in particular vis-à-vis the
United States: it seems that the European Union intends delisting to be universal,
across all forms of the Internet, which could conflict with non-EU citizens’ rights to
truthful information. I propose using geo-filtering to ensure the effectiveness of the
ruling, but restrained to the boundaries of the European Union.
Table of Contents
I. Introduction ............................................................................................................................ 3
II. Google Spain Case .................................................................................................................. 5
II.1 Facts of the Case .............................................................................................................. 5
II.2 Analysis of the Judgment ................................................................................................. 6
II.2.1 Territorial Scope ......................................................................................................... 7
II.2.2 Material Scope: Obligations of Search Engines as Data Processors and Controllers
11
II.2.3 Right to Be Forgotten ............................................................................................... 18
III. Striking the Right Balance Between Privacy and the Right to Be Informed in the
Digital Age: Some Unanswered Questions ................................................................................. 21
III.1 The Rights at Stake: Privacy v. Public Interest .............................................................. 21
III.1.1 The Balancing Under the European Convention of Human Rights and the E.U.
Charter of Fundamental Rights .............................................................................................. 22
III.1.2 The Balancing According to Google Spain.............................................................. 23
III.2 Balancing Factors: Article 29 Working Party v. Google’s Advisory Council ............... 26
III.2.1 Art. 29 Working Party Guidelines............................................................................ 27
III.2.2 Google’s Advisory Council ...................................................................................... 31
III.2.3 Actual Balancing Practice by Google ...................................................................... 33
III.3 Balancing by Search Engines ......................................................................................... 35
1
V. Extraterritoriality of EU Data Protection Rights: Applicable Law within the EU and
Conflict with U.S. Law. ................................................................................................................ 40
V.1 Applicable Law Within the EU ...................................................................................... 40
V.2 Conflict with the U.S Law.............................................................................................. 42
2
I. Introduction
After H.R. Coase’s The Problem of Social Cost,1 the most cited law review article in United
States history deals with the right to privacy 2: already in 1890, Warren & Brandeis’ The Right to
Privacy argued for the need to create a new right intertwined with the principle of inviolate
personality. 3 It does not come as a surprise that the first paragraph of Advocate General
Jääskinen’s (“AG Jääskinen”) Opinion for the European Court of Justice (“ECJ”) in the Google
At the time of Warren and Brandeis’ article, the rise of photography and the surge of the
news industry threatened to invade dimensions of life that were previously kept private. 5
Nowadays, we are experiencing the rise of new technologies that will be present in all aspects of
our lives. Massive amounts of information about us are produced and stored faster than ever. 6
The intimate intellectual and personal space that we once had no longer exists in the
technological age, and now, more than ever, modern societies must reflect upon the value given
to privacy, so that we can design adequate protections and redress mechanisms to safeguard
personal privacy.
1
R.H. Coase, The Problem of Social Cost, 3 J.L. & ECON. 1 (1960).
2
Fred R. Shapiro & Michelle Pearse, The Most-Cited Law Review Articles of All Time, 110 MICH. L. REV.
1483, 1489 (2012).
3
Samuel D. Warren & Louis D. Brandeis, The Right to Privacy,4 HARV. L. REV. 193, 205 (1890).
4
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL v. Agencia Española de
Protección de Datos, EU:C:2013:424.
5
Warren & Brandeis, supra note 3, at 195.
6
DANIEL J. SOLOVE, THE DIGITAL PERSON: TECHNOLOGY AND PRIVACY IN THE INFORMATION AGE, 1-2
(2004).
3
One of the information technologies challenging society is the so-called “Googleization”
effect: the Internet eases the dissemination and preservation of information and personal data.
Information that would have been out of reach some years ago can be now conveniently found by
simply searching someone’s name in Google or any other search engine. 7 The question is then:
In its seminal case Google Spain, the ECJ had the opportunity to address this question. 9
The Audiencia Nacional (Spanish National High Court) certified to the ECJ certain questions
related to the application of the European Data Protection Directive 95/46/CE (“Directive
95/46”), 10 the main piece of legislation on the protection of personal information in the EU, 11
until the General Data Protection Regulation comes into effect on 25 May 2018. 12
The ECJ faced three sets of questions related to Directive 95/46 concerning: 13 (i) its
territorial scope; (ii) its material scope regarding its applicability to search engines; and (iii) the
7
DANIEL J. SOLOVE & PAUL M. SCHWARTZ, INFORMATION PRIVACY LAW 9 (5th Ed., 2014).
8
Id.
9
Case C-131/12, Google Spain SL v. Agencia Española de Protección de Datos, EU:C:2014:317.
10
Order A.N., Feb. 27, 2012 (R.J.C.A. No. 2012/321) (Spain).
11
Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection
of individuals with regard to the processing of personal data and on the free movement of such data.
Council Directive 95/46, 1995 O.J. (L 281) 31 (EC).
12
Regulation 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing
of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data
Protection Regulation), 2016 O.J. (L 119/1). While the General Data Protection Regulation entered into
force on 24 May 2016, it shall only apply from 25 May 2018.
13
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL¶ 6.
14
The Right to be Forgotten is not a right to for information to disappear into total oblivion, but a right to
request the delisting of links from search engines. See Brendan Van Alsenoy & Marieke Koekkoek,
Internet and Jurisdiction After Google Spain: The Extraterritorial Reach of the “Right to Be Delisted”,
INT. DATA PRIVACY L. (Apr. 8, 2015),
http://idpl.oxfordjournals.org/content/early/2015/04/08/idpl.ipv003.abstract.
4
In this article, I offer a critical view of the Google Spain judgment. Then, I focus on the
relevance of the Right to be Forgotten to the evolution of the European conception of the right to
privacy in the digital age, and its place within the sphere of fundamental rights in the EU,
particularly with regard to the tension between privacy and freedom of speech. Finally, I also
discuss certain relevant procedural questions and underline the challenges posed by the
extraterritorial application of laws that affect the borderless, global nature of the Internet.
In this section I offer an overview of the facts and the legal challenges of the Google
Spain case, the solutions given by the ECJ, and a critical assessment of its main conclusions.
La Vanguardia is one of the major daily newspapers in Spain. On two occasions in early
1999, the newspaper published announcements about the auction of an apartment belonging to
Mr. Mario Costeja González and his ex-wife for the repayment of social security debts. 15
In 2008, La Vanguardia digitized its library, including the issues with information about
the auction. Google searched the library through its crawlers, 16 indexed the periodicals, and
included them in the search results. Consequently, anyone who searched Mr. Costeja’s name in
Google could access the links to La Vanguardia’s related pages from the results display.
Mr. Costeja sought the removal or alteration of the pages by La Vanguardia and the removal or
15
Google Spain SL, Case C-131/12, ¶ 14.
16
Crawlers are Internet bots that search the world wide web to index it. See Sergey Brin & Larry Page,
The Anatomy of a Large-Scale Hypertextual Web Search Engine, 30 COMPUTER NETWORKS AND ISDN
SYSTEMS 107, 107-17 (1998).
5
concealment of Google’s results. Google denied both requests to either remove or conceal the
links. In March 2010, Mr. Costeja filed a complaint with the relevant National Data Protection
Authority (“DPA”), the Agencia Española de Protección de Datos (“AEPD”, the Spanish DPA). 17
The AEPD decided that according to Spanish data protection law, 18 Google should have
taken the necessary steps to remove the data from the index and prevent access to the links. 19
However, the AEPD concluded that La Vanguardia was bound by Spanish law to publish the
auction and was thus not obligated to comply with Mr. Costeja’s request. 20 Google challenged
The Audiencia Nacional identified certain questions that needed to be clarified by the ECJ
before the Spanish high court could decide the case. The questions can be summarized to fit
within three categories of issues: (i) the territorial scope of Directive 95/46; (ii) the material scope
and the inclusion of search engines; and (iii) the Right to be Forgotten. 22
17
Google Spain SL, Case C-131/12, ¶ 14-15.
18
Ley Orgánica 15/1999, de Protección de Datos de Carácter Personal, (B.O.E. 1999, 298) (Spain),
(incorporating Directive 95/46 into national law).
19
AEPD, Jul. 30, 2010 (Decision No. R/01680/2010), at 22.
20
Id. at 22-23.
21
Google Spain SL, Case C-131/12, ¶ 18.
22
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 23.
6
II.2.1 Territorial Scope
The first request for interpretation related to Article 4(1) of Directive 95/46, which
establishes its applicability. Inter alia, 23 Directive 95/46 applies to the processing of personal
data where (i) the “processing” is carried out in the context of the activities of an establishment of
the “controller” within a Member State or where (ii) the controller is not established in the EU
but it “makes use” of equipment within the EU for purposes of “processing data,” unless the
The Audiencia Nacional needed to know whether Directive 95/46 applied to Google Inc.,
particularly since Google Spain limited its activity mostly to marketing and the search activity
was performed in the U.S. by Google Inc. Thus, the court asked the ECJ whether Google Spain
could be considered an establishment of Google Inc., taking into account that Google Spain’s
activity was limited to (i) promoting and selling advertising space in Spain; (ii) managing two
concrete files with Google’s Spanish clients data on behalf of Google Inc.; (iii) referring requests
Thus, the applicability of Directive 95/46 depended upon the characterization of Google
Spain as an “establishment” of Google Inc. Although in layman’s terms Google Spain is part of
Google Inc., for the purpose of determining the applicability of Directive 95/46, the data
processing activity (the search), had to be carried out in the context of the activities of Google
23
Apart from the examples cited, Directive 95/46 is also applicable where the controller is not established
on the Member State's territory, but in a place where its national law applies by virtue of international
public law. Directive 95/46, supra note 11, art. 4(1)(b).
24
Directive 95/46, supra note 11, art. 4(1)(a), 4(1)(c).
25
Google Spain SL, Case C-131/12, ¶ 20.1(a);Order A.N., supra note 10, operative ¶ 1.1.
7
Spain. This characterization was not obviously correct, since Google Spain’s activities related to
Beyond the establishment trigger of the territorial scope, another potential connection was
through Google’s “use of equipment” in Spain. In that regard, the Audiencia Nacional asked the
ECJ whether there is “use of equipment” in the EU where (i) Google search crawlers located and
indexed information located in E.U. servers; (ii) Google used an EU site (www.google.es) and
displayed the results in Spanish; and (iii) Google temporarily stored indexed information in an
undisclosed place. 27
The ECJ confirmed the territorial applicability of Directive 95/46 and held that Google
Spain was an establishment of Google Inc. for the purpose of Directive 95/46. According to the
ECJ, when the operator of a search engine sets up a subsidiary in a Member State with the
intention of promoting and selling advertising space, the processing of personal data carried out
establishment of the controller in the territory of a Member State, within the meaning of Article
The ECJ reached this conclusion due to the “inextricable link” between the search and
advertisement activities:
the activities of the operator of the search engine and those of its establishment situated in
the Member State concerned are inextricably linked since the activities relating to the
advertising space constitute the means of rendering the search engine at issue
26
Google Spain SL, Case C-131/12, ¶ 46.
27
Google Spain SL, Case C-131/12, ¶ 20.1(b), 20.1(c); Order A.N., supra note 10, ¶1.2, 1.3.
28
Google Spain SL, Case C-131/12, ¶ 60.
8
economically profitable and that engine is, at the same time, the means enabling those
activities to be performed. 29
Besides, the ECJ stated that Article 4(1)(a) cannot be interpreted restrictively because the
aim of the Directive is to ensure the “effective and complete protection of the fundamental rights
and freedoms of natural persons, and in particular their right to privacy,” and the Directive
ensures effectiveness “by prescribing a particularly broad territorial scope.” 30 Focusing on search
engines in particular, and their potential exclusion from the territorial scope, the ECJ also
concluded that under these circumstances, excluding search engines “would compromise the
directive’s effectiveness and the effective and complete protection of the fundamental rights and
Thus, the ECJ adopted a broad territorial scope to ensure the protection of the
fundamental right to data protection. 32 The marketing activities of Google Spain were considered
sufficient to conclude that it was an establishment of Google Inc. and that the processing was
carried out in the context of Google Spain’s activities because of the inextricable link between the
The conclusion of the ECJ regarding the inextricable link between the marketing activity
by Google Spain and the data processing activity (the search) by Google Inc. seems at odds with
29
Id. ¶ 56.
30
Id. ¶ 53-54.
31
Id. ¶ 58.
32
Charter of Fundamental Rights of the European Union, art. 8, 2000 O.J. (C 364) 1, 10. This broad
conceptualization of establishments to ensure the effective protection of privacy rights has been upheld in
Case C-230/14, Weltimmo s.r.o. v. Nemzeti Adatvédelmi és Információszabadság Hatóság ('Weltimmo').
EU:C:2015:639, ¶ 25-31.
33
Google Spain SL, Case C-131/12, ¶ 56.
9
the Working Party on the Protection of Individuals’ (“Working Party”) views. 34 Under the
Directive 95/46 if it is involved in activities relating to data processing. 35 The main factor to
analyze whether the data processing was carried out in the context of Google Spain’s activities is
its degree of involvement between Google Spain and Google Inc.; and as a secondary factor, the
nature of the activities. 36 The mere marketing of AdWords by Google Spain is a commercial
activity that does not qualify as data processing and is at most a weak involvement with the
natural search activity performed by Google Inc. 37 Nevertheless, the intended broad applicability
and scope of Directive 95/46 calls for a loose interpretation of the requirement that the data
processing be done in the context of the establishment’s activities. 38 In fact, AG Jääskinen, who
held in favor of Google in his opinion, took a wider stance than the ECJ and proposed that an
economic operator should be considered a single unit for the purpose of Directive 95/46. 39 This
effectively leads to territorial applicability as long as the search engine operator has an
establishment within the E.U., even where the establishment performs activities disconnected to
34
See Directive 95/46, supra note 11, art., which sets up a Working Party on the Protection of Individuals
to clarify and analyze the application of the Directive.
35
Working Party on the Protection of Individuals, Opinion 8/2010 on applicable law, at 13, 0836-
02/10/EN WP 179 (Dec. 16, 2010). Note that this opinion has been amended by the Working Party on the
Protection of Individuals, Update of Opinion 8/2010 on applicable law in light of the CJEU judgment in
Google Spain, 176/16/EN WP 179 update (Dec. 16, 2015). However, for the purpose of critically
analyzing the ruling I refer throughout the document to the opinions of the Working Party as expressed in
the original Opinion.
36
Id. at 14.
37
See Case C-323/09, Interflora Inc. v. Marks & Spencer Plc, 2011 E.C.R. I-08625, ¶ 9-13, (establishing
AdWords and natural search as two independent services). A fortiori, the mere marketing of AdWords
hardly squares with the qualification of “inextricably linked” to the natural search.
38
Working Party, Opinion 8/2010, supra note 35, at 8.
39
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 66
40
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 67.
10
II.2.2 Material Scope: Obligations of Search Engines as
Once the ECJ established that Google Inc.’s search activities were under the territorial
scope of Directive 95/46, the question then became whether a search engine is a data processor
and/or a data controller and thus, materially covered by the scope of Directive 95/46. 41 Two
main questions determined the outcome of this characterization: whether a search is “data
processing,” and whether Google “controlled” the purpose and means of the data treatment.
The Audiencia Nacional asked the ECJ whether the activity of Google searches, including
information crawling, automatic indexing, ranking, and user display should be considered as data
processing, 42 where the “processing of personal data” refers to any operation or set of operations
which is performed upon personal data, whether automatic or not.43 Meanwhile, a “processor” is
the natural or legal person that processes personal data on behalf of the “controller.” 44
A second question in the case was whether Google Inc. could be considered a controller, 45
where a controller is defined as a natural or legal person that determines the purpose and means
41
The obligations enshrined in Directive 95/46 place the burden on those characterized as “processors”
and “controllers.”
42
Google Spain SL, Case C-131/12, ¶ 20.2(a); Order A.N., supra note 10, ¶ 2.1.
43
Directive 95/46, supra note 11, art. 2(b).
44
Id. art. 2(e).
45
Google Spain SL, Case C-131/12, ¶ 20.2(b); Order A.N., supra note 10, ¶ 2.2.
46
Directive 95/46, supra note 11, art. 2(d).
11
If an individual is considered to be processing and controlling data under Directive 95/46,
that triggers certain obligations for those controllers. These obligations stem from the data
subject’s rights, including the right of the data subject to access, rectify, erase, or block the
information, 47 and the right to object to the data processing. 48 The Audiencia Nacional asked: in
the event that Google constituted a controller of personal data, could the AEPD require Google to
remove the links from the search results even when the information remained available in the
original source? 49
The ECJ concluded that the activity of a search engine, which consists of finding personal
data published or placed on the Internet by third parties, indexing it automatically, storing it on a
temporary basis, and including it in Internet users’ search results after ranking it, constitutes the
processing of personal data within the meaning of Article 2(b) of the Directive. 50 This is so as
long as the search engine collects, retrieves, records, organizes, stores, discloses, and makes the
personal data available to users. 51 The data processing by the search activity happens even if the
data have already been published on the Internet and when the data are not modified by the
search engine. 52
The ECJ also concluded that search engines determine the purpose and the means of the
search engine activity, and, as such, are controllers of the processed personal data within the
meaning of Article 2(d) of the Directive. 53 As with the territorial scope, the ECJ underlined that
47
Id. art. 12(b).
48
Id. art. 14(a).
49
Google Spain SL, Case C-131/12, ¶ 20.2(c)-20.2(d); Order A.N., supra note 10, ¶2.3, 2.4.
50
Google Spain SL, Case C-131/12, ¶ 41.
51
Google Spain SL, Case C-131/12, ¶ 28.
52
Id. ¶ 29.
53
Id. ¶ 33.
12
to ensure effective and complete protection of data subjects’ right to privacy, the concept of a
The ECJ established that search engine operators are obligated to remove links published
by third parties that contain information about data subject from the list of results displayed
following a search made on the basis of a person’s name, even if such data remains available on
the source web sites and even if its publication on those web sites is lawful. 55
Upon the data subject’s request to delist a link, a search engine must examine the merits
of the delinking, and if the search engine denies the request, the data subject can appeal to the
DPA. According to the ECJ, pursuant to data subject’s right to access, rectify, erase, or block
information, 56 or the right to object the data processing, 57 data subjects must have the opportunity
to directly request the search engine operator to delist a link. The search operator “must then duly
examine their merits and, as the case may be, end processing of the data in question.” 58 If the
controller decides to deny the request, “the data subject may bring the matter before the DPA or
the judicial authority so that it carries out the necessary checks and orders the controller to take
specific measures accordingly.” 59 DPAs have then the power to “order in particular the blocking,
To support the broad interpretation of Directive 95/46, the ECJ underlined the significant
impact that search engines may have on the fundamental rights to privacy and to the protection of
personal data of data subjects, irrespective of the relevance of the original publication on the
54
Id. ¶ 34.
55
Id. ¶ 88.
56
Directive 95/46, supra note 11, art. 12(b).
57
Id. art. 14(a).
58
Google Spain SL, Case C-131/12, ¶ 77.
59
Id.
60
Id. ¶ 78.
13
Internet. The Court stressed that search engines provide a detailed profile of the individual
through a structured overview of his personal information available on the Internet. 61 This
information can affect an enormous number of aspects of an individual’s private life, and such
information could be impossible to gather in the absence of search engines. 62 Also, the role of
omnipresent. 63
Since the delinking may affect other fundamental rights, such as freedom of speech, the
ECJ requires a balancing of the different fundamental rights involved. This primarily involves
weighing the data subjects’ rights to privacy and to the protection of personal data of data
subjects against the Internet users’ freedom to receive accurate information. 64 According to the
ECJ, as a general rule, privacy trumps the right to information, although the balance must take
into account the nature of the information, its sensitivity to private life, and the varying interest of
the public in having the information according to the data subject’s role in public society. 65
The ECJ legitimizes subjecting search engines to erasure obligations independent from the
source of the publication based on the fact that publishers are often not subject to EU law and
data subjects can legitimately request erasure from search engines without asking the same from
the information originator. 67 Beyond the effectiveness of the protection, the ECJ gave other
reasons for allowing the independent assessment of requests to search engines and editors.
61
Id. ¶ 80.
62
Id.
63
Id.
64
Id. ¶ 81.
65
Id.
66
Id. ¶ 84.
67
Id. ¶ 85.
14
Mainly, the ECJ stated that the nature of the data processing by search engines and editors differs,
and the outcome of the balancing could diverge because the enhanced visibility that search
engines provide, which could potentially affect the fundamental right to privacy more
significantly. 68
Finally, the ECJ differentiated between the ways that search engines and publishers
process personal data: while publishers release a timely piece of information about an individual,
search engines organize and aggregate an individual’s personal information in order to offer it in
a structured fashion to its users, which decisively contributes to the dissemination of the personal
Inasmuch as the activity of a search engine is therefore liable to affect significantly, and
additionally compared with that of the publishers of websites, the fundamental rights to
privacy and to the protection of personal data, the operator of the search engine as the
person determining the purposes and means of that activity must ensure, within the
framework of its responsibilities, powers and capabilities, that the activity meets the
requirements of Directive 95/46 in order that the guarantees laid down by the directive
may have full effect and that effective and complete protection of data subjects, in
68
Id. ¶ 86-87.
69
Id. ¶ 36-37.
70
Id. ¶ 38. See also id. ¶ 83.
15
Although the ECJ limited the search engine operator’s duty to comply with Directive
95/46 according to its responsibilities, powers, and capabilities, 71 the broad interpretation of the
application to search engines could nonetheless make it impossible for search engines to comply
with some obligations resulting from the Directive. For example, search engines may be unable
to comply with the prohibition against processing personal data that reveals racial or ethnic origin,
related to an individual’s health or sex life. 72 Non-compliance with this obligation could lead to
potential liabilities and severe administrative fines, which under the new General Data Protection
Regulation could amount to twenty million Euros or 4% of the global annual turnover, whichever
is higher. 73 However, such liabilities must be modulated based on the processor’s “capabilities”,
which could release search engines from the burdensome responsibility of implementing an
impossible monitoring system. Thus, there is little clarity as to the specific obligations for search
engine operators other than to consider requests for erasure and/or comply with regulators’ or
court orders.
The broad interpretation of the material scope of Directive 95/46 and the characterization
of search engines as controllers could lead to an inconsistent interpretation of a rule created when
the Internet was only an incipient technology. 74 In response to this concern, AG Jääskinen
proposed that the interpretation of the definitions in Article 2 of Directive 95/46 by the ECJ
71
Id. ¶ 38, 83.
72
Directive 95/46, supra note 11, art. 8.
73
General Data Protection Regulation, supra note 12, Art 83(5)(b) in relation with art. 17.
74
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 26.
16
material scope of the Directive over new technologies. 75 The Working Group could be seen as
supporting the need for proportionality in the consideration of whether a search engine should
the principle of proportionality requires that to the extent that a search engine provider
with regard to the content related processing of personal data that is taking place. In this
case the principal controllers of personal data are the information providers. 76
In fact, national courts 77 and DPAs 78 had previously acknowledged the lack of control
that Google has over the data it processes and the nature of search engines as mere intermediaries.
However, the ECJ performed a formalistic interpretation of the controller test, concluding
that Google determines the means and purpose of the information, without any measure of
proportionality. 79 The conclusion that Google is a controller despite the lack of (i) awareness that
it is processing personal data and (ii) intentionality in the processing 80 leads to an absurd situation
in which Google is perpetually breaching data protection laws. De facto, this places search
75
Id. ¶ 30.
76
Working Party on the Protection of Individuals, Opinion 1/2008 on data protection issues related to
search engines 00737/EN WP 148 (Apr. 4, 2008), p. 14; see also Opinion of Advocate General
Jääskinen, Case C-131/12, Google Spain SL, ¶ 88.
77
See Rb. Amsterdam 26 April 2007, m.nt Jensen/Google Netherlands NL:RBAMS:2007:BA3941. Since
search results are technical, passive and automatic the search engine is not aware of the content, and no
liability thereof can be inferred. In Palomo v. Google Inc., T.S. 4 Mar 2013, RJ\2013\3380 (Spain)
upholding A.P Madrid, 19 Feb 2010, JUR\2010\133011, which dismissed an appeal against a first instance
ruling that the search engine liability for the dissemination of third party content due to its lack of
awareness .
78
See Garante per la Protezione dei dati personali, proceeding 108, doc. web n. 1892254 (Mar. 21 2012),
(concluding that Google only gathers and automatically offers links and the control remains within the
publisher of the information).
79
Google Spain SL, Case C-131/12, ¶ 33.
80
See Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 82.
17
engines in a permanent position of illegality in the realm of privacy laws. 81 Once search engines
are characterized as controllers, the only way for them to comply with Directive 95/46 is to assess
the legality of all the content that they index, which is obviously impossible. 82
Finally the Spanish court asked whether, in light of the rights to erasure and blocking, as
well as the right to object, there is a right to eliminate information at will by the data subject
under the Directive. 83 In particular, the court asked about the recognition of a right “enabling the
data subject to address himself to search engines in order to prevent indexing of the information
relating to him personally, published on third parties’” web pages, invoking his subjective wish
that such information should not be known to Internet users when he considers that it might be
prejudicial to him or outdated, even though the information in question is not prejudicial and has
This question is the most controversial part of the judgment. 85 The ECJ held that Articles
12(b) and 14(a) of Directive 95/46, which enshrine the right of the data subject to access, rectify,
81
See id. ¶ 90.
82
See id. ¶ 89. More sensibly, in SARL Publison System v SARL Google France, the court concluded that
Google was not obliged to check the legality of the sites offered in its search results. Cour d'appel [CA]
[regional court of appeal] Paris, civ. Mar. 19, 2009, JurisData, 2009-377219, (Fr.) .
83
Google Spain SL, Case C-131/12, ¶ 20.3; Order A.N., supra note 10, ¶ 3.
84
Google Spain SL, Case C-131/12, ¶ 20.3
85
See James Ball, Costeja González and a Memorable Fight for the Right to be Forgotten, GUARDIAN
(May 14, 2014, 11:34 AM[), http://www.theguardian.com/world/blog/2014/may/14/mario-costeja-
gonzalez-fight-right-forgotten; Robert Lee Bolton, The Right to Be Forgotten: Forced Amnesia in a
Technological Age, 31 J. MARSHALL J. INFO. TECH. & PRIVACY L. 133 (2015). Jonathan Zittrain, Don’t
Force Google to ‘Forget’, N. Y. TIMES (May 14, 2014),
http://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html?_r=0.
18
erase, or block the information, 86 or the right to object to the data processing, 87
also grant data
subjects a Right to be Forgotten. The Right to be Forgotten allows an individual to request that a
search engine remove certain search results retrieved after searching his name. An individual may
ask search engines to remove links to web pages that may contain inadequate, irrelevant, or
excessive personal information, even when a third party published it lawfully and the information
Once the individual exercises his right to request removal, the search engine must assess
whether the personal information should no longer be linked to his name. 89 In balancing the
fundamental rights at stake, namely the fundamental right to privacy and the right to protection of
personal data against the economic interests of the search engine and the right of the general
public to access information, privacy generally prevails. 90 The preponderant right to information
seems limited to situations where the data subject plays a role in public life. 91
Forgotten. With regard to the erasure or blocking rights under Article 12(b) of Directive 95/46,
AG Jääskinen concluded that these rights should be limited to situations where the information is
either incomplete or inaccurate, but should not merely be subject to deletion because of the
subjective preference of the data subject. 92 I infer that since Google’s data treatment constitutes a
86
Directive 95/46, supra note 11, art. 12(b).
87
Id. art. 14(a).
88
Google Spain SL, Case C-131/12, ¶ 95-96.
89
Id. ¶ 96.
90
Id. ¶ 81, 97.
91
Id. ¶ 97.
92
See Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 104.
19
complete reflection of the published information, it does not trigger the breaches of Directive
Similarly, AG Jääskinen found that the right to object under Article 14(a) should be
Jääskinen, the criteria used in assessing the right to object must balance the interest and purpose
of the data processing against the interest for the data subject, but not his subjective preferences
to delist the information. 94 I conclude in similar terms that the compelling legitimate grounds that
trigger the right to object must constitute objectively identifiable situations. The mere subjective
perceptions of the data subject are insufficient. I find it difficult to agree with the ECJ that
Directive 95/46 enables the data subject to prevent the indexing of information, whether
prejudicial or not, even when the information has been lawfully published.
Moreover, Article 17 of the European Commission proposal for a General Data Protection
Regulation aimed at replacing the Directive 95/46 included the Right to be Forgotten as a
novelty. 95 However, the European Parliament eliminated any reference to the Right to be
Forgotten and limited its scope in the first reading, although the configuration of the Right to
Erasure in subsequent drafts included the elements that constitute the Right to be Forgotten.96
93
See id. ¶ 105.
94
See id. ¶ 108.
95
See European Commission Proposal for a Regulation of the European Parliament and of the Council on
the protection of individuals with regard to the processing of personal data and on the free movement of
such data (General Data Protection Regulation) No. 2012/0011 of 25 January 2012,
http://www.europarl.europa.eu/registre/docs_autres_institutions/commission_europeenne/com/2012/0011/COM_CO
M(2012)0011_EN.pdf.
96
See European Parliament, European Parliament legislative resolution of 12 March 2014 on the proposal
for a regulation of the European Parliament and of the Council on the protection of individuals with regard
to the processing of personal data and on the free movement of such data (General Data Protection
Regulation) (COM(2012)0011 – C7-0025/2012 – 2012/0011(COD)), Amendment 112, Article 17 (Mar.
12, 2014), http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P7-TA-2014-
0212+0+DOC+XML+V0//EN.
20
The Council of the European Union agreed, however, to maintain the term “Right to be
Forgotten,” while stressing the need to strengthen the balancing of rights. 97 Thus, the new
inclusion of an explicit Right to be Forgotten in the General Data Protection Regulation and its
contested breadth and qualification shows that Directive 95/46 did not include a Right to be
Forgotten. 98
Despite the clear mandate of the ECJ for the protection of data subjects’ privacy by search
engines, the ECJ barely addressed the standard to be used in conducting this balancing of rights.
In Europe, the right to privacy is a fundamental right that is as strong as free speech rights and the
right to be informed, and as such, it enjoys particular protection. 99 However, in Google Spain the
ECJ hinted that privacy trumps all other rights, without even referencing the case law of the
European Court of Human Rights (“ECHR”) or the recognition of the right to be informed and
the right to free speech in the E.U. Charter of Fundamental Rights (“the Charter”)
97
See Council of the European Union, Proposal for a Regulation of the European Parliament and of the
Council on the protection of individuals with regard to the processing of personal data and on the free
movement of such data (General Data Protection Regulation), (Sept. 3, 2014) 11289/1/14 REV1
http://register.consilium.europa.eu/doc/srv?l=EN&f=ST%2011289%202014%20REV%201
98
See Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL, ¶ 110.; General Data
Protection Regulation, supra note 12.
99
Alexander Tsesis, The Right to Be Forgotten and Erasure: Privacy, Data Brokers, and the Indefinite
Retention of Data, 48 WAKE FOREST L. REV.131 (2014)
21
III.1.1 The Balancing Under the European Convention of
protects private and family life, Article 10 of the Convention protects freedom of speech and
press. 100
In Von Hannover the ECHR balanced both rights and set strong protection for privacy
vis-à-vis free speech. 101 Where an individual suffers an intrusion into his life, the court held that
for the balancing to take place, the individual must show that the information at issue was within
his “legitimate expectation” of privacy. Secondly, the balancing calculation must consider
whether the information contributes to a debate of general interest. 102 However, the ECHR did
not define exactly what constitutes a contribution to a debate of general interest, so there is
limited guidance for the balancing assessment. In Mosley, the ECHR underlined that where the
protection of privacy rights has an impact on Article 10 freedom of expression, the court must
strike a fair balance between the competing rights and interests arising under Articles 8 and 10.103
However, the diversity in practice among member States as to the balancing of competing
interests of respect to private life and freedom of expression calls for a wide margin of discretion
as to the relevant element to take into account in weighing these interests. 104 The guidance of the
100
Council of Europe, European Convention for the Protection of Human Rights and Fundamental
Freedoms art. 10, Nov. 4, 1950, C.E.T.S. 5, http://www.echr.coe.int/Documents/Convention_ENG.pdf.
101
Von Hannover v. Germany (No.1), 2004-VI Eur. Ct. H.R.
102
Barbara McDonald, Privacy, Princesses, and Paparazzi, 50 N.Y. L. SCH. L. REV. 205, 223 (2005).
103
Mosley v. United Kingdom, 774 Eur. Ct. H.R .111 (2011).
104
Id. at 108-110, 124
22
In the EU, the Charter has the same force as the Treaties, 105 and its provisions must be
interpreted in line with the case law of the ECHR. 106 The Charter recognizes the right of respect
for private and family life (Article 7) and the protection of personal data (Article 8) on one side
and the freedom of expression and information (Article 11) on the other, as well as the freedom
to conduct business (Article 16). Article 7 of the Charter replicates Article 8 of the Convention.
The presence of the additional Article 8 of the Charter suggests the willingness to include an even
wider protection for privacy rights than the Convention. The general principle of proportionality
applies in the assessment of conflicting rights recognized in the charter. In accordance with
Article 52(1) of the Charter, any limitation on the exercise of the rights and freedoms recognized
by the Charter must respect the essence of those rights and freedoms and respect the
proportionality principle for which the limitation must be necessary to protect other rights and
freedoms or to meet EU objectives of general interest. The proportionality principle requires that
the restriction is appropriate, necessary, and the least onerous to attain the objectives legitimately
pursued. 107 The principle of proportionality has also been recognized in the scope of Directive
95/46. 108
In line with the ECHR, the ECJ recognized that when balancing the different fundamental
rights at stake, it is necessary to reach a fair balance between the rights to privacy and the right to
the protection of personal data, on the one hand, and the freedom of information of search engine
105
Case C-297/10, Hennigs v. Eisenbahn-Bundesamt, 2011 E.C.R. I-07965, ¶ 47.
106
Charter, supra note 32, art. 52(3).
107
Case C-283/11, Sky Österreich GmbH v. Österreichischer Rundfunk, EU:C:2013:28, ¶ 50.
108
C-92/09, Scheke v. Land Hessen, 2010 E.C.R. I-11063, ¶ 48.
23
users on the other. 109 The ECJ also established that, as a general rule, privacy trumps freedom of
information. 110
I find it difficult to square the Google Spain conclusion, that the rights to privacy and to
the protection of personal data should prevail over the freedom of information and the economic
interests of the search engine operator, with the principle of proportionality and with the ECHR
case law. Such a bold statement without further elaboration undermines the equal footing of
fundamental rights and gives no explanation as to why privacy rights are superior to free speech
Also, the ECJ did not include in the balance the publisher’s right to freedom of expression,
even though this omission may undermine the effective reach of the views that reach Internet.
The only consideration that the Court considered with regard to the role of publishers of
information is the acknowledgement that the publication of information by the original publisher
may be justified on journalistic grounds. 111 I contend that the publisher’s freedom of expression
should have been recognized and assessed in the ruling, especially in the current digital age
where search engines act as information gatekeepers and a lack of visibility in their search results
may lead to total exclusion of certain views. 112 The Court itself recognizes that a search engine
“may play a decisive role in the dissemination of that information,” and it “makes access to that
109
Google Spain SL, Case C-131/12, ¶ 81.
110
Id.
111
Id. ¶ 85.
112
See The Advisory Council to Google on the Right to be Forgotten, Report of The Advisory Council to
Google on the Right to be Forgotten, Feb. 6, 2015,
https://drive.google.com/a/google.com/file/d/0B1UgZshetMd4cEI3SjlvV0hNbDA/view?pli=1 at 27.
Opinion of Jimmy Wales, founder of Wikipedia and Member of Google’s Advisory Council: “I
completely oppose the legal situation in which a commercial company is forced to become the judge of
our most fundamental rights of expression and privacy, without allowing any appropriate procedure for
appeal by publishers whose works are being suppressed.”
24
information appreciably easier”. 113 But such factors act as a broadening element of the delisting
obligations for search engines. In my view, the court should have considered the establishment of
a subsidiary obligation on the part of the search engines to eliminate links where the request has
The ECJ denied giving any weigh in the balancing to the search engines’ economic
interests. 114 Google’s freedom to conduct business, 115 freedom of establishment, and freedom to
provide services 116 could have also been considered in the balancing of rights. 117
Finally, the necessary application of the principle of proportionality casts doubt on the
conclusion that privacy trumps free speech where search engines list links to personal
information. In the first place, as Jonathan Zittrain puts it, the judgment is oddly narrow as it
allows a data subject to request that information be delisted from search engines, but that same
underlying information remains available online, so other websites or social networks are free to
link it. 118 The differentiation of search engines from other sites reflects a misconception of the
nature of the Internet, as such differentiation does not make sense to experts. 119 It is ironic that
Mr. Mario Costeja González initiated the proceedings to eliminate information about his past
debts with the Social Security, but now that fact is known worldwide. Moreover, the links are
only removed upon the search of the name, but the search engine user could also use other
113
Google Spain SL, Case C-131/12, ¶ 87.
114
Id. ¶. 81, 97
115
Charter, supra note 32, art. 16.
116
Consolidated Version of the Treaty on the Functioning of the European Union art. 49, 56, 2008 O.J. C
115/47.
117
C-360/10, SABAM v. Netlog, EU:C:2012:85 (2012), ¶ 44.
118
Jonathan Zittrain, Don’t Force Google to ‘Forget’, N.Y. TIMES (May 14, 2014).
119
See Eric Goldman, Search Engine Bias and the Demise of Search Engine Utopianism, 8 YALE J. L. &
TECH. 188 (2006).
25
keywords to trigger the display of the delisted links. 120 Thus, the obligation that search engines
delist links connecting to information protected by the right to privacy is not appropriate, as the
information is still available in the original source. Moreover, the shifting of the inquiry as to the
appropriateness of the published information from the publisher to the search engine vests
publishers with a guarantee: if the published information in fact is private information that turns
out to have been improperly published, search engines will be the entities forced to face the
consequences, not publishers, which may ultimately lead to the publication of more intrusive,
private information. In fact, in my view the least onerous solution would be to address requests to
the originators of the information. Originators can prevent search engines from searching their
publications, they are acquainted with the relevance and the context of the publication, and they
should be responsible for what they publish – not search engines, which after Google Spain
After the ruling, search engines were given very little instruction about the operational
aspects of the judgment and many questions remained unanswered. One of the most relevant
questions is: what criteria should search engines use to balance these fundamental rights? The
ECJ provided search engines with an idea of the criteria search engines should use; they specified
120
See Eric Goldman, Primer on European Union’s Right To Be Forgotten (Excerpt from My Internet Law
Casebook) + Bonus Linkwrap, TECH. & MARKETING L. BLOG (Aug. 21, 2014),
http://blog.ericgoldman.org/archives/2014/08/primer-on-european-unions-right-to-be-forgotten-excerpt-
from-my-internet-law-casebook-bonus-linkwrap.html.
26
that decisions should consider the nature of the information, its sensitivity for private life, and
This and other questions started to arise and both the EU and Google tried to shed some
light on how to apply the holding of the ruling to the day-to-day functioning of search engines.
The Working Party includes representatives of the Member States’ authorities and the
EU. 122 The tasks of the Working Party include the provision of expert opinion on questions of
data protection, the promotion of a uniform application of the Directive, the issuance of
recommendations for the public, and the issuance of advice for the European Commission for the
In November 2014, the Working Party published a set of guidelines clarifying its views
on the application of Google Spain, and offering thirteen non-exhaustive criteria to be considered
by the DPAs in the balancing of rights when search engines deny erasure. 124 It also invited search
121
Google Spain SL, Case C-131/12, ¶ 81.
122
Directive 95/46, supra note 11, art. 29(2).
123
European Commission, Tasks of the Article 29 Data Protection Working Party,
http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/tasks-art-29_en.pdf.
124
Working Party on the Protection of Individuals, Guidelines on the Implementation of the Court of
Justice of the European Union Judgment on “Google Spain and Inc v. Agencia Española de Protección de
Datos (AEPD) and Mario Costeja González” C-131/12 (Nov. 26, 2014), at 13-20.
125
Id. at 10.
27
The relevant questions to be asked according to the Working Party serve the purpose of
applying the test laid down by the ECJ and serving as a flexible working tool to help DPAs
A first step in the analysis should be to check if the search results relate to a natural
person upon search of his name. 127 As a fundamental right, privacy and data protection rights are
personal in nature and only those rights were recognized in Google Spain. 128
Once the DPA establishes the natural personhood, it must assess whether the individual is
a public figure who plays a role in public life. 129 As a rule of thumb, the Working Party proposes
to conclude that there is an overriding public interest in the availability of the information where
it protects the public from improper public or professional conduct. 130 However, according to
Article 6(3) of the Treaty of the European Union, the fundamental rights of the European
Convention of Human Rights constitute general principles of the EU, whose force equal those of
the Treaties. 131 In Mosley, the ECHR concluded that the role in public life is to be assessed under
the light of the contribution of the information to debates on matters of general public interest.132
However, it also considered that the divergence between the different states as to the proper
balancing of the competing interests confers to states, and consequently to the EU, a wide margin
126
Id. at 12.
127
Id. at 13, Criterion 1.
128
Charter of Fundamental Rights, supra note 32, art. 8.
129
Working Party on the Protection of Individuals, Guidelines on the Implementation of the Court of
Justice of the European Union Judgment on “Google Spain and Inc v. Agencia Española de Protección de
Datos (AEPD) and Mario Costeja González” C-131/12 (Nov. 26, 2014), at 13-14, Criterion 2.
130
Id. at 13.
131
See Koen Lenaerts & Jose A. Gutiérrez-Fons, The Constitutional Allocation of Powers and General
Principles of EU law 47 COMMON MARKET L. REV.1629 (2010).
132
Mosley v. United Kingdom, 77 Eur. Ct. H.R . 132 (2011).
28
of appreciation with regard to the latitude of the balancing. 133 DPAs retain an important margin
of appreciation to decide whether a piece of information enriches the public debate. Legal
certainty in this regard is consequently quite limited, as different Member States may reach very
different conclusions.
A third relevant factor relates to whether the data subject is a minor. 134 In that case, the
right to information is unlikely to debunk the delisting. The European Commission set up the so-
called CEO Coalition to promote a safer Internet for minors through self-regulation among the
leading technological companies. 135 The CEO Coalition includes the owners of the most relevant
search engines, Google and Microsoft, as well as other relevant market players such as Apple and
Facebook. They recognize that “[p]rivacy is a universally applicable right, and is especially
strongly defined for minors.” 136 The companies published their own action plan according to the
strategy set out by the Coalition. Both Google and Microsoft included actions to protect minor’s
privacy, but none related to their search engines’ results: Google focused on G+ and Youtube
privacy settings and Microsoft on Internet Explorer, Windows, and Xbox. 137 Despite the lack of
references to search engines in the statements, the need to protect minor’s privacy is
uncontroversial.
133
Id. at 124.
134
Working Party on the Protection of Individuals, Guidelines on the Implementation of the Court of
Justice of the European Union Judgment on “Google Spain and Inc v. Agencia Española de Protección de
Datos (AEPD) and Mario Costeja González” C-131/12 (Nov. 26, 2014), at 15, Criterion 3.
135
See European Commission, Self-regulation for a Better Internet for Kids, http://ec.europa.eu/digital-
agenda/en/self-regulation-and-stakeholders-better-internet-kids.
136
Coalition to make the Internet a better place for kids, Statement of Purpose, 5
https://ec.europa.eu/digital-agenda/sites/digital-agenda/files/ceo_coalition_statement.pdf.
137
See Google & Youtube, CEO Coalition to make the Internet a Better Place for Children,
http://ec.europa.eu/information_society/newsroom/cf/dae/document.cfm?doc_id=1634; Microsoft
Statement, CEO Coalition to make the Internet a Better Place for Children,
http://ec.europa.eu/information_society/newsroom/cf/dae/document.cfm?doc_id=1639.
29
Fourthly, the DPA should ask whether the data are accurate. 138 Where the information is
Fifthly, the DPA must assess whether the data is relevant and not excessive, asking
whether the data refers to the data subject’s professional life, whether the information constitutes
hate speech, slander, libel, or similar offences, or whether the information shows a personal
opinion or relates to a verified fact. 140 In general, the relevance of the data is closely related to the
age of the data, where more modern information is usually more relevant. 141 Also, data related to
a person’s private life, speech offences, and inaccurate facts are more likely to be delisted than
The sixth factor states that delisting is more likely when the information is sensitive and
reveals the racial or ethnic origin, political opinions, religious or philosophical beliefs, trade
union membership, or facts about a person’s health or sex life, in the sense of Article 8 of
The seventh factor asks whether the information is up to date and still relevant for the
144
purposes for which it was processed.
Although prejudice is not a requirement for delisting, 145 it is an factor that supports
delisting. 146 Proportionality also plays a role where the impact on privacy disproportionately
outweighs the public relevance, if any, that the data could have. 147
138
Working Party, Guidelines, supra note 124, at 14, Criterion 4.
139
Id.
140
Id. Criterion 5.
141
Id. at 15-16.
142
Id, at 15-17.
143
Id. at 17-18, Criterion 6.
144
Id. at 18, Criterion 7.
30
As the ninth factor, the DPA must ask whether the information puts the data subject at
risk for identity theft, stalking, personal injuries, and other personal risks. 148
Factor ten looks into the context in which the information was published and promotes
delisting where the editor published the information without consent, or if he refuses to accept the
revocation of consent previously granted. 149 Meanwhile, factor eleven balances whether the
information is journalistic in nature, in which case, delisting could be more complicated. 150 If the
editor publishes the information due to a legal requirement, then delisting may not be
advisable. 151 Finally, where the information refers to a criminal offence committed by the data
subject, factor thirteen calls for a careful balance of public interests by the different DPAs. 152
Given the sparse guidelines provided by the ECJ, Google set up an advisory committee of
experts to guide Google in the practical application of the judgment. The conclusions were
published on February 6, 2015. 153 The document devotes a whole section to the criteria that
Google should take into account for the delisting, depending on whether the element advances the
right to privacy of the data subject or the public interest. 154 The first group of classifying factors
145
Google Spain SL, Case C-131/12, ¶ . 96.
146
Working Party, Guidelines, supra note 124, at 18, Criterion 8.
147
Id.
148
Id. at 18, Criterion 9.
149
Id. at 19, Criterion 10.
150
Id. at 19, Criterion 11.
151
Id. at 19-20, Criterion 12.
152
Id. at 20, Criterion 13.
153
Report of the Advisory Council, supra note 112.
154
Id. § 4 at 7-15.
31
refers to the nature of the information, which could act in favor or against the delisting depending
Among the factors that would count in favor of delisting, Google should consider whether
the information relates to an individual’s intimate or sex life; 156 financial information or Personal
Identification Information (“PII”); 157 sensitive information revealing racial or ethnic origin,
life ; 158 false, inaccurate information or information that places the data subject at risk; 159
information about minors; 160 and information that appears in image or video form. 161
The factors the weigh in favor of the public interest and against delisting include
or criminal activity; 164 information that contributes to a debate on a matter of general interest
165
such as industrial disputes or fraudulent practices ; information that is true and
factual; 166and information that has historical, scientific, or artistic relevance. 167
155
Id. at 9.
156
Id.
157
Id.
158
Id. at 10.
159
Id.
160
Id.
161
Id.
162
Id. at 10-11.
163
Id. at 11.
164
Id.
165
Id. at 12.
166
Id.
167
Id. at 12-13.
32
III.2.3 Actual Balancing Practice by Google
The principles outlined by the Working Party and the Advisory Council try to shed light
on the way DPAs and search engines should engage in the balancing of rights. There is very little
information available regarding actual practice of the balancing by Google. 168 And only around 1%
168
See Ellen P. Goodman et al., Open Letter to Google From 80 Internet Scholars: Release RTBF
Compliance Data, (May 13, 2015) https://medium.com/@ellgood/open-letter-to-google-from-80-internet-
scholars-release-rtbf-compliance-data-cbfc6d59f1bd. Facing a lack of information about the actual
balancing practice, a group of eighty academics published an open letter requesting Google release
information about its practice responding to the Right to be Forgotten. In particular, the academics
requested more information about: “(1) Categories of RTBF requests/requesters that are excluded or
presumptively excluded (e.g., alleged defamation, public figures) and how those categories are defined
and assessed; (2) Categories of RTBF requests/requesters that are accepted or presumptively accepted
(e.g., health information, address or telephone number, intimate information, information older than a
certain time) and how those categories are defined and assessed; (3) Proportion of requests and successful
delistings (in each case by % of requests and URLs) that concern categories including (taken from Google
anecdotes): (a) victims of crime or tragedy; (b) health information; (c) address or telephone number; (d)
intimate information or photos; (e) people incidentally mentioned in a news story; (f) information about
subjects who are minors; (g) accusations for which the claimant was subsequently exonerated, acquitted,
or not charged; and (h) political opinions no longer held; (4) Breakdown of overall requests (by % of
requests and URLs, each according to nation of origin) according to the WP29 Guidelines categories. To
the extent that Google uses different categories, such as past crimes or sex life, a breakdown by those
categories. Where requests fall into multiple categories, that complexity too can be reflected in the data.
(5) Reasons for denial of delisting (by % of requests and URLs, each according to nation of origin).
Where a decision rests on multiple grounds, that complexity too can be reflected in the data; (6) Reasons
for grant of delisting (by % of requests and URLs, each according to nation of origin). As above, multi-
factored decisions can be reflected in the data; (7) Categories of public figures denied delisting (e.g.,
public official, entertainer), including whether a Wikipedia presence is being used as a general proxy for
status as a public figure; (8) Source (e.g., professional media, social media, official public records) of
material for delisted URLs by % and nation of origin (with top 5–10 sources of URLs in each category);
(9) Proportion of overall requests and successful delistings (each by % of requests and URLs, and with
respect to both, according to nation of origin) concerning information first made available by the requestor
(and, if so, (a) whether the information was posted directly by the requestor or by a third party, and (b)
whether it is still within the requestor’s control, such as on his/her own Facebook page); (10) Proportion
of requests (by % of requests and URLs) where the information is targeted to the requester’s own
geographic location (e.g., a Spanish newspaper reporting on a Spanish person about a Spanish auction);
(11) Proportion of searches for delisted pages that actually involve the requester’s name (perhaps in the
form of % of delisted URLs that garnered certain threshold percentages of traffic from name searches);
(12) Proportion of delistings (by % of requests and URLs, each according to nation of origin) for which
the original publisher or the relevant data protection authority participated in the decision; (13)
Specification of (a) types of webmasters that are not notified by default (e.g., malicious porn sites); (b)
33
Google has published certain examples of cases where they decided to delist links and
170
others where they rejected requests to delist. They have delisted links to: information about a
person who was convicted of a serious crime in the last five years but whose conviction was
quashed on appeal; an article about a political activist who was stabbed at a protest; an article
about a teacher convicted for a minor crime over 10 years ago; sites showing a woman’s address;
at request of his wife, a decades-old article about a man’s murder, which included his wife’s
name; at the request of the victim, an article about a rape; comments about a decades-old crime;
sites containing personal information about a doctor (but rejected to delist information about a
botched procedure); a site that had taken a self-published image and reposted it; information
about a man’s conviction once the conviction was spent under the UK Rehabilitation of
Offenders Act; and a link about a contest in which the data subject participated as a minor. 171 We
can see that Google is thus likely to delist information of personal nature (e.g. address),
information that concerns crime victims or minors, information that is old or otherwise no longer
relevant, and information about crimes that were reversed on appeal or spent convictions.
On the other hand, Google has rejected the delisting of links to: articles about a decades-
lawsuit against a newspaper; information about a priest who possessed child pornography and
proportion of delistings (by % of requests and URLs) where the webmaster additionally removes
information or applies robots.txt at source; and (c) proportion of delistings (by % of requests and URLs)
where the webmaster lodges an objection.” See also Google must be more open on 'right to be forgotten',
academics warn in letter, GUARDIAN, (May 14, 2015, 04:00),
http://www.theguardian.com/technology/2015/may/14/google-right-to-be-forgotten-academics-letter.
169
Id.
170
Transparency Report: European Privacy Requests for Search Removals, GOOGLE,
http://www.google.com/transparencyreport/removals/europeprivacy/?hl=en-US (last accessed, May 22,
2015).
171
Id.
34
was sentenced to jail and banished from the church; information about a couple accused of
business fraud; information about the arrest of a professional for financial crimes; articles about
embarrassing content published online by a media professional; information about a dismissal for
sexual crimes committed in the job; articles and blog posts about public outcry as a consequence
of an alleged abuse of social welfare; a copy of an official public document reporting on the data
subject’s fraudulent activity; a site asking the removal of a public official; information about and
investigation for sexual abuse by a former clergyman. 172 Generally speaking, the information is
not delisted when it contributes to the public debate in the sense of Hannover. 173
In any case, the lack of more widely available information about the actual practice by
search engines, and in particular by Google, makes it difficult to draw conclusions on the broader
guidelines and policy considerations that inform the balancing in the case-by-case practice. 174
Data subjects address their delisting requests directly to the controller. Only if the
controller does not grant such request is the data subject entitled to bring the matter before the
DPA or the national courts. 175 In effect, Google, a private company, is in charge of deciding what
information on the Internet should remain available for its public interest and what information
should be censored because it intrudes in data subjects’ privacy. 176 The Working Party welcomed
172
Id.
173
Von Hannover v. Germany (No.1), 2004-VI Eur. Ct. HR.
174
Goodman, supra note 168.
175
Google Spain SL, Case C-131/12, ¶ 77.
176
University of Kent, ‘Right to be Forgotten’ or censorship, (May 13, 2014, 05:12 PM)
http://www.kent.ac.uk/newsarchive/news/comment/stories/Right_to_be_forgotten/2014.html.
35
the entrustment of the balancing to the Google in the Google Spain ruling. 177 However, AG
Jääskinen discouraged the ECJ from leaving the balancing to the search engines on a case-by-
case basis, as such ruling would lead to an unmanageable number of requests. 178 Also, the
Advisory Council seemed more skeptical about shifting the balancing of fundamental rights from
the public authorities to a commercial company. 179 Frank la Rue, considers that “it should be a
State authority that establishes the criteria and procedures for protection of privacy and data and
The attribution of the balancing obligation to the search engine operator could lead to
uncontrolled elimination of content from the search results with no control by a public authority.
The search engine carries out the assessment of the balancing elements “on the basis of its own
legal ground,” derived from its own economic interest and the users’ interest in accessing
181
information via the search engines. The search engines are private operators with an economic
interest. In Google Spain, the ECJ forced these search engines to put in place a costly compliance
mechanism to address whether the data subject’s right to privacy prevails over the search
engine’s economic interest and the public interest to receive information. However, the transfer
of this vital function is flawed for various reasons: first, it assumes that it is in the interest of the
search engine not to delink information. However, such an assumption is not clearly correct, and
a search engine could decide to favor privacy over the right to be informed in order to reduce
177
See Working Party, Guidelines, supra note 124, at 5-6.
178
Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL ¶ 133.
179
See Opinion of Advocate General Jääskinen, Case C-131/12, Google Spain SL ¶ 29.
180
Id.
181
Working Party, Guidelines, supra note 124, at 6.
36
compliance costs. 182 Secondly, the procedure allows for fair defense of both the right to privacy
(as it is the data subject who initiates the procedure) and the economic interest of the search
engines (as it takes the final decision over the delisting request). However, no such fair
representation exists for the public interest in having access to that information and the free
speech of publishers, who have no role at all in the bilateral exchange between the data subject
and the search engine. Moreover, public intervention will take place only where the search engine
rejects the data subject request but not where such request is granted. As a consequence, the
balancing of rights may be skewed against the freedom of information inasmuch the intervention
of DPAs is limited to situations where privacy rights did not prevail in the assessment at the
search engine level. Finally, the entrustment of the balancing of fundamental rights to a private
company sets a dangerous precedent that departs from the general architecture of the legal system
in the EU, where only public authorities wield legitimacy to define and defend the public
interest. 183
There are certain procedural questions that remain unresolved. Who needs to be notified
of the link takedowns? How significant is the duty to state reasons of denial for search engines’?
182
For example, where the search engine decides to grant a delisting request there is no obligation to
motivate the decision, but if the request is denied, the search engine should explain the reasons for the
refusal. See Working Party, Guidelines, supra note 124, at 7
183
See generally Andrew Moravcsik & Andrea Sangiovanni, On Democracy and “Public Interest” in the
European Integration, in Renate Mayntz & Wolfgang Streeck, DIE REFORMIERBARKEIT DER
DEMOKRATIE: INNOVATIONEN UND BLOCKADEN 122-151 (2003).
37
IV.1 Notification to Webmasters/Editors
The Working Party considers that notifying webmasters and editors of the erasure of their
content from Google’s index has no legal basis and can undermine the privacy rights of the data
subject if the editor is able to identify the data subject, while the communication has no practical
effect as editors and webmasters are ignored in the delisting procedure. 184 The Working Party
acknowledges, however, that in cases where the balancing assessment is complicated, the
circumstances may call for a timely gathering of information from the editors. 185 I find these
statements to be at odds with each other: the Working Party recognizes the existence of
“legitimate expectations that webmasters may have with regard to the indexation of information
and display in response to users’ queries,” but then disregards the editors’ free speech rights and
186
even denies them a mere notification of the delisting of their publications.
On the other hand, the Advisory Committee considers that the erasure of links without
notification to affect editor’s freedom of expression, and consequently advocates that editors be
notified. 187 I find this approach more responsive to the transparency requirements and more
effective as a mitigating factor against the exclusion of the editors’ rights in the balancing
process. It should be for editors to decide whether the information is respectful of privacy rights
and to assess whether they willingly want to withdraw the content or face a proceeding before the
DPA. But at the very least, search engines should inform editors of the delisting of their content,
so that they can take appropriate measures to ensure respect for their free speech rights.
184
Working Party, Guidelines, supra note 124, at 3, 10.
185
Id.
186
Id.
187
See Report of the Advisory Council, supra note 112, at 17, 25.
38
Google has decided to inform webmasters of the delisting of their sites for the sake of
transparency, but as a policy matter and to protect the data subject’s privacy, they only disclose
Users expect search engines’ results to be an unbiased and accurate reflection of the
existing information in the web related to the search terms. When information about a data
subject is delisted, it distorts the search. Thus a conceivable action to mitigate the lack of
accuracy of the search is to inform the users upon the search of an individual’s name of the
erasure of links. The Advisory Council considered that as a general rule, users should be
informed of the delisting upon a search, as long the search engines does not disclose information
that allows the identification of the data subject. 189 Contrary to this view, the Working Party
considers the only acceptable notification that prevents the user from identifying the data subject
is a general statement permanently inserted on the search engines’ displays. 190 But this general
notification solution defeats the purpose of the notification in the first place: the warning to the
user that the results displayed are not accurate. A general message which informs the user that the
information “may” not be accurate casts doubt over all searches and eliminates the certainty of
accuracy over those searches not affected by delisting. I believe that the solution proposed by the
Advisory Council is more robust, protects the privacy of the data subject, and warns the user
188
European Privacy in Search Frequently Asked Questions, GOOGLE,
http://www.google.com/transparencyreport/removals/europeprivacy/faq/?hl=es#how_do_you_decide (last
visited January 18, 2017).
189
See Report of the Advisory Council, supra note 112, at 21.
190
Working Party, Guidelines, supra note 124, at 9-10.
39
IV.3 Justification to Data Subjects
The Working Party suggests that when performing the balancing of rights, the search
engine must state sufficient reasons to support the decision not to delist. 191 The Advisory Council
seems to agree. 192 In this regard, I first note that the obligation to make decisions which affect
rights is generally reserved for public authorities. 193 Under EU law, public authorities have an
obligation to disclose in a clear and unequivocal fashion the reasons for their decisions, thereby
enabling public review. 194 The requirement to state reasons varies with the circumstances,
including the content and the nature of the alleged reasons. 195 However, although from a policy
perspective such an obligation makes sense, the legal basis for imposing an obligation to give
reasons for their decisions on a private operator like a search engine is doubtful. The compliance
Law.
Another question left unresolved by the ECJ is the determination of the applicable
national law within the EU, where the parent company based in a third country has a designated
191
See id. at 7.
192
Report of the Advisory Council, supra note 112, at 21.
193
Joined Cases T-81/07, T-82/07, and T-83/07, KG Holding v. Commission, 2009 E.C.R. II-02411 , ¶ 61.
194
Case C-367/95 P Commission v. Sytraval and Brink’s France, 1998 E.C.R. I‑ 1719, ¶ 63.
195
Case C-350/88 Delacre v. Commission, 1990 E.C.R. I-395 ¶ 16.
40
establishment in certain EU Member States, as the ECJ only provided guidance as to Article
4(1)(a) of Directive 95/46, which triggers the application of the directive in the presence of an
Another question is whether delistings performed under Directive 95/46 would apply in
other Member States where, unlike in Spain, there are no Google establishments intended to
promote and sell advertising and to orientate the parent company’s activity towards the
inhabitants of that Member State. 196 The Working Party considers that the ECJ left open the
a Europe-wide application of the delisting and delisting of a link affects all Member States’
Google sites. 198 Given the singularity of the European market, this stance is correct and
reasonable. However, delistings in compliance with the law of the Member State where the
search engine has the headquarters is not sufficient, as Directive 95/46 does not provide for a
one-stop-shop, given the lack of harmonization across Member States. 199 Consequently,
controllers will be subject to the national laws of all Member States where they have an
establishment, if any processing activity is carried out in the context of an establishment. 200 The
lack of a one-stop-shop solution leads to the an effective “race to the top” whereby the views of
196
Google Spain SL, Case C-131/12, ¶ 45.
197
Working Party on the Protection of Individuals, Update of Opinion 8/2010 on applicable law in light of
the CJEU judgment in Google Spain, 176/16/EN WP 179 update (Dec. 16, 2015), at 2.
198
Letter from Peter Fleischer, Google Global Privacy Counsel, to Isabelle Falque-Pierrotin, Chair of the
Article 29 Working Party (July 31, 2014), Response to the Questionnaire addressed to Search Engines by
the Article 29 Working Party regarding the implementation of the CJEU judgment on the “right to be
forgotten”, https://docs.google.com/file/d/0B8syaai6SSfiT0EwRUFyOENqR3M/edit?pli=1; see, e.g.,
www.google.uk, www.google.es, www.google.it, www.google.de, www.google.fr.
199
Working Party on the Protection of Individuals, Update of Opinion 8/2010 on applicable law in light of
the CJEU judgment in Google Spain, 176/16/EN WP 179 update (Dec. 16, 2015), at 6.
200
Id.
41
the national jurisdiction affording the most extensive data protection rights will extend all over
The real battle in the territorial reach debate concerns Google’s main site, www.google.com.
European users can easily circumvent the effectiveness of the delisting by using
www.google.com and signing off to avoid being redirected to the national sites, like
www.google.es or www.google.at. However, if the reach of the ruling were to reach the main
site, it could prevent citizens from non-EU countries from accessing legally protected information
The Advisory Council underlines that when European users type www.google.com they
are generally redirected to the regional site and that 95% of the searches happen in those sites. 201
The practically universal use of delisted search results satisfies the Council, which disfavors geo-
location solutions to this problem due to the potential detrimental effect that these technologies
could have if they result in restricting access to certain subjects. Consequently, the Council
concluded that the main site should be left untouched. 202 The report included, however, a dissent
from former German justice minister Sabine Leutheusser-Schnarrenberger. 203 The Working
Party, on the other hand, considers that for the full effectiveness of the ruling and to ensure the
201
Report of the Advisory Council, supra note 112, at 19.
202
Id. at 20.
203
Id. at 26.
42
total protection of privacy rights, the delisting has to affect not only the EU domains but all
The problem with the Working Party solution is the conflict of laws. In the U.S., Section
230 (c)(1) of the Communications Decency Act (“CDA”) shields users and providers of
interactive computer services from liability when they publish information provided by others,
which for search engines means that they may wield statutory protection to fine-tune the results
ranking at will and to refuse to delist links. 205 The safe haven for hosting providers in the EU
does not provide for such wide protection. 206 Besides, search engines hold First Amendment
constitutional rights that shield them from scrutiny, and any attempt by public authorities to
constrain their free speech would fail. 207 As a consequence, the Right to be Forgotten as designed
by the ECJ in Google Spain would not be recognized in the U.S. Google has decided so far that
its .com site will not delist links following Google Spain, but the views of the Working Party and
the vocation of universality of the ruling may lead to case law obliging Google to delist links on
the .com site. 208 In fact, Google has battled the French DPA over worldwide delistings to ensure
the effectiveness of the ECJ’s ruling in Google Spain. In May 2015, the President of the French
DPA ordered Google Inc. to make delistings effective not only in the national sites, but
globally. 209
204
Working Party, Guidelines, supra note 124, at 3.
205
47 U.S.C. § 230; e.g. Getachew v. Google, Inc., 491 Fed. Appx. 923 (10th Cir. 2012).
206
See Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain
legal aspects of information society services, in particular electronic commerce, in the Internal Market,
2000 O.J. (L 178).
207
See Zhang v. Baidu.com, Inc., 2014 WL 1282730 (S.D.N.Y. 2014); Langdon v. Google, Inc., 474 F.
Supp. 2d 622 (D. Del. 2007).
208
See Charter, supra note 32, art. 8.
209
See Décision de la Présidente n° 2015-047 metant en demeure la société Google Inc., CNIL (French
DPA) (21 May 2015); see also Délibération 2015-170 du bureau de la Commission nationale de
43
Between the overreaching territorial application to all domains and the application only to
the EU domains, a more reasonable option remains. Geofiltering seems the most appropriate
approach to allow for effective protection of privacy rights while respecting the territoriality
principle. This solution entails the modification of the results page in accordance to the origin of
the search. In other words, if a user makes a query within the European Union, it should not
matter whether she uses the national version of Google or the .com site, she would still obtain the
delisted results, whereas someone in the U.S. would still receive the virgin results page. 210
This solution has support in case law. In UEJF and LICRA v. Yahoo! and Yahoo France,
two student associations sued Yahoo for the showing of Nazi propaganda and artifacts on sale. 211
Yahoo contested the jurisdiction of the High Court of Paris because the content originated in the
U.S. The court rejected the claim arguing that allowing French users to view and participate in
the selling constituted a wrongdoing in France. 212 After ensuring that it was technically feasible
to block access to French users, the court granted the interim relief and force Yahoo to make it
“impossible” for French users to access the content in French territory. This ruling effectively
On 21 January 2016, Google offered the French DPA to ensure the effectiveness of the
Right to be Forgotten across Europe by delisting results from all of Google’s sites where the
l’informatique et des libertés n°2015-170 décidant de rendre publique la mise n°2015-047 de la société
Google Inc., CNIL (French DPA)(8 Jun 2015).
210
See generally Alsenoy & Koekkoek, supra note 14.
211
Tribunal de Grande Instance [TGI] [ordinary court of original jurisdiction] Paris, May 22, 2000,
Interim Court Order No. 00/05308, 00/05309, RLDA. 2000, no 29, 1/07/2000, n° 1844 (Fr.).
212
Id.
213
The TGI confirmed this conclusion in M. et Mme X et M. Y v. Google France, Google tried to limit the
applicability of content restrictions to www.google.fr, but the TGI rejected this possibility because it did
not guarantee the impossibility of accessing that content from France. Tribunal de Grande Instance [TGI]
[ordinary court of original jurisdiction] Paris, Sept. 16, 2014, RLDI 2014/108, nº3592 (Fr.).
44
search request came from France. 214 The French DPA rejected the proposal because (i) users out
of Europe could see the links; (ii) requests made from Europe from a non-French IP address in a
non-European site (e.g. google.com) could also see the links; (iii) geo-blocking solutions based
215
on IP addresses could be easily circumvented through technical means, including VPN.
Given the unitary nature of the Single Market, I consider that the French DPA is right to
point out that the solution is incomplete because other requests made from the EU could still see
the links in google.com., and that problem could be solved by extending the delisting to all access
made from any EU Member State, regardless of the origin of the delisting request. I disagree,
however, with the overreaching conclusions about delisting beyond Europe and circumvention
The territoriality principle underpins the system of international public law, and it
recognizes the states’ right to exclusive competence over their own territory, and the obligation to
214
Délibération de la formation restreinte n°2016-054 prononçant une sanction pécuniaire à l’encontre de
la société Google Inc. CNIL (French DPA) (10 Mar 2016), at 9. Note that Google has appealed this
decision before the Conseil d’État, the highest administrative court, application number 399,922. See also
Peter Fleischer, Adapting our approach to the European right to be forgotten, GOOGLE,
https://www.blog.google/topics/google-europe/adapting-our-approach-to-european-rig/. , where Google’s Global
Privacy Counsel announced that Google would apply the same policy in the whole EU. See also Bing To
Use Location for RTBF, BING ( 12 Aug. 2016), https://blogs.bing.com/search/august-2016/bing-to-use-
location-for-rtbf., where Bing announced the same approach taken by Google to use location-based signals
to delist from all Bing versions for European users accessing Bing from the Member State from which the
delisting request was originated.
215
Délibération de la formation restreinte n°2016-054 prononçant une sanction pécuniaire à l’encontre de
la société Google Inc. CNIL (French DPA) (10 Mar 2016), at 9-10. For a more visual explanation of the
concerns of the French DPA, see Portée du déréférencement de M.Plaignant appliqué par Google, CNIL
(24 Mar. 2016), https://www.cnil.fr/fr/infographie-portee-du-dereferencement-de-mplaignant-applique-
par-google . The French DPA push for global delisting has faced a strong public reaction from Google, see
Kent Walker «Ne privons pas les internautes français d’informations légales», LE MONDE, (19 May 2016,
2:57PM CET, updated 20 May 2016, 6:53AM CET), http://www.lemonde.fr/idees/article/2016/05/19/ne-
privons-pas-les-internautes-francais-d-informations-legales_4922590_3232.html#aPgr01sw36DfYd1P.99; Kent
Walker, A principle that should not be forgotten, GOOGLE (19 May 2016), https://blog.google/topics/google-
europe/a-principle-that-should-not-be-forgotten/; Peter Fleischer, Reflecting on the Right to be Forgotten,
GOOGLE (9 December 2016), https://blog.google/topics/google-europe/reflecting-right-be-forgotten/.
45
216
respect other states’ equivalent right. The Internet does not trump the obligation to respect the
territoriality principle. 217 Although it can be circumvented by technical means, for example
through a proxy, geographic filtering respects the territoriality principle while allowing for
maximum effectiveness with respect to privacy rights. While the universal application could
marginally improve the effectiveness, its pernicious extraterritoriality effects militates against its
use.
VI. Conclusion
Between 29 May 2014 and 18 January 2017, data subjects have made 675,624 requests to
Google to evaluate the removal of 1,865,610 URLs, of which 682,250 or 43.2% were eliminated
and 895,405or 56.8% were not. 218 The regional disparity is wide: while in France and Germany
the removal rate reached 48%, in Italy it only hit 32%. 219 These numbers show the relative
importance of the Right to be Forgotten in the evolution of privacy rights in the Internet age.
Google Spain has shown the existing concern in Europe about the protection of the
fundamental right to privacy in an economic environment where new technologies favor massive
collection of data and where the most dynamic companies exploit that information. The judgment
is, however, an imperfect answer to an existing question: how do we protect privacy rights in the
Internet age?
216
M.N. Shaw, International Law (5th edn, 2003), at 412.
217
UTA KOHL, JURISDICTION AND THE INTERNET: REGULATORY COMPETENCE OVER ONLINE ACTIVITY
89 (2007).
218
European privacy requests for search removals, GOOGLE,
http://www.google.com/transparencyreport/removals/europeprivacy/?hl=en (last updated 18 Jan. 2016))
219
Id. The site also offers, for every Member State, the number of applications, the number of URLs
affected, and the percentage of removal and non-removal decisions.
46
For once, the ruling seems blinded by a desire to protect privacy, placing too much
emphasis on privacy and disregarding broader considerations. In the first place, the judgment
neglects to mention the relevance of other fundamental rights related to free speech, while
rejecting the relevance of freedom to conduct business or ignoring the freedom of expression of
editors and publishers. 220 Secondly, it entrusts the balancing of rights to private entities which
effectively become the gatekeepers of privacy rights on the Internet, wielding more power over
the equilibrium between free speech and privacy than any other public authority in history.
Thirdly, by turning the balancing into a bilateral procedure between the data subject and the
search engine, the ruling prevents an effective protection of public interest or the rights of editors,
which are not represented. This may lead to an overexpansion of privacy rights at the expense of
freedoms of speech and information without allowing for effective second guessing by public
authorities, as DPA’s activities will be limited to review cases where delisting is rejected.
Fourthly, by entrusting a private entity with the balancing decision, the ruling effectively deprives
public authorities from exercising a sovereign power to determine what constitutes the public
interest. Fifthly, the judgment shifts the burden of the responsibility of content creation from the
editor to the search engines, which could in turn weaken privacy rights by offering the wrong
guidelines as to how to balance the different interests at stake, and entrusting search engines with
a case-by-case analysis of the competing interests, certainty is extremely limited. The line
dividing cases leading to delisting from those not allowing for the takedown of links is to be
drawn over time. DPAs and national courts may have extraordinary powers to rectify the
220
This concern has been stressed in the Council of the European Union. See Proposal for a General Data
Protection Regulation, supra note 97.
47
balancing practice by search engines, but the risk remains that differing conceptions of what
should be taken down across national sensibilities may lead to a balkanization of the Internet in
Europe, which would harm the Single Market and further integration of the EU.
With regard to procedural matters, the ruling leaves many questions open, particularly the
addressees of potential notifications about takedowns, the breadth and scope of the justifications
supporting the decisions of the search engines, and potential transparency requirements that
would allow the public to acknowledge how fundamental rights are actually balanced. Another
upcoming debate could relate to the role of DPAs in bilateral procedure between the data subject
Also, the question of extraterritoriality threatens to expand the effectiveness of the Right
to be Forgotten beyond the European borders and might lead to a conflict of laws with the U.S.,
where First Amendment free speech protection remains strong and where the Right to be
Forgotten has been heavily contested. The potential extraterritorial effects cast doubt on the
timeliness of the judgment, as it affects the rights of citizens in other jurisdictions. The
elimination of links should be limited to access from the EU. I suggest that with respect to the
effectiveness and territoriality principles, the technical geographic location could serve the
purpose of guaranteeing respect to the Right to be Forgotten in Europe, while allowing other
jurisdictions to decide the weight that they confer to privacy in the Internet.
In my view, the ruling fails to take into consideration the future of an open and fair
Internet, a debate that should take place democratically. The practical implementation of the
General Data Protection Regulation should address all these problems from a wider perspective
to enhance the protection of European citizens in the Internet while empowering a healthy and
48
open development of the Internet and avoiding crippling the efforts of European information
49