2014 The Futures of Privacy
2014 The Futures of Privacy
2014 The Futures of Privacy
Futur Numrique
Cahier de prospective
The Futures of Privacy
Editor
Carine Dartiguepeyrou
ISBN 978-2-915618-25-9
Fondation Tlcom, Institut Mines-Tlcom, February 2014
Edition: Uniqueness
Table of contents
Introduction
An Economists Thoughts on the Future of Privacy, Patrick Waelbroeck ................. 131
Conclusion
Raising Awareness around the Concept of Privacy, Carine Dartiguepeyrou ......... 137
Introduction
Introduction
But beyond that, the emerging problem is connected with the capacity to
use personal data, digital footprints, and tracks of usage. Who is allowed to
use this data, the person or the service provider? Can this problem be solved
by forty-pages-long contracts, as a way for the user to give their consent?
This global evolution and what we see now is just the beginning of the
digital metamorphosis1 is pushed forward by two engines: economy and
psycho-sociology.
Big Data technologies give companies power to process huge quantities
of data to produce information and knowledge that could become sources
of value creation. Selling data could bring in big money, which means there is
today a real economics pressure to reduce the limits of privacy. By combining
open data, contents of the grey domain (where ownership rules are not clearly
defined), private contents (which users give right to use by signing 40-pages
contracts), and data resulting from access to Net services, digital operators
can create value from data and offer very innovative services. This is the heart
of massive digital platforms who offer free services to customers so that they
can create value out of customers access for people interested in selling products, services, or access to other sellers. Creation of knowledge through Big
Data, potential innovative services, referencing, advertisement, intelligence
activities, there are a lot of very profitable reasons to release the maximum
part of personal data from the limits of privacy.
Concurrently there is a kind of psycho-sociological pressure to make open
use of personal data: A social pressure and an interest to share data and
contents through social networks, but also personal valuation trough public
access to personal data. The digital society could evolve toward a kind of
transparency about personal data. And rather than a Big Brother evolution of
society it could favor ideas such as anyway, some people I dont know have
access to my private data, so there is no need to make a difference between
private or not private.
Some fear with reasons the Big Brother kind of scenarios, with a disappearing boarder between private and public personal data, with a danger to
erode and affect the inner versus outer domains, which are so critical to build
up ones own personality. Others claim privacy is an old concept. Still others
think we need time to adapt, and that step by step we are going to develop
rules and tools to manage and master the privacy issues.
Francis Jutand serves as Director of the Department of Scientific Affairs and
Member of the Board of Governors at the Institute Mines-Tlcom. He is President
1. See La mtamorphose numrique (The Digital Metamorphosis), by F. Jutand and 14 authors
on the impact of the digital metamorphosis.
11
Introduction
Five workshops took place in situ and on Google+. The task was not easy as
there was the barrier of the language (the working language was English), the
barrier of the different experiences (research, future studies, marketing, legal,
etc.), as well as a variety of paradigms. The questions addressed were:
Will ownership still be a value worth something in ten years time?
Do we need to have someone who owns or should we prefer universal
access? With digital technologies, is this value-rising or, on the opposite,
value-declining?
In ten years time, is public data likely to define our digital identity?
Will the value of respect still be relevant? Is digital identity likely to overpass
personal identity and private life?
How is value creation likely to evolve in the coming ten years? What are
the major levers of change? Since service is adding value to data (raw vs aggregated data), how are services likely to evolve? Will usage lead to an increase
or decrease of economic intermediaries (economic actors)? As awareness on
privacy increases, is public value likely to develop and how? What roles could
communities and group of individuals play in providing social value?
Once we had shared our understandings and key questions around this
three themes, we then proposed to share our visions of the future. Each
member proposed a scenario for 2023. The results were surprising. Although
the first workshops showed there were various understandings of the concept
of privacy and its likely evolution, the experience of the scenarios showed
some convergence.
These were the emerging points of convergence:
More transactions, more data exchanged, flow brings value (vs stock, vs
property); value creation comes from interaction, attracting attention (vs
selling data), creating new links, from what you do with it (vs collecting).
Cognitive evolution: In the context of overflow of information, we
remember the context where we stored the information and the comments
more than the events themselves; for example, we tend to pay more attention to the Likes than to the profile.
Risk of surveillance society (failure of data protection legislation and of
privacy enhancing technologies).
Expected huge technological collapse, security breach or Black Thursday
in the data ecosystem between 2016 and 2020.
Traceability and trust, long term relationships, more loyalty to people and
business that protect your data; the meaning of trust evolves as value chain
creation is often not transparent.
12
13
Introduction
Discussion
Introduction of the afternoon session
Thibaut Kleiner, Senior Advisor in charge of Privacy, European Commission
Cabinet of Vice-President Neelie Kroes
14
Discussion
Third roundtable: Value creation and privacy
What are the rising questions with regards to privacy, from both the
economic and technical perspectives? Can value creation be achieved in
this field? What are the existing trade-offs? What are the challenges faced
by companies with regards to regulation? How can this be tackled? Which
shifts are required, in which fields? What are the likely scenarios in terms of
market players? How is their socio-economic contribution likely to evolve in
the coming ten years?
Key note speech: Nicolas de Cordes, VP Marketing Vision, Orange
Armen Aghasaryan, Senior Researcher, Alcatel-Lucent Bell Labs
Stphane Lebas, Marketing Director, SFR
Matthieu Soul, strategic analyst, Ateliers BNP Paribas
Patrick Waelbroeck, economist, member of the Chair Values and Policies
of Personal Information, Institut Mines-Tlcom
Discussion
Concluding remarks
15
Cultural Differences
in the Perception of Privacy
Introduction
In February 2012, the Obama White House unveiled a Privacy Bill of
Rights (2012, 9). Although most of its principles were recognizable as a kind
of traditional principles of fair information practices, embodied, for example,
in the OECD Privacy Guidelines, the third principle of Respect for Context
(PRC), introduced as the expectation that companies will collect, use, and
disclose personal data in ways that are consistent with the context in which
consumers provide the data (p. 47), was intriguingly novel. The Report
buoyed hopes. It signaled in the White House a serious interest in privacy
and it portended a departure from business as usual. In addition to the Bill
of Rights, the Framework for Protecting Privacy laid out a multi-stakeholder
process, provided foundations for effective enforcement, pledged to draft new
privacy legislation, and announced an undertaking to increase interoperability
with international efforts to protect privacy (Civil 2012).
At the same time, the dockets of public interest advocacy organisations
slowly filled with privacy challenges. Courts and regulatory bodies were awash
with cases of overreaching standard practices, embarrassing gaffes, and technical loopholes that enabled surreptitious surveillance and the capture, aggregation, use, and dispersion of personal information. As awareness spread so
did annoyance, outrage, and alarm among ordinary users of digital and information technologies, notably, the Web, mobile systems, and location based
services.
For anyone following deliberation on the subject of privacy, these observations are not new. More jarring, however, is that this panoply of information
practices, for the most part, proceeds under the halo of legality evoking, quite
19
literally, gasps of disbelief, among the newly informed. For privacy scholars
and activists, the level of indignation about these perfectly lawful practices
adds strength to their position that something in the relevant bodies of law
and regulation is amiss, that the status quo needs correction. Taking the cue,
governmental bodies have begun placing citizens privacy on their active
agenda. It is at this point in the story that my article picks up.
The present moment resembles others in the recent history of privacy
in which revelations about new technologies, practices, or institutions push
beyond a threshold and momentum gathers for the position that Something
has to be done! At this juncture, as others, public commentary reflects widespread anxiety over the deployment of IT, networks, and digital and information systems (including so-called Big Data) that have radically disrupted
flows of personal information.1 It always bears reminding that socio-technical
systems embedded in particular political-economic environments and not
bare technology are the proper agents of disruption (Nissenbaum 2010).
Acknowledging this anxiety, federal authorities have aimed at an adjustment of the status quo through such vehicles as the White House and FTC
Reports, with similar governmental reactions elsewhere in the world, e.g. WEF
Report (2012) and EU Amendments Process. Both Reports include a number
of recommendations for policy and procedure, but in this article, as indicated
above, the focus is on the White House Consumer Privacy Bill of Rights, and
within the Bill of Rights, the Principle of Respect for Context (PRFC), which
holds great promise as an agent of change, yet equally, could fizzle to nothing.
20
emerging privacy issues to be worked out by industry, civil society, and regulators. On the industry front, Google declared itself on board with Obamas
Privacy Bill of Rights, and Intel affirmed the Administrations [] calls for
US federal privacy legislation based upon the Fair Information Practices
(D.Hoffman 2012).
Unprecedented White House engagement with contemporary privacy
problems has buoyed hopes that change is in the air. How far the rallying
cry around Respect for Context will push genuine progress, however, is critically dependent on how this principle is interpreted. Context is a mercilessly
ambiguous term with potential to be all things to all people. Its meanings
range from the colloquial and general to the theorised and specific, from the
banal to the exotic, the abstract to the concrete, and shades in between. The
positive convergence of views held by longstanding antagonists may be too
good to be true if it rests on divergent interpretations. Whether the Privacy Bill
of Rights fulfills its promise as a watershed for privacy will depend on which
one of these interpretations drives public or private regulators to action.
2. Meanings of Context
This article focuses on specific meanings and shades of meanings that
seem to have shaped the White House principle, embodied both in deliberations leading up to public release of the Report and in action and commentary that has followed it. My aim is to demonstrate that some interpretations
would have no systematic impact on policy and some would lead no further
than entrenched business-as-usual. Whereas some meanings offer progressive
if limited improvement, an interpretation based on the theory of contextual
integrity opens the doors to a genuine advancement in the policy environment, one that heeds the call for innovation, recognises business interests
of commercial actors, at the same time placing appropriate constraints on
personal information flows for the sake of privacy. I am arguing that only a
subset of uses form a viable foundation for systematically shaping privacy
policy, and, more importantly, that not all among this subset will mark a
productive departure from business as usual.
In the influential subset, four interpretations are of particular interest; they
reflect views of persistent voices in the privacy and IT arena: context as technology system or platform, context as sector or industry, context as business
model or practice, and context as social domain.
atop (or below) it; most notably, the Web and the host of systems and
platforms it, in turn, has spawned. When these systems mediate communication, action, and transaction, we talk of these activities as taking place online,
or in Cyberspace, and because of this, it has been natural to conceive of the
privacy problems associated with them, tinged with the distinctive character
of the medium, as problems associated with the online context, the context of
the Net. The language of context as applied to technology slides around quite
smoothly, however, and we readily talk of acting and communicating in the
context of a phone call, in the context of an online social network, in the
contexts of Twitter, Facebook, or Wikipedia, or in the contexts of the various
mobile, location-based services and applications.
These expressions suggest that contexts are defined by the properties of
respective media, systems, or platforms whose distinctive technical characteristics shape moderate, magnify, enable the character of our activities,
transactions, and interactions, including ways that information about us is
tracked, gathered, analysed, and disseminated. If contexts are understood as
defined by properties of technical systems and platforms, then respecting
contexts will mean adapting policies to these defining properties.
23
context, understood as respect for social domains, opens new and significant
avenues for the proposed White House policy framework, I have provided a
brief excursus into the theory of contextual integrity.
2.4.1. Contextual integrity: descriptive dimension
The heart of our concerns is appropriateness; specifically, technologies,
systems, and practices that disturb our sense of privacy are not those that
have resulted in losses or control, nor in greater sharing of information,
but those that have resulted in inappropriate flows of personal information. Inappropriate information flows are those that violate context specific
informational norms (from hereon, informational norms), belonging to a
subclass of general norms governing of respective social contexts. The theory
of contextual integrity offers a structured account of these informational
norms that aims for descriptive rigor as well as normative clout.
Three key parameters define informational norms: actors, informationtypes, and transmission principles. They prescribe appropriate flow according
to the type of information in question, about whom it is, by whom and to
whom it is transmitted, and conditions or constraints under which this transmission takes place. Informational norms are context-relative, or contextspecific, because, resting atop a model of a differentiated social world, they
cluster around coherent but distinct social contexts. Accordingly, the parameters, too, range over distinct clusters of variables defined, to a large extent,
by respective social contexts.
Actors, the first parameter subject, sender, recipient are characterised
by particular context relevant functions, or roles, as they act in capacities
associated with particular roles or functions within contexts. These functions include the perfectly mundane and familiar physician, nurse, patient,
teacher, senator, voter, polling station volunteer, mother, friend, uncle, priest,
merchant, customer, congregant, policeman, judge, and, of course, many more.
In complex, hierarchical societies, such as the contemporary United States,
actors governed by informational norms might be collectives, including institutions, corporations, or clubs. Information type, the second parameter, ranges
over variables derived from ontologies that, for the most part, reflect the
nature of particular domains. Finally, transmission principle, the third parameter, designates the terms or constraints under which information flows.
By isolating the transmissions principle as an independent variable we
can reveal the source of error in the dominant understanding of privacy as a
right an information subject has to control information about him or herself
(through notice and consent mechanisms, for example). Seen through the
lens of contextual integrity, it mistakes one part of the right for the whole;
mistakes the transmission principles for the informational norm.
24
The three parameters actors, information types, and transmission principles are independent. None can be reduced to the other two, nor can any
one of them carry the full burden of defining privacy expectations, except
perhaps when one of two of the parameters is so obviously understood, or
tedious to fully specify, that it need not be explicitly mentioned. This is why
past efforts to reduce privacy, say, to one class of information or to one transmission principle are doomed to fail.
When actions and practices comport with informational norms, contextual
integrity is maintained. But when actions or practices defy expectations by
disrupting entrenched, or normative information flows, they violate contextual integrity.
The theory of contextual integrity is a theory of privacy with respect
to personal information because it posits that informational norms model
privacy expectations; it asserts that when we find people reacting with
surprise, annoyance, indignation, and protest that their privacy has been
compromised, we will find that informational norms have been contravened,
that contextual integrity has been violated.
2.4.2. Contextual integrity: prescription and policy
My claim is that context understood as social domain offers a better chance
than the other three for the Principle of Respect for Context to generate positive momentum for meaningful progress in privacy policy and law. The account
of social domain assumed by the theory of contextual integrity constitutes a
platform for connecting context with privacy through context-specific informational norms, and offers contextual integrity as a model for privacy itself.
In order to develop this claim, a few observations concerning the White House
Privacy Bill of Rights provide a necessary perspective.
25
forward counterparts in the other sets of guidelines, each worthy, in its own
right, of in-depth critical analysis.
Here, however, my focus dwells primarily on Respect for Context, which
Appendix B shows lining up with Purpose Specification and Use Limitation
Principles. I also would like to draw attention to the White Houses CPBR principles of Focused Collection and Individual Control, whose counterparts in
the OECD Guidelines are listed as Collection Limitation and Use Limitation
principles. Although the first pair does not explicitly mention context, I argue
that context and how context is interpreted have significant bearing on how
these two important principles play out in practice.
The right of Respect for Context is summarised as a right to expect that
companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data (White House
Privacy Report 2012, 55). Its close kin, (i) Purpose Specification and (ii) Use
Limitation, summarised in Appendix B, from the OECD Privacy Guidelines,
require that, (i) The purposes for which personal data are collected should be
specified not later than at the time of data collection and the subsequent use
limited to the fulfillment of those purposes or such others as are not incompatible with these purposes and as are specified on each occasion of change
of purpose (p. 58); and (ii) Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance
with Paragraph 9 [purpose specification] except (a) with the consent of the
data subject; or (b) by the authority of law (p. 58).
4. A question of interpretation
Thus far, we have argued that fixing an interpretation of the Principle of
Respect for Context (PRC) promises substantive meaning for the Consumer
Privacy Bill of Rights (CBPR). Achieving the more important aim, however, the
aim of materially advancing the state of privacy protection in the US requires
that we fix the right interpretation. Only social domain fully answers this
need.
Although it is not wrong to say that people may act and transact in contexts
shaped by technical systems, it is a mistake to hold that these systems fully
account for the meaning of Respect for Context. So doing allows material
design to define ethical and political precepts; it allows the powers that shape
the technical platforms of our mediated lives not only to affect our moral and
political experiences through built constraints and affordances, but further,
to place them beyond the pale of normative judgment. Where technical platforms mediate multiple spheres of life, such as those constructed by Facebook
and Google (particularly its newly federated construct), the need to distin26
5. Summary of Findings
For the Consumer Privacy Bill of Rights (CPBR) to meaningfully advance
privacy protection beyond its present state, a great deal hangs on how the
Principle of Respect for Context (PRC) is interpreted. My evaluation reveals
key implications of each conception of context as business model, as technology, as sector, and as social domain. Respecting context as business model
offers no prospect of advancement in privacy protection beyond the present
state-of-affairs. Citing innovation and service as the drivers behind this interpretation, its proponents are expecting individuals and regulators to sign
a blank check allowing businesses to collect, use, and disclose information
based solely on exigencies of individual businesses.
Respecting context as sector (or industry) fares slightly better as it offers
a framework beyond the needs of individual businesses to establish standards
and norms. How well this approach meaningfully advances privacy protection
beyond the present state depends on how sectors are defined.
Understanding context in purely technological terms implies that
27
Conclusion
Context may very well be constituted by technology, business practice, and
industry sector. It may be constituted by geographic location, relationship,
place, space, agreement, culture, religion, and era, and much more, besides.
In individual instances, each one could qualify and shape our expectations of
how information about ourselves is gathered, used, and disseminated. None
of them, however, provides the right level of analysis, or carries the same
moral and political weight as social domain. This is the thesis I have defended.
Upon its basis, I offer an amendment to the Principle of Respect for Context
as it is given in the Consumer Privacy Bill of Rights: Respect for Context means
consumers have a right to expect that companies will collect, use, and disclose
personal data in ways that are consistent with the [social] context in which
consumers provide the data.
28
29
30
Introduction
How to think otherwise is the lot of modern especially French philosophy.
Despite other approaches, such as the insight generated by innovation gurus or
the self-made luck of serendipity, the advantage of philosophical juxtaposition
is perspectivism. Michel Foucault, for example, expresses astonishment at the
limits of his (Western) thought as it runs up against an alien taxonomy. Indeed,
such is the incongruity of the classification of animals that Jorge Luis Borges cites
from the Chinese encyclopaedia, Celestial Emporium of Benevolent Knowledge,
that Les mots et les choses a son lieu de naissance dans [ce] texte [] [et] le rire
qui secoue sa lecture toutes les familiarits de la pense (Foucault1966, 3).
What confront Foucault are the cultural specificity of his understanding and the
- - from where it is
historicity of his knowledge, which are enclosed in an episteme
well nigh impossible to think otherwise. But it is not just because he came from
the land of le mme that Foucault seemed so perplexed by lautre, which French
republican conviction excludes. There really are times when ones habitus is
exposed for what it is, viz., a perspectival set of dispositions, comportments,
ideas and emotions, which make no sense elsewhere.
Apart from the historical limits of our interpretative horizons, therefore,
the encounters between what we do with what they do is a means to problematise our thinking.1 For this reason, I would like to examine the Japanese
conception of subjectivity and how it informs debates about privacy, civil
society and surveillance. The presumption is that such an inquiry into how
1. Is theory, once translated from the West into Japanese, so to speak, renovated or reborn?
Or is it displaced onto other intellectual and political horizons? (Elliott, Katagiri and Sawai
2013, 2).
31
they are taken up there might help us to think differently about the question
of privacy here. To begin with, section one sketches a view of the subject of
privacy. In section two we outline a discourse on the uniqueness of Japanese
subjectivity, which is a potential source for thinking differently about privacy.
Section three then focuses on Japanese civil society and the spaces of privacy
it affords, which allows a contrast with Western accounts of the public-private
dichotomy. Finally, in section four we examine the claim that the notion of a
supervised society might better capture the ocular nature of power in Japan
today. We conclude with a discussion of privacy once it is juxtaposed with its
manifestation in Japan.
32
What form might any invasion of privacy as the right to be left alone take?
It may range from private information being released into the public realm (as
in so-called revenge porn [Jacobs 2013], which reveals how consent is dependent upon context and choice a function of trust), decisions being made on
our behalf without consent, or data provided in confidence to one institution
being sold to another. The concept of privacy implores legislation on behalf of
personal autonomy (or the right to decide about all self-regarding actions);
self-determination (or the right to control the flow of information about
ourselves, even when others demand access to it); consummatory claims (or
the right to dignity, or privacy as an end that screens one from any spectacle
in extreme, compromising or unforeseen circumstances); and strategic intent
(or the right to secrecy about final intentions and ends, hence privacy as a
means to safeguard and promote self-interest) (Rule 2012, 65-66).3
What is central is the mediation of the relation between the individual and
the state (or a third party that is subject to regulations put in place by the state)
through rights. They initially concern the individuals corporeal sovereignty vis-vis pre-constitutional arbitrary violence, thereafter the various freedoms
(of conscience, thought and speech) to be enjoyed without interference by
others, and today the radical transformation by technology of subjectivity
and the question of privacy itself. The common denominator is the Western
subject, who because she authors, knows, decides, creates, imagines, envisions
and chooses, is assigned a right to a private, sovereign realm of thought and
action. The question is what happens to privacy when there is no ghost in the
machine (Ryle 1949, 15), or subject, on behalf of whom the right exists?
33
34
35
oscillates between the uchi (inner) and soto (outer) realms, which require a
constant Janus-faced switching between ones honne (true core) and tatemae
(public expectation) (Bachnik 1986; Rosenberger 1992).10
In sum, when subjectivity is non-corporal, yet mediated by the interaction of two bodies (human or social), we might readily call it a virtual subject
in reality, for which privacy, as a question of individual autonomy, makes
no sense. Why, after all, would anyone claim ownership of ideas, feelings,
emotions, thoughts, preferences or personal information, which is the condition of possibility for the right to privacy, if they are only ever extra-corporeally constituted in face-to-face interaction, the community or language?11
36
37
In other words, Japan has not entirely fended off the world-historical transformation [] of the emergence of new practices, dynamics, and technologies of surveillance (Haggerty 2009, ix). In Tokyo, for example, surveillance at
a glance seems innocuous. The Tokyo Metropolitan Government (TMG) and a
high profile governor below which are 23 ku (wards) govern the city. One
level below are the NHAs mentioned earlier and shopkeepers associations, or
shoutenkai (SKAs). While formal policing is shared between the National Police
Agency (NPA) and the Tokyo Metropolitan Police Authority, there are only 363
NPA cameras in Japan because local government, private corporations and
NHAs and SKAs (Murakami Wood 2012, 87) operate the majority. To be sure,
camera surveillance has increased due to several events: the Aum Shinrikyo gas
attacks on the Tokyo metro in 1996; the 2002 FIFA World Cup, where surveillance mirrored British and American ideologies of crime prevention (McGrath
2004, ch. 1) and targeted foreign hooligans and illegal foreign vendors; and
the Community Security and Safety Development Ordinance of 2003 introduced by the TMG governor in a bid to crack down on crime and its supposed
cause, foreigners. In addition, since 2012 all foreign nationals are dividualised
via an obligatory local residents registry (jyuminhyo), which is digitised and
connected to the states juki-net database (Ogasawara 2008).13 Nonetheless,
there has not been a centralisation of control in Tokyo, but the responsibilisation of the various actors that deploy it (Murakami Wood 2012, 88).
The effect is the surveillance of the few who do not measure up to the
Nihonjinron techno-cultural benchmark. In some respects it is a pragmatic
issue. Blanket surveillance in Tokyo is difficult due to its panoply of narrow
alleyways, while each ward is often financially or politically unable to implement surveillance cameras. Yet strategic surveillance is also political. It
follows a logic outlined by Zygmunt Bauman in Globalisation: The Human
Consequences, in which he distinguishes between the tourist and the vagabond. At the global level, the tourist that searches for new experiences adopts
strategies of movement. These take advantage of privileged rights of passage
in an exclusive world of time divorced from space. At the other extreme, the
tourists alter ego, the vagabond, precisely because of the absence of any
privileges, pursues strategies of survival to escape the ever present threat of
stigmatising and assignment to the underclass, which is an anonymous
human mass to be dealt with by any means possible (Bauman 1998, 96-97).
13. As Gilles Deleuze (1995, 181-182) shows, control societies that deploy the mechanism
of surveillance constitute new ocular objects, or the dividual, whose freedom of movement and right of association in the public sphere is dependent on the functioning of their
electronic card, which in turn is a function of a smoothly running and anonymous computer
system.
39
The tourists might be said to correspond to the Japanese citizen and the
vagabond to the non-Japanese resident. Moreover, the techniques of stigmatising follow an onto-technological logic. On the one hand, the vagabond
is the target of a strategic technological intent (Western techne- guided by
Japanese spirit), or surveillance and exclusion from the technological devices
that constitute the identity of the Japanese domestic tourist. On the other
hand, the tourist is the target and effect of the unintentional effects of ICT
- Technologies here are tech(Japanese spirit constituted by Western techne).
nologies of power in the double sense that they divide and conquer: division
insofar as they create difference within (Japanese) sameness; and conquering
to the extent that the technologically mediated sign of difference is off-limits
to the vagabond, who is subject to domination by surveillance proper. For
these reasons, we might speak of the strategic surveillance of the few, the
vagabonds, and the supervision of the many, the tourists, via friendly authoritarianism. According to Sugimoto (2010, 290-291), it is authoritarian to the
extent that it encourages each member of society to internalise and share
the value system which regards control and regimentation as natural, and
to accept the instructions and orders of people in superordinate positions
without questioning.
In terms of authoritarianism, this discipline-control system employs four
main mechanisms of micro-management: firstly, the use of small groups,
such as NHAs, as the basis of mutual surveillance and deterrence of deviant
behaviour; secondly, an extensive range of mechanisms in which power is
made highly visible and tangible; thirdly, the legitimisation of various codes
in such a way that superordinates use ambiguities to their advantage; and,
fourth, the inculcation of moralistic ideology into the psyche of every individual with a particular stress upon minute and trivial details. However, the
administering of these authoritarian mechanisms is friendly. Firstly, it resorts,
wherever possible, to positive inducements rather than negative sanctions
to encourage competition to conform; secondly, it portrays individuals and
groups in power positions as congenial and benevolent, and uses socialization channels for subordinates to pay voluntary respect to them; thirdly, it
propagates the ideology of equality and the notion of a unique national homogeneity, ensuring that notions of difference are blurred; finally, it relies upon
leisurely and amusing entertainment, such as songs, visual arts and festivals,
to make sure that authority infiltrates without obvious pains. In this light, the
predictions at the dawn of the information society of Japanese critical theorist,
Masakuni Kitazawa, are poignant. He feared the technocratic bio-politics of
a discipline-control state would create a kanri shakai, or supervised society,
that is administered through highly sophisticated mechanisms for forecasting,
40
planning and control [and which propagate] a set of optimum conditions for
that societys well-being (Kitazawa quoted in Buckley 1993, 415).
Conclusion
Through a broad excursus of the Japanese concept of Nihonjinron, we can
see that any regulation of privacy, public space and surveillance by international organisations expounding universal norms would merely proliferate the
perception of globalisation as a process of westernisation. It would not only
be at the expense of the diverse manner in which privacy is practiced, but
its Western concept of subjectivity would make a legal framework grounded
in the individual nothing more than a proxy for the ravages of transnational
capitalism.
It is for these reasons that juxtaposing how privacy is taken up in Japan
is important. It provides the opportunity to problematise the subject, which
portends well if privacy is to have multiple futures. In short, because of the
Japanese extra-corporeal subject in reality, the ontological upheavals of a
dynamic of surveillance focused on digital personae (Lyon 2001, 15) with
virtual / informational profiles (Haggerty and Richardson 2006, 4) promises
to be of less importance in Japan than in those cultures where subject-centred
identity means the individual risks ejection from the landscape of self-identity. With the current restructuring [] [of] the nature of the individual
(Poster 1990, 185-190), it is possible that we have much to learn from them
about how to be in the future, where privacy promises to be very different
from what it is today.
Bregham Dalgliesh. After completing his graduate studies in Canada (University
of British Columbia) and Scotland (University of Edinburgh), Bregham Dalgliesh taught
in France (Sciences Po Paris, Tlcom cole de Management, ENSTA ParisTech, ESSEC,
New York University in France) from 2002 to 2011 before taking up his current position
as Associate Professor at the University of Tokyo. He has published widely, with the task
of critique taken up through an engagement with science and technology as socially
embedded enterprises that demand philosophical reflection because of their
constitutive effect upon the politics, culture and ethico-moral relations that
define and limit the human condition.
References
Abe, Kiyoshi. 2000. The Information Society without Others: A Critique of Informatization
in Japan. Kwansei Gakuin University Social Sciences Review 5, 53-73.
Allison, Anne. 1994. Nightwork: sexuality, leisure, and corporate masculinity in a Tokyo
hostess club. Chicago: University of Chicago Press.
Bachnik, J. M. 1992. Kejime: Defining a Shifting Self in Multiple Organisational Modes.
In N.R. Rosenberger (ed.) Japanese Sense of Self. Cambridge University Press, 151-172.
Bauman, Zygmunt. 1998. Globalization: The Human Consequences. New York: Columbia
University Press.
41
42
43
Introduction
Wolfgang Schulz
47
we should ponder about the connection between autonomy and memory and
when doing so recall one saying by the German sociologist Niklas Luhmann,
Forgetting is the main function of memory. When we think about forgetting and the right to be forgotten in the public sphere then we talk about
things like social memory. Can there really be an individual autonomy over
social memory? Of course, a politician who went astray would like that, to
have memories of bribery or other deeds erased, and we know a lot of cases
in the online sphere where a person demands archive content which deals
with his past behaviour to be taken down. Legal debates revolve around these
issues, that people want things to be erased and that means of course, that
somehow our social memory is affected.
When we talk about things like that and when we look at the freedom
of communication aspect only, we have to see that there is an ambivalence
in the concept of privacy: On the one hand, privacy enables communication, because when you know that your privacy is protected in the process
of communication, then you contribute to the public debate more openly.
On the other hand, autonomy cannot be the only principle governing public
communication. You can say that the whole media system is a system that is
about social memory and about forgetting things. The news of today erases
somehow the news of yesterday.
Intervening with this kind of systems we put in place to create social
memory, means on the other hand to interfere with these instruments,
the social institutions we have developed. It is up to the media system to
structure public memory. Thats an interesting freedom of speech issue, Im
intending to do more work on it in the future.
self and relating with others, and deciding autonomously what personal
information to give to others and what to restrict somehow.
3. Then of course there are other third parties involved; one of the most
important issue being that right holders need their personal data to sue
alleged copyright infringers.
I think it is helpful to keep in mind that there are at least these three categories of conflicts, and that we are very likely to them in future, in the next
couple of years even. Each type of conflict might call for a specific regulatory
solution.
What potential risks do we face and have to deal with when we try to come
up with solutions, when we look for a global governance concept for privacy
for the next ten years? I list here just some trends:
Personal data as a payment for services,
Data as warfare,
Knowledge asymmetries,
Recombination of Big Data,
Blurring of the private-public distinction,
Informed consent fallacy,
Data literacy,
Fragmented regulation and forum shopping.
Personal data is more and more becoming a kind of currency for online
commerce. It somehow changes the perspective significantly since many
legal systems frame personal rights in a way that comes from human dignity,
from autonomy like I mentioned before. The current data protection regulation does not really see personal data as a kind of payment; however, in the
everyday life it is.
We see that data can be a kind of weapon, that we need data to fight
terrorism for example, and the other way round: information about personal
things can be used as a kind of weapon both ways.
Another relevant set of problems link to information asymmetries. When
I wear Google Glasses, to take an example, and I look at someone, I receive
information about his educational background, what texts he has recently
posted online, what his skills are, then there is an inherent asymmetry when
he does not wear the same type of glasses. We will see more of this kind of
asymmetries in various social situations with the increasing market penetration of augmented reality applications. How do we deal with that?
Another salient issue is Big Data: Many things we have designed to protect
private life, to make communication anonymous, dont really work anymore
when you have the potential to recombine Big Data, and then detect who the
acting individual actually is.
49
hardware and software structure of the Internet, plays a major role. We have
to consider the interplay of all those things to get the whole picture. That is
our starting point when we do research on things like that and we believe that
privacy governance has to be analysed along the same lines.
53
55
Since 2005, the Asia-Pacific Economic Cooperation (APEC) has also its
specific Privacy Framework.5
Despite different legal cultures and regimes, these texts make appear a
consensus on the way to protect personal data (1). This consensus, far from
being challenged, is currently reaffirmed and reinforced in all the international
instances (2).
56
give rise to arbitrary discrimination, such as racial origin, sex life or political
opinions. It also provides additional safeguards for individuals and requires
countries to establish appropriate sanctions and remedies.
At the European level, if the data protection directive embodied a set of
principles consistent with the OECD and Council of Europe agreements, the
principles are somewhat stronger. The directive includes inter alias the establishment of Data Protection Authorities and a right to have disputes heard by
the courts. It also requires to provide opt-out options for direct marketing
uses of personal data as well as limitation on data exports to countries outside
the EU which do not have an adequate level of personal data protection. So
as to better align Convention 108 with the EU Directive, the Convention has
been completed in 2001 by an additional protocol regarding the role of independent supervisory authorities and requiring data export limitations.6
Regarding the APEC Privacy Framework, this Framework promotes a weak
standard from the European point of view, in that sense that principles
present in other international instruments or in national laws of many countries, among which are the 21 APECs members, are not reproduced. Thus, the
Principles in APECs Privacy Framework are at best an approximation of what
was regarded as acceptable information privacy principles twenty years ago
when the OECD Guidelines were developed.7 For instance, the APEC Privacy
Framework does not limit collection to lawful purposes,8 nor does it include
the purpose specification principle. Moreover, new principles have appeared,
testimony of the influence of the United States: The principles of preventing
harm and of choice are ambiguous as to their effect and are capable of a vast
number of interpretations and implementations. We also notice a complete
absence of any obligations to enforce the principles by law. However, the APEC
processes have stimulated regular discussion of data privacy issues between
the governments in the region and more systematic cooperation between
Data Protection Authorities in cross-border enforcement.9
6. Additional Protocol to the Convention for the Protection of Individuals with regard to
Automatic Processing of Personal Data, regarding supervisory authorities and transborder
data flows, Strasbourg, 8.11.2001.
7. Greenleaf, G., Asia-Pacific developments in information privacy law and its interpretation, University of New South Wales Faculty of Law Research Series 5 (19 January 2007),
p. 8.
8. De Terwangne, C., Is a Global Data Protection Regulatory Model Possible?, in Reinventing
Data Protection?, Gutwirth, S.; Poullet, Y.; Hert, P.; Terwangne, C.; Nouwt, S. (eds.), 2009,
p. 184. Also Pounder D.C., Why the APEC Privacy Framework is unlikely to protection
privacy?, http://www.out-law.com/page-8550.
9. Greenleaf, G., The Influence of European Data Privacy Standards outside Europe:
Implications for Globalisation of Convention 108, Edinburgh School of Law Research Paper
Series, n 2012/12, p. 17.
57
58
parency of the processor of personal data and gives the individual subjective
rights to control the processing if his/her personal data.13
59
60
within the EU. The first requirement covers certain key data protection principles that should be embodied in the third countrys framework. The second
requirement looks at available mechanisms to deliver a good level of compliance, to provide support to individual data subjects and to provide appropriate
redress to injured parties where rules have not been complied with. Even if
the list of adequacy decisions is not impressive; however, it is growing: Israel,
Uruguay and New Zealand were added recently. It is also likely that the list
will grow in the future. The draft regulation has provided for more flexibility
by allowing adequacy decisions for a territory or a processing sector within a
third country, and by introducing the possibility of finding an adequacy for an
international organisation.
Finally, the growing practice of cooperation among data protection
authorities both in Europe and on other continents can give considerable
weight to more global privacy practices. Since 2010, the Global Privacy
Enforcement Network (GPEN) has grown to include members from Europe,
Asia, North American and the Pacific. The US Federal Trade Commission is
playing a very active role and is working together with supervisory authorities
in Europe, Canada and other APEC countries.
Since most data protection legislations are based on the same international documents, the fundamental principles of law are similar across regions
and legal system. However, the differences in the cultural, historical and legal
approaches to data protection and privacy mean that once one accesses
to the highest level of abstraction, there can be significant differences. I do
not think we will end-up with full harmonisation across the globe. A certain
degree of diversity will always remain. It is unavoidable and even desirable.
Claire Levallois-Barth is doctor in law, Matre de confrence at the Economic
and Social Science Department at Tlcom Paristech. She specialises in new technologies law, especially in privacy and data protection law both at the French and international level. Her studies are related to the application of these legal aspects to new
technologies such as location-based services, Bluetooth and social networks. She is the
coordinator of the Chair Valeurs et Politiques des Informations Personnelles (Institut
Mines-Tlcom). She is the general secretary of the French Association of Personal
Data Protection Officials (Association Franaise des Correspondants la Protection
des Donnes Caractre Personnel - AFCDP) and she is a Data Protection Official.
61
63
64
data in certain sectors. At the federal level, eight different privacy laws exist,
each with a different acronym and scope of application:
HIPAA (Health Insurance Portability and Accountability Act) health data,
GLBA (Gramm-Leach-Bliley Act) financial data,
COPPA (Childrens Online Privacy Protection Act),
FCRA (Fair Credit Reporting Act),5
ECPA (Electronic Communications Privacy Act),
VPPA (Video privacy protection act),
Cable TV Privacy Act,
Can-SPAM Act.
Some of these laws are at least as restrictive as European data protection
laws, although their scope is more limited. In addition to these focused federal
laws, there exists a myriad of state laws dealing with targeted privacy issues.
The State of California is particularly active, having enacted laws targeting the
collection of data via the Internet as well as the so-called eraser law, which
permits minors to delete their personal data on Internet platforms.6 California
also has a general right of privacy included in the states constitution. Almost
all states in the United States have laws regulating how data breaches should
be notified.
In addition to these focused statutes, the United States has a general
statute on consumer protection that has been used extensively as a means to
protect personal data. Section 5 of the Federal Trade Commission Act prohibits
any unfair or deceptive practice and empowers the Federal Trade Commission
(FTC) to enforce the provision against companies. Over recent years, the
Federal Trade Commission has proactively expanded the concept of unfair
and deceptive practice to include processing of personal data by companies
in ways that do not match the reasonable expectations of consumers. The
FTCs first point of focus is on the privacy policies that companies themselves
publish. If any of the statements in the privacy policy are not respected by the
company, either in spirit or in letter, the FTC will accuse the company of an
unfair and deceptive practice. The FTC has expanded the concept of unfair and
deceptive practice to cover information security, thereby putting a relatively
high burden on companies to take measures to protect personal data against
unauthorised disclosure. The FTC has a wide range of tools at its disposal,
going from soft measures such as workshops and guidelines to more draconian measures such as sanctions and, importantly, settlement agreements.
(We will return to the subject of settlement agreements in the second part
of this article.)
5. Incidentally the FCRA includes a form of right to be forgotten.
6. For a description of Californias privacy laws, see, http://oag.ca.gov/privacy/privacy-laws.
65
The FTC uses these tools to send signals to the market regarding the FTCs
interpretation of the vague unfair and deceptive standard. Professor Solove
refers to the FTCs new common law of privacy.7 Many states have their own
authorities (generally the attorney general), which enforce state privacy rules.
Those state authorities can issue guidelines in addition to those of the FTC.
The recent guidelines issued by the California Attorney General on mobile
applications8 contain recommendations that resemble in many respects the
position of Europes Article 29 Working Party.9
Even in matters involving government surveillance, US and European laws
are not as far apart as they might seem. Like most European countries, the
United States has a separate set of rules for normal police investigations and
for national security operations.10 Police investigations are governed by the
Crimes and Criminal Procedure11 section of the US Code, whereas national
security investigations are governed by the Foreign Intelligence Surveillance
and War and National Defense12 sections of the Code. This is similar to the
legal structure in France: the Code de procdure pnale governs surveillance in
the context of criminal investigations, and the Code de la scurit intrieure
governs surveillance in the context of national security. As can be expected,
the rules surrounding national security provide fewer safeguards and less
transparency than the rules applicable to criminal investigations. In criminal
investigations, police must obtain a court order before conducting intrusive
surveillance. In national security matters, authorisations may be given by a
separate national security court (in the US) or by a specially named person in
the Prime Ministers office (in France).
The Snowden affair has raised serious questions about the adequacy of the
US framework for national security surveillance. A recent report commissioned
by President Obama shows that the US regime for collection of data in national
security cases requires improvement, in particular to better protect privacy
of both US and non-US citizens.13 The European Commission also listed areas
where the US could help restore trust in cross-border data flows, including the
7. Daniel Solove and Woodrow Hartzog, The FTCs New Common Law of Privacy, August,
2013, www.ssrn.com.
8. California Attorney General, Privacy on the Go, Recommendations for the Mobile Ecosystem, January 2013 http://oag.ca.gov/sites/all/files/pdfs/privacy/privacy_on_the_go.pdf.
9. Article 29 Working Party, Opinion n 02/2013 on apps on smart devices, WP 202, February
27, 2013.
10. Winston Maxwell and Christopher Wolf, A Global Reality: Governmental Access to Data
in the Cloud, Hogan Lovells White Paper, May 2012.
11. Title 18, US Code, Crime and Criminal Procedure.
12. Title 50, US Code, War and National Defense.
13. Liberty and Security in a Changing World, Report and Recommendations of the
Presidents Review Group on Intelligence and Communications Technology, Dec. 12, 2013.
66
67
work within which private actors discuss and if possible agree on regulatory
measures. Co-regulation is like self-regulation, except that in co-regulation
the government or regulatory authority has some influence over how the
rules are developed, and/or how they are enforced. This is supposed to make
the rulemaking process more legitimate and effective compared to purely
self-regulatory solutions. It is more legitimate because the process is supervised by officials who are accountable to the democratically-elected legislature. It is more effective because the resources of the state can be used to
enforce the rules.
Data protection authorities in Europe are distrustful of purely self-regulatory arrangements, and prefer co-regulatory solutions in which the data
protection authority (DPA) is involved in both the formation of rules and their
enforcement. DPAs in Europe emphasise binding corporate rules (BCRs), which
evidences this co-regulatory preference.
Under the European data protection directive, companies are prohibited
from sending personal data outside the EEA to countries that have not been
recognised by the European Commission as providing an adequate level of
data protection. The United States currently is not viewed as providing an
adequate level of protection of personal data. One of the ways companies can
overcome the prohibition is by adopting BCRs. BCRs are a set of internal procedures that guarantee a high level of protection of personal data throughout
the organisation, including in parts of the organisation located in countries
without adequate protection. BCRs must be developed in close cooperation
with DPAs in Europe. A multinational group can propose BCRs following a
template adopted by the Article 29 Working Party, but ultimately the content
of the BCRs must be negotiated point by point with one of Europes DPAs.
Once the lead authority is satisfied with the content of the BCRs, the file is
then sent to two other co-lead DPAs who in turn scrutinise the content of the
file to ensure that the BCRs meet European standards. Once the BCRs have
been approved, they confer rights on third parties who can sue the company
for any violation of the BCRs. Likewise, any breach of the BCRs can give rise
to sanctions by DPAs.
BCRs constitute co-regulations because they are developed by private
stakeholders within a framework established by regulatory authorities, and
once they have been adopted, the BCRs can be enforced by regulatory authorities in the same way as classic regulations.
The Federal Trade Commissions (FTC) extensive reliance on negotiated
settlement agreements can also be seen as a form of co-regulation. The
FTC conducts investigations and begins enforcement action against companies that have violated the unfair and deceptive practices rule, as well as
68
other privacy violations such as violation of the US-EU safe harbor framework. One of the procedural options that the FTC can propose is a settlement
agreement with the company, which binds the company to put an end to the
relevant practices as well as submit itself to on-going accountability obligations similar to those one sees in BCRs.
The individual settlement agreements provide for procedural and structural safeguards to help prevent violations of data privacy commitments.18
Like European BCRs, the negotiated settlement agreements provide for
both internal and external audit procedures, training programs and periodic
reporting to the FTC. The settlement agreements last for 20 years, giving the
FTC the ability to co-regulate major Internet companies over a long period
of time. The FTC settlement agreements are public, thereby permitting the
FTC to use the settlement agreements as a means of sending signals to all
companies in the relevant sector. Although the settlement agreements are
not binding on companies that are not signatories, the settlement agreements provide to third parties guidance on what the FTC considers to be the
state of the art in terms of privacy compliance. The settlement agreements
inform third parties on practices that the FTC is likely to view as unacceptable,
as well as compliance measures that the FTC is likely to consider as optimal.
The FTC settlement agreements can have wide ranging effects. First, if the
settlement agreement binds a major Internet platform such as Facebook, the
settlement agreement will have an impact on a large portion of the Internet
industry simply because the platform represents a large part of Internet users.
Second, the settlement agreement will have indirect effects on all other players
in the Internet industry, by showing best practices and FTC expectations. The
FTCs settlement agreements serve a pedagogical function, thereby contributing to overall compliance with regulatory best practices in the industry.
The United States government is trying to encourage other co-regulatory
solutions for data privacy. The US administration refers to this as the multistakeholder process. Under the multi-stakeholder process, the National
Telecommunications and Information Agency, the NTIA, convenes stakeholders in an effort to develop codes of conduct. The role of the NTIA is to
organise multi-stakeholder meetings, facilitate the exchange of information,
and apply the threat of mandatory regulatory measures should the stakeholders fail to agree on consensual measures. The NTIA acts as a maieutic
regulator,19 helping to nudge stakeholders toward a consensus. The presence
18. For an example, see the Facebook settlement agreement here: http://www.ftc.gov/newsevents/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumersfailing-keep.
19. Nicolas Curien, Innovation and Regulation serving the digital Revolution, The Journal of
Regulation, 2011, I-1.32, p. 572-578.
69
20.http://www.ntia.doc.gov/other-publication/2013/privacy-multistakeholder-process-mobileapplication-transparency.
21. United States White House, Consumer data privacy in a networked world: a framework
for protecting privacy and promoting innovation in the global digital economy, February
2012.
22. http://www.apec.org/Press/News-Releases/2013/0306_data.aspx.
23. For a full description of CBPRs, see http://www.apec.org/Groups/Committee-on-Tradeand-Investment/~/media/Files/Groups/ECSG/CBPR/CBPR-PoliciesRulesGuidelines.ashx.
24. http://publications.apec.org/publication-detail.php?pub_id=390.
70
Florence Raynal
Introduction
Global privacy governance is at stake because data privacy has become
a growing issue at a worldwide level for citizens, governments and business.
Personal data are necessary for almost all business and public services, they
do not know geographical borders anymore and are an essential element of
the future of the Internet economy.
There are strong expectations from our citizens to get a robust and effective protection for their right to privacy wherever their data are handled. They
claim guarantees to trust business operators and public institutions about the
way they respect their privacy.
Companies are more and more integrating privacy as part of their business strategy and as an element of competiveness to make the difference and
develop clients confidence.
Regulators, legislators, governments are reforming fundamental texts such
as the OECD privacy guidelines, the Convention 108 or the EU Directive 1995,
or creating new approach (e.g. APEC privacy framework).
Those historical revolutions coming in will have a direct effect on global
privacy governance and will shape our regulatory environment for data
privacy.
The CNIL has been actively involved in following the debates around those
reforms promoting a European approach on privacy matters and in developing
European and International cooperation among data protection authorities,
which is a key element of global privacy governance.
1. Transcription of the speech given during the Privacy seminar of Institut Mines-Tlcom.
71
OECD
The OECD just adopted on July 2013 its new privacy guidelines. We all
know that those guidelines do not have any legal value as such, but they
represent a strong political message as 34 governments of the EU but also of
the APEC (US, Canada, Japan) are adopting them. They represent a standard,
an orientation that governments should follow locally.
72
The new rules put an emphasis on the accountability concept but also push
for the designation of data protection authorities in the world and underline
the need for better international cooperation and interoperability.
As the European Union is in the process of changing its privacy framework,
it is essential to follow with great care these evolutions and maintain the
acquis communautaire.
Council of Europe
The same is happening at the Council of Europe with the modernization
of the Convention 108 that is on its way and that should be adopted in 2014.
This text, which is the first European binding instrument of data protection,
bounds 45 members (over 47) and has the legal value of a Treaty.
Therefore, changes made on this convention are key for the European
Union and will have a direct impact on our regulatory framework.
Conclusion
There is food for thought and lot of initiatives at the moment for building
global privacy governance and the key word for data protection authorities
is cooperation for example by creating interoperability among different
regimes.
Trying to agree on common or adequate values and rules is not necessarily
the Holy Grail, what is important is to create tools, bridges and paths to navigate between different ecosystems. It is exactly what the CNIL is trying to
achieve with the APEC with a strong cooperation between the EU WP29 and
the APEC data privacy working group or with the French speaking countries.
Indeed, we deeply believe that one of the corner stones is to improve cooperation between Data Protection Authorities at EU level but also at international level. A good example at EU level is the WP29 enforcement taskforce
on the Google case. At international level, the international conference of
privacy commissioners recently adopted a resolution calling for a strategic
plan to refine the Conference and enhance its capacity for action. The idea is
to define a new governance model and to develop a real and effective network
at global level at international level for exchanging information, best practices
and organizing joint enforcement actions.
I am not sure that we have today the solution for global privacy governance, but there is a need, a strong willingness from all stakeholders with
that objective in mind. Ideas are on the table like stones. Now, lets build the
house!
75
Pierre-Emmanuel Struyven
Privacy can be described as the ability for individuals to disclose personal
data in a controlled manner. It is a matter discussed by regulators, lawyers
and consumer organisations alike. In todays digital world, Privacy and Identity
are at the heart of many business models and consumer propositions. In this
speech, Ill try to highlight how privacy and identity are at the core of digital
lifestyle and business innovation for pure-player digital services but also
legacy/brick-and-mortar business models.
77
receive a free and relevant service. Search engines or targeted advertising are
typical examples of such deals between users and the services they use. We
know that storage and use of personal data is a highly regulated area. At the
end of the day, the customer is the main judge: am I satisfied with the service
I receive and am I happy about the way my personal data are used.
that discover mass Internet through mobile and also younger demographics
that are connected 24/7.
We do everything digitally today: we check our mails online and increasingly on mobile phones, we check credit card balance and bank account with
increasingly popular mobile banking applications, we use e-administration to
check social benefits.
The mobile is also going to be used to pay at the counter (NFC based
credit card payment) or to secure online payment (two-factor authentication), board public transports, and store loyalty cards. We really live with and
in a digital world and we leave traces everywhere and all the time.
Many online and mobile services are free for the end user. It is the basic
business model of search engines: through my query I tell the service what
Im looking for, therefore giving away a little piece of information about my
interests, and in return I will get a fairly accurate result. Brands and corporations are ready to pay to be featured in the search results, because they are
exposed to relevant customers in a relevant context. Similar two-sided business models are found in price comparison engines or social networks. Mobile
or tablet usage is amplifying this already well-established model, because
usage is fast developing on mobile and that using the actual localisation of
the user creates new opportunities in targeting.
So there really is a win-win deal between consumers, trading personal
information in return for service and relevance. It is a very important paradigm, when one speaks about privacy because it means that, as a consumer,
Im ready to trade some of my privacy in return for a service.
Not so different is the use of personal and usage data to improve the
customer experience. Most e-commerce sites use it to maximise their performance by showing one customer the most relevant products: relevance by
linking to existing purchase history or catalogue browsing, relevance by
suggesting products that similar customers have purchased. Recommendation
engines are increasingly important in many businesses selling physical goods
or digital content. When browsing huge online catalogues even more so
when browsing happens on the go on the mobile or the TV screen receiving
relevant recommendations or seeing relevant product categories first not only
enhances my experience as a user but also increases sales.
Targeted advertising is another example where being identified and
exposing more or less explicitly personal data, changes the experience for the
end user. Targeting is a major trend for digital marketing, Web and mobile
ads. With online advertising, you can target your audience in a very detailed
way, possibly down to the individual user. This was not possible in traditional
mass-media such as TV. On one hand, again as a consumer, if Im using a
79
service where ads are displayed, Id rather receive relevant ads than ads that
are absolutely not interesting to me. But again on the other hand, it means
that whoever is delivering the ad has identified me and has linked information
about me to the identifier. It is likely that this happens completely within the
ad-serving infrastructure (through the use of cookies, IP address, device id,
etc.) and that the website publisher has nothing to do with it. But in the mind
of the consumer, it is hard to tell if the targeted ads are the result of the Web
publisher giving access to some of my personal data or if it is done by using
technologies such as cookies.
Targeted ads can go one step further now with localisation. Using the GPS
embedded in smartphones, using the cellular or Wi-Fi network, it is possible
to know where a mobile or a tablet is located. Using localisation for targeting
requires the user consent (opt-in). A user who has opted-in for such a service
will receive information from brands offering rebates or promotions when it
enters a mall. Several startups are developing indoor localisation technologies
to be able to target users location very precisely, e.g. when entering a specific
shop or even when facing a specific shelve, say sodas or cereals, to be able to
send or display the right promotion at the right time.
In all cases, what the industry should put first is the trust of the consumers,
because in the long run everything relies on the fact that the consumers trust
the companies theyre trading with or theyre involved with, about the fair
use of the data they leave behind, again explicitly or implicitly. Therefore, data
protection is also a major issue: beyond opt-in and sensible use of the data,
we must protect personal data against accidental leakage or fraudulent access
by unauthorised internal staff or hackers.
Big Data
Big Data is the new buzz word. It is basically the fact that it is now possible
to store huge amounts of data, and with the help of new types of databases and
query tools, it is possible to access new levels of customer knowledge, behaviour patterns, etc. It is now possible to accumulate data that were previously
not accessible and/or not stored. Lets take every single purchase by every single
loyalty card holder in the retail business, or the minute by minute location of
a car equipped with a connected GPS. It questions one major statement in the
European privacy regulation: that there is a specific purpose to each data base.
Big Data relies on the assumption that data should be stored for the sake
of finding useful information in the future by analysing them with these new
powerful tools.
To protect privacy while still be able to tap into the potential of Big Data,
the link with the individual who is associated to the data needs to be cut. To
80
do this, one solution is to apply a two stage process to the data. First stage is
to anonymise the data, by changing the identifier of each user into an alias, so
there is no way to trace back the individual linked to the data. In the second
stage data are aggregated so no individual data are stored anymore. Today
already some form of BD is used, to model road traffic and therefore find out
where there are traffic jams, for real time alert purposes of for analysis and
planning of public transports and roads. The root information is the individual
movements of cars that can be deducted from the movements of all SIM cards
in a given territory, or from all connected GPS.
Use of BD will go far beyond in the future and will trigger new privacy questions. We see that an increasing number of mobile applications and devices
offer to monitor simple biological variables (pulse, weight, body temperature,
blood pressure) and lifestyle variables (distance walked every day, physical
activity, food/calories eaten). When speaking of private data and privacy,
health related data are paramount; as a consumer I consider it strictly private.
In fact, all these data are also likely to be accumulated over time and will
trigger new possibilities in the future. A user of such service might receive
an alert because his daily values are exiting the normal values defined by
analysing the mass of data available from all users. In that world, I am warned
that I might be sick before even feeling sick and experiencing symptoms.
Several startups have already started to produce such devices, some of them
for casual use and some with real/critical health applications in mind. Some
have already said that they will never sell individual data
Identity
Identity is closely linked to privacy. We know that there is a big need for
secured identity in the digital world. I need to identify myself to look at my
bank account and to transfer money from my account to another account. I
want that identifier to be secure enough so that it is not possible for fraudsters to access my personal data or worse my savings. And even if my
identifier is compromised (lost, stolen) I want my bank to be able to detect
(authenticate) that the person using my identifier is not me. This often
requires some overhead for the user, such as changing passwords regularly,
choosing secure passwords or even using a temporary password received on
the mobile phone or generated by a dedicated device, rather than using a
simple and unique permanent password. So we must educate our customers
to protect their privacy and accept that protection may add some complexity
to their daily online life.
Identity also provides a way to increase the trust in digital services while
keeping privacy.
81
Conclusion
We have tried to illustrate how personal data, privacy and identity are
at the core of the digital lifestyle. At the heart of most business models of
free online services is customer knowledge, and careful use of customer data
improves relevance and performance. Mobile and tablets are increasing usage
levels and develop access to services everywhere and all the time. The ability
to locate creates additional targeting possibilities. Big Data is a new frontier
where previously untapped data sets are used to increase customer knowledge and create value for businesses as well as the public sphere. All this has
to be done while letting our customers control their privacy and protect their
data against unwanted use. Trusted digital identity is needed to adequately
protect the users against fraudulent access.
Pierre-Emmanuel Struyven is VP Innovation and New Markets at SFR. He joined
SFR in 2009. Prior to SFR, he was CEO of Streamezzo, a startup active in application
software development for mobile devices. Pierre-Emmanuel spent five years with the
worlds largest music company, Universal Music, heading operations, product marketing
and business development for the groups Mobile subsidiary and Digital division. There
he was involved in mobile and digital music business development, with a focus on new
mobile distribution channels, customer experience, product development and
innovation. Previously, he was director of Product Marketing in the telco
industry, and held various positions in IT. He graduated from Universit
Libre de Bruxelles (Ecole Polytechnique).
82
83
84
talented artists.5 Big Data has also many applications in terms of optimising
processes. SAS reports that it helped a major consumer electronics group
to drastically reduce fraud by combining information efficiently across the
companys IT system.6
The effectiveness of Big Data is not only coming from the quantity of information, but also from the quality of the data and how well it is curated. Very
often, the Big Data approach requires that data sets are allowed to enrich
each other from different sources, often private ones. This may require being
able to recognise the data subjects to merge the data sets. Companies may
need to access data from third parties, e.g. business partners or customers,
and integrate them with their own data, to pool together and obtain data
with greater analytical properties. How this can be organised in full respect of
data protection rules is not a trivial question.
Big Data can be combined with cloud computing and Internet of Things to
deliver its full potential. With cloud computing, data can be gathered through
the Internet and stored into servers with enormous capacities, where they
can be analysed effectively, possibly thanks to supercomputers. This makes
it possible to collect vast amount of data and to avoid being limited by the
capacity of local computers and servers. With the Internet of Things, objects
can be equipped of sensors and transmit a series of data that are then
collected and analysed. This makes it possible to improve the reliability of
machines, for instance, by controlling the durability of the parts and being
able to change them before an incident occurs, thus reducing maintenance
costs. For instance, US manufacturer John Deere uses sensors added to their
latest equipment to help farmers manage their fleet and to decrease downtime of their tractors as well as save on fuel. The information is combined
with historical and real-time data regarding weather prediction, soil conditions, crop features and many other data sets.7 Connecting objects makes it
also possible to track the performance and usage of products and to collect
data about how they react and how they are actually used by customers.
Thanks to data analytics, it is then possible to improve the user-friendliness
and customer experience and to invent new applications too.
85
reveal much if they remained separate suddenly can be made very telling if
they are combined and analysed on the basis of broader statistical evidence.
The company Target for instance became known after an article in The New
York Time8 revealed how it could identify a pregnant woman, in that case a
teenage girl, without their parents knowing about the pregnancy. The statistician at Target explained that the company had developed through Big Data
some statistical indicators about purchasing patterns that would allow to
identify a pregnant woman and to propose her targeted products before she
would give birth. This anecdote shows how the sensitivity about Big Data is
making it possible to find out about your inner secrets without you necessarily wanting to share this information.
In broader terms, technology raises challenges about the control over data
and its applications. A key principle of data protection regulation in the EU is
the notion of consent, which is one of the legal grounds that allow processing
of personal data. EU legislation specifies that data must be collected for
specified, explicit and legitimate purposes and not further processed in a way
incompatible with those purposes. However, it is not clear whether this is
easily delivered in the case of data generated automatically and processed
through data-mining technologies, as it is not obvious what the ground for
processing may be ex ante. Similarly, in case of innovative mobile applications based on the Internet of Things, the nature of the data and whether it is
personal or not may be contested.
To make things even more complicated, one should also underline that
privacy may be evolving in society at large. With the development of social
media, individuals are increasingly becoming public figures on the Internet.
The Eurobarometer survey showed that 74% of Europeans think that
disclosing data is increasingly part of modern life, but at the same time 72%
are worried they give away too much personal data.9 Interestingly, Febelfin,
the Belgian Federation of Financial Institutions, hired an actor to pose as a
mentalist, while he was actually only taking advantage of information these
people had posted on the Internet, revealing to them how their privacy was
possibly infringed through their own actions of sharing it online.10 Any solution to the novel issues for privacy in the digital age requires for that reason
to include users in the equation.
8. Charles Duhigg, How companies learn your secrets, The New York Times, 6February
2012, www.nytimes.com.
9. Special Eurobarometer 359, Attitudes on Data Protection and Electronic Identity in the
European Union, June 2011.
10. http://www.febelfin.be/fr/partager-des-informations-sur-Internet-cest-sexposer-auxabus.
86
the OECD principles (and can be traced back to the 1953 European Convention
on Human Rights):
Notice: Data subjects should be given notice when their data is being
collected.
Purpose: Data should only be used for the purpose stated and not for any
other purposes.
Consent: Data should not be disclosed without the data subjects consent.
Security: Collected data should be kept secure from any potential abuses.
Disclosure: Data subjects should be informed as to who is collecting their
data.
Access: Data subjects should be allowed to access their data and make
corrections to any inaccurate data.
Accountability: Data subjects should have a method available to them to
hold data collectors accountable for following the above principles.
The data protection reform has set a series of objectives. A reinforced right
to be forgotten was proposed to help people better manage data protection
risks online: People will be able to delete their data if there are no legitimate
reasons for retaining it. Wherever consent is required for data to be processed,
it is proposed that it is given explicitly, rather than assumed as is usually the
case now. In addition, people will have easier access to their own data and be
able to transfer personal data from one service provider to another more easily.
There is also increased responsibility and accountability for those processing
personal data: For example, companies and organisations must notify the
national supervisory authority of serious data breaches as soon as possible (if
feasible, within 24 hours). People should be able to refer cases when they are
victims of a data breach or when rules on data protection are violated to the
data protection authority in their country, even when their data is processed
by an organisation based outside the EU. In addition, it is proposed that EU
rules will apply even if personal data is processed abroad by companies that
are active in the EU market. This will give people in the EU confidence that their
data is still protected wherever it may be handled in the world.
However, the proposal has led to a heated debate. In particular, a number
of private sector companies warned that while they shared the objectives of
the proposal, the solutions developed to meet them raise a series of difficulties linked with their suitability for the online business. The worries of many
Internet companies or of companies involved in cloud computing and Big
Data are that this new framework will make a number of existing business
models difficult to operate and that it will create new rigidities for business,
with risks of very high fines in case of non-compliance having the potential
to freeze experimentation and innovation. For instance, the obligation to ask
88
for consent before analysing data as well as the prohibition of profiling have
been raised as major hurdles.
The American Chamber of Commerce to the EU a good proxy for the
position of the US digital businesses underlined that making explicit consent
the norm will inhibit legitimate practices without providing a clear benefit to
data subjects. It believes that profiling techniques per se do not need special
regulatory treatment given the many safeguards in the draft Regulation. At
minimum, the Regulation should make clear that the restrictions on profiling
do not extend to beneficial activities such as fraud prevention, service
improvement, and marketing/content customization.11 Another issue is the
right to be forgotten. Among others, Facebook criticised the proposal saying
that it raises major concerns with regard to the right of others to remember
and of freedom of expression on the Internet. They also pointed at a risk that
it could result in measures which are technically impossible to apply in practice and therefore make for bad law.12
The Council to that day has not managed to develop a negotiation
mandate. One of the most contentious issues is whether a one-stop-shop
would be maintained or whether national governments would want to maintain national regulators for the activities on their territories. Potentially, this
could largely burden compliance and take away the objective of creating a
single market for data. In Parliament, more than 4,000 amendments were
proposed to the text through various committees. Discussions were very
heated, but on 21st October, the lead committee (LIBE) managed to adopt its
report (prepared by MEP Albrecht). A series of elements are introduced and
notably the notion of pseudonymous data, which is proposed to be subject
to a lighter framework. The definition of consent is slightly relaxed, as statements and actions are included as a proof of consent. New provisions are
introduced for international data transfer: notification for data transfer or
disclosures and propose that all adequacy decisions (such as the Safe Harbour
decision with the US) expire five years after adoption of the regulation. It
proposes to introduce a European Data Protection Seal, and increases the
amount of sanctions in case of non-compliance to e100 million or five
percent of worldwide turnover.
The processing of health data for research, statistic or scientific studies
is still authorised but data controllers would have the obligation to obtain
11. AmCham EU position on the General Data Protection Regulation; 11 July2012; American
Chamber of Commerce to the European Union; Avenue des Arts/Kunstlaan53, 1000 Brussels,
Belgium https://dataskydd.net/wp-content/uploads/2013/01/AmCham-EU_Position-Paperon-Data-Protection-20120711.pdf.
12. http://thenextweb.com/facebook/2012/11/20/facebook-proposed-eu-right-to-beforgotten-raises-major-concerns-over-freedom-of-expression-online.
89
consent from the data subject. Last but not least the Albrecht report proposes
to get rid of the notion of right to be forgotten and replaces it with the
right to erasure. In addition, the Commissions proposal already restricted it
in some cases, for instance when the data are needed to exercise freedom of
expression, for public interest in public health, for historical, statistical and
scientific purposes, or when required by law.
Also, it is worth noting that the Commission published a Communication
as regards international transfer of data as a consequence of the new challenges highlighted by the Snowden revelations13. The studies found a series
of shortcomings, in particular about companies who had wrongfully declared
they were listed but were not, or those who did not correctly publicise the
principles. Also some companies did not implement the principles in their
actual corporate policies.
Finally, reliance on self-certification led to inadequate level of enforcement. Furthermore, the large scale access by intelligence agencies to data
transferred to the US by Safe Harbour certified companies raises additional serious questions regarding the continuity of data protection rights
of Europeans when their data in transferred to the US. On that basis, the
Commission called for improvements in the Safe Harbour decision, notably
in relation to enforcement by the US authorities and obligations of private
companies.
90
not contain provisions on pseudonymous data. However, the new article 4(2a)
proposed by the Parliament sets out the definition of pseudonymous data,
which means: personal data that cannot be attributed to a specific data
subject without the use of additional information, as long as such additional
information is kept separately and subject to technical and organisational
measures to ensure non-attribution.15 Pseudonymous data would enable
some further processing without needing new consent, which would benefit
to Big Data analytics. Transforming data in such a way should be possible
if the data is of sufficient quality and recognisable. In case this data would
be de-anonymised, due for instance to the combination of several data, the
lawfulness of proceedings would, however, still depend on the presence of
consent.
Another opportunity could result from clearly sequencing the collecting
of consent from further processing, as suggested by the Article 29 Working
Party (WP29) opinion 03/2013 on purpose limitation. Data analysis is not
problematic as long as the data controller does not find personal data. So
data about machines, for instance, should be processed without difficulty.
Once the collection of personal data occurs, data protection rules have to
apply and either consent or the allowed exceptions may be used for further
processing. The WP29 emphasises that the specific provision in Article 6(1)
(b) of the Directive on further processing for historical, statistical or scientific purposes should be seen as a specification of the general rule, while
not excluding that other cases could also be considered as not incompatible. This leads to a more prominent role for different kinds of safeguards,
including technical and organisational measures for functional separation,
such as full or partial anonymization, pseudonymization, aggregation of data,
and privacy-enhancing technologies.
Thirdly, privacy impact assessments may offer opportunities to identify risks and to provide remedies to innovative and novel issues as regards
privacy in a new context, such as Big Data or cloud. These impact assessments
could be conducted in a way that is transparent and coordinated with the
authorities, to avoid excessive burden on individual companies. They would
offer a space for discussion about emerging risks, and for tailored solutions
that could be flexibly amended over time.
Along the same line, there is certainly scope to use technology in a better
way, so that it delivers privacy by design. Too often, privacy is a secondary
consideration, and it operates as a remedy to a technological problem. By integrating privacy from the outset in the technical specifications, it is possible to
15. Compromise Article 4, available at http://www.edri.org/files/eudatap/04COMP
Article04.pdf.
91
limit the legal barriers to data processing. For instance, the initiative around
Do-Not-Track16 offers possibilities for users to stay in control about what data
they share through their browsers. Developments around automatic deletion
of personal data through apps, or default restriction to the transmission of
personal data can be developed.
Furthermore, there may be scope to support the development of compliance models akin to Binding Corporate Rules to increase accountability at the
level of individual data controller, thanks to standardisation and certification
processes. Binding Corporate Rules (BCR) are internal rules (such as a Code of
Conduct) adopted by multinational group of companies which define global
policy with regard to the international transfers of personal data within the
same corporate group to entities located in countries which do not provide
an adequate level of protection. It may, however, be possible to normalise
the processes that underpin BCR and to develop standards that can generally be applied at industry level. This would develop alternative mechanisms
for businesses to comply with privacy, through industry standards that could
be validated by data protection authorities, and maintained through specific
audit procedures. Such an alternative model could offer additional flexibility
and scope for innovation.
Finally, whatever solution is envisaged, it is essential that users are kept
in mind. User-friendliness is essential to deliver good results, and it is important that sound principles are translated into operational and feasible steps
that can be mastered by users. The example of the cookie regulation and its
implementation in the Netherlands is a good example in that respect: while
the spirit of the law was very commendable, the way it was implemented
led to unpractical and cumbersome processes, which have created irritation
among users and excessive consent. Using behavioural economics and testing
of technological solutions would therefore seem a useful development in the
field of privacy and data protection. Here again, dialogue between industry
and data protection regulators should be paramount.
Thibaut Kleiner is a member of the Cabinet of Vice President Neelie Kroes,
European Commisison. Thibaut is a senior advisor in charge of Internet policies (privacy,
Internet governance, media and data, etc.). He has been working for the European
Commission since 2001, occupying a number of positions, notably in the field of
competition policy, where he was head of a unit in charge of coordination, and member
of the cabinet of Neelie Kroes during her previous mandate, where he notably supervised state aid (including during the banking crisis). An economist by training, he holds
a Master from HEC Paris and a Ph.D. from the London School of Economics.
92
93
Nicolas de Cordes
Personal data valorisation is happening through a complex and vibrant
value chain. It is first collected through various means: mobile phones OS or
their apps, computers, communication networks, social networks, electronic
notepads, readers, smart appliances, smart grids, sensors, etc.
It is stored, aggregated, processed and then exchanged by Web retailers,
Internet behavior tracking companies, search engines, electronic medical
providers, identity providers, network operators, Internet service providers,
financial institutions, utility companies, public administrations, etc.
Personal data have typically three different origins: Volunteered when
users declare their interests and preferences; observed through monitoring
of usages: browsers history, consumption via credit cards or online shopping,
search and localisation requests on maps, etc.; finally it can also be inferred or
deducted, when algorithms and crossing of different data sources create new
attributes and profiles of users for different purposes.
The end users of personal data usually are the companies serving the users
in the first place, but also third parties looking to commercialise or serve
better their customers by enriching their knowledge (companies, government
agencies or public organisations), and increasingly the users themselves, who
can reuse these personal data to improve their services.
All told, the value created through digital identity and personal data can
be massive. A BCG study estimated a 22% annual growth rate of business
directly related to personal data, which could deliver a e330 billion annual
economic benefit for organisations in Europe by 2020. Individuals would
benefit to an even greater degree, as consumer value will be more than twice
1. Transcription of the speech given during the Privacy seminar of Institut Mines-Tlcom.
95
as large: e670 billion by 2010. The combined total digital identity value could
amount roughly to EU-27 GDP (1,000Bne). But, as many analysts, BCG estimates that two-thirds of this potential value generation is at risk if stakeholders fail to establish a trusted flow of personal data.
96
TRUST
Fair and reciproqual
exchanges
(transparency)
Respect of freedom
(privacy, control)
Trust is the basis of society: It brings social stability instead of unsecurity and erratic behaviour; it brings forth economic development in place
of small-cash-based economy. Also, trust enables law, bills of rights and
democracy that must prevail over the rules of the strongest.
But trust is slow to build. One major question concerning the use of Big
Data then, is: Is trust at risk in digital societies? At present, 60% to 80%
of people express a lack of trust in some form when talking about their
personal data. Loss or lack of trust in the digital world is not a good sign
for our society which is going digital by default.
In a non-expert legal, philosophical, or linguist view of the question,
trust is both a feeling and an attitude based on two major principles: One
is the respect of the freedom of the individual or the party we are talking
with, the other one is to have a fair exchange, something that to your eyes
seems fair and non discriminatory. The second principle, the notion of privacy,
is something very important that takes several different forms. One element
of privacy is linked to the emergence of automatization and algorithms that
are creating a world which is a little at risk of becoming algorithm-centric
instead of user-centric. So, privacy, because of its two critical applications,
protecting citizens against abusive government and protecting consumers
against unethical business, is central
98
for 50% users), and would have to be quite high for social network posts,
medical records, financial data or credit card data (for all users, e50 wouldnt
be enough). More than half the people would not agree to voluntarily give
very sensitive personal data even with financial compensation.
What is going on could lead us into the problem of privacy from a different
angle. The concept came from the behaviour sciences: During exchanges it
seems that people have two accounting or value systems operating in their
minds. First there is the social accounting or value system: I give you something, you give me something and we find it fair. Second is the economic
accounting or value system: If I give you one euro, I expect to receive one
euro or one euro plus something. This sort of balancing act really takes place
in our heads, and behavioural sciences teach us that we cannot mix those
two value systems. If you give a nice book, a gift to someone and say: Here
is something I found for you, I like what you read, I thought this book would
be really interesting, and by the way, there was a discount and I got it for 10
euros only Take another example: You are at a family dinner, your stepmother has made a fantastic lunch, everybody is happy, everything is really
nice; now you take your wallet and say: Dear stepmother, it is so fantastic I
will pay you 150 euros for this dinner That doesnt really fly. If you ask your
neighbour to help you carry and transport some very heavy thing, he will be
happy to do it if there is just the one thing and if he is on his own with you.
But if at the same time you have your movers around, whom you pay, and you
ask him for help (without him being paid obviously) It doesnt work. In these
three examples, we are mixing social exchanges and monetary exchanges.
What is true between people is also true between companies. Imagine a
bank that would say: We are a family business, we love our customers, everybody is part of a big family. Then, if you have a problem of payment and you
go and see your banker, what you would expect to hear is: Oh, Im terribly
sorry for whats happening to you. What can I do? Lets stop the reimbursement process immediately. Come back whenever you are ready. You would
expect this kind of friendly family attitude. But the banker would more likely
say: It is annoying you are only half way through your loan, maybe I can lend
you a bridge loan, and maybe I can take some mortgage on your car. Mixing
the systems obviously cannot happen.
The evolution of the concept of monetising personal information is
confronted to that problem. A first exchange of information takes place in one
purpose, then this information is repackaged, and when it is repurposed for
another usage which involves money, this clearly touches the two very sensitive accounting system we have in our heads. This is one of the reasons why
we are stuck with the personal data problem. Reconciliating the monetary and
100
social aspects of the value system is really something important and it may
appear to be less about exchanging monetary value or giving people a share of
the deal, and more about giving the opportunity to users to have some level
of control. What I mean is that we absolutely need to respect the context and
the personal intentions, and this is a particularly difficult problem.
This analysis can be seen as a sort of basic level, a background context for
looking a little further into the future.
Privacy scenarios
When we talk about scenarios, we typically look at trends, then we see
what is the common agreement about things and what are the variables, what
could go in one direction or the other. This usually requires a lot of thinking
and brainstorming. Here is a scenario landscape:
(UL) Patchwork:
Creative silos, a world of Mini Data and alliances
Users assemblee services that work together linking them through IDs
(UR) Customer is King
User-centric permission Web architecture
Personal data lockers used as foundations for ergonomic personal
services
(BR) Arm wrestlers
Strong policies and law enforcement create users counter power
Users manage IDs and privacy settings
(BL) Big Brother likes you
Oligopoly, GAFA + IDs
Large commerce platforms drive the digital world with a 360 view of
their customers
101
On the horizontal axe is the possible evolution of control by the user over
its personal data, with respect of context and every other safeguards, and
forms of control which might not exist yet, or start to exist on a small scale.
On the other axe is something we havent talk much about, which is commercial power and competition structure. Do we have a market concentrated on
a few very powerful actors or do we have a more fragmented situation with
many small players?
So we have a bit of a demand-and-offer landscape for privacy scenarios.
We imagined scenarios in that context, privacy laws on one side, competitive
laws on the other side, and they are linked. These types of scenarios are very
simplistic views, landscapes; they show the extreme options.
On the bottom left, we find the Big Brother likes you box. This is a bit of
a frightening sort of future. To some extent, there is a natural gravity driven
by the power of the platforms and the power of the economics which tend
to drag the system down into that box. There is a natural monopoly in the
way the network operates that naturally intends to go in that direction. This
is why we need to go away from that box by educating people, by making
regulations evolve and by finding ways to go elsewhere. Unless we are happy
with a Big Brother scenario Going elsewhere is a choice of society. Im not
going to promote Orange opinion on cultural value and social decision. Which
direction we take is a political decision, to be taken by us as citizens. However,
Orange has an opinion about what would be good for its own business. And
we dont like much this extreme bottom left corner where all the information ends up in a situation which is not good for the global dynamic of the
ecosystem. We obviously prefer an environment with a balanced view of the
different actors where the user has more things to say because that will be a
better basis for creativity.
We know there are a lot of regulation levels, things that a regulator can do
from bills of rights to political debate about automatic actions, trust-based
standards, emergence of third parties, and privacy by design and so on. These
are all the levers we could use, these are indeed the tools that are under
discussion to move the position into a more favourable scenario.
To sum up
We need to increase our chances to evolve in the right direction for a better
balance of power between users and companies and institutions. A regulatory
and policies tool-box could use many approaches to that effect:
Propose a Bill of Digital rights, fit for the XXIst century.
Organise the political debate about discriminatory automated profiling.
Encourage private trust-based standards (charters).
102
personal shopping experience. When he goes out of the shop, he automatically goes back to a safer level. When he enters the shop, he can agree to say
for instance: I will give you my details about sizing because I would like to see
only things that fit me, and only sizing details would be displayed.
We set up a data governance board that really help the company manage
and define its approach, to ensure appropriate governance for the protection
of privacy and personal data in line with Group strategy, while fostering the
focused development of new business opportunities on Personal Data.
Conclusion
Finally some food for thought. So, this is what we do at Orange at present,
but we are also exploring the future and we are confronted to big questions,
as we all are. First thing which is also my personal belief: Big Data is a necessity. We might like it or not, but the problems of the world we are confronted
to are too big for us to just let them happen. We need to use every resource
available to humankind to help solve this big crisis we are potentially facing.
And that means using Big Data.
One of the virtuous uses of Big Data is shown in the results of a contest
that we launched in 2013, called Data for Development (D4D). We released
network information, statistically anonymous information, to the research
community. We ask them to try and find how to use this Big Data to help Ivory
Coast society to work better and develop. In the upper left corner, we can see
104
105
107
such a scenario can be imagined only for a silo system, while in todays interconnected world the privacy is threatened by the consolidation of various data
sources. This happens for example in the scope of a single large data controller
covering multiple service domains (e.g. Google search, Gmail, Google+, and
YouTube), where the domain-specific personal profiles are consolidated into
a unified personal profile. More generally, the privacy breach can happen by
re-identification of anonymised data through crossing data sources from
completely different public or private domains.
A well-known example is provided by the de-anonymization attack of
a Massachusetts hospital discharge database achieved by joining it with a
public voter database.2 It has been naively believed that simple removal of
explicit user identifiers such as name, address, phone number, or social security number would be sufficient to maintain the users confidentiality within
the disclosed personal data records. Very often however, the remaining data
can be used to re-identify the subjects of data records by matching them with
other data sources. This phenomenon is furthermore emphasised with the
emergence of Big Data.
108
issues. Namely, it has been demonstrated that an adversary who knows only
a little bit about an individual subscriber can easily identify the subscribers
record in the dataset. Using the Internet Movie Database (IMDb) as the source
of background knowledge, the researchers successfully identified the Netflix
records of known users, uncovering non-public information on their political
or even sexual preferences. This illustrates once again that today an individual
is not anymore the master of his or her privacy versus disclosure balance as
it would be the case in a hypothetical silo information system.
110
targeted advertising. Much of telcos personal data is collected by permissionbased methods and is more accurate, although it is actually underutilised,5
So, telcos could better leverage their subscriber data while maintaining and
enforcing their trusted relationship. To that end, they need to develop more
personalised services by breaking down their internal silos. This will improve
the customer experience and increase customer stickiness. Furthermore, telcos
have the opportunity to take the role of a trusted intermediary between the
end user and the other actors of the personal data ecosystem. They can build
personal data vaults or identity broker services which give the control of the
personal data to their owners (end users) while enabling the operation of the
ecosystem. Such solutions need to be heavily supported by privacy-preserving
data mining and privacy-preserving personalization technologies.
Privacy-preserving analytics
Privacy-preserving analytics refers to methods which allow exploring the
data and exploiting their utility without unveiling the sensitive information.
For example, a processing entity that carries out a computation (evaluating a
specific function) over some input data should not be able to discover sensitive information contained in the data sources. As we discussed earlier, in the
case when the user identification is considered sensitive, simply removing
the explicit identifiers from the data sources is not sufficient to ensure that
privacy-preserving property.
One of the well-known approaches to this problem is the homomorphic
encryption technique, an encryption scheme which allows certain algebraic
operations such as addition and/or multiplication to be carried out on the
encrypted plaintext, see e.g. below.6 The sources data are encrypted, with
private keys available only to the sources, so that their communication to
a function computation entity does not disclose any information. The latter,
however, is able to carry out the operation on encrypted inputs and then
sends back the results so that each source node can decrypt and discover the
correct value of the computed function. These cryptographic techniques are in
general heavyweight and imply an important computational overhead which
can be an obstacle for practical deployments.
A different direction actively explored in the research community is the
approach of statistical perturbation procuring formally provable so-called
5. Telcos: Leveraging Trust Through Privacy Management, OVUM report IT012000074,
May 2013.
6. Daniele Micciancio. A first glimpse of cryptographys Holy Grail. Commun. ACM 53, 3,
p.96, March2010.
111
112
9. A. Aghasaryan, M. Bouzid, D. Kostadinov, M. Kothari, and A. Nandi. On the use of LSH for
privacy preserving personalization. In Proceeding of the 12th IEEE International Conference
on Trust, Security and Privacy in Computing and Communications (TRUSTCOM), July 2013.
113
Stphane Lebas
115
One of the key concerns comes from the fact that the best way to create
value is to share value, and to share it of course with end users. There is a lot of
value in personal information and Big Data, but to start to unleash this value,
you need to share it.
To go further in that direction, we can identify some startups that even
promote Privacy + Data Monetisation services:
www.yesprofile.com
www.moneyformydata.com
But even so, the question remains the same: How do I minimise the impact
of these requirements on the customers experience? By having a very small
privacy section, putting the opt-out button in an hidden place The question
really is to switch from what is the impact on the user experience to what
could be the benefit in terms of user experience.
Large corporations need to see privacy not only as something they have to
comply with but in terms of benefits.
It could be seen also as a way to avoid the debate around sharing the value;
maybe they have to focus on user experience. Not just on recommendations,
but on helping users to understand what they really are doing on the Internet
and on allowing them to enjoy a safe navigation.
Privacys paradox
From the user-centric perspective, customer privacy meets an unexpected
paradox: When you ask Internet users: Do you feel that your personal information, reputation and privacy are at risk on the Internet today? 90% of
respondents answer Yes.
Of course, it is not peoples first concern.
For 56% of users, they themselves should be responsible for managing and
guarding online privacy.
46% of people only answer that private companies that store data (social
networking sites, databases, blog platforms, etc.) should be responsible for
privacy (Source: 123people online customer survey 2011).
BUT: to use a smartphone or download an app, absolutely everyone validates Terms & Conditions with privacy concerns without reading it
117
Another study says that 50% users never once in their life read Terms &
Conditions.
If you would like to have a clear demonstration of that, you can read all the
Terms & Conditions you have to accept in order to use your mobile device
If you look at what you allow Twitter to do on your Android device, it is
scaring: Theyre allowed to have your location (GPS and network), to modify
or delete contents from your mass storage, to have access to your network
connections full access to modify your call logs, to see your contact details,
to modify your contact details, and so on.
The truth is that today most people just dont care when they validate such
terms and conditions, so our job is to be able to convince the end user and the
service provider that there is value in changing such a situation
At the end of the day, it is mainly a question of user experience: When
Iwant to download an app, I want a direct download, I dont want to read
terms and conditions. It is the same thing when I activate my smartphone.
Today, service providers and handset manufacturers are using user experience against privacy regulations. We should definitely reverse such situation.
data or acknowledges the data, he his totally in control of his data. After three
clicks of customer consent the data is transmitted to the service provider. The
user is really in the middle of the relationship between the third party and
telco, and he has full control over his privacy and the content of the data (he
can modify it).
From a technical perspective, the pilot was based on an open API approach,
Code is law approach, open authentification mechanism. It was of course
fully compliant with French regulations.
To conclude, here are three recommandations to unleash the value of
privacy features:
Have better leverage on user experience to educate customers on privacy
on the one hand, and on the benefits of data sharing on the other hand.
Give back control and value to the customers in order to onboard them.
Maybe use any massive security and privacy breach in the coming years
as a tipping point? This may be the only solution for mass market users to
become fully aware that privacy is a critical concerns for everyone.
Stphane Lebas is Product Marketing Director in charge of Applications and
Smartphone Services within SFR. He has been working for many years on Location
Based Services and API exposure to third party with a focus on Privacy and
Customer Data Management. Since 2010, he has also been building
new activities around SFR network Big Data analytics.
119
121
122
3. www.cnil.fr.
4. www.acpr.banque-france.fr
123
of interaction between the banks and their customers,5 and this also intensifies the relationships with their clients. As the total number of interactions
doubled almost entirely because of these new channels, when people come to
branches or have a call with their relationship manager, they expect a higher
quality of advice and therefore, it improves the quality of the relationship
with their advisor.
Parallel to this, from the usage of the electronic card to the categorization
of expenses and the use of digital applications that have been developed in
the last few years, there is an explosion of data creation within retail banking
activities derived from client usage which could be leveraged in a win-win
configuration. The same will happen soon for insurance activities on a larger
scale with the emergence of new measurements from sensors installed in
cars, at home, to mobile health trackers which will allow the launch of new
services and offers to individuals and corporations, thanks to the intensive
usage of data collection and analyses.
124
6. Citigroup, Microsoft and Morningstar Launch Bundle.com a new social media site,
Jacksonville, 21st Jan2010.
7. http://bundle.com/guide/city/new-york-ny/restaurants.
8. http://www.billshrink.com
9. MasterCard Acquires Truaxis, Inc. to Enhance Delivery of Personalized Shopping Offers
and Rewards to Consumers, Press Release Mastercard, September 6, 2012.
10. History of Mint.com: http://en.wikipedia.org/wiki/Mint.com.
125
126
They are also susceptible of requesting more of such services even if they
have to share more information and personal data in the process.
Of course there is the question of technical feasibility in terms of confidentiality, analysis and data-rendering. The personal authentication and data reliability are pre-requisites for the correct use of personal data in a manner that
benefits both the individual and the business which is providing the service.
For example, in the banking industry, there is a strict regulation regarding the
account opening process (Know Your Customer procedures, KYC) and suspicious transactions: You need to ensure that reliable data, or a set of proofs, are
collected to guarantee the correct level of customer service.
There is also a real question about which kind of third parties can be
trusted today in the context of the Snowden revelations, both about governments and technology giants. Some forces would probably push in favor
of more independent data vaults. Also likely to emerge are new forms of
protection equivalent to the so-called electronic vault to avoid data loss,
and also to avoid this data being seen by, or compromised by, corporations
or governments.
The concept of Vendor Relationship Management (VRM, as opposed
to CRM, Customer Relationship Management), presented in the book The
Intention Economy When Customers Take Charge written by Doc Searls,13 is
probably one of the most advanced visions on the subject, and is becoming
more and more relevant in a context where new independent third parties will
have to be created. The idea is to develop tools that help people take better
decisions regarding their service providers and manage their relationship. The
tools could be used to make some Request For Proposal (RFP) for services and
invert the bargaining power between corporations and individuals.
127
personal data, like movie ratings, that will become a standard for the creative
industries.
We also took as a specific example the interest in experimenting some
ideas to carry out cross-fertilization of data collection and analyses between
culture and tourism industries in partnership with private actors: In the
summer of 2012, the Cte dAzur Regional Tourism Committee (CRT) and the
telecommunications operator Orange conducted a pilot experiment intended,
firstly, to quantify and model the presence and movements of visitors in
the Cte dAzur region using data collected from their mobile phones, and
secondly, to extract the meaning hidden in this data to facilitate and industrialise decision making in managing the tourism offer in the region. Each
year, nearly 77million foreign tourists visit France. The tourist audience on
French soil contains growth opportunities worthy of the digital economy. For
example, the audience visiting from the BRICS countries is experiencing a
double-digit growth. Tourism in France affects about one million jobs directly
and almost as many jobs indirectly. In 2012, it generated consumption of
nearly 138 billion euros, equivalent to 7% of the French GDP. Given these
figures and the natural proximity of the two sectors, it is legitimate for the
cultural industries to draw upon Big Data initiatives that have already been
implemented for tourism, and to consider potential economic synergies based
upon shared use of data. In both these industries, it is personal data that has
been anonymised and aggregated which provides them with real added-value,
allowing them to take better decisions for investment and their respective
offers to the public.
consumers greater control of their own data. Giving people greater access to
electronic records of their past buying and spending habits can help them to
make better buying choices. For example, data that a phone company holds
about your mobile use may help you choose a new tariff.15 Some of the UKs
biggest companies which are already working on the project include Google,
British Gas, Lloyds TSB and O2.
From an individual point of view, the issue about personal data is that
you have to experiment the real value of a certain set of data you are willing
to share with someone, and to do this you need to be ready to give away a
little of your privacy. But you want to be in control and have the possibilities
to cancel or opt-out of this sharing phase; to be sure it is not definitive. One
extreme example is that of medical records: I am ready to share my medical
data with professionals and online services that I trust, but not with corporations that could use it at my expense (car insurance for driving behavior,
medical conditions).
This question of trust is a key element and must be understood in a dynamic
context: as a corporation, you can betray individuals once but they will no
longer trust you, and a lot of the trade-offs about them are in collecting new
data and actions which will occur in the future. Individuals are not giving
access to an unlimited gold mine, and they have to be treated with respect,
which is good news: The value is more and more in the future data which an
individual will share with the services rather than the stock of information that
the service has on him (flow > stock). And for one specific reason: If they want
to better serve you in the future, businesses have to earn your trust and they
cannot abuse it. If this is not the case, you can unsubscribe from their services
and go to see a competitor which will potentially take better care of you.
129
amount at your favorite store. After you tried it several times, and became
confident that it was convenient and secure, you ended up trusting the system
because it did not let you down. This is what creating a virtuous circle of trust
is: The more you use it, the more confident you are and the more you want to
use it in the future. Like for credit, the regulator will probably have to incentivise the private sector to educate their public, and also to take a commitment regarding the lisibility / transparency of the terms and conditions of the
services they provide. For example, according to the financial regulations in
France, you have to protect the individuals against themselves when they
make an investment; the bank has to confirm the client has the capacity to
understand the risk they are taking.
Raising the level of understanding about privacy and personal data will
be necessary to avoid major scandals and maintain peoples trust in digital
services as they become more and more sophisticated in terms of usage of
personal data. The journey is just beginning
Matthieu Soul is a strategic analyst at LAtelier BNP Paribas. He graduated from
the Audencia Nantes Business School with a Master in Management, and has been
working within different innovation centres for the BNP Paribas Group, of which the
Atelier North America in San Francisco and the Center for Innovation, Technologies &
Consulting (CITC) in Paris. He is also the Vice-President in charge of the organisation
Finance for Youth Diplomacy (www.youth-diplomacy.org). He is a regular
contributor to the programmes proposed by the Fondation Tlcom including
Privacy and New Business Models in the Digital Era.
130
An Economists Thoughts
on the Future of Privacy
Patrick Waelbroeck
Introduction
Privacy deals with personal information that identifies the preferences of
a person. The identity of the person is not so important in economics. In fact
in the neoclassical equilibrium model, consumers are anonymous and have no
identity, and what they do is interact with each other. The identity of a person
plays a role only if it influences his or her choice. For example, it is easier to
give to family and close friends or to people who share certain beliefs than to
strangers. Akerlof and Kranton (2000) discuss this notion of identity. However,
the choices that we make are important because they reveal our preferences.
This is known as the axiom of revealed preference in economic theory. So all
our online activities, our choice of websites to visit, the comments we post,
our online purchases, our posts on Twitter can be considered as personal information because they reveal our preferences and our willingness to pay for a
product or a service. This is the reason why Big Data technologies combine as
many separate datasets as possible in order to gain the most precise knowledge of our online profiles.
We receive information filtered by infomediaries and platforms such as
Google or Amazon. For example, Google search engine filters search results
based on a persons geo-localisation, browsing history and profile. Amazon
runs algorithms to deliver customised product recommendations based
on a persons browsing history and purchases. These filters raise important
economic questions that I will discuss in Section 2.
We also produce personal data that have a commercial value. We leave
traces and footprints unintentionally but we voluntary contribute to online
communities such as eBay, Amazon, Wikipedia, Twitter, YouTube. I discuss the
131
1.3. Authenticity
Is everybody happy on the Internet? 99 percent of faces that you see on
the Internet represent people smiling. When you scan personal ads on online
dating sites you will find out that all male users make more money than their
peers. These are lies and Big Data technologies need to test the reliability of
the datasets being used. Tripadvisor is employing 100 persons full-time to
filter out questionable comments. What are the consequences of a firm or
government applying an algorithm to you based on incorrect information?
Authors such as Crawford and Shultz (2013) have strongly argued for a technological due process to uncover the algorithmic rules that are applied to us
online. But if Internet users know the algorithm that is applied to them, they
can again manipulate the algorithm and Big Data technologies become less
efficient (see the discussion in the previous subsection).
and states. Designing privacy laws that maintain the balance of power between
citizens and governments is a big challenge for democracies in the future. On
the one hand, consumers will be better informed and make better decisions
by using new tools to process massive data. Open data will enable citizens
to better assess public policies. They will gain autonomy and the democratic
process could be reinforced. At the individual level, new connected devices will
better monitor health, prevent illnesses and contribute to personal happiness.
On the other hand, traces and footprints left voluntarily or involuntarily on
the Internet make it easier to monitor deviations from the mean. This can be
used by unethical firms to manipulate consumers using human weaknesses
pointed out by behavioral economics (Calo, 2013) or by a central authority
wanting to strengthen its political power by harassing minorities and stigmatizing unwanted behaviors.
Privacy and innovation. When designing privacy policies, we should not
forget that future business models and innovation will greatly depend on
personal information. We should discuss privacy and innovation policies at
the same time. The question that will need to be answered is where to set the
cursor between protection and innovation.
Privacy and competition policy. There are many upcoming challenges
related to competition policy, as I have already discussed. New privacy laws
need to make sure that privacy protection tools will be competitively supplied
on the market, that search algorithms do not leave potential competitors out
of the market and that Big Data algorithms do not leave consumers with a
limited set of choices.
How to share the value generated by personal information? Perhaps the
biggest privacy challenge is to find a way to better share the value generated by personal data and contributions. Current discussions have focused
on market for personal data, new tax regimes on value added generated from
personal contributions, and universal wage.
Autonomous data, licences and Digital Rights Management (DRM).
Internet users could licence their personal data for different uses by different
companies. The licences could be enforced by DRMs or privacy by design. Data
could become more autonomous, fueling innovation while respecting individual rights.
Patrick Waelbroeck earned a Ph.D. in economics from the University of Paris 1
Panthon-Sorbonne. He also holds a master degree from Yale University for which he
obtained a Fulbright scholarship. His research focuses on the economics of innovation, the economics of intellectual property, Internet economics and the economics of
personal data. Patrick Waelbroeck is a member of the editorial board of the Journal of
Cultural Economics. He has published widely cited articles on the subject of piracy in
the cultural industries, which have influenced the public debate in France, Europe and
135
136
Conclusion
138
Acknowledgements
Many thanks to:
Francis Jutand, Scientific Director of Institut Mines-Tlcom,
Vronique Deborde, Directrice dlgue of Fondation Tlcom,
Claire Levallois-Barth for having invited Florence Raynal and Winston
Maxwell,
Pierre-Antoine Chardel for having invited Helen Nissenbaum and
Bregham Dalgliesh,
The partners of Fondation Tlcom and of Think Tank Futur Numrique,
Every contributor to this Cahier,
Anne Andrault for typesetting and proofreading this Cahier,
Nicolas Basset for the layout of the Cahiers de Prospective covers.
139
www.fondation-telecom.org
www.mines-telecom.fr
ISBN 978-2-915618-25-9
ISBN : 978-2-9156-1823-2