ReCEPL SERIES 1.
PRIVACY AND CONSENT A LEGAL AND UX&HMI APPROACH FOR DATA PROTECTION
ISBN 978-88-96055-878
ReCEPL SERIES
1.
PRIVACY AND CONSENT
A LEGAL AND UX&HMI APPROACH
FOR DATA PROTECTION
Editors
Lucilla Gatt
Ilaria Amelia Caggiano
Roberto Montanari
ReCEPL SERIES
1.
ReCEPL SERIES
1. Gatt L., Montanari R., Caggiano I.A. (eds.)
Privacy and Consent. A Legal and UX&HMI Approach
University Suor Orsola Press, 2021, ISBN 978-88-96055-878
2. Gatt L. (ed.)
Social Networks and Multimedia Habitats. Jean Monnet Chair PROTECH. “European Protection
Law of Individuals in Relation to New Technologies”. 1st International Workshop A.Y. 2019-2020
University Suor Orsola Press, 2020, ISBN 979-12-80426-00-0
SCIENZA
NUOVA UTOPIA
PRIVACY AND CONSENT
A LEGAL AND UX&HMI APPROACH
FOR DATA PROTECTION
Editors
Lucilla Gatt
Ilaria Amelia Caggiano
Roberto Montanari
Research Team
Legal Area Maria Cristina Gaeta (Coordinator),
Anna Anita Mollo, Livia Aulino, Margherita Vestoso
UX&HMI Area Emanuele Garzia, Luca Cattani, Federica Protti,
Simona Collina, Roberto Montanari
UNIVERSITÀ DEGLI STUDI
SUOR ORSOLA
BENINCASA
This research has been anonymously refereed
No part of this publication may be reproduced, stored, retrieved system,
or transmitted, in any form or by any means, without the written permission of the publisher,
nor be otherwise circulated in any form of binding or cover.
Editing
Luciana Trama
Design and printing
Flavia Soprani, Carmine Marra
© Copyright 2021 by Suor Orsola Benincasa University. All rights reserved.
The individual essays remain the intellectual properties of the contributors.
ISBN 978-88-96055-878
TABLE OF CONTENTS
9
11
17
21
25
27
CHAPTER 1
CONSENT AND INFORMATION IN THE PRIVACY SETTINGS OF A PC
OPERATIVE SYSTEM: SCOPE AND METHOD OF THE RESEARCH
1.1 Introduction to the research
1.1.1 The reasons for a piece of research on the consent
to the processing of personal data 11
1.1.2 The reasons for choosing a hybrid method of investigation
and the relationship with technology 12
1.1.3 Legal analysis (L.A.) and experimental investigation
concerning the consent to the processing of personal data 13
1.1.4 Consent to the processing of personal data and Behavioral
Analysis (B.A.): introduction 15
1.1.5 Interaction between L.A. and B.A.: a functional approach
to legal rules and models of protection of personal data alternative
to consent 15
1.2 The context/background of the research
1.3 The sum up on the experimental research
CHAPTER 2
THE EXPERIMENT: DESCRIPTION, DATA COLLECTION AND ANALYSIS,
FINDINGS
2.1 The experiment
2.1.1 User Experience evaluation: an introduction to the study 27
2.1.1.1 Experiment and sampling 28
2.1.1.2 Description of the experimental protocol 28
2.1.1.3 The experiment procedure 29
2.1.2 The usability tools 30
5
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
55
71
6
2.1.2.1 Eye-tracker 30
2.1.2.2 Thinking-aloud 31
2.1.2.3 System Usability Scale 32
2.1.3 Statistical analysis 33
2.1.3.1 Usability metrics results 34
2.1.3.2 Descriptive statistics 36
2.1.3.3 Users’ awareness 43
2.1.4 Biometrics and users’ characteristics 48
2.1.4.1 Biometrics and users – Descriptive statistics 48
2.1.4.2 Biometrics and users – Analysis 50
Appendix 53
2.2 From Cognitive-Behavioral Analysis to Interaction Design
2.2.1 Some reflections on cognitive and behavioral analysis
findings 55
2.2.1.1 People seem to tend not to read privacy notices
carefully 60
2.2.1.2 The user consent 61
2.2.1.3 Privacy is considered a no task related approval 62
2.2.2 From reflections on cognitive and behavioral analysis
findings to interaction design recommendations: info-view and
interlocking design approach 62
2.2.2.1 HCI and information visualization in support the users 63
2.2.2.1.1 Possible ways to support users with HCI
2.2.2.1.2 Possible ways to support users with information
visualization
2.2.2.2 Interlock as a design solution for future privacy
interaction 66
2.2.3 Conclusions: the urgency of an holistic approach
to the privacy which imply the interaction design 68
2.3 Legal Analysis
2.3.1 Findings relevant to the Legal analysis (L.A.) on the basis
of the cognitive-behavioral analysis 71
2.3.1.1 Consent to the processing of personal data and its
consequences: user’s awareness 71
2.3.1.2 The purpose of the consent to the processing
of personal data in terms of efficacy 80
2.3.1.3 Clues of mismatch between legal and users’ conception
of personal data protection 91
2.3.2 Critical issues arising from the legal analysis (L.A.)
of the legal and UX-HMI experiment 101
Table of contents
2.3.2.1 A comparison between W10 operative systems before
and after the entry into force of GDPR 101
2.3.2.2 Conclusions on the Legal Analysis (L.A.):
how strengthening the ex ante protection 104
117
119
123
133
CHAPTER 3
SOLUTIONS TO THE CRITICAL ASPECTS EMERGING FROM THE
EXPERIMENT: THE LEGAL AND UX&HMI PERSPECTIVE
3.1 Results from the LEGAL AND UX&HMI: A persistent critical
aspects on consent and its real role
3.2 A de iure condito proposal for the empowerment of information
about data processing: legal design as an ex ante remedy
3.3 A de iure condendo (normative) proposal for ameliorating
the GDPR
137
LIST OF AUTHORS
143
SOURCES
159
ANNEXES
Questionnaire Italian version 159
Questionnaire English version 171
7
CHAPTER 1
CONSENT AND INFORMATION
IN THE PRIVACY SETTINGS
OF A PC OPERATIVE SYSTEM:
SCOPE AND METHOD OF THE RESEARCH
1.1
INTRODUCTION TO THE RESEARCH
1.1.1 THE REASONS FOR A PIECE OF RESEARCH ON THE CONSENT
TO THE PROCESSING OF PERSONAL DATA1
In Italy the issue of consent to the processing of personal data
(privacy) is mainly dealt with the traditional methodological perspective of
legal studies, namely: 1) considering existing rules as a datum; 2) analyzing
in a de iure condito perspective the legislation in force both at national and
at European and international level, accentuating the differences in US / EU
approach with ideological contents; 3) insisting on clarity and awareness of
the information at multiple levels of usability and administration.
According to this perspective, it is hard to find awareness of the
different articulations that the topic takes on in relation to specific cases or
the type of “data” processed (eg the request for consent to the processing of
health data at the entrance of an operating room appears different from the
request for consent to the processing of data relating to one’ s musical tastes
placed at the opening of an APP to download songs, which in turn appears to
be different from the information of video-shooting of one’ s own image placed
at the entrance of a pharmacy).
Likewise, by analyzing the current regulation, the aforementioned
studies do not consider or undervalue actual and potential privacy problems
related to the effective functioning of distinct devices in the digital and nondigital world2.
1
The author of this paragraph is Prof. Lucilla Gatt, Full Professor of Private Law at Suor
Orsola Benincasa University of Naples.
2
The considerations expressed in the text are indicative of a general attitude of the
Italian scholarship. More problematic aspects of personal data protection, with particular regard
to the relationship between the protection of privacy and technology there is, for example, are in
11
1. Consent and information in the privacy settings of a PC operative system
Privacy seems to be disregarded, in practice, by the users who give
consent to the processing of personal data.
1.1.2 THE REASONS FOR CHOOSING A HYBRID METHOD OF INVESTIGATION
AND THE RELATIONSHIP WITH TECHNOLOGY 3
With the Privacy and the Internet of Things: a behavioral and legal
approach project, commissioned by an important technological partner, the
researchers of Utopia4 Living lab (a Laboratory set up at the Unisob’ s Scienza
Nuova Research Center), has adopted a method of study of the issue of privacy
consent, inspired by a widespread model in the Anglo-American area, in which
the legal analysis is closely connected by a preliminary and / or contextual
analysis of people’ s behavior, in an experimental environment. In the Privacy
project we have tested the interaction of a number of users with electronic
devices when they authorize / consent to the processing of personal data5.
The reasons for this methodological choice are to be found, on the
one hand, in the acknowledgment of the impact of technology on traditional
legal categories; on the other, in the awareness of having to proceed with
an analysis, of rules which take for granted the capacity of data subjects to
protect certain their interests.
We have pursued the goal to surpass the attitude of legal research
from the 70s to present to develop in a circular more, instead of a linear one,
G. Comandè, Tortious Privacy 3.0: at quest for research, in Essays in Honor of Huldigingsbundel vir
Johann Neethling, LexisNexis, 2015, 121 ss ., but v. infra also nt. 3.
3
The author of this paragraph is Prof. Lucilla Gatt, Full Professor of Private Law at Suor
Orsola Benincasa University of Naples.
4
The Living LAB_Utopia develops research itineraries on the possible interactions
between law and new technologies. Equipped with the most modern technological instruments,
Utopia is a place of cooperation and collaboration between experts and researchers, in the legal
and technological field, in the sign of the sought-after interrelationship between knowledge. The
topics addressed by Utopia’ s legal research group are different. By way of example, we highlight:
protection of personal data in the era of computerization, civil liability and automatic devices,
remotely piloted air systems (SAPR) and civil liability, bio-law and biotechnologies, protection
of biotechnological inventions and patentability of living, law, computation and simulation,
Online Dispute Resolution (so-called ODR), neuroscience and law (so-called neuro-law), digital
inheritance, start-up and technology transfer.
5
Ex multis, A. Acquisti, Privacy, in Riv. pol. Econ, 2005, p. 319; Solove, Privacy selfmanagement and the Consent Dilemma 126 Harv. Law Rev. (2013) 1880; Mantelero, Personal Data
for Decision-Making in the Age of Analytics: 32 Computer Law & Security Rev. (2016) 238-255;
Strahilevitz, Toward a Positive Theory of Privacy Law, 113 Harv. Law Rev. (1999) 1; Borgesius, Informed
consent: We Can Do Better to Defend Privacy, IEE, 2015 (Vol. 13, p. 103 – 107); Id. Behavioral Sciences
and the Regulation of Privacy on the Internet, Amsterd. Law School Research Paper no. 2014 – 54.
12
1.1 Introduction to the research
as it tends to propose ad infinitum the same topics and issues with similar
methods if not identical. Thus, such research is frequently found to be
incapable of reaching scientifically original results.
This perspective induces to evaluate the meaning of originality
(alias advancement with respect to the state of the art) of research activity,
considering the possibility of affirming an idea of original “scientific result”
(i.e. of point of arrival of research activity from which to start a further and
different research) already present in other branches of knowledge6 and
which, instead, appears opaque in the humanities field, with specific regard to
the legal domain.
We intend to develop awareness among researchers about their
contribution to effectively develop legal research, proposing problems and
solutions truly functional to the needs of contemporary society. In this study,
we also wanted to give the research an international attitude, to confront it,
both on a methodological level and a substantive level, with research from
other legal system.
Finally, the research deals with technology in a twofold sense: having
considered technology as the subject of study and as a tool for conducting it.
1.1.3 LEGAL ANALYSIS (L.A.) AND EXPERIMENTAL INVESTIGATION
CONCERNING THE CONSENT TO THE PROCESSING OF PERSONAL DATA 7
The research undertaken intends to measure Italian users’ awareness
and their sensitivity towards privacy (measured by their consenting to data
processing) when installing an operating system on a personal computer.
It aims to verify how the Italian and European legislation, focused on consent,
and implemented in this operating system, can assure effective protection of
users with regard to the processing of personal data and especially where the
request for processing takes place in a digital environment.
It is worth-noting that the experiment, although conducted on a
small number of subjects, has, however, a statistical and discrete relevance
considering the degree of homologation in the functioning of software and
devices.
It should also be emphasized that this research combines the
empirical methodology well-developed in foreign studies with the observation
of users’ behavior, as a new approach. In fact, a sample of users, although
6
N. Irti, Un diritto incalcolabile, Torino 2016, pp. 137-151.
The author of this paragraph is Prof. Lucilla Gatt, Full Professor of Private Law at Suor
Orsola Benincasa University of Naples.
7
13
1. Consent and information in the privacy settings of a PC operative system
numerically much lower than that used by other empirical studies on privacy8,
was submitted an experimental protocol and a questionnaire divided into two
macro-sections (a legal and a behavioral part). Each experiment lasted about
an hour.
High-tech equipment (Eye-tracking) was used to verify the users’
actions regarding privacy settings during the installing of the program and its
use. A number of dimensions have been subject to evaluation: attention (can
be analyzed through graphic representations of heat-map); task execution
time; reading order within the screens (thanks to the gaze plot sequences);
awareness and knowledge of privacy matters; usability.
Through this multi-instrumental analysis it is possible to verify
whether the individual, to whom personal data refer, is able to understand, or
is, in the act in which the information is given or to which he/she consents,
genuinely interested in knowing the information supplied and whether he
perceives this lack of understanding as a lack of protection, also considering
the digital environment in which these operations take place as well as the
device through which choices are put in place.
The methodological perspective adopted is that of the interrelation of
law with the analysis of behavior, based on cognitive psychology, with the aim
of verifying the latter’ s results in the legal domain. The survey is therefore
divided into two phases:
1. one highly experimental, as based on a complex examination
of the relationship between user and technology as well as the use of the
aforementioned Questionnaire and the collection and processing of data in
digital format;
2. one analytical, as based on the examination and commentary of
the data collected within the framework of the current legislation on privacy,
both nationally and at European level in a de iure condendo (normative)
perspective.
In a nutshell, by making use of the Behavioral analysis (B.A.) and
analysis of the applied data (Legal Analysis L.A.), a functional perspective
of legal analysis has been assumed in order to assess the legal efficiency
of the measures (essentially founded on the so-called consent) adopted by
legislation on the protection of personal data.
8
This sample is representative of the Italian population by age, profession, income level,
cultural level, level of interaction skills with digital environments and IT support.
14
1.1 Introduction to the research
1.1.4 CONSENT TO THE PROCESSING OF PERSONAL DATA
AND BEHAVIORAL ANALYSIS (B.A.): INTRODUCTION 9
The behavioral analysis concerned the involvement of a sample of
users, profiled according to different degrees of knowledge in relation to
the topics of investigation, i.e. some users were aware of aspects related
to privacy also from the legal point of view, others substantially unaware,
although sensitive to the subject of data privacy for the social weight that the
topic has. The sample was composed of volunteer individuals from the Suor
Orsola Benincasa University, including students, teachers and non-teaching
staff, as well as volunteers from outside.
The structure of the test follows the so-called usability test10, that is a
controlled analysis experience in which the sample is asked to perform some
tasks (i.e. operations to be performed on the interactive system presumably
familiar to users). Data that highlights situations related to the use of the tool
are collected in order to understand the experience in terms of interaction, the
degree of awareness about what has been done and the errors committed in a
more or less systematic way.
In relation to the experience carried out in the project, we worked
to understand, using qualitative and quantitative measures, both aspects
of usability related to the use of the interactive system (i.e. effectiveness,
efficiency and satisfaction expressed during use), and the degree of awareness
that the system allows to make users informed use of their personal data.
1.1.5 INTERACTION BETWEEN L.A. AND B.A.: A FUNCTIONAL APPROACH
TO LEGAL RULES AND MODELS OF PROTECTION OF PERSONAL DATA
ALTERNATIVE TO CONSENT 11
With the conclusion of the experimental phase, a complete and indepth analysis of all the data collected has been launched, but already from the
intermediate analysis – as illustrated above – a number of elements emerges
9
The authors of this paragraph are Prof. Roberto Montanari, Drs. Federica Protti and Dr.
Emanuele Garzia. Professors Gatt, Montanari and Caggiano thank Dr. Andrea Castellano for their
work in conducting the experiments. For the final part only, the experiment also made use of the
contribution of Vincenzo Pascale, Marianna La Rocca, Ilenia Nigro, Valentina Platella, Marcella
Capizzuto, Raimondo Casaceli. A special thank for the preliminary study, the preparation of the
research questionnaires, the organization of the experiment goes to Maria Cristina Gaeta.
10
For a methodological account see: www.usabile.it/212003.htm.
11
The author of this paragraph is Prof. Lucilla Gatt, Full Professor of Private Law at Suor
Orsola Benincasa University of Naples.
15
1. Consent and information in the privacy settings of a PC operative system
for proposing a hypothetical scenario where the processing of personal data
does not depend – at least not always – on the consent of the physical person.
The investigation carried out up to now highlights the limits of the prior
consent both because it is unconsciously made and because – even when it
is consciously given – it does not prevent a harmful processing for the user.
On the contrary, the provision of prior consent could have a distortive effect
because it tends to conceal ex post remedies on the basis of the belief that the
sole granting of consent eliminates a priori the possibility of injury.
In other words, the prior consent can generate in the user a nonunivocal understanding and probably a false conviction.
In addition, the digital environment and the technological support used,
that is, in a broad sense, the context of human-machine interaction necessary
for carrying out certain activities12, should be considered for measuring the
efficiency of the legal rules and for promoting regulations, at least in principle,
adequate to the aforementioned context and functioning.
The experimentation conducted and the legal-behavioral analysis
of the collected data highlight the inefficiency of an ex ante protection and
lead to consider if an alternative model of regulation based on prohibition of
data processing or special limitation on the processing of categories of data
can be conceived, with the final aim of strengthening private remedies of
compensation/restitution and deterrence to treatments harmful to the user,
and more effectively protect the fundamental right to personal data.
12
The processing of some personal data, in particular non-sensitive data, represents
– almost always – a (pre) requirement for operation, in some cases for security purposes, of the
device used. Think of the whole question of the c.d. telemetry whose radius of action inevitably
affects that of the c.d. privacy, putting back into (apparent) discussions certainties achieved even
and above all at the level of the latter’ s regulation.
16
1.2
THE CONTEXT/BACKGROUND OF THE RESEARCH 13
Privacy represents one of the most significant issues of current social
life and its protection constitutes a primary concern of legislation. Private
life is constantly monitored through an increasing amount of identification
and tracking technologies, wired and wireless sensor and actuator networks,
enhanced communication protocols, and distributed intelligence for smart
objects (cumulatively named as the Internet of Things), which collect
individuals’ data generally traded to other businesses.
European legislation is aimed at assuring the involved stakeholders’
(according to the EU legislative nomenclature, the data subjects) awareness
of the monitoring, collecting and managing activities of their personal data, as
well as of the objective of these activities, according to the conceptualization
of privacy as a fundamental right under EU Law. European law mandates an
authorization to be given in relation to data collection and treatment and
the new Reg. (EU) 2016/679 (so-called General Data Protection Regulation
or GDPR) adopts a stricter regulation and lays down new obligations for
businesses, an increased harmonization, an expanded territorial scope.
However, the legislation in force, based on the supply of information as
a mandatory pre-requisite of the authorization, lacks an objective consideration
of the effective influence of information on individuals’ awareness.
Moreover, the GDPR, despite its technological neutrality, still needs
to be reconciled with the enhanced technologies of the IoT, in order to assure
effective protection of rights and compliance by businesses. For example,
on the technological side, although the GDPR has elaborated the concept of
“Privacy by Design” (a development method for privacy-friendly systems and
services, thereby going beyond mere technical solutions and addressing
13
The author of this paragraph is Prof. Ilaria Amelia Caggiano, Full Professor of Private Law
at Suor Orsola Benincasa University of Naples.
17
1. Consent and information in the privacy settings of a PC operative system
organisational procedures and business models as well), its concrete
implementation remains unclear at the present moment.
Therefore, the European approach to the collection and treatment of
personal data poses crucial questions for businesses in terms of how to obtain
authorization from the technologies users, especially when data are obtained
through wireless devices and sensors, as stakeholders might not be aware
of information released during their interaction with these technologies. The
storage and management of these data has to be re-considered as well.
As a consequence of the rapid technological developments, a new
regulation on Data Protection has been recently enacted at a European level
with the purpose of creating a stronger data protection framework in the
Union as well as developing the digital economy and the internal market of
personal data (rec. 7). According to the GDPR the data subject’s consent to
the processing of his/her personal data still plays a fundamental role as
a prerequisite for the processing of personal data. Regulation 2016/679
modifies the basic system of rules about the processing of personal data,
namely Directive 95/46/EC (the so-called “mother directive”), with regard to
organizational and entrepreneurial models, responsibilities of controllers and
processors, thus shifting the risk of the activities carried out to them. This is
the “accountability” model where the burden of proving compliance with law
and no risk of data breach falls to the gatherer of data (rec. 86, article 5).
However, the new act upholds the legitimate bases for this process.
Under the Regulation, controllers are required to bear the risk of
processing by adopting a series of ex ante measures. Think of the data protection
Impact Assessment (rec. 84), in case of a high risk to rights and freedoms for
individuals; the design of systems and applications aimed at minimizing the
use of personal data (so called Privacy by design and by default – Article 25),
the technical and organizational measures to minimize the risk of personal
data (such as pseudonymization); the mandatory appointment, in some cases,
of a new figure, the Data Protection Officer (Article 37, p. 97), a manager in a
third party position (at least, in principle) with the task of advising the manager
/ holder in order to ensure proper management of personal data in companies
and bodies and act as contact point with Authorities. Such measures are
accompanied by: the assertion of rights to natural persons (to erasure – to
be forgotten, to data portability – Article 20); a uniform regulation within
the European Union’s data processing market14, guaranteed by a European
authority, the European Data Protection Board (Article 68); the limitation of the
14
Regarding the territorial scope of the legislation (Articles 3 to 5): no reference is made
to the location of the terminal in the Member State but to the provision of services in EU states, so that
18
1.2 The context/background of the research
circulation of data outside Europe, on the basis of the conformity assessment
of guaranteed measures for data transferred outside the EU.
The above-mentioned regulatory choices reveal an approach which
favours the technological production of massive amounts of data, techniques
which allow multiplication of the data, and regulate processing, minimize
the risks of loss, dispersal, dissemination, for the purpose of protecting
the data subjects. Technology (privacy by design, through anonymisation
and pseudoanonymization) is called upon to regulate technology (since
processing is now almost automated) according to the objectives set by the
legislator, while legal rules gain their own important space as ex post measures.
As already said, in this context, consent is one of the “legitimate
basis” (according to the terminology common to the Nice Charter) to initiate
the processing of personal data15. With respect to personal data in general,
consent is a condition of lawfulness under Article 6), and with regard to
particular categories of data (sensitive data), it excludes the prohibition of
processing (Article 9)16.
Consent is defined as “any freely given, specific, informed and
unambiguous indication of the data subject’s wishes by which he or she,
by a statement or by a clear affirmative action, signifies agreement to the
processing of personal data relating to him or her”. In the new legal terminology,
consent must be an affirmative / positive (but not necessarily written) action
the new framework applies in full to the undertakings located Outside the European Union offering
services or products to persons located in the territory of the European Union.
15
Thus, it is confirmed that a wide area of data processing remains under the rules of
consent. Consent continues to represent only a legitimate basis for processing, required where it is
not necessary for the performance of a contract or the pre-contractual stage, for the fulfillment of a
legal obligation for the holder to safeguard vital interests, For the pursuit of a public interest or the
exercise of public authority for the pursuit of a legitimate interest of the holder or a third party provided
that they are not in conflict with the rights of the party concerned (Article 7 of the GDPR). A significant
opening in the market, in this regard, is the possibility that the processing of personal data for direct
marketing purposes is considered to be pursued by a legitimate interest of the data controller or
third parties (cons. 47). As for the particular categories of data, further specific derogations are also
foreseen in this case, with respect to the consensus (Article 9 GDPR). On the different structure of
the legality conditions in the privacy code and in the Regulations (and in Directive 95/46 / EC), L.L.
Bolognini, E. Pelino, Condizioni di liceità, in L. Bolognini, E. Pelino, C. Bistolfi, cit., p. 278, but it does not
lead to appreciable consequences on.
16
In maintaining the information mechanism + consent, the need for prior and
unambiguous (explicit for sensitive data) and on request (as in the consumer case are excluded from
the pre-ticked boxes) is confirmed, favored by the presence of icons (identical In the EU), also with
regard to the transfer of non-EU data and the existence of a revocation right (see paras 32-38, §§ 6
-8). The right to data protection as self-determination (such as control over collection, dissemination,
processing of correctness and data removal) already in the pre-regulation system, G. Sartor, Privacy,
reputazione, affidamento: dialettica e implicazioni per il trattamento dei dati personali, in AA.VV.,
Privacy digitale. Giuristi e informatici a confronto, Giappichelli, Torino 2005, p. 81 ff.
19
1. Consent and information in the privacy settings of a PC operative system
(see Article 4, paragraph 1, n. 11) Reg. 2016/679)17. Also when electronic
means are used a positive acceptance action is required. In this regard, one
should not only consider the use of digital instruments but, inevitably, the
already widespread diffusion of sensors (e.g. motion detection software,
touch sensing, etc.).
Consent must be explicit only with regard to sensitive data (Article 9
GDPR and Article 8 (2) (a) “mother directive”). Explicit consent is also required
for the purposes of profiling (Article 22 GDPR) and in the case of transfer to
a third country or an international organization (Article 49 (1) (a)). Explicit
consent can be understood as consent to be clearly manifested. This is a
further qualification, different from the affirmative consent (that is, not tacitly
expressed), laid down as a general rule by the European legislator (rec. 32)18.
Transparent disclosure of any information or communication relating
to the processing is provided as mandatory (rec. 39, art. 12). Consent,
according to a pattern that recalls neoformalism in contract theories, cannot
be obtained in the absence of a number of mandatory pieces of information
provided to the data subject. However, information must be provided to the
data subject even when it is not fundamental to a subsequent manifestation
of will.
17
Recital 32, «any other conduct which clearly indicates in this context that the
data subject accepts the proposed processing Therefore, withholding of information, inactivity
or preselection should be allowed. [...] If the consent of the data subject is required by electronic
means, the request must be clear, concise and not interfere immutably with the service for which the
consensus Is expressed».
18
Art. 23 Italian Privacy code (previous version) stated: «Consent is valid only if it is
expressly and specifically disclosed in relation to a clearly identified processing, if it is documented
in writing, and if the information referred to in art. 13».
20
1.3
THE SUM UP ON THE EXPERIMENTAL RESEARCH 19
The experimental research comes from a research project founded by
Microsoft Italy concerning the impact of European Data Protection regulation
on business and users dealing with enhanced technologies. In details, the
project aimed at examining whether regulation strategy is effectively coherent
with individuals’ data protection in a digital era.
The exponential development of technology has led to profound
changes in social relations and in law.
One of the unifying aspects of our time is the centrality and spreading
of information, which is now in itself a consumed and exchanged good. In the
technological era, information is massively collected and quickly transmitted
through the Internet. Through this new mode of communication, marketable
information is gathered from users of the internet by bots with no regard for
people as such: personal data as discussed in this paper being a fragment of
information related to natural persons.
It is clear that the flow of information pertaining to individuals has
modified social habits, freedoms and rights 20. A new autonomous right to
personal data has been vested in natural persons, as independent from the
right to personal identity or their privacy as the right to be left alone21.
19
The author of this paragraph is Prof. Ilaria Amelia Caggiano, Full Professor of Private Law
at Suor Orsola Benincasa University of Naples.
20
Rodotà, Tecnologie e diritti, Il Mulino, Bologna 1995; Id., Intervista su Privacy e libertà, a
cura di P. Conti, Laterza, Bari 2005. Parlano, efficacemente, di sistema “dato-centrico” Montelero, The
Future of Consumer Data Protection in the E.U. Rethinking the “Notice and Consent” Paradigm in the
New Era of Predictive Analytics (2014) 30 (6) Computer Law & Security Rev. 643 ff. ; O. Pollicino, Un
digital right to privacy preso (troppo) sul serio dai giudici di Lussemburgo? Il ruolo degli artt. 7 e 8
della Carta di Nizza nel reasoning di Google Spain, in Dir. info., 2014, p. 569 ss.
21
The idea that the individual not only has the right to a living space that cannot be invaded
by third parties (Article 7 of the Charter of Fundamental Rights of the EU), but that he has the right to
“govern”, or at least have access to, Every fragment of identity that is held by third parties interprets
21
1. Consent and information in the privacy settings of a PC operative system
But, the situation is still evolving 22.
Digital data can be aggregated from a vast constellation of sources
(databases, search engines, virtual stores, e-mail, social networks, cloud-storage
services, things in the Internet of Things 23). Once processed, they are able to
profile individuals, global society, or any large community. This is the world of
big data: datasets which, by volume, speed and variety, allow the extraction of
additional information so as to determine business models, markets or scientific
uses of this digital knowledge 24. Personal data, which is available to public and
private entities, is most effectively processed by those companies with the
most extensive databases, which are in a dominant or approaching monopolistic
position. The massive amount of data allows for an increasingly pervasive
profiling of individuals, through the history of their activities, their preferences,
interactions, lifestyles, in order to: propose targeted products or advertisements
(commercial purposes); predictively monitor larger social groups; and correlate
the most disparate information with great accuracy 25.
The collection of personal data is realized in various ways: by
professionals, at the conclusion of a contract for further use, namely for selling
the data to third parties; by social networks, where content users voluntarily
share their personal information and deliberately express their social identity
(data about their images, tastes, through the social display of appreciation or
sharing); and by search engines 26.
the formalization of the individual’s interest in not exposing one’s own person to the invasion of his or
her sphere by third parties.
22
Lucidly identifies the link between the different phases of technological development
and the content of the personal data protection regulation Montelero, The Future of Consumer Data
Protection... cit.
23
On the IOT, preliminarily, Paganini, Verso l’Internet delle cose, in Dir. Ind., 2015, p. 107 ff.
24
Value extraction in big data takes place through analytical methods of data mining
(algorithms). Analytics are data tracking tools: software that lets you find correlation between data,
analyse historical series, determine trends and seasonal behaviours, simulate economic scenarios,
segment customers, and conduct data and text mining activities to better understand a wide range
of phenomena Of business. These are tools that enable private and public decision-makers to make
better decisions. Providing budget indicators based on historical series, understanding customers
and employees’ behavior in advance, assessing the degree of risk of funding, are some practical
examples of analytics use. On the subject, see: R. Moro Visconti, Evaluation of the Big Data and
Impact on Innovation and Digital Branding, in Industrial Law, 2016, p. 46 ss.
25
Google was able to predict the influenza in real time and 2 weeks before government
institutions. On this point, Bogni – Defant, Big data: diritti IP e problemi della privacy, in Politica
industriale, 2015, p. 117 ff.
26
Pipoli, Social Network e concreta protezione dei dati sensibili: luci ed ombre di una
difficile convivenza, in Dir. info, 2014, p. 981 ff. Where the following text (nt. 40) states that “with
the pressure of appreciation and sharing keys, data on racial or ethnic origin, religious, philosophical
or other beliefs can be communicated to political opinions, [...] As well as data suitable for revealing
the state of health and sex life, that is, sensitive data....” (our translation) In the Declarations of
22
1.3 The sum up on the experimental research
The circulation of data, through the sale to third parties (also
for advertising purposes), is a source of profit for the professional data
controller 27. Thus, the personal interests of the user function as negotiable
intangible assets 28 are not only a matter of fundamental rights and freedoms,
but because of their economic value, a matter of patrimonial rights.
The brief scenario above described allows for reflection on the reasons
and efficacy of the right to the protection of personal data, with particular
regard to the recent European legislative act (Regulation (UE) 2016/679).
1. Privacy setting version of 2019 – pair devices
2. Privacy setting version of 2016 – pair devices
3. Privacy setting version of 2019 – camera
4. Privacy setting version of 2016 – camera
Rights and Responsibilities of Facebook it reads that «the user grants to Facebook a non-exclusive,
transferable license, which may be granted as royalty, free of charge and valid worldwide, for the use
of any content IP published on Facebook or in connection with Facebook».
27
The collection, analysis, and conversion of all relevant user data (including sensitive
data), including through the license, fall within the User Data Profiling activity. [...] in order to generate
segmentation of their users into homogeneous behavioural groups [...] to make Behavioural Advertising
[...] in social networks are the users themselves to build a profile of them, which will be used for their own
profiling” (our translation) Pipoli, Social Network, cit. text up to nt. 84. In that regard, Facebook’s social
networking information clarifies how the so-called data use license is only used for editorial advertising
purposes and not for sale to third parties. Though, in another place, it expressly says “We do not share your
personal information (personal information includes name or email address that you can use to contact or
identify you) with advertising, measurement or analysis partners, unless You do not grant us permission”.
28
One need only mention the fact that Google, which has become the world’s second-largest
value-for-money company – with $ 522 billion in capitalization, just under $ 587 billion, with sales growing
in the first quarter of 2016. (fonte: www.corriere.it; www.corriere.it/economia/finanza_e_risparmio/16_
aprile_26/wall-street-sorride-solo-gigante-google-2a13ebae-0b8f-11e6-a8d3-4c904844517f.shtml).
23
CHAPTER 2
THE EXPERIMENT:
DESCRIPTION, DATA COLLECTION
AND ANALYSIS, FINDINGS
2.1
THE EXPERIMENT
2.1.1 USER EXPERIENCE EVALUATION: AN INTRODUCTION TO THE STUDY 29
The previous chapter highlighted the importance of privacy and how
the development of technology has impressed profound sociological and legal
changes into this concept.
In this chapter the focus is set on the usability aspects, i.e. privacy
settings preferences and user awareness.
Users’ interaction with privacy can be described as “usability”, a
concept that goes back to Jakob Nielsen, who defined it as a measure for
the quality of the users’ interaction with websites, software applications and
similar other tools30.
The process of assessing how much pleasant and easy is a system to
use is called usability evaluation.
A product is ‘usable’ when it’s easy to learn, it’s easy to remember, it
has few interaction errors and it’s pleasant and efficient to use. Obviously,
the methods for evaluating usability and user experience depend on the
objectives and what specific aspect is under the magnifying glass of
scientific evaluation. Methods may depend on the level of usability you want
to guarantee, or on the number of redesigns that you can make in the initial
phase, in the construction phase or in the final verification.
29
The authors of this paragraph are Emanuele Garzia, PhD student in Humanities and
Technologies: an integrated research path at Suor Orsola Benincasa University of Naples, Dr. Luca
Cattani, postdoctoral research fellow at the Department of Legal studies University of Bologna, Drs.
Federica Protti, UX designer and Prof. Simona Collina, Associate Professor in Experimental Psychology
at Suor Orsola Benincasa University of Naples and Prof. Roberto Montanari, Professor of Cognitive
Ergonomics and Interaction Design at Suor Orsola Benincasa University of Naples.
30
Nielsen J, ‘Usability Engineering’ (Morgan Kaufmann Publishers Inc. 1994).
27
2. The experiment: description, data collection and analysis, findings
The ISO 9241-210:2019 defines the user experience as “perceptions
and reactions of a user arising from the use or expectation of use of a product,
system or service” 31. The concepts of usability and user experience are
often overlapping. Entering into details, usability corresponds specifically to
pragmatic aspects (the ability to perform a task efficiently, effectively and
satisfactorily), while user experience also includes hedonistic aspects, user
emotions, user beliefs, preferences, psychological and physical reactions,
behaviours and actions that occur before, during and after use.
The main part of the chapter is going to discuss the following
topics: the experiment, description of the experimental protocol, experiment
procedure, the usability instruments in use, eye-tracker, think-aloud, System
Usability Scale.
2.1.1.1 EXPERIMENT AND SAMPLING 32
The experiment looked for connections between privacy setting
preferences, user awareness and usability. The participants to the research
activities were 75 volunteers recruited within the University (students,
professors, and other personnel) and outside the University environment (see
paragraph 1.1.4). The methodology, instruments and materials used were:
– Written form for informed consent;
– Scenario and tasks;
– Eye-tracking;
– Thinking-aloud;
– Questionnaires.
2.1.1.2 DESCRIPTION OF THE EXPERIMENTAL PROTOCOL 33
The experimental protocol was set up in this way:
– The participant signed the written form for informed consent;
– The participant filled the Profiling Questionnaire, that helped to
define his/her demographic, technological and juridical competences.
The participant was asked to do the following tasks:
• TASK 1: install Windows 10;
• TASK 2: activate Cortana and test it through voice instructions
“Find the best pizzeria in Naples”;
31
ISO 9241-210:2019. Ergonomics of human-system interaction – Part 210: Humancentred design for interactive systems.
32
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
33
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
28
2.1 The experiment
TASK 3: open Microsoft Edge and browse the website of the newspaper
Il Mattino di Napoli and read the title of the most important news of
the first page. Perform the download and installation of Windows maps
from Microsoft Store;
• TASK 4: explore the panel related to privacy settings and configure
the computer in the way that best suits the needs for daily use;
• TASK 5: explore the panel related to security settings and configure
the computer in the way that best suits the needs for daily use.
• The participant filled the Final Questionnaire, which included
the following sections:
– SECTION 1: USABILITY;
– SECTION 2: LEGAL AREA;
– SECTION 2.1: Level of user’s awareness provided by W10 regarding
protection of personal data;
– SECTION 2.2: Level of user’s understanding of the consequences
related to having provided consent to the processing of personal data;
– SECTION 2.3: Degree of user’s ability to manage the configuration of
the system with regard to processing of personal data;
– SECTION 2.4: Privacy and Safety;
– SECTION 3: USER’S ASSESSMENT: on the simplicity of the instructions
received with respect to data processing;
– SECTION 4: System Usability Scale (SUS).
•
2.1.1.3 EXPERIMENT PROCEDURE 34
The testing phase required using a computer located within the
laboratory, while each participant was alone in the room with an observer.
The observer started giving a general overview of the research’s
objectives and informed each participant about every aspect concerning the
testing sessions (timing, devices to be used, etc.).
The observer asked to the participant to sign the form for written informed
consent, as well as to answer to some questions for filling in the table defining
demographic profiling, and those of technological and juridical competences.
Later, the participant was asked to wear the Tobii eye-tracker and the
observer proceeded with a short calibration procedure.
The participant wore the eye-tracker during the whole test session and
was allowed to remove them for filling in the final questionnaires. The observer
introduced to the user the scenario and explained each task one at a time.
34
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
29
2. The experiment: description, data collection and analysis, findings
Once the task execution phase had been completed, the user removed
the glasses and was asked to fill in the questionnaires, distributed in the
following order of priority: awareness questionnaire, System Usability Scale
questionnaire and acceptability questionnaire.
At the end of the test session the participant was allowed to review his
own video recorded through Tobii studio software (so that the observer could
also be sure that the recording had been done correctly).
2.1.2 THE USABILITY TOOLS 35
The methodology provided the use of eye-tracking and think-aloud
tools while the user was performing the tasks.
Eye-tracking was used to record eye movements, while thinking-aloud
helped expressing with words the criticality encountered while performing the tasks.
The usability instruments used were:
– eye-tracker;
– thinking-aloud tool and thinking-aloud space;
– System Usability Scale (SUS).
2.1.2.1 EYE-TRACKER 36
The study uses high-tech instruments as an eye-tracker, that consists
into recording the participant’s ocular movements through a pair of special glasses.
Tobii Glasses consists in a miniaturized system of eye tracking, located
in the frame of a pair of glasses also equipped with a front camera, facing the
outside scene, and an eye tracking camera for monitoring the subject’s eye
movement. The data storage module is connected to the glasses by a thread
and it is pocket-sized. The device is able to record the ocular behaviour at a
frequency of 30 Hz. The data acquired will be then analysed through the use
of the software Tobii Studio and brought back to the cognitive load associated
with each performed task, based on the interpretation provided by the
cognitive models known. The characteristics that make Tobii a particularly
suitable device to our purpose are three. 1. The goggles are ergonomically
designed to perfectly fit and work with the head of the wearer, making the
path more natural and less conditioned. 2. The subject has complete freedom
35
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
36
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
30
2.1 The experiment
of movement. 3. The execution paths’ recording that happens during the tasks
takes place without hampering the view of the subjects.
Some of the measurements that can be extracted from Tobii are:
– heat-maps;
– gaze plots, reading order in the displays;
– running time of the task (initial and end point).
Heat-maps and gaze plots are data visualizations that can communicate
important features of the user’s visual behaviour.
The gaze plots reveal the watching time sequence, or where and how
long the users looks at a single point. The time spent is shown by the diameter
of the circles.
Heat-maps show how the watching is distributed over the stimulus, and
there are no information regarding the watching order or the focus on an individual
fixation. A colour scale, moving from green to red, indicates the duration of gazes.
Red spots, over the legend and the highlighted district, indicate that the subjects
have looked these areas for a significant period of time.
2.1.2.2 THINKING-ALOUD 37
The thinking-aloud methodology aims to obtain a precise
understanding of the users’ decision-making process during the execution
of the usability tests, by enabling them to express loud comments on the
undertaken actions as well as on impressions about the navigation experience.
While undertaking a navigation path in order to achieve a task, the user is called
upon to explain choices, doubts, as well as concerns to the observer, who is in
charge of following the testing session. The main purpose is to discover the
mental model followed by the user during the execution of the tasks, in order
to extrapolate data allowing the interface’s improvement.
3. Heatmap
37
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
31
2. The experiment: description, data collection and analysis, findings
The test sessions was carried out in thinking-aloud space (see image
n. 4 – Thinking-aloud space) and it was accompanied by the analysis of users’
ocular paths through the use of qualitative eye tracking, in order to detect the
major difficulties which have had a strong impact on the interaction processes,
although not directly expressed by users (possibly because of shyness or
uncertainty).
2.1.2.3 SYSTEM USABILITY SCALE 38
The System Usability Scale, or SUS, was created by John Brooke in
1986. It allows to evaluate the usability of different products and services
(hardware, software, applications) through a questionnaire.
This questionnaire consists of ten questions to which the participant
must answer by expressing the level of agreement with related propositions
on a scale of 1 to 5. The proposed propositions investigate parameters such as
the level of complexity of the product for each user, and any need of support to
use the product. The 10 propositions of SUS are:
3. Thinking-aloud space
38
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
32
2.1 The experiment
1. I think I would like to use this system frequently.
2. I found the system unnecessarily complex.
3. I thought the system was easy to use.
4. I think I would need the support of a technical person to be able to
use this system.
5. I found the various functions in this system were well integrated.
6. I thought there was too much inconsistency in this system.
7. I would imagine that most people would learn to use this system
very quickly.
8. I found the system very cumbersome to use.
9. I felt very confident using the system.
10. I needed to learn a lot of things before I could get going with this system.
The main advantages of SUS are:
– Ease of administration and analysis;
– The possibility of defining whether a system is usable or not.
Recent researches show that SUS, originally designed to measure
a single dimension (perceived ease of use), also allows to evaluate other
parameter, such as comprehensibility and user satisfaction.
Researches also show that a reference score can be fixed at 68.
However, ours was fixed at 66 39: below this level we talk about below average
usability, above this level we talk about above average usability.
The measurement of the score obtained in the test takes place:
– subtracting one from the score assigned to odd-numbered statements;
– subtracting five from the score assigned in the even claims;
– multiplying the overall result obtained from the sum of the answers
to all the statements by the coefficient 2.5 gives a number from 0 to 100.
2.1.3 STATISTICAL ANALYSIS 40
In the present study, a usability test is analysed along with a set of
questions of the related questionnaire, in order to find correlations between
individual characteristics such as age, gender, education and occupation, and
the extent to which users feel aware of and interested in privacy issues during
the installation process of an app. The aim of such an analysis is to profile
39
SUS is built as a Likert scale with 10 items. Scores obtained via this process are then
normalized on a 1-100 point scale and the overall mean of the 10 indicators is adopted as the SUS of
the installation procedure.
40
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
33
2. The experiment: description, data collection and analysis, findings
users aiming to reach a preliminary discrimination between homogeneous
users’ categories. Therefore, the set of paired correlation has to be discussed,
in order to identify those variables of interest that are susceptible of correlation
with individual characteristics. Descriptive statistics concerning the variables
identified with such a procedure are presented in the study, followed by a
more in-depth analysis carried out with ordered probit regressions, which
allow ceteris paribus comparisons between the determinants (in this case,
individual characteristics) of the probability to be aware of and/or interested
in privacy issues.
The final sample size for the present analysis has been set at 75
individuals, nearly half of which aged under 25 and still in education.
The descriptive statistics, extrapolated from the sample before checking
these variables, may tend to reflect the above-mentioned asymmetries.
The main variables of interest are thus represented by age, education
and occupation plus a set of questions included in the questionnaire that
seem to be correlated with personal characteristics (see the Correlation
Tables below). A subsample of users (45 users) was observed during their
usability testing in order to measure performance metrics with specific
devices (eye trackers, smart watches, etc.). In addition, these 45 users went
through additional questions in order to assess the overall level of awareness
concerning data protection.
2.1.3.1 USABILITY METRICS RESULTS 41
Before proceeding to discuss descriptive statistics and results
related to the questionnaire, it is necessary to introduce the usability metrics
extrapolated from the subsample of users who agreed to be recorded during
their practical test. These metrics are basically linked to the way in which
users have completed the assigned tasks, and can be summarized as follows:
a. Performance metrics (e.g. the time needed to complete the task
“complete installation”);
b. Choice (e.g. whether the user has selected the “Quick settings” option);
c. Subjective overall usability assessment (calculated following the
System Usability Scale approach).
The average time needed to complete the installation is 38 seconds,
with a strikingly high variance and an asymmetric distribution that we are
going to discuss referring to users’ demographics.
41
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
34
2.1 The experiment
TABLE 1 – AGE GROUPS IN THE SURVEY SAMPLE
AGE GROUPS FREQ. PERCENT CUM.
15 – 24
44
58.67
58.67
25 – 34
23
30.67
89.33
35 – 44
4
5.33
94.67
45 – 54
2
2.67
97.33
55+
2
2.67
100.00
Total
75
100.00
TABLE 2 – OCCUPATIONAL STATUSES
OCCUPATION
FREQ.
Professionals
17
Technicians
3
White collar employees
1
Retail & Services
1
Farmers & Workmen
7
Students, Retired and Unemployed 46
Total
75
TABLE 3 – GENDER COMPOSITION
GENDER
FREQ. PERCENT
Male
27
36.00
Female
48
64.00
Total
75
100.00
PERCENT
22.67
4.00
1.33
1.33
9.33
61.33
100.00
CUM.
22.67
26.67
28.00
29.33
38.67
100.00
CUM.
36.00
100.00
As far as the choice indicator is concerned, only 5 users out of 45
refused to adopt the “Quick settings” representing a very low share of the total
(11.11%). Once again, we find many differences in the probability to select
different options and these differences are going to be discussed referring to
users’ demographics.
Finally, the overall value on the System Usability Scale is 60.33,
meaning that the overall usability of the installation procedure is slightly lower
compared to the norm (fixed at 66).
Metrics and users’ demographics are analysed in two steps. First,
ceteris paribus correlations are run between a set of dependent variables
(selection of fast settings, modification of privacy and security settings, rereading of the final disclosure) and a set of possible explanatory variables
(age, gender, qualification, using Windows, having attended at least one
law course at the university, being able to answer the question about data
processing, having expressed fear for the security of their data while surfing
the Internet). The four models are then re-estimated following bootstrapping
procedures in order to approximate the confidence intervals of the estimators
in the impossibility of knowing the distribution of the variables (given the few
35
2. The experiment: description, data collection and analysis, findings
observations available) and thus perform statistical significance tests 42. To
sum up: there are no chance predictors that the subject will accept the fast
settings or that he/she will modify the data security settings. However, the
following predictors appear statistically significant:
a. Females show a lower propensity to change the privacy settings
compared to men (-30.5%);
b. Individuals of the younger age group (15-24) show a greater
propensity to change the privacy settings compared to the age groups 25-34
and 34-45 (33.45%);
c. Having attended law courses negatively affects the probability that
the user re-read the privacy statement with a percentage of 39.31% (probably
because the user feels prepared on the subject and does not believe that he/
she has something new to learn from reading such statements). The estimate
is 95% significant.
d. An extended use of using Windows as an operating system
negatively affects the propensity to re-read the disclosure by a percentage of
24.87%, even if the significance of the estimate falls to 90% in this case.
2.1.3.2 DESCRIPTIVE STATISTICS 43
The tables below show correlations within the final questionnaire:
Correlation Tables A and B show Pearson Correlation Coefficients
between individual characteristics and a subset of questions included in the
questionnaire (questions that are not correlated with individual characteristics
are omitted in the tables).
1. Age is correlated to the following questions:
– Question AC “I think it is useful to allow the app to access my name,
my image and other info on my account”;
– Question AD “I think it is useful to allow apps to automatically
share and sync my information with wireless devices that are not explicitly
associated with my PC, tablet or phone”;
– Question AF “Do you think that the information on privacy and
personal data has been presented clearly?”;
– Question AG “When completing the installation of W10, did you
clearly understand that by clicking on the “Doing quicker” button at the bottom
of the page 1 you declared that you consent to the use of your personal data
and cookies?”.
42
Estimates from these models are included in the Appendix.
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
43
36
2.1 The experiment
2. Education is correlated to the following questions:
– Question AG “When completing the installation of W10, did you clearly
understand that by clicking on the “Doing quicker” button at the bottom of the page,
you declared that you consent to the use of your personal data and cookies?”;
– Question AF “Do you think that the information on privacy and
personal data has been presented clearly?” (also correlated with Age).
TABLE 4 – CORRELATION TABLE A – PEARSON CORRELATION COEFFICIENTS
BETWEEN INDIVIDUAL CHARACTERISTICS AND ANSWERS TO QUESTIONS AC-AG
AGE
GENDER
EDUCATION
OCCUPATION AC
AD
AE
AF
AG
AGE
1.000
GENDER
0.1461
1.000
EDUCATION
0.3487
0.0870
1.000
OCCUPATION
0.4161
-0.1374
0.6032
1.000
ANSWERS
TO QUESTION AC 0.8348
0.0677
0.6647
0.6284
1.000
ANSWERS
TO QUESTION AD 0.8582
0.1594
0.5876
0.6254
0.9347 1.000
ANSWERS
TO QUESTION AE 0.5846
-0.0736
0.6520
0.7224
0.8328 0.7388 1.000
ANSWERS
TO QUESTION AF 0.7386
-0.0059
0.6936
0.6750
0.9261 0.8668 0.8925 1.000
ANSWERS
TO QUESTION AG 0.8131
0.1009
0.6530
0.6755
0.9424 0.9464 0.8186 0.8793 1.000
TABLE 5 – CORRELATION TABLE B – PEARSON CORRELATION COEFFICIENTS
BETWEEN INDIVIDUAL CHARACTERISTICS AND ANSWERS TO QUESTIONS AV-AW
AGE
GENDER
EDUCATION
OCCUPATION AC
AD
AGE
1.000
GENDER
0.1696
1.000
EDUCATION
0.4129
0.1049
1.000
OCCUPATION
0.3688
-0.1374
0.5916
1.000
ANSWERS
TO QUESTION AV 0.5362
0.0270
0.7806
0.7593
1.000
ANSWERS
TO QUESTION AW 0.5262
-0.0090
0.7596
0.7813
0.9730 1.000
37
2. The experiment: description, data collection and analysis, findings
3. Both Education and Occupation are correlated to the following
questions:
– Question AE “During the installation of Windows 10 (W10), did you
understand if there were any steps related to privacy and the protection of
your personal data?”;
– Question AV “Do you feel able to manage your personal data on
Windows 10?”;
– Question AW “Do you find that Windows 10 makes it easier to set up
your PC’s security, content and personal data?”.
It is important to stress out how these paired correlations are obtained
by considering variables only two by two, and omitting all other variables,
making it necessary to control in a ceteris paribus estimate whether the impact
on the probability to agree (or disagree) on a given issue/question can be seen
as consistent. For instance, an high correlations between age and certain
opinions might be due to the asymmetric composition of the sample, and so
education and occupation also should be checked. In fact, older individuals
tend to possess higher educational achievements and more demanding jobs,
which could affect the way that the user perceives privacy issues. Also an
high correlation between age and opinions could be upwardly biased due to
the omission of education and occupation in the paired correlation.
Tables 6 and 7 show results for Questions AC and AD in the questionnaire.
Quite strikingly, the vast majority of respondents (about 77% in both cases) did
not feel they needed to share and synchronize personal information (such as
name and other characteristics) with the app and/or other wireless devices.
This low propensity to share personal information is translated into
consistent results when this is referred to the users’ awareness of privacy
issues during the installation processes. Table 8 shows how nearly three
quarters of the respondents felt self-confident about the identification of
the steps related to privacy and protection of personal data, while half of the
sample was not satisfied with the presentation of privacy issues proposed by
the app in terms of clarity (Table 9).
TABLE 6 – ANSWERS TO QUESTION AC – I think it is useful to allow
the app to access my name, my image and other info on my account
(Scale 1-5: Totally Disagree – Totally Agree)
1 (Disagree) – 5 (Agree)
FREQ. PERCENT CUM.
1
18
24.32
24.32
2
21
28.38
52.70
3
18
24.32
77.03
4
13
17.57
94.59
5
4
5.41
100.00
Total
74
100.00
38
2.1 The experiment
TABLE 7 – ANSWERS TO QUESTION AD – I think it is useful to allow apps
to automatically share and sync my information with wireless
devices that are not explicitly associated with my PC, tablet or phone
(Scale 1-5: Totally Disagree – Totally Agree)
1 (Disagree) – 5 (Agree)
FREQ. PERCENT CUM.
1
29
38.67
38.67
2
19
25.33
64.00
3
10
13.33
77.33
4
12
16.00
93.33
5
5
6.67
100.00
Total
75
100.00
On the other hand, Table 10 shows how only a minority (slightly higher
than 30%) of the respondents stated to have correctly understood a specific
privacy setting which was not explicitly presented during the process, and
associated with a shortcut.
This could seem to be at odds with previous evidences concerning the
high degree of attention paid by users to privacy issues. However, the link
between this misunderstanding of implicit privacy policies (Question AG) and
the high interest associated with explicit privacy choices (Questions AC, AD
and AE) can be identified with the very low degree of satisfaction stated by
users regarding the clarity which such policies and choices were presented
with (Question AF). Finally, Tables 11 and 12 confirm the high degree of selfTABLE 8 – ANSWERS TO QUESTION AE – During the installation of Windows10
(W10), did you understand if there were any steps related to privacy
and the protection of your personal data?”
(Scale 1-5: Totally Disagree – Totally Agree)
1 (Disagree) – 5 (Agree)
FREQ. PERCENT CUM.
1
5
6.67
6.67
2
6
8.00
14.67
3
10
13.33
28.00
4
22
29.33
57.33
5
32
42.67
100.00
Total
75
100.00
TABLE 9 – ANSWERS TO QUESTION AF “Do you think that the information
on privacy and personal data has been presented clearly?”
(Scale 1-5: Totally Disagree – Totally Agree)
1 (Disagree) – 5 (Agree)
FREQ. PERCENT CUM.
1
3
4.00
4.00
2
11
14.67
18.67
3
23
30.67
49.33
4
22
29.33
78.67
5
16
21.33
100.00
Total
75
100.00
39
2. The experiment: description, data collection and analysis, findings
confidence showed by respondents when reference was made to the ability to
go through the privacy settings.
When running ordered probit regression between individual
characteristics and the answers to the questions of the questionnaire, the
statistical software deployed in this study (STATA) fails to fit the model for
some questions. This is the case for Questions AV, AW, AD and AF. This can be
due to the fact that there is insurmountable skewness within the sample for
certain individual characteristics and further analysis should be carried out
by transforming such questions into dummy variables (where the dependent
variable equals 1 for positive answers and 0 otherwise). However, some useful
evidence regarding the other three variables still emerges.
Table 13 shows estimates of correlation between individual
characteristic and answers to Question AC – “I think it is useful to allow the
app to access my name, my image and other info on my account” (Scale 1-5:
Totally Disagree – Totally Agree).
TABLE 10 – ANSWERS TO QUESTION AG “When completing the installation
of W10, did you clearly understand that by clicking on the “Doing quicker”
button at the bottom of the page, you declared that you consent to the use
of your personal data and cookies?”
(Scale 1-5: Totally Disagree – Totally Agree)
1 (Disagree) – 5 (Agree)
FREQ. PERCENT CUM.
1
24
32.00
32.00
2
18
24.00
56.00
3
10
13.33
69.33
4
17
22.67
92.00
5
6
8.00
100.00
Total
75
100.00
It is necessary to say a few words concerning the interpretation of the
coefficients. Given that the dependent variable is ordinal and discrete, the sign
of the correlation coefficient (first column in the Table) indicates whether a
given characteristic (e.g. Age) increases or decreases the probability to agree
with the proposition presented in the question. In this case, age is positively
correlated with the dependent variable, which means that the older the user,
the more likely he/she will agree with the statement. In this model, the only
coefficients that are statistically significant are associated with Age and
Education and are both positive. As a consequence, we can conclude that older
and more educated individuals are more likely to appreciate the fact that the
app accesses their names, images and other personal data.
40
2.1 The experiment
TABLE 11 – ANSWERS TO QUESTION AV “Do you feel able to manage
your personal data on Windows 10?”
(Scale 1-5: Totally Disagree – Totally Agree)
FREQ. PERCENT CUM.
No
32
42.67
42.67
Yes
43
57.33
100.00
Total
75
100.00
TABLE 12 – ANSWERS TO QUESTION AW “Do you find that Windows 10
makes it easier to set up your PC’s security, content and personal data?”
(Scale 1-5: Totally Disagree – Totally Agree)
FREQ. PERCENT CUM.
No
31
41.33
41.33
Yes
44
58.67
100.00
Total
75
100.00
TABLE 13 – ORDERED PROBIT REGRESSION WITH ANSWERS TO QUESTION AC
AS DEPENDENT VARIABLE
QUESTION AC
COEF.
STD. ERR.
Z
P>Z
[95% CONF. INTERVAL]
AGE
8.232838
1.085398
7.59
0.000
6.105498
GENDER
-.9552995 .6753561
-1.41
0.157
-2.278973 .3683742
EDUCATION
.9758989
.4172403
2.34
0.019
.1581229
1.793675
OCCUPATION
-.7201583
.2213635
-3.25
0.001
-1.154023
-.2862938
OBS
74
10.36018
Wald Chi2 Prob>Chi2 Log Pseudolikelihood Presudo R2
158.77
0.0000
-11.318752
0.8986
Table 14 presents estimates of correlation between individual
characteristics and the extent to which users are aware of the fact that there
are steps related to privacy and the protection of personal data during the
installation process. The only significant coefficient is associated with Age,
suggesting that older individuals are more aware of privacy issues when they
are installing the application.
The only significant coefficient in the regression displayed in Table
15 is associated with occupation. This variable has been operationalized
based on the Italian Occupational Classification, called NUP. The structure
of such classification allows us to treat this coefficient as a discrete ordinal
variable. It is important to stress out that professions are ranked according
to the international standard classification of occupations (ISCO) and
divided in 8 different major groups, on the basis of the cognitive content and
experience required to perform constituent tasks of a given job. As a result,
41
2. The experiment: description, data collection and analysis, findings
profession allocated to Major Groups 1, 2 and 3 can be considered as highly
skilled while those ranked in the higher groups (Group 4, 5, 6, 7 and 8) are
progressively less demanding or low skilled. In this case, we have added an
additional grouping (Group 9) that accounts for students, retired workers
and unemployed people, thus leaving unaltered the underlying structure of
such variable. Consequently, the negative relationship between the degree
of agreement and the Major Groups of the NUP classification clearly point
out that highly skilled workers are more likely to notice that triggering the
shortcut during the installation process has clear implication in terms of
privacy: “you consent to the use of your personal data and cookies”. Users
employed in medium and low skilled occupations seem to be less aware of
this trade-off between the ease of a complex process (the installation) and
the level of protection of personal data.
TABLE 14 – ORDERED PROBIT REGRESSION – DEPENDENT VARIABLE:
ANSWERS TO QUESTION AE
QUESTION AE
COEF.
STD. ERR.
Z
P>Z
[95% CONF. INTERVAL]
AGE
6.491957
.3868772
16.78
0.000
5.733692
7.250222
GENDER
.1537822
.5505986
0.28
0.780
-.9253711
1.232936
EDUCATION
.0338212
.2531654
.5300164
OCCUPATION
-.0943916 .1727589
OBS
74
0.13
0.894
-.462374
-0.55
0.585
-.4329929 .2442097
Wald Chi2 Prob>Chi2 Log Pseudolikelihood Presudo R2
336.10
0.0000
-17.530296
0.8299
TABLE 15 – ORDERED PROBIT REGRESSION – DEPENDENT VARIABLE:
ANSWERS TO QUESTION AG
QUESTION AG
COEF.
STD. ERR.
Z
P>Z
[95% CONF. INTERVAL]
AGE
28.87879
29.68761
0.97
0.331
-29.30786 87.06544
GENDER
.0326724
.7313543
0.04
0.964
-1.400756 1.4661
EDUCATION
.0784844
.3136995
0.25
0.802
-.5363554 .6933242
OCCUPATION
-4.327373 1.842421
-2.35
0.019
-7.938452 -.7162939
OBS
74
42
Wald Chi2 Prob>Chi2 Log Pseudolikelihood Presudo R2
.
.
-10.12312
0.9109
2.1 The experiment
2.1.3.3 USERS’ AWARENESS 44
In order to assess the extent to which users are aware of how their
choices during the installation process will affect future use of their personal
data, a subsample of the original 75 users went through an extended
questionnaire, which included additional questions related to the legal
effectiveness of their choices and the legal content of each step of the process.
Tables 16, 17, 18 and 19 summarize the main characteristics of this subsample
of 45 users.
TABLE 16 – AGE GROUPS IN THE SUBSAMPLE
AGE GROUPS FREQ. PERCENT CUM.
15 – 24
31
68.89
68.89
25 – 34
9
20.00
88.89
35 – 44
2
4.44
93.33
45 – 54
1
2.22
95.56
55+
2
4.44
100.00
Total
45
100.00
TABLE 17 – GENDER COMPOSITION OF THE SUBSAMPLE
GENDER
FREQ. PERCENT
Male
15
33.33
Female
30
66.67
Total
45
100.00
TABLE 18 – EDUCATIONAL BACKGROUND
EDUCATIONAL BACKGROUND
FREQ.
Undergraduate
28
Bachelor Degree
4
Master’s Degree
12
PhD
1
Total
45
PERCENT
62.22
8.89
26.67
2.22
100.00
TABLE 19 – OCCUPATIONS
OCCUPATION
Professionals
Technicians and Associate Professionals
Clerks and Service Workers
Students
Retired
Unemployed
Total
FREQ.
2
3
2
36
1
1
45
CUM.
62.22
71.11
97.78
200.00
PERCENT
4.44
6.67
4.44
80.00
2.22
2.22
100.00
CUM.
4.44
11.11
15.56
95.56
97.78
100.00
44
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
43
2. The experiment: description, data collection and analysis, findings
Tables 20 and 21 show results for Question 48 and 49 in the extended
questionnaire, both focused on the purpose/effectiveness of the consent
to process personal data but with different goals. In fact, while Q48 asked
respondents to define what is the legal purpose of this consent, Q49 asked
respondents whether they thought that this consent was going to effectively
protect their personal data. Quite strikingly, high portions of the subsample
were not able to indicate the correct purpose of the consent (51.11%) and had
no idea whether the consent had actual implications on the use of their data
(44.44%).
TABLE 20 – RESULTS FOR Q48 – Question 48: “What is the purpose of giving
consent to the processing of personal data?”
FREQ. PERCENT
Wrong answer
23
51.11
Right answer
22
48.89
Total
45
100.00
TABLE 21 – RESULTS FOR Q49 – Question 49: “Do you think that your consent
to data processing effectively protects your personal data?”
FREQ. PERCENT
Missing value
1
2.22
No
12
26.67
Yes
12
26.67
Doesn’t know
20
44.44
Total
45
100.00
Tables 22 and 23 show results respectively for Questions 52 and 55 of
the questionnaire. In this case, respondents were asked to what extent they
feared to face negative consequences arising from the unauthorized processing
of their data and whether they thought that these negative consequences
may be avoided by simply denying their consent to process such data. Even
if the vast majority of respondents (66.67%) stated to be worried about the
consequences of unauthorized processing, only one out of five (17.78%)
thought that his/her data could be processed even if they did not express their
consent. From a usability perspective, at this point of the installation process,
this may imply that users tended not to grasp what interests were at stake
when the consent quoted in Question 48 was either delivered or denied. In fact,
even if both size and composition of the subsample do not allow for inferences
concerning the relationships between different choices and different user’s
profiles, when reference is made to the usability perspective it is clear that a
relatively small number of users is enough to highlight possible bottlenecks in
44
2.1 The experiment
the system. In this case, a worryingly high percentage of users do not seem
to have fully understood, firstly, what the real purpose of consent was and,
secondly, what the possible consequences of consent were. This makes us
reflect even more, if we breakdown results from Q21 (“Do you think that the
main operating systems such as Windows 10, Windows 7, Windows 8 and El
Capitan use your personal data in compliance with Personal Data Protection
Code – Legislat. Decree no. 196 of 30 June 2003?”) with responses to Q52:
two thirds of those who stated to fear unauthorized processing also stated to
be confident that the main operating systems use personal data in compliance
with the Italian Protection Code.
TABLE 22 – RESULTS FOR Q52 – Question 52: “Do you think there may be
negative consequences for you arising from the use of your personal data
without your consent?”
FREQ. PERCENT
Missing value
11
24.44
No
4
8.89
Yes
30
66.67
Total
45
100.00
TABLE 23 – RESULTS FOR Q55 – Question 52: “Do you think that denying
consent to the use of data could limit those negative consequences?”
FREQ. PERCENT
Missing / doesn’t know
13
28.89
No
8
17.78
Yes
24
53.33
Total
45
100.00
Tables 24 and 25 show results respectively for Questions 57 and 58. In
this case, respondents were asked to assess their knowledge regarding both
cookies and data profiling. It is interesting to notice how, at this stage of the
process, when the installation is completed, the majority of users reckoned
not to know what cookies (68.89%) and profiling (60%) were. Once again, from
the usability perspective, this implies that users were not delivering conscious
choices and that there is room for improvements in the way they are informed.
At this stage, we are not able to assess if this information bottlenecks are due
to a lack of clarity associated with the system, or to a lack of attention paid
by users themselves. However, the data clearly show that, before asking them
to make a choice, no adequate information concerning the purpose, aim and
consequences of the consent is delivered to users by either attracting their
attention and/or explicitly explaining more in-depth implications.
45
2. The experiment: description, data collection and analysis, findings
TABLE 24 – RESULTS FOR Q57 – Question 57: “Do you think that denying
consent to the use of data could limit those negative consequences?”
FREQ. PERCENT
Missing / doesn’t know
10
22.22
No
31
68.89
Yes
4
8.89
Total
45
100.00
TABLE 25 – RESULTS FOR Q58 – Question 58: “Do you know what data profiling is?”
FREQ. PERCENT
Missing
13
28.89
No
27
60.00
Yes
5
11.11
Total
45
100.00
TABLE 26 – CORRELATION MATRIX - PEARSON CORRELATION COEFFICIENTS BETWEEN INDIVIDUAL
CHARACTERISTICS AND ANSWERS TO QUESTIONS 48, 49, 52, 55, 57 AND 58
AGE
GENDER
EDUCATION OCCUPATION Q48
Q49
Q52
Q55
Q57
AGE
1.000
GENDER
0.0941
1.000
EDUCATION
0.5544
0.0167
OCCUPATION
-0.4263 -0.3147
-0.5388
1.000
Q48
-0.0769 -0.0629
-0.0547
-0.1174
1.000
Q49
-0.0858 -0.1308
-0.0374
0.0732
-0.3273 1.000
Q52
-0.1382 0.3034
0.0905
-0.1119
0.0000
0.1633
1.000
Q55
-0.0401 0.2335
0.2924
-0.2645
0.1085
0.1642
0.2725
Q57
0.2154
-0.4973
-0.1349
0.2055
-0.1494 -0.0394 -0.2533 -0.2970 1.000
Q58
0.1525
-0.1137
0.3886
-0.0136
0.0861
1.000
-0.0172
1.000
-0.0546 0.1839
0.3523 1.000
Correlation Table 26 shows Pearson Correlation Coefficients between
individual characteristics and a subset of questions included in the extended
questionnaire (questions that are not correlated with individual characteristics
are omitted in tables).
46
Q58
2.1 The experiment
Answers to Question 57 are correlated to the following individual
characteristics:
– Gender.
It is somewhat surprising to notice how answers to the different
additional questions included in the extended questionnaire show only poor
correlations between them. This is particularly true for questions 48 and 49.
We indeed expected to notice a correlation between respondents who were
able to provide a correct definition of the consent quoted in question 48, and
those who were able tell if the consent could effectively protect their data.
As a result, we found out that the overall degree of awareness of users could
be overestimated in results of Question 48’s results. Pointing out the correct
answer within a list of possible alternatives can in fact be due to a lucky strike
rather than to a deep knowledge of the subject, especially if respondents seem
to be unable to draw the appropriate operational consequences from their
“correct” reply.
TABLE 17 – ORDINARY LEAST SQUARES REGRESSION – DEPENDENT VARIABLE:
ANSWERS TO QUESTION 57
QUESTION 57
COEF.
STD. ERR.
Z
P>Z
[95% CONF. INTERVAL]
AGE GROUP
.1708717
.0651595
2.62
0.014
.0377982
.3039451
GENDER
-.2935195 .1032323
2.84
0.008
-.504348
-.0826909
.060498
EDUCATION
-.0712368
.064504
-1.10
0.278
-.2029715
OCCUPATION
.0401042
.0362702
1.11
0.278
-.0339694 .1141777
_CONS
.1213823
.6332162
0.19
0.849
-1.171818
OBS
35
F Stat
0.0032
R-squared
0.4018
1.414582
Adj. R-sq.
0.3220
Once again, Table 26 shows paired correlations obtained by
considering variables only two by two and omitting all other variables. It
is necessary to assess the extent to which these correlations remain
statistically significant ceteris paribus. Thus, we run a OLS regression with
Answers to Q57 as dependent variable and all individual characteristics as
covariates. The only significant coefficients are those related to Age and
Gender, suggesting that being female decreases the probability to know
what cookies are (-29.35%) while passing from an age group to the upper
one increases this probability (+17.08%). However, given the size and
composition of the subsample these correlations can hardly be taken for
granted and do not allow for inferences.
47
2. The experiment: description, data collection and analysis, findings
2.1.4 BIOMETRICS AND USERS’ CHARACTERISTICS 45
2.1.4.1 BIOMETRICS AND USERS – DESCRIPTIVE STATISTICS 46
In order to assess the extent to which users are aware of how their
choices during the installation process will affect future employment of their
personal data, it is useful to have a closer look at users’ performances and thus
run an in-depth analysis of biometrics, choices and users’ characteristics.
In order to do so, both descriptive statistics and regressions are presented
in this section with the aim of reaching a better understanding of users’
different behaviour patterns and their linking with demographics and user
characteristics. The average completion time was 38.34 seconds, while it took
less than 30 seconds for more than a half of the sample to complete the task
(58%) and three quarters of the sample dedicated less than one minute to the
task (see Table 28). At first sight, users are not spending much time reading,
modifying and then re-reading disclaimers and settings (Privacy, Security,
etc.). This is why it can be interesting to explore how this low propensity to
read and think before completing the task, varies across the sample.
Descriptive statistics can be used as a preliminary frame for a more indepth analysis. It is possible to have a look at how possible determinants, such
as Gender, Age, Educational Background and the personal OS used at home,
cut down performances and choice indicators. In this case, performances and
choice indicators are measured as dummy variables (1 = Yes; 0 = No) while the
different average scores for the subgroups identified by users’ characteristics
can be interpreted as percentages. For instance, Table 29 shows how 78% of
the sample has modified Privacy settings whereas just the 69.70% of females
has done so. Tables 29-32 show performance and choice indicators against
users’ characteristics.
Preliminary evidences from descriptive statistics that will be tested
with OLS regressions in the next subsection are the following:
a) Gender explains different behavioural choices concerning Privacy:
less females (-24.42%) modified Privacy settings (Table 29);
b) Educational Background explains different behavioural choices
concerning awareness: fewer users (-24.42%) have re-read the disclosure
45
The authors of this paragraph are Emanuele Garzia, PhD student in Humanities and
Technologies: an integrated research path at Suor Orsola Benincasa University of Naples, Dr. Luca
Cattani, postdoctoral research fellow at the Department of Legal studies University of Bologna, Drs.
Federica Protti, UX designer and Prof. Simona Collina, Associate Professor in Experimental Psychology
at Suor Orsola Benincasa University of Naples and Prof. Roberto Montanari, Professor of Cognitive
Ergonomics and Interaction Design at Suor Orsola Benincasa University of Naples.
46
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
48
2.1 The experiment
amongst those who sustained at least one in law exam in a graduate course
(Table 30);
c) Age explains different behavioural choices concerning Privacy:
younger users (+7,19%) modified Privacy settings more often than the overall
average in the sample (Table 31);
d) The personal OS that subjects use at home explains different
behavioural choices concerning awareness: fewer users (-11,8%) have re-read
the disclosure amongst those who usually use Windows compared to those
who use Apple Mac OSX (Table 32).
TABLE 28 – Absolute, percent and cumulative frequencies
for completion times
COMPLETION TIME (SEC)
FREQ. PERCENT CUM.
3
1
2
2
4
1
2
4
5
5
10
14
7
1
2
16
8
3
6
22
9
1
2
24
10
3
6
30
14
1
2
32
15
1
2
34
17
1
2
36
20
1
2
38
22
2
4
42
24
1
2
44
27
2
4
48
28
1
2
50
29
1
2
52
30
3
6
58
31
1
2
60
37
1
2
62
38
1
2
64
41
1
2
66
44
1
2
68
46
1
2
70
50
2
4
74
55
1
2
76
62
1
2
78
64
1
2
80
70
1
2
82
74
1
2
84
84
1
2
86
86
1
2
88
90
1
2
90
91
1
2
92
96
1
2
94
105
1
2
96
133
2
4
100
TOTAL
50
100
49
2. The experiment: description, data collection and analysis, findings
TABLE 29 – USERS MODIFYING PRIVACY SETTINGS AND GENDER
GENDER
PRIVACY
F
69,70%
M
94,12%
Total
78,00%
TABLE 30 – USERS RE-READING THE DISCLOSURE AND EDUCATIONAL
BACKGROUND (LEGAL)
EDUCATIONAL BACKGROUND (LEGAL)
READ
No
53,33%
Yes
28,57%
Total
36,00%
TABLE 31 – USERS MODIFYING PRIVACY SETTINGS AND AGE
AGE GROUP
PRIVACY
15 – 24
85,19%
25 – 34
76,47%
35 – 44
33,33%
45+
66,67%
Total
78,00%
TABLE 32 – USERS RE-READING THE DISCLOSURE AND USUAL OS
USUAL OS
READ
Apple Mac OSX e iOS
52,94%
Android
0,00%
Apple Mac OSX e iOS
0,00%
Linux
0,00%
Microsoft Windows
10,00%
Microsoft Windows
41,18%
Nessuno
100,00%
2.1.4.2 BIOMETRICS AND USERS – ANALYSIS 47
As far as the analysis of biometrics data is concerned, estimates
have been calculated in two separate steps in order to test their steadiness.
When the focus of the study shifts from usability issues to the description of
the population in the sample, in fact, there is room for the adoption of more
sophisticated estimation strategies in order to allow statistical tests onto the
significance of the regressors.
As anticipated, a ceteris paribus correlation is tested between a set of
four dependent variables (Selection of fast settings, modification of Privacy
47
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
50
2.1 The experiment
and Security settings, Re-reading of the final disclosure) and a set of possible
explanatory variables (age, gender, qualification, using Windows, having
attended at least to one law course, being able to answer the question about
data processing, having expressed fear for their data safety while surfing the
Internet). In a second step, the four models are then re-tested by bootstrap,
in order to approximate the confidence intervals of the estimators postulating
the impossibility of knowing the distribution of the variables (given the
few observations available) and perform the actual test on the statistical
significance of the same. To sum up: there are no predictors of the subject’s
chances to accept the fast settings or to modify the data Security settings.
However, the following predictors appear statistically significant:
a) Females show a lower propensity to change the Privacy settings
than males (-30.5%);
b) Individuals of the younger age group (15-24) show a higher
propensity to change the Privacy settings compared to the age groups 25-34
and 34-45 (+33.45%).
Both coefficients are 95% significant after the bootstrap procedure.
c) Having taken legal examinations negatively affects the probability
that the subject will re-Read the privacy statement (-39.31%). This may be due
to the fact that individuals with a legal background do feel prepared on the
subject and self-confident thus skipping this part they consider as redundant.
The estimate is 95% significant.
d) Usually using Windows as an operating system negatively
affects the propensity to Re-read the disclosure by about 24.87%, even if the
significance of the estimate falls in this case to 90%. However, this evidence is
not confirmed by bootstrapping robustness checks.
Table 33 and 34 show only relevant estimates from OLS and
bootstrapping models respectively. Tables with all variables and controls are
reported in Appendix A2.
51
2. The experiment: description, data collection and analysis, findings
TABLE 33 – OLS ESTIMATES OF THE DETERMINANTS OF SELECTING FAST SETTINGS,
MODIFYING PRIVACY SETTINGS, MODIFYING SECURITY SETTINGS, AND RE-READING
THE DISCLOSURE
TABLE 33 – OLS BOOTSTRAPPED ESTIMATES OF THE DETERMINANTS OF SELECTING
FAST SETTINGS, MODIFYING PRIVACY SETTINGS, MODIFYING SECURITY SETTINGS,
AND RE-READING THE DISCLOSURE
52
2.1 The experiment
APPENDIX
53
2. The experiment: description, data collection and analysis, findings
54
2.2
FROM COGNITIVE-BEHAVIORAL ANALYSIS
TO INTERACTION DESIGN
2.2.1 SOME REFLECTIONS ON COGNITIVE AND BEHAVIORAL
ANALYSIS FINDINGS 48
The right to privacy represents a human need and trait, so much that
Irwin Altman defines it as culturally specific and culturally universal. Individual
boundaries can be identified in its private and public sphere (Altman, 1977)
and, according to other scholars, these two spheres are handled in different
ways, such as through separation, reservation or anonymity (Schoeman,
1984). The rules and behaviours that affect the private and public areas are
different across cultures (Moore, 1984) and change according to individual
psychologies, different societies and time and space perceptions (Solove,
2006).
In an interesting article appeared on The New Yorker, Virginia Woolf’s
idea of privacy is investigated 49. In Woolf’s book Mrs. Dalloway, Clarissa’s
character reflects on the value of personal privacy within her marriage:
«And there is a dignity in people; a solitude; even between husband
and wife a gulf; and that one must respect, thought Clarissa, watching him
open the door; for one would not part with it oneself, or take it, against his will,
from one’s husband, without losing one’s independence, one’s self-respect something, after all, priceless» 50.
Virginia Woolf emphasized the way humans appreciate that
inwardness. It is easier to understand this concept when people are forced to
48
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
49
Rothman, J. (2014). Virginia Woolf’s idea of Privacy. The New Yorker, www.newyorker.
com/books/joshua-rothman/virginia-woolfs-idea-of-privacy, consulted 31 January 2020.
50
Woolf V., ‘Mrs. Dalloway’ (Wordsworth Editions Limited 2003), p. 88.
55
2. The experiment: description, data collection and analysis, findings
moments of exposure, so that they have to fight to protect their personal life
from the outside world.
Nevertheless, information interchanges are a fundamental
characteristic of the human sociality. For example, self-esteem can be
increased (Kahneman, 2000; Tversky, 2016) and some hedonistic needs can
be met (Toma, 2013) thanks to social media and dissemination processes.
Information display of our experience on social media is a limited openness,
so that everyone can feel the strength of its private self.
Recent advances in information technology have flooded every
aspect of our personal and professional life, often making the collection and
use of personal data invisible. Our power of control over personal data has
then become inextricably linked to problems of personal choice, autonomy
and socio-economic power. It is up to everyone to balance the risks and
benefits of our own privacy choices, and to be able to exchange, in the right
proportion, loneliness for freedom, exposure for mystery and the knowable for
the unknown.
Moreover, our study has shown that the behaviour of male users
compared to the one of female users is characterized by a greater propensity
to manage privacy settings.
The propensity to manage privacy settings
In terms of data protection and data release, there are important
differences in relation to gender in the use of the Internet and related skills
(Hargittai and Shafer, 2006; Hargittai and Litt, 2013).
Consumers may either overestimate their response to regulatory
factors in hypothetical contexts due to their own beliefs, which often make
56
2.2 From Cognitive-Behavioral Analysis to Interaction Design
them more inclined to share personal informations (Ajzen et al. 2004), or
to underestimate behavioural factors when choosing hypothetical contexts
(Liberman et al. 2004; Loewenstein and Adler 1995). According to Klopfer
and Rubenstein, the benefits of privacy are subject to interpretation only in
“economic terms”. However, other authors believe that benefits also include
the objective costs of disclosure and the user’s preferences (Mullainathan
and Thaler 2000; Simon 1959). In Westin’s opinion (2000), most consumers
weigh the privacy of various commercial and governmental programs that
require personal information in a balanced way, carefully assessing the value
for themselves and for the wider society.
There is a certain discrepancy, known as the “privacy paradox”,
between attitudes and behaviour. People who claim to consider privacy
important often do not show as much concern about it in their daily behaviour
(Taddicken, 2014). Efforts to protect their privacy or not depends on the costs
and benefits that emerge in a specific situation (Klopfer and Rubenstein,
1977). It should be stressed out that the decision-making process regarding
privacy choices may also be conditioned by a misperception of such costs and
benefits, as well as social norms, emotions and certain strategies related to
a greater simplification of decisions and problems. Each of these factors can
steer behaviours in a different way, related to different attitudes (Laufer and
Wolfe, 1977; Klopfer and Rubenstein, 1977).
In 2006 Barnes understood that «adults are concerned about the
invasion of privacy, while teenagers freely give up personal information
(...) this is because teenagers are often unaware of the public nature of the
Internet» 51. Older users seem to be more aware of the value of their privacy
choices (Courtney 2008). In a research conducted by Miltgen and PeyratGuillard (2014) through various focus groups, and with a sample of young
people (15-24 years) and adults (25-70 years) from seven European countries,
they found that people aged 45-60 better perceive privacy risks and are much
more afraid of identity theft than younger people.
According to Olphert, Damodaran, and May (2005) privacy
concerns were a central barrier to older adults’ uptake of the internet.
Therefore, older people would seem to tend to perceive their digital literacy
as lower than that of younger users. It is possible that, even if some older
adults may be willing to interact on social media, their understanding of the
convenience of these sites is so limited (Quan-Haase, Williams, Kicevski,
Elueze, & Wellman, 2018) that they may be at a greater risk, especially
51
Barnes SB, ‘A privacy paradox: Social networking in the United States’ (2006) 11 (9)
First Monday.
57
2. The experiment: description, data collection and analysis, findings
since adults do not have as much expertise posting and interacting on
social media and adjusting their privacy settings.
Our study reveals that the younger age group (15-24) and (25-34)
show a greater propensity to change the privacy settings compared to the
others. According to Dhir, Torsheim, Pallesen, Andreassen (2017), young
people seem to be more able to understand privacy issues due to their greater
experience with digital media.
The propensity to change privacy settings based on age
According to different researches, consumer behaviour is related to
his/her own confidence in terms of risk perception (Kim et al. 2008; Vance et
al. 2008). If businesses show to be able to provide information correctly, the
consumers’ confidence into that same business will increase, with the relative
reduction of their concerns about privacy and disclosure risks (Culnan and
Armstrong, 1999).
For example, companies that gather consumer privacy information
often highlight privacy changes over time, and sometimes notify their
consumers of privacy protection alterations, so that the data they acquire
become clearer than those of their competitors.
Industry self-regulation and government regulation reduces consumer
privacy concerns, decreasing the perceived risk of participation in locationbased services and increasing consumers’ intention to disclose personal
information (Xu et al., 2009; Xu et al., 2012).
Our study shows that only 24.42% of those who have attended law
courses read the privacy policy. Those with a legal background (or those who
have attended at least to one law course) are 39.31% less likely to read the
58
2.2 From Cognitive-Behavioral Analysis to Interaction Design
privacy policy than those who have never taken an exam. It is possible to
conclude that attending to law courses reduces the likelihood of reading the
privacy policy by 40%.
Background differences in reading the privacy policy
This behaviour would appear to be reminiscent of the “privacy paradox”
mentioned earlier, which states that individuals often act in contradiction
to their preferences or attitudes, giving away confidential information in
exchange for some comfort or benefit (Pavlou, 2011).
Prominent economists and psychologists have understood that our
decisions are not entirely rational, and that prejudice plays a very important
and sometimes paradoxical role in making decisions, although we are unaware
of it (Lowenstein, 1996; Kahneman et al., 1982, 1991). We therefore speak
of “delimited rationality”, since that the assumption of perfect rationality
sometime does not adequately grasp all the possible nuances of human
behaviour (Acquisti and Grossklags, 2005).
Under the conditions of the likelihood of an high elaboration, the
person’s cognitive activity is aimed at conveying the information about a topic
in a persuasive appeal, in order to evaluate the merits of a recommendation
and thereby identify the most veridical position on an issue (see Elaboration
Likelihood Model 52).
52
The process of persuasion was described by the ELM (Elaboration Likelihood Model)
developed by Richard E. Petty and John T. Cacioppo. There are two paths of the persuasion process:
the first path is defined “central”, the second path is defined “peripheral” Richard E. Petty and John T.
Cacioppo, 1986.
59
2. The experiment: description, data collection and analysis, findings
Although many academic disciplines (such as law, economics,
psychology, management, marketing, political and social sciences) have
deeply explored the concept of privacy and produced a large amount of
research about it, there is no unanimous agreement on the concept’s
definition.
A common thread in all the various dissertations is that it is worth
asking whether the individual, group or society is satisfied with their privacy
needs, and what can be done in order not to compromise the common good of
human relationships. The studies and the results of the experiment that have
been carried out reveal three main problematic areas that will be now listed
and addressed below:
1. people seem to tend not to read privacy notices carefully;
2. the user consent;
3. privacy is considered a no task related approval.
2.2.1.1 PEOPLE SEEM TO TEND NOT TO READ PRIVACY
NOTICES CAREFULLY 53
The privacy notices aim to provide users with useful information
regarding their privacy choices, and how personal data will be treated by
the business/company/website (the collection process, the processing
process and the process of sharing personal information). Most users
would tend not to read the privacy notices in detail (see table 28, table 30,
table 33) and there are several possible explanations. Among them the
first assumption can be made on the base of the scientific literature and
theories reported as follows.
For example, information regarding privacy notices can be very complex,
and the lexical competence required to understand them can affect the
perception of the message (Hartzog, 2010). The use of a specialized vocabulary
can lead to poor understanding and difficulty during decision-making (Balebako
et al., 2014, Leon et al., 2012). Surely users need clearer privacy notices.
When the information is lacklustre or too thorough, the “accumulation
problem” described by Ben Shahar and Schneider (2011) can ocurr, which is
the difficulty in digesting a large amount of information.
Privacy notices are often quite hard to read (and even more difficult to
understand), and according to Hartzog’s (2010) study, people appear to have
difficulty reading and interpreting privacy notices, a fact which leaves the users
without adequate support in making rational decisions about their privacy
53
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
60
2.2 From Cognitive-Behavioral Analysis to Interaction Design
(Milne and Culnan, 2004). In addition, privacy notices are extremely timeconsuming (McDonald and Cranor, 2008) and they are not always targeted
towards consumers’ needs or written in order that a large proportion of the
population is able to understand them, according to the average educational
level (Jensen and Potts, 2004; Milne, Culnan, and Greene, 2006).
Calo (2012) argues that the users’ commitment in reading about
how information are processed is quite limited, and when privacy notices are
particularly complex an overload of information can occur.
Also, Acquisti and Grossklags (2005) reveal how even those who
are highly concerned about their privacy tend to consent to their data being
processed in exchange for small benefits, as they do not accept to pay trivial
sums in place of giving their consent.
2.2.1.2 THE USER CONSENT 54
The user consent is fundamental to operate online. So reading privacy
policy becomes essential before giving the consent, even though many users
tend to not read it in order to spare time (see table 10, table 11, table 20, table
21, table 22, table 23, table 24).
A user’s activity on a website, or app, or smart device may be recorded
after the user has chosen to accept its consent; in the event of him/her refusing
to give consent, the user waives the opportunity to take full advantage of the
activity that was intended (Schwartz et al., 2009).
In addition to the default settings, websites may also have specific
features that induce frustration among the users and consequently arouse
disclosure of personal informations (Hartzog, 2010). This practice has been
called “malicious interface design” (Corti et al., 2010). This also relates to
the fact that individuals tend to manage privacy preferences according to
context and experiences, or on the base of cultural or motivational criteria
(Marx, 2001). Because of this, the data shared by the users can be used to
influence their emotions, thoughts and behaviour in many different aspects
of their lives.
Therefore, users accept to give informed consent in order to be able to
use service completely (Skotko et al., 1977), especially when consent requests
are presented at inappropriate times for the user (Spiekermann et al., 2001).
Government regulations seem to reduce consumers’ concern and
increase their confidence (Xu et al., 2009). People are often unaware of the
exact informations they share and how it can be used.
54
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
61
2. The experiment: description, data collection and analysis, findings
Since companies may vary the terms concerning data processing at
any time, sometimes the effort of reading and understanding the data is futile.
2.2.1.3 PRIVACY IS CONSIDERED A NO TASK RELATED APPROVAL 55
The policy makers tend to protect the privacy of individuals through
informed and rational decision-making policies which include a basic
framework of protection, such as the principles embedded in the so-called
“fair information practices”. Hence, individuals should be protected and
safeguarded, in order to mitigate the great imbalance of power that exists
between people (consumers and citizens) and data holders (governments,
companies and societies) who currently have an advantageous position.
However, as described above, our data show that, in order to fully carry
out an activity, the user tends to accept informed consent, giving priority to the
imminent task which requires the service to be completed. Practically speaking,
the user will accept all interactions related to the privacy of the interface or system
in use mostly because he/she wants to start using the systems as soon as possible
(see table 7, table 8, table 9). Of course this is not the only reason, and many other
factors interfere in guiding decisions on consent approval or disapproval. However,
it is possible to design a proper solution to the problem, with the main focus of
improving the users’ awareness regarding this decision phase. In particular, it
would be interesting to link every sensitive element to an interlocking process
(Norman, 2014). An interlocking process is defined as an approval process where
the user interacts with certain sensitive elements; he/she can approve or not the
privacy for that element in that context in more conscious way.
2.2.2 FROM REFLECTIONS ON COGNITIVE AND BEHAVIOURAL ANALYSIS
FINDINGS TO INTERACTION DESIGN RECOMMENDATIONS:
INFO-VIEW AND INTERLOCKING DESIGN APPROACH 56
In this paragraph we will discuss our proposal on how to achieve an
interlocking process when a user interacts with a certain sensitive element. In
this way privacy will be linked to the user’s context and awareness.
In the light of this approach we can consider three main issues about
interactions with privacy:
1. people seem to tend not to read privacy notices carefully;
55
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
56
The authors of this paragraph are Prof. Roberto Montanari, Emanuele Garzia (PhD
student), Dr. Luca Cattani, Drs. Federica Protti, Prof. Simona Collina.
62
2.2 From Cognitive-Behavioral Analysis to Interaction Design
2. the user consent;
3. privacy is considered a no-task related approval.
The impact of the first problem could be cut down by combining the
contribution of the concept of Human Computer Interaction through a userfriendly design. The second and third problem could be avoided by combining
the approach of the first problem with the adoption of the concept taken from the
process of interlock (Norman, 2014), the so-called “interlock-guided approach”.
2.2.2.1 HCI AND INFORMATION VISUALIZATION IN SUPPORT THE USERS
2.2.2.1.1 Possible ways to support users with HCI 57
According to academic literature, as users are encouraged to make
a cost-benefit analysis between the time and effort required to read and
understand the privacy policies and any resulting benefits, the nature of these
texts and the users’ behaviours in legal text reading have yet to be examined.
For instance, the privacy policies as we have already mentioned
appear to be considered difficult to read, and understand, since they are
typically structured in long and thorough sections written in a hard legal
lexicon. Strong skills seem to be required to understand them, although
many highly educated users (possibly with a university degree) appear to
have difficulties to understand the content of these texts (Fabian et al., 2017;
Proctor et al., 2008). These policies are often named by users as “text walls”
(Passera, 2015), texts which are literally “impenetrable” to the human eye 58.
Generally, we blame the user, or rather his or her inability or ignorance,
rather than acknowledge that the privacy policy documents’ production should
be improved, in order to be closer to the users’ needs or education (Hartzog,
2018). According to one study from 2016, individuals with a college or
university education would need about 29-32 minutes to read a privacy policy
text of about 8000 words (Obar and Oeldorf-Hirsch, 2016). Texts’ length and
information overload are the main reasons why users are discouraged from
reading privacy policies, since they perceive this task as too time-consuming
and stressful, and they feel almost helpless in front of them (Hochhauser,
2003). The direct consequence of having too much information is that users
are discouraged to rationally perform the reading task, so that they end up to
“skim, freeze or choose information arbitrarily” (Calo, 2012).
Therefore, user-friendly language is needed to communicate and
spread the useful and right messages (be they legal or administrative or of
57
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
58
Article 29 Working Party, Guidelines on transparency under regulation 2016/679, 17/
EN WP260 rev.01, April 2018.
63
2. The experiment: description, data collection and analysis, findings
any other nature) in a clear and comprehensible way even to non-expert
users (Athan et al., 2015; Butt, 2012). In order to make a document more
readable, it would be useful to organize the document by dividing the text
into paragraphs and sections, and providing informative titles, perhaps using
boldface fonts (Passera, 2015). To counteract the “information fatigue”, it is
also recommended to use a structured and navigable layout through which
users can easily identify the information they need and move to the relevant
section through direct links 59 (Ben-Shahar-Schneider, 2014; Helleringer and
Sibony, 2016).
Since vagueness and ambiguity often seem to characterize legal
documents such as those concerning privacy (Tiersma, 1999; Endicott, 2000;
Endicott, 2001), this end up in the readers being confused about the intended
meaning of the text, potentially jeopardizing his/her right to be properly
informed (Bhatia, 2016). Terms such as “might” or “certain” must be lost, since
they do not allow the users to take real value from the data and information in
the text (Pollach, 2005, Reidenberg et al., 2016).
Therefore, according to the investigated literature, the main issues
with texts reporting privacy policies are: the complexity and difficulties in
understanding of the legal language used for normal users, the lack of an
attractive layout, the length of the text, long reading times and information
overload. This is consistent with the considered literature and with the results
of our experiment.
2.2.2.1.2 Possible ways to support users with information
visualization 60
HCI is the discipline which gave us the methodological ground and
the empirical findings to understand how sometime reading a legal text
on privacy reported on a digital product can be extremely complex. The
approaches that we should use in order to solve such a problem are given by
another discipline, the Information Visualization, which still operates in the
framework of the HCI.
Giving examples on how to rewrite privacy texts is not in fact the
primary aim of this approach, but mostly a work that legal practitioners have
59
European Parliament and Council of European Union. Directive2011/83/eu of the
european parliament and of the council of 25 october 2011 on consumer rights amending council
directive 93/13/eec and directive 1999/44/ec of the european parliament and of the council and
repealing council directive 85/577/eec and directive 97/7/ec of the european parliament and of the
council. OJ L 304, 22.11.2011, p. 64-88, 2011.
60
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
64
2.2 From Cognitive-Behavioral Analysis to Interaction Design
to perform in line with the recommendations provided, such as making the
content simpler, shorter and based on different level (especially for beginner
users) with relevant information and for whoever wants to dig deeper into
details). The Information Visualization, also called InfoView, and its approaches
are applied to several domains of writing, such as science, business,
journalism and many other fields (Bederson & Shneiderman 2003). The
InfoView is aimed at improving the human ability to discover, make decisions
and understand textual content, while allowing the texts to be clearer, readable,
memorable and simple to comprehend. Most of the InfoView tools can in fact
be considered cognitive artifacts that improve our knowledge, amplifying the
cognitive processes involved in the interaction with typical representations of
the outside world (Stuart et al. 1999).
Within the legal field, visual communication is usually quite rare, but
there are some examples of it in traffic regulations documents or in patent
descriptions’ outlines, where you can find the presence of visual aids under the
form of diagrams. Here the technical data have to be assured by law, in order
to be sure that the users are always in the condition of rebuilding the included
inventions. As reported before, visual aids in legal documents accompany
the text with the scope to support the readers and without replacing any
content. As a matter of fact, the priority is still left to written language, while
graphic elements and visual aids illustrate and clarify terms and legal actions
to be performed (Barton et al., 2013). Some means of visualizations (charts,
diagrams, flow charts, etc.) in support of the legal field are proposed by the
Legal Design Lab 61.
In order to ensure that legal documents can be used not only by law
experts but also by people without legal background, the design of these
documents needs to be improved. It should be reviewed from a functional
perspective, which would ensure a better understanding from both points of
view. For example, privacy policies should aim at increasing trust and ensuring
that both sides can benefit from it (Barton et al., 2013: 49-50).
Nonetheless, among the community there is no common agreement
regarding the potential virtuousness of the InfoView approach in legal texts.
There is still a degree of uncertainty about the correct interpretation of a visual
display, not only referred to legal field but others too. A possible visualization
of an information, such as a flowchart, could be applied to the creation of laws,
to competences division texts and probably to criminal and civil proceedings
too, although in a forced way. At the moment though it is not appropriate to use
61
Legal Design Lab, Visualization for Laweyers. Legaltechdesign, www.legaltechdesign.
com/legal-design-toolbox/visualization-for-lawyers, consulted 31 January 2020.
65
2. The experiment: description, data collection and analysis, findings
icons or graphic representations in legal texts, for example to represent the
content of a sentence or a rule.
However, according to empirical data, users understand the content of
a contract accompanied by visual aids more quickly and precisely (Passera,
2012; Passera and Haapio, 2013; Passera et al., 2013). The same happens with
contracts or legal texts via the web (Kay, 2010, Kay and Terry, 2010, Botes,
2017). Elements such as icons, comics or vignettes reduce the cognitive
overload of the users, allowing to identify relevant information in the text more
quickly by favoring the storage of legal concepts (Passera and Haapio, 2013,
Kay, 2010, Passera et al., 2013, Passera, 2017). This way documents with
graphic elements are perceived as more pleasant by the user, and those who
drafted them are perceived as more reliable (Passera et al., 2013; Passera et
al., 2012). In addition, the visual aids also induce editors of legal documents to
write in an easier way (Berger-Walliser et al., 2017). It is therefore necessary
that the meaning of the graphic aids is displayed as accessibly as possible,
perhaps with the use of symbols that are already known to the user. The
reason of such a device is to avoid misunderstandings between the designers’
intentions and the users’ interpretations (Rossi and Palmirani, 2018).
2.2.2.2 INTERLOCK AS A DESIGN SOLUTION FOR FUTURE
PRIVACY INTERACTION 62
As emerged in the experiments, users tend to accept all interactions
related to the privacy of an interface or system already in use, mainly because
she/he wants to start using the system as soon as possible, as documented
in table 7, table 8, table 9. Of course this is not the only reason that comes into
play to motivate the decision on consent approval or disapproval, and many
other factors are involved, which are connected to legal-related topics and
which we already encountered in other chapters of this text (see Section 3,
Legal Analysis).
Moreover, as emerged from academic literature, it seems that privacy
policies offer just information, and not a real choice about consent. This
approach is called “take-it-or-leave-it-approach” (Schaub, 2015) and it shows
that one of the main problems with consent is that sometimes users have no
choice but to accept it (Solove, 2013). A privacy notice has to be displayed with
a precise timing: if too much time goes by between when an alert is displayed
and when the user makes a decision, the user’s perception of the alert may
alter (Schauba et al., 2015). Therefore, it is always better to give the users
62
The authors of this paragraph are Emanuele Garzia (PhD student), Dr. Luca Cattani, Drs.
Federica Protti, Prof. Simona Collina, Prof. Roberto Montanari.
66
2.2 From Cognitive-Behavioral Analysis to Interaction Design
limited time and timely information to inform them of their possible choice.
When a user is surfing on a website, he/she is often inclined to give consent
by deliberately ignoring the notice, because he/she feels the notice as just a
nuisance (Obar and Oeldorf-Hirsch, 2016).
Combining all these observations, we can conclude that users seem
not to be inclined in spending time reading privacy notices, but they prefer to
quickly start their task to which the privacy consent is referred. In other words,
it can happen that users unwittingly compromise their privacy status in the
prior intention to quickly access the desired content, service, interaction.
An appropriate solution in the design of interactive documents for
privacy could be performed by improving the users’ awareness in the consent
decision-making phase. We believe that the design principles for legal help
websites provided by the Legal Design Lab, in particular the “Design Principle
3: Provide rich and clear interactions”, combined with the interlock-guided
approach, can help solve some of the problems already described 64.
The interlocking process that this analysis is suggesting can be
described as an acknowledgment’s request that goes ahead exactly when
the procedure is happening, instead of a sort of pro-toto section, which is the
current paradigm of the privacy consent. In other words, if someone is asked
to give the consent before the interaction itself, and assuming that this will
cover every future privacy related episode regarding that interaction, it is
quite likely that users will not have the necessary awareness to assess the
situation and to give an informed consent.
However, if this request took place exactly while the task was in
progress, the level of awareness would certainly be higher and the decision
could be more motivated and aware of its implications. For instance, instead
of approving everything as a condition of starting using the service, tools or
performing the interaction, the consent request should be delivered only when
that same service, tool or interaction is going to expose private and sensitive
data of the user. Only in this way the users would better understand why a
request of consent is needed and relevant for that operation, and this is the
exact logic behind the interlocking. A pre-requirement of this approach is of
course that task-related interactions should be only considered within the
digital world. Interlock approach in fact appears as a blocking warning, waiting
for a consensus before allowing to proceed towards the next phase, especially
with tools such as the installation wizard, that helps the users learning about
the computer world.
63
Legal Design Lab, Design principles for legal help websites. Legaltechdesign, www.
legaltechdesign.com/legal-design-toolbox/visualization-for-lawyers/, consulted 31 January 2020.
67
2. The experiment: description, data collection and analysis, findings
A clear example of this event are the cookie warnings. Nowadays,
when you start browsing a website to operate or use a certain service, a
warning banner appears asking the user to accept or deny the service provider
to send profiling cookies, third party cookies and technical cookies. To simplify
such a situation, we propose the use of an interlock-guided approach. When
the user is about to take advantage of a specific service, he/she can only
select warning banners that are contextualized and linked to that very specific
experience. Depending on his/her choice, the user is than allowed to proceed
or not towards the service she/he intended to use.
2.2.3 CONCLUSIONS: THE URGENCY OF AN HOLISTIC
APPROACH TO THE PRIVACY WHICH IMPLIES THE
INTERACTION DESIGN 64
During our research, we understood that privacy is a human trait and
need, and that sharing information is a characteristic of human sociality. We
highlighted the discrepancy of the “privacy paradox”, that happens when
people who are concerned with their privacy do not behave accordingly to their
belief in their daily lives. Human beings may behave differently from his/her
own beliefs, mainly because they may be influenced by misperceptions about
costs and benefits, social norms, emotions, cultural context, or strategies of
simplification in their decision making processes. Interaction design could
play a huge role in limiting the risk of these misperceptions, especially thanks
to his possible interaction with law firms and legal texts’ editors.
An improved transparency in every transaction could increase
consumers’ trust and have a beneficial effect for the entire system. Any limits
to an effective, transparent and clear interaction should be removed in the
mutual interests of all the players involved, and our suggestions should be
seen as a tool to improve the effectiveness of the entire consuming cycle.
In broader terms, the idea of privacy form the UI/UX point of view could
be reshaped via interaction modalities. First, if we would agree to rethink the
common idea of privacy as a one-shot experience, approval could be asked
both in the beginning phase and during the process, since systems could
consult the users for a further approvals or ask them to reconsider some parts.
HCI and interlocking approach could help to better understand privacy
texts, but only if their language is simple and memorable, via icons, vignettes
64
The authors of this paragraph are Prof. Roberto Montanari, Emanuele Garzia (PhD
student), Dr. Luca Cattani, Drs. Federica Protti, Prof. Simona Collina.
68
2.2 From Cognitive-Behavioral Analysis to Interaction Design
etc. A greater use of visual aids in these kinds of texts could be increased by
considering privacy related to any sensitive element through an interlocking
process, where the user approves the interaction with a sensitive element.
Will he/she approve the privacy of a certain element in that context and in a
conscious way, or not?
Data regarding people’s management of their approval and consent
choices should be published, shared and used, partly as a design tool (for
instance, it would be possible to approve 80% of data collecting), partly as a
guide. It would be a sort of HF privacy agency, which would provide suggestions
on how companies should revise their own privacy statements, according
to the daily experiences of the users, and also thanks to data gathered and
shared from their privacy related data section. Also, an in depth study would
be necessary regarding the correlations between human factors and privacy
policies in each country: this would contribute to the creation of guidelines or
specific protocols for different countries and/or age groups, and it would make
privacy policies accessible to all. It would also be desirable to design interfaces
according to the guidelines that emerged from this and other academic
researches. This way, both different options’ content and configuration, and
the consequences’ explanation would finally be understandable to every user .
69
2.3
LEGAL ANALYSIS
2.3.1 FINDINGS RELEVANT TO THE LEGAL ANALYSIS (L.A.)
ON THE BASIS OF THE COGNITIVE-BEHAVIORAL ANALYSIS
2.3.1.1 CONSENT TO THE PROCESSING OF PERSONAL DATA
AND ITS CONSEQUENCES: USER’S AWARENESS 65
Introduction. In order to assess the user’s level of knowledge and
awareness regarding the processing of personal data and, more particularly, if it
is sufficient to deny consent to prevent the processing, in this section we will try
to analyze the empirical data that emerges from the question number 52.
Do you think there may be negative consequences for your arising
from the use of your personal data without your consent?
In this question, users were asked to what extent they fear to face
negative consequences arising from the unauthorized processing of their data
and whether they think that these negative consequences may be limited
simply denying their consent to process such data.
65
The author is Dr. Anna Anita Mollo, Ph.D. in Humanities and Technologies: an integrated
research path at Suor Orsola Benincasa University of Naples.
71
2. The experiment: description, data collection and analysis, findings
More in particular, the question number 52 was compared with
questions number 21, 23, 27, 28, 29, 32, 33, 36, 48 and 49.
The results of this comparison show that users behavior is not
compliance with the regulatory framework, for the following reasons.
Users who know the regulatory framework. From the empirical analysis,
the positive data that can be drawn concerns only users that in question
number 52 answered that there may be negative consequences from the
processing of personal data without consent.
The same sample think that it’s extremely important to protect
personal data when browsing internet (Question n. 23) 66, and in question
number 36 they answer that they read the privacy policy and cookies whit
maximun attention when they downloaded the app 67.
How important is it for you to protect your personal data when
browsing the internet?
They further believe that the main operating systems provide clear
information about protection of personal data (Question n. 21) 68 and that they
65
Inference question number 52 Question n. 23 How important is it for you to protect your
personal data when browsing the Internet? User who answer “Yes, I do” in Question n. 52 concerning
Question n. 23 answer: 66,67% Extremely Important; User who answer “No, I don’t” in Question n. 52
concerning Question n. 23 answer: 50,00% Important and 50,00% Extremely Important. User who
answer “I don’t know ” in Question n. 52 concerning Question n. 23 answer: 41,67% Extremely Important.
66
Inference question number 52 Question n. 36 How carefully did you read the privacy
policy and cookies when you downloaded the app? A. User who answer “Yes, I do” in Question n. 52
concerning Question n. 36 answer: 6,67% No attention; 40,00% Poor attention – 33,33% Medium
attention – 10,10% very careful – 10,10 % Maximum attention.
67
Inference question number 52 Question n. 21 Do you think that the main operating
systems (such as Windows 10, Windows 7, Windows 8, El Capitan) use your personal data in
compliance with Personal Data Protection Code (Legislat. Decree no.196 of 30 June 2003)? User who
answer “Yes, I do” in Question n. 52 concerning Question n. 21 answer: 66,67% Yes; User who answer
“No, I don’t” in Question n. 52 concerning Question n. 21 answer: 50,00% Yes and 50,00 % No; User who
answer “I don’t know” in Question n. 52 concerning Question n. 21 answer: 50,00% Yes and 50,00 % No.
72
2.3 Legal Analysis
understand, while installing Windows 10, if there were any steps concerning
privacy and the protection of personal data (Question n. 28) 69.
How carefully did you read the privacy policy and cookies when you
downloaded the app?
Do you think that the main operating systems (such as Windows 10,
Windows 8, El Capitan) use your personal data in compliance with Personal
Data Protection Code (Legislat. Decree no. 196 of 30 June 2003)?
To this it’s added that the same sample in question number 29 replies
that the information concerning privacy and personal data has been presented
in a clear way during the installation 70.
68
Inference question number 52 Question n. 28 While installing Windows 10 (W10), did
you understand if there were any steps concerning privacy and the protection of personal data? User
who answer “Yes, I do” in Question n. 52 concerning Question n. 28 answer: 43,33 completely; User who
answer “No, I don’t” in Question n. 52 concerning Question n. 28 answer: 75,00 very probably; User who
answer “I don’t know” in Question n. 52 concerning Question n. 28 answer: 45,45 completely.
69
Inference question number 52 Question n. 29 Do you believe that information
concerning privacy and personal data has been presented in a clear way? User who answer “Yes,
I do” in Question n. 52 concerning Question n. 29 answer: 36,67 % very probably; user who answer
“No, I don’t” in Question n. 52 concerning Question n. 29 answer: 25,00 % probably not, 25,00 %
possibly, 25,00 % very probably, 25,00 % completely; user who answer “I don’t know” in Question n.
52 concerning Question n. 29 answer: 41,67 % completely.
73
2. The experiment: description, data collection and analysis, findings
While installing Windows 10 (W10), did you understand if there were any
steps concerning privacy and the protection of personal data?
Obviously these results arise from a good knowledge of the users of
the framework regulatory related to the processing of personal data.
So only in case of good knowledge of the rules, the users seems very
worried by negative consequences from the use of their data without consent.
This is demonstrated by the fact that the users have correctly
answered the questions number 48 71 and number 33 72.
Do you believe that information concerning privacy and personal data has
been presented in a clear way?
71
Inference question number 52 Question n. 48 What is the purpose of giving consent
to the processing of personal data? A. User who answer “Yes, I do” in Question n. 52 concerning
Question n. 48 answer: 36,67% To protect them, that is to prevent abusive uses of them; 50,00% To
make a conscious choice; 6,67% To self determination; 6,67% I don’t know.
72
Inference question number 52 Question n. 33 Did you understand what it means to
consent to the processing of your personal data by the third parties? A. 3,3% No, I din’t; 10,00% I don’t
know; 33,33% Yes, it means that even subjects other than those who have accepted my consent will be
able to use my data for various purposes; 50,00% Yes, it means that only those who have collected my
consent can use my data; 3,33% Yes, it means that my data can only be used by third parties.
74
2.3 Legal Analysis
What is the purpose of giving consent to the processing of personal data?
To confirm this, we can read how the same users manifest their
perplexities in relation to the effectiveness of their consent to the processing
of personal data, or how consent could really protect their personal data
(Question n. 49) 73.
Did you understand what it means to consent to the processing of your
personal data by third parties?
Therefore, they are well aware that, the actual use of the IT tool,
although it’s formally adapted to the regulatory requirements, this isn’t always
sufficient to protect their personal data.
This even if modern operating systems clearly provide all necessary
informations to their users.
Users who dont’ know the normative data: level of interest for the
protection of personal data – On the other hand, the results of inferences
73
Inference question number 52 Question n. 49 Do you consider that your consent to the
processing could effectively protect your personal data? A. User who answer “Yes, I do” in Question
n. 52 concerning Question n. 49 answer: 44,83% I don’t know. B. User who answer “No, I don’t” in
Question n. 52 concerning Question n. 49 answer: 50,00% No, I don’t. C. User who answer “I don’t
know” in Question n. 52 concerning Question n. 49 answer: 58,33% I don’t know.
75
2. The experiment: description, data collection and analysis, findings
Do you consider that your consent to the processing could effectively
protect your personal data?
related to the other two samples of users that answered to question number
52 are less encouraging.
With specific reference to users who answered question number 52
in the negative sense, it seems useful to point out that as regard question
number 21 they appear to be divided 74: in fact, only half of this sample thinks
that the main operating systems (such as Windows 10, Windows 7, Windows 8,
El Capitan) use personal data in compliance with Italian Protection Code 75.
Even more relevant circumstance is that such users believe that is
extremely important to protect their personal data when browsing the Internet 76,
with a clear contradiction to the answer given to question number 52.
Although, this sample does not consider important to provide consent
in order to protect its data (question number 52), at the same time, from how
it answers to question number 27, it seems to show a certain mistrust for all
the possibilities of synchronizing apps and that these apps access to the
information of their account 77.
74
Inference question number 52 to Question n. 21: Do you think that the main operating
systems (such as Windows 10, Windows 7, Windows 8, El Capitan) use your personal data in
compliance with Personal Data Protection Code (Legislat. Decree no.196 of 30 June 2003)? B. User
who answer “No, I don’t” in Question n. 52 concerning Question n. 21 answer: 50,00% Yes; 50,00% No.
75
Inference Legislative Decree 30 June 2003, no. 196.
76
Inference question number 52 to Question n. 23: How important is it for you to protect
your personal data when browsing the Internet? B. User who answer “No, I don’t” in Question n. 52
concerning Question n. 23 answer: 0,00% No Important; 0,00% Not very important; 0,00% Moderately
important; 50,00% Important; 50,00% Extremely Importan
77
Inference question number 52 Question n. 27: On a scale from 1 (strongly disagree) to
5 (strongly agree), show how much you agree with each of the statements listed below. B. User who
answer “No, I don’t” in Question n. 52 concerning Question n. 27 answer: 1. “I think it’s helpful that
apps find my position and chronology of my positions”: 25,00% strongly disagree; 25,00% disagree;
50,00% slightly agree; 00,00% agree; 00,00% strongly agree; 2. “I think it’s useful that Windows
and Cortana can recognize my voice to improve service customization”: 50,00% strongly disagree;
76
2.3 Legal Analysis
Then, in question number 28, they replied to understand that installing
Windows 10 was required some steps on privacy and data processing 78.
However, 50% of the same people who say in question 21 that they
don’t know if the main operating systems adopt or not a policy consistent with
the regulatory framework. In this case, a worryingly high proportion of users
does not seem to fully understand what is the purpose of, and what are the
possibile consequences arising from the consent 79.
This probably means that, normally, users control the privacy settings
of operating systems only in the initial phase of the first installation.
In fact, the same sample, in question number 29 appear very confused
about the clarity of the way in which information concerning privacy and
personal data has been presented 80, but then, in question number 32, they
say that they have well understood the consequences for their privacy of the
settings chosen in the “Privacy” panel 81.
In fact, those who believe that not give consent rise negative
consequences, in question number 36 says that they read the privacy policy
and cookies when downloaded the app with medium attention 82, even if they
are aware that the purpose of giving consent to the processing of personal
data is to make a conscious choice 83.
00,00% disagree; 00,00% slightly agree; 50,00% agree; 00,00% strongly agree; 3. “I think it’s helpful
allow to apps to access my name, my image, and other information on my account”: 50,00% strongly
disagree; 25,00% disagree; 00,00 % slightly agree; 25,00% agree; 00,00% strongly agree; 4. “I think
it’s useful to allow apps to automatically share and sync my information with wireless devices not
explicitly connected to my PC, tablet, or smartphone”: 50,00% strongly disagree; 25,00% disagree;
20,00% slightly agree; 00,00% agree; 00,00% strongly agree.
78
Inference question number 52 Question n. 28 While installing Windows 10 (W10), did
you understand if there were any steps concerning privacy and the protection of personal data? B.
User who answer “No, I don’t” in Question n. 52 concerning Question n. 28 answer: 00,00% Absolutely
not; 00,00% probably not; 00,00% possibly; 75,00 very probably; 25,00 completely.
79
L. Cattani, p. 13 of the first paper «When reference is made to the usability perspective it’s
clear that a relatively small number of users is sufficient to highlight possible bottlenecks in the system».
80
Inference question number 52 Question n. 29: Do you believe that information
concerning privacy and personal data has been presented in a clear way? B. User who answer “No, I
don’t” in Question n. 52 concerning Question n. 29 answer: 00,00% Absolutely not; 25,00 % probably
not; 25,00% possibly; 25,00% very probably; 25,00% completely.
81
Inference question number 52 Question n. 32: On a scale from 1 (absolutely not) to
5 (completely), show how much you agree with each of the statements listed below. 2. Did you
understand the consequences for your privacy of the settings chosen in the “Privacy” panel? 6,67%
Absolutely not; 10,00% probably not; 40,00% possibly; 10,00% very probably; 33,33% completely.
82
Inference question number 52 Question n. 36: How carefully did you read the privacy
policy and cookies when you downloaded the app? B. User who answer “No, I don’t” in Question n.
52 concerning Question n. 36 answer: 25,00% No attention; 00,00% Poor attention; 50,00 % Medium
attention; 25,00% very careful; 00,00 %Maximum attention.
83
Inference question number 52 Question n. 48: What is the purpose of giving consent
to the processing of personal data? B. User who answer “No, I don’t” in Question n. 52 concerning
77
2. The experiment: description, data collection and analysis, findings
This negative result could depend both on the lack of clarity of the
information provided, but also on the fact that users read with poor attention
the privacy police.
In any case, this determines that users, although they don’t know the
regulations, they still give their consent, even if they are aware that in this
way they doesn’t effectively protect their personal data (Question n. 49) 84.
This demonstrates the low level of awareness of how the processing of
personal data can affect decisions regarding the performance of daily activities 85.
Empirical data and regulatory framework of reference – As it has been
mentioned in the previous paragraph, it emerges that the rule of consent it’s
not compliance with the legislator’s requests.
This is certainly due to the fact that it’s not a conscious consent
(Question 48).
So it appears quite clear that this realized the infringement of many
rules. First of all the art. 6, para. 1, let. a), GDPR and the art. 7 para 1 and 4
GDPR 86: it’s evident that many users, although formally giving their consent,
they don’t really understand exactly what are the consequences connected to
the consent thus granted (Question 32.2).
A large part of the sample, in fact, doesn’t know how to give an answer
about the consequences for its personal data if it don’t give adequate consent 87.
The answers given in cases B and C of question number 21 denote that
users don’t know the regulations of the processing of personal data, because
they can’t recognize if the main operating systems process data in accordance
with current legislation 88.
Question n. 48 answer: 25,00% To protect them, that is to prevent abusive uses of them: 50,00% To
make a conscious choice; 00,00 % To self determination; 25,00% I don’t know.
84
Do you consider that your consent to the processing could effectively protect your
personal data? A. User who answer “Yes, I do” in Question n. 52 concerning Question n. 49 answer:
44,83 % I don’t know. B. User who answer “No, I don’t” in Question n. 52 concerning Question n. 49
answer: 50,00% No, I don’t. C. User who answer ““I don’t know” in Question n. 52 concerning Question
n. 49 answer: 58,33 % I don’t know.
85
I. Caggiano, Consenso al trattamento dei dati personali e analisi giuridicocomportamentale. Spunti di riflessione sull’effettività della tutela dei dati personali, in Nodi virtuali,
legami informali. Internet alla ricerca di regole: a trent’anni dalla nascita di Internet e a venticinque
anni dalla nascita del web, Atti del Convegno di Pisa, 7-8 ottobre 2016, Pisa, 2017, p. 68 e ss.
86
Regulation (EU) 2016/679
87
Question n. 52: Do you think there may be negative consequences for you arising from
the use of your personal data without your consent?: Answers Yes, I do 65,22%; No, I don’t 8,70%; I
don’t know 26,09%.
88
Inference question number 52 to Question n. 21: Do you think that the main operating
systems (such as Windows 10, Windows 7, Windows 8, El Capitan) use your personal data in
78
2.3 Legal Analysis
Whoever answers “I do not know” in question number 52, then believes
that the protection of personal data is extremely important when browsing Internet.
This implies that part of the sample doesn’t know how to protect its
personal data, even if they consider it very important 89.
Then, it seems appropriate to take into consideration the art. 12, para.
1, GDPR: if the information provided from the operating system are clear
(Question n. 29) 90, but users don’t fully understand the consequences of
giving consent (Question n. 32.2), this means that the result requested by the
legislator hasn’t been achieved.
This is even more evident if the information is provided in such a
way as not to allow users to understand what is the effect of this mode of
installation (Question n. 32.1) 91.
So, the consent to the processing of personal data is often rendered
unconsciously and with a probable distortive effect, as the user doesn’t have
full knowledge of the tools of protection.
Nevertheless, it’s widespread, on the contrary, the belief that granting
consent to the processing of personal data is sufficient to eliminate any
possibility of injury.
Probably, more attention should be focused on the behavior of users
who consent to the processing of their personal data; in the analysis of the
effective capacity of the legal rules to protect certain subjects and interests, in
fact, we can’t disregard the way in which these relate to the device whose use
requires the consent to the processing of personal data.
This is also confirmed by a recent legislation: the European legislator,
in the context of the general regulation on the protection of data GDPR, has
compliance with Personal Data Protection Code (Legislat. Decree no.196 of 30 June 2003)? B. User
who answer “No, I don’t” in Question n. 52 concerning Question n. 21 answer: 50,00% Yes; 50,00% No.
C. User who answer “I don’t know” in Question n. 52 concerning Question n. 21 answer: 50,00% Yes;
50,00% No.
89
Inference question number 52 to Question n. 23: How important is it for you to protect
your personal data when browsing the Internet? C. User who answer “I don’t know” in Question
n. 52 concerning Question n. 23 answer: 8,33% No Important; 0,00% Not very important; 25,00%
Moderately important; 25,00% Important; 41,67% Extremely Important.
90
Do you believe that information concerning privacy and personal data has been
presented in a clear way? User who answer “Yes, I do” in Question n. 52 concerning Question n. 29
answer: 36,67% very probably; user who answer “No, I don’t” in Question n. 52 concerning Question n.
29 answer: 25,00% probably not, 25,00% possibly, 25,00% very probably, 25,00% completely; user
who answer “I don’t know” in Question n. 52 concerning Question n. 29 answer: 41,67% completely.
91
On a scale from 1 (absolutely not) to 5 (completely), show how much you agree
with each of the statements listed below. User who answer “Yes, I do” in Question n. 52 concerning
Question n. 32 answer: 1. If you have selected the “quicker” mode at the time of installation, would
you be able to explain what is the effect of this mode of installation on the data you supply (eg., types
of data provided, their use, etc)? 33,33% probably not.
79
2. The experiment: description, data collection and analysis, findings
regulated and integrated the legislation about data relating to electronic
communications having the character of personal data 92.
The Commission carried out ex post Regulatory Fitness and
Performance Programme (“REFIT evaluation”) of the ePrivacy Directive
(Directive 2002/58/EC). This evaluation shows that the objectives and
principles of the current framework remain valid; however, since the last
revision of the privacy directive in 2009 on the market, important technological
and economic results have been recorded 93.
More in particular, the REFIT evaluation showed that the consent rule,
intended to protect the confidentiality of terminal equipment, has failed to
achieve the objectives: end users are faced with requests to accept markers
(“tracking cookies”) without understanding their meaning and, sometimes,
they are even exposed to cookies without giving their consent.
In conclusion, the rule of consent, combined with a lack of attention
and interest to the processing of personal data, makes the current data
protection system absolutely ineffective, leaving users totally unprotected.
2.3.1.2 THE PURPOSE OF THE CONSENT TO THE PROCESSING
OF PERSONAL DATA IN TERMS OF EFFICACY 94
Introduction. In this paragraph, we will try to verify the efficacy of
consent to the processing of personal data with specific regard to the awareness
of the users and of its related risks. More precisely, the efficacy is the ability to
produce the awareness of the user through the tool of the informed consent
given in compliance with European and national legislation 95.
In order to evaluate the efficacy of the consent 96 a subsample of the
92
Proposal for a Regulation of the European Parliament and of the Council, of 10 January 2017
concerning the respect for private life and the protection of personal data in electronic communications
and repealing Directive 2002/58/EC (COM(2017) 10 final), then modified on 10 February 2021.
93
Consumers and businesses have increasingly relied on new internet-based services
designed to enable interpersonal communications, such as voice-over-IP, instant messaging and
network-based e-mail services instead of using traditional communication services.
94
The author of this section is Dr. Maria Cristina Gaeta, Research Fellow in Privacy Law at
Suor Orsola Benincasa University of Naples, Ph.D. in Private Law at Federico II University of Naples.
95
Efficacy (or effectiveness) is a criteria of the regulatory impact assessment (RIA) and of the
ex post review of legislation. However, they are usually focused not only on the efficacy of the regulatory tool
but also on the criteria of efficiency, coherence, relevance, and added value. Efficiency is the achievement of
the best result in the most economical way and in the shortest possible time. The coherence is congruence
with the broader national and international regulatory context. Relevance means the applicability of the
legislation with regard to the state of the art, and added value is what the regulatory tool can add compared
to what has already been stated. At European level ex post review of legislation is a practice resulting in one
or more documents presenting a retrospective stock-taking of some aspects of an EU regulatory reform,
with or without evaluative elements. Within the European Commission, these documents can take the
form of a Commission report or a Staff Working Document, and it can be supported by external studies.
80
2.3 Legal Analysis
original 75 users of the above analysed experiment went through an extended
questionnaire (see Q48 – Q59), including additional questions related to the
awareness of choosing specific privacy settings and their consequences (e.g.
“Fast choice” during the installation of the Microsoft operative system) 97. The
users’ choices and the legal content of every single step of the installation
process is the object of the experiment carried out 98.
Copia di Privacy and the Internet of Things: a behavioural and legal approach
Question n. 48: What is the purpose of giving consent to the processing
of personal
data?
Q48 A cosa
serve dare il consenso al trattamento dei dati personali?
Risposte: 46
Saltate: 2
50
40
30
20
10
0
A proteggerli,
cioè ad
impedirne usi
abusivi
A esercitare
una scelta
consapevole
OPZIONI DI RISPOSTA
Ad
autodeterminarsi
Non lo so
RISPOSTE
A proteggerli, cioè ad impedirne usi abusivi
34,78%
16
A esercitare una scelta consapevole
47,83%
22
Ad autodeterminarsi
4,35%
2
Non lo so
13,04%
6
TOTALE
46
6. Question n. 48 and related answers
Concretely, in this paragraph it will be analysed the added sections
from III.I to III.IV of the questionnaires which refers to questions from n. 48 to
51, focused on the consent to the processing of personal data.
1/1
A lack of basic privacy knowledge.
With regards to the purpose of
the consent, with regard to Q48, which asks respondents to define what the
legal purpose of this consent is (see image n. 6) 99, the results of the test
See European Commission Staff Working Document of 7 July 2017, on Better Regulation Guidelines
(SWD (2017) 350), 14 ff.; ‘Ex-post review of EU legislation: a well-established system, but incomplete
[2018] European Court of Auditors, 6 ff., www.eca.europa.eu/Lists/ECADocuments/SR18_16/SR_BETTER_
REGULATION_EN.pdf (accessed 27 May 2019); ‘La nuova disciplina dell’analisi e della verifica dell’impatto
della regolamentazione’ [2018] Senate of the Italian Republic, 7 ff., www.senato.it/application/xmanager/
projects/leg18/ESPERIENZE_DIRETTIVA_AIR.pdf (accessed 27 May 2019).
96
At the outset, on the function of the consent to the processing of personal data in a
digital context see W Kuan Hon, C Millard, J Singh, ‘Twenty Legal Consideration for Clouds of Things,
Legal Studies Research Paper no. 247 [2016] Queen Mary University of London website, 21.
97
“Fast choice” setting was a way to install the Windows 10 operating system, provided
in version 2016, for which the operating system is installed without modifying the default settings
chosen by the producer, including privacy settings. With version 2019 of the Windows 10 operating
system, this possibility is no longer provided, maybe because not in compliant with GDPR (entered
in force on 24 May 2016 and applicable after two years, so from 25 May 2018).
98
See subchapter 2.1.3.3 on the Users’ awareness.
99
Question n. 48: “What is the purpose of giving consent to the processing of personal data?”.
81
2. The experiment: description, data collection and analysis, findings
demonstrate that a big part of the sample had a lack of basic knowledge about
privacy.
In point of fact, more than half of the sample was not able to indicate the
correct purpose of the consent, giving the wrong answer (51.11%), believing
that the request of the consent aimed to protect personal data, preventing
unauthorised uses. Only a minor part of them said that consent is needed to
make a conscious choice (see image n. 7).
7. Percentage representation of the answers to question no. 48
At the same time, answering to Q49 100, that ask users if giving the
consent effectively protect their personal data, a big part of the users had no
idea whether the consent had actual implications on the processing of their
data (44.44%) 101 (see images n. 8 and 9).
Copia di Privacy and the Internet of Things: a behavioural and legal approach
Question n. 49: Do you think that your consent to the processing effectively
protects
your
personalal data?
Q49 Ritiene
che
il consenso
trattamento effettivamente protegga i tuoi dati personali?
Risposte: 45
Saltate: 3
50
40
30
20
10
0
Si
No
OPZIONI DI RISPOSTA
Non lo so
RISPOSTE
Si
26,67%
12
No
26,67%
12
Non lo so
46,67%
TOTALE
21
45
8. Question n. 49 and related answers
100
Question n. 49: “Do you think that your consent to the processing effectively protects
your personal data?”.
101
See subchapter 2.1.3.3 on the Users’ awareness.
1/1
82
2.3 Legal Analysis
9. Percentage representation of the answers to question no. 49
In this context, it is important to highlight that specific factors affect
the users’ answers and, among others, age and education take over, but also
sex and occupation are important factors. With reference to the age, the results
of the experiment show how users of the younger age group (15-24 years old)
have a greater propensity to change the “fast choice” privacy settings (33.45%)
than the adult age groups (25-34 and 34-45 years old). About the education,
then, it has been demonstrated that users who do not have a legal background
are brought to re-read the privacy statement, while individuals with a legal
background feel well prepared on the matter thus skipping parts of the privacy
policy they consider it as redundant (39.31%). Regarding sex, females show a
lower propensity to change privacy settings than men (-30.5%). Finally, users
employed in medium and low skilled occupations seem to be less aware of
this trade-off between the complex process of the installation of the operative
system and the level of protection of personal data 102.
The unawareness of the users and the disinterest in the protection of
personal data. Taking into account only the sample of the users who answered
correctly to the Q48 saying that the consent is needed to make a conscious
choice (48.89%), that from now on will be defined as “sample of users of Q48”,
interrelations must be made with other questions, in order to verify if the consent,
as lawful base, is a really effective tool for the protection of personal data, with
specific reference to awareness and interest of the users. Particularly, the
relationship has been made between Q48 and Qs 21, 27, 28, 29, 32, 33 and 49 103.
102
See subchapter 2.1.3.2. on the descriptive statistics.
Of the users who don’t know if the consent effectively protects personal data (Q 49),
no one answer to Q50 (Question n. 50: “(If yes) Why?”) and Q51 (Question n. 51: “(If no)“Why?”).
103
83
2. The experiment: description, data collection and analysis, findings
Analysing the interrelationships, it emerged that the sample of users
of Q48 who chose the right answer believes that operative systems use their
information by respecting the Italian privacy law (Q21) 104. Actually, before
the entry into force of the GDPR, Italian legislation did not keep up with
technological evolution, but the scenario changed with the GDPR, that has
tried to improve the data protection in this digital environment. However, data
protection issue persists regarding the concrete application of the GDPR. The
reason why the user believed in the compliance with the existing legislation
is probably a consequence of the fact that they are really unaware of the
operative systems working and of the risk of unauthorised processing (e.g.
due to data breach) 105. As a matter of fact, even though the majority of all the
users (not only the sample) affirms to be worried about the consequences of
unauthorised processing (66.67%), only 1 out of 5 thinks that his/her data
could be processed without their consent (17.78%), which is the typical case
of unauthorised processing.
Furthermore, half part of the same sample of users of Q48, affirms
that protecting personal data is very important (Q23) 106. It means that half
of these users abstractly gives importance to personal data but, in practice,
is the opposite. Indeed, users do not really care about their personal data, as
demonstrated by the tasks performed by the users during the installation and
use of the Windows 10 software (lack of interest in the protection of personal
data) 107. As we will attempt to demonstrate, users almost always tend to give
their consent in order to use a service or a better service.
Indeed, during Task 1 108, where user were asked to Install Windows
10, from the recording of the test is clear that usually users chose “fast choice”
option” (see image n. 10).
104
Question n. 21: “Do you think operative systems (such as: Windows 10, Windows 6 ,
Windows 8, El Capitan) use the information you provided them by respecting the Privacy legislation
(d.lgs. 196/2003)?”.
105
For example, there are virtual assistants that record the surrounding environment even
if the user does not give his/ her consent, or vehicles that record data on the driving environment
without the consent of the pedestrians or of the passengers on board.
106
Question n. 23: “How much protecting personal data while surfing the internet is
important to you?”.
107
See subchapter 2.1.4. on biometrics and users’ characteristics: «The average
completion time is 38.34 seconds, while it took less than 30 seconds for more than a half of the
sample to complete the task (58%) and three quarters of the sample dedicated less than one minute
to the task (see Table 28). At first sight, users are not spending much time in reading, modify and
then re-reading disclaimers and settings (Privacy, Security, etc.)».
108
Task 1: “Install Windows 10, where, usually users chose “fast choice” option”.
84
2.3 Legal Analysis
10. Record of one of the test were user chose “fast choice”
The following image n. 11, instead, show how during the execution of
Task 4 109, in most cases users gave the consent to be localised or the consent
to cookies, which happened just by continuing to surf the internet without
giving their express consent.
11. Record of one of one of the test were user gave the consent to be localise or the consent to
cookies
About the Q27, the answers are confusing. Indeed, the sample of users
of Q48 seems to be not very concerned with protecting their personal data
(localisation and chronology, Q27, Answer A) 110, but the same users think that
is useful that Windows and Cortana use vocal recognition (Q27, Answer B).
They also believe that is quite useless giving the consent to the processing
of personal and sensitive data (Q27, Answer C) as well as giving the consent
to share automatically and to synchronize their information through wireless
devices 111 (Q27, Answer D) 112. The context is further confused by the fact that
109
Task 4: “Active and Cortana will test having them perform the task through voice
instructions: «How I can arrive to the station? » and then: «Find the best pizzeria in Naples»”. In
most cases users gave the consent to be localized.
110
Question n. 27: “I believe it is useful giving the consent in order to make my apps able
to localize me and knowing my chronology”.
111
More in details comparing Q 48 with Q27: Answer A: 3 (36,36%); Answer B: 5 (31,82%);
Answer C: 2 (27,27%); Answer D: 2 (54,55%).
112
Anyway, from a statistical analysis this complex scenario has been translated into in
the result according to which there is low propensity to share personal information. See subchapter
2.1.3.2. on the descriptive statistics.
85
2. The experiment: description, data collection and analysis, findings
the majority of the sample of users of Q48 said that they realise if there were
steps related to privacy or personal data (Q28) 113.
The absence of lawfulness, fairness and transparency of the
processing. With regard to Q29, three-quarters of all the users felt selfconfident in terms of identifying the steps related to privacy and data
protection, while half of the sample was not satisfied with the privacy policies
of some app in terms of clarity (Q29) 114, which involves a lack of lawfulness,
fairness and transparency of the processing, provided by article 5 GDPR 115.
These results make thinking that privacy policy is clear enough and that users
know privacy law, are aware of the meaning of processing and that they can
understand when and what kind of processing takes place. The problem lies in
the fact that, as the answers to the other question and the tasks demonstrate,
the facts are the opposite 116. A possible interpretation to solve the conflict
between what has been declared by some of the users in the questionnaire
and their real behaviour is based on two main factors. (i) First of all, the user
does not know very well the possible kinds of processing of personal data
and their related risks; users think to be aware but sometimes they are only
confused (lack of basic knowledge about privacy). In support of this thesis
there is an important element, which shows that the users of Q48 affirmed
that they are unable to indicate what are the consequences of choosing “fast
choice” privacy settings (Q32, Answer A) 117, even though they said that they
can understand enough the consequences of choosing this privacy setting
(Q 32, Answer B) 118 and that they also know the consequences of giving the
consent to the processing of their personal data by third people (Q33) 119.
Indeed, if users were really intended to better protect their personal
data, they would not choose the “Fast choice” setting that, as represented in
113
Question n. 28: “While installing Windows 10 (W10), did you realize if there where
transitions related to privacy and personal data protection?”.
114
Question n. 29: “Do you think information concerning personal data and privacy are
clearly represented?”.
115
With regard to privacy policy, its requirements are strictly listed in arts 13 and 14,
GDPR and art. 12 and rec. 58, GDPR provide the appropriate measures that the data controller should
take in order to act in compliance with the principle of transparency.
116
See footnote n 8.
117
Question n. 32: “If while installing you picked the “fast choice” option, could you be
able of indicating what is this option consequence on personal data you provide (ex. Kind of data
provided their use)”.
118
More precisely comparing Q48 with Q32: Answer: 2 (50,00%); Answer: 3 (45,45%).
119
Question n. 33: “Did you realize what is meaning giving third people the consent to
process your personal data?”.
86
2.3 Legal Analysis
the image below, autorise data processing in many cases. As a matter of facts,
about Task 5 120 and Task 6 121 often users decide to do not change the privacy
and security settings that was chosen by default with the “Fast choice” option.
For example, in the privacy settings users decided to let turned on
the possibility for Microsoft to access to the user’s account information
(see image n. 12).
In the same way, with regard to the security settings (Task 6), Users
decided to let turned on the automatic notifications for update Windows
software, App Store and other Microsoft products. This could not be safe
because sometimes the updates slow down the computer and, in some
cases, they could be fake updates sent by hackers that could cause damage
or block the device that receives an install the updates (see image n. 13).
12. Setting Privacy, section on the account information
13. Security Settings: section on Microsoft updates
120
Task 5 “Explore the panel relating to privacy settings and configure your computer the
way that best suits his needs for daily use”.
121
Task 6: “Explore the panel related to security settings and configure your computer the
way that best suits his needs for daily use”.
87
2. The experiment: description, data collection and analysis, findings
Furthermore, performing Task 1 122, only 5 users out of 45 refused
to adopt the “fast choice” representing a very small part (11.11%). This
means that users do not know what are the consequences of “fast
choice” settings and/or that they do not care about the consequences of
these privacy settings because the most important thing is to install the
software as quickly as possible in order to take advantage of its services.
(ii) Secondly, connected to this last step of the first point, the user will
always give the consent to the processing of his/her personal data in order
to use a service or to use a better service without caring about personal
data, as demonstrated by the tasks required to perform (lack of interest
in the protection of personal data, in particular in balance with other
needs) 123. Moreover, with reference to the privacy policy, the gaze plot and
timing results, respectively demonstrate that users do not really read the
privacy policy text before giving their consent, because they looked at a
fixed point (behaviour which is not in line with the reading skill), and that
they spent too short time to perform the task 124. In other words, analysing
the behaviour of the users required to give consent to the processing of his/
her personal data to benefit from a service, it has been demonstrated that
users generally provide consent without paying attention to the privacy
policy 125. In this way, the user’s right to self-determination is undermined,
since consent is not a freely given, specific, informed and unambiguous
indication of the data subject’s wishes.
The consent to the processing of personal data as an inadequate
regulatory tool. Finally, and with particular regard to the efficacy of the
consent, the sample of users of Q48 says that they do not know if the consent
effectively protects personal data (Q49) 126. It means that the majority of the
users know that consent is not a functional tool for the protection of their
122
For the content of the task 1 see footnote n. 108.
See Task 1 in footnote n. 108.
124
See again subchapter 2.1.4. on biometrics and users’ characteristic: The average
completion time is 38.34 seconds, while it took less than 30 seconds for more than a half of the
sample to complete the task (58%) and three quarters of the sample dedicated less than one minute
to the task (see Table 28). At first sight, users are not spending much time in reading, modify and
then re-reading disclaimers and settings (Privacy, Security, etc.).
125
I.A. Caggiano, ‘A quest for efficacy in data protection: a legal and behavioural analysis’,
Working Paper no. 10/2016, 11 ff.. On the lack of interest of users towards privacy policy and the
proception of personal data see also Míšek J., ‘Consent to personal data processing – The panacea or
the dead end? (2014) 8 (1) Masaryk University Journal of Law and Technology, 76 ff..
126
Question n. 49: “Do you consider that your consent to the processing protects
effectively your personal data?”.
123
88
2.3 Legal Analysis
personal data. On one hand, Q48 asks users to define the legal purpose of
the consent. On the other hands, Q49 asks users to evaluate if the consent
effectively protects their personal data expressing their personal opinion. As
explained above, high proportions of the subsample were not able to indicate
the correct purpose of the consent (51.11%) and had no idea whether the
consent had implications on their data (44.44%) 127, demonstrating the low
knowledge of the user in the field of privacy.
Conclusion. Therefore, on the basis of data analysed, we can
consider that consent is not an adequate regulatory tool to protect the
users’ personal data (is not completely efficient) 128. Indeed, even though
the data subjects give the consent, usually they are unaware of the
consequence of specific privacy settings choice, also because they do not
really understand the operation of the technologies involved in the data
processing, and of the possible risks for their personal data. In these cases,
the processing is not lawful and is not in compliance with articles 6 and 9
GDPR and with recital 32, GDPR.
The consent of the data subject, as provided in recital 32 of the GDPR,
is highlighted as positive act. The GDPR rule is that the express consent
of the processing of personal data is required (recital 32, GDPR) with the
exception represented by the explicit consent, which is required only
with regard to special categories of personal data (art 9, GDPR), profiling
(art 22, GDPR) 129 and the transfers of personal data to a third country or
an international organisation (art. 49, para 1, lett. a, GDPR). However, the
difference between express and explicit consent is unclear and it would
appear that explicit consent is nothing more than an express consent
characterized by the greater determination in the behaviour of the user 130.
In addition, recital 32 considers lawful any positive act clearly
indicating the willingness of the user to consent to the processing of his or
her personal data, even in the case of the consent provided online, currently
127
See again subchapter 2.1.3.3 on the Users’ awareness.
L. Gatt, R. Montanari, I.A. Caggiano, ‘Consenso al trattamento dei dati personali e analisi
giuridico-comportamentale. Spunti di riflessione sull’effettività della tutela dei dati personali’ (2017)
2 Politica del diritto 350-351; I.A. Caggiano, ‘Il consenso al trattamento dei dati personali’ (2017) DIMT
online, 20 ff.; G Zanfir, ‘Forgetting about consent. Why the focus should be on “suitable safeguards” in
data protection law’ (2014) Reloading Data Protection 237-257.
129
The processing of personal data and in particular profiling process is a way to improve
machine learning. In this sense D. Kamarinou, C. Millard, J. Singh, ‘Machine Learning with Personal
Data’, Legal Studies Research Paper no. 216 [2016] Queen Mary University of London website, 1 ff..
130
I.A. Caggiano, ‘Il consenso al trattamento dei dati personali’ (n. 30).
128
89
2. The experiment: description, data collection and analysis, findings
given using electronic means. Nevertheless, for the consent provided online
are accepted certain behaviours that appear to be more closely related to
implied consent rather than the express consent (or even the explicit one).
Taking the example of a website, sometimes it is not requested to tick a box
indicating the user’s consent when they are visiting a website, as long as in
the banner that appears on the home page it is specified that the consent
is deemed to have been provided by continuing to surf on the website. This
does not match the definition of positive act.
In this already very complex context, an important role is played by
electronic communications, currently regulated under Directive 2002/58/
EC on electronic communications (well known as ePrivacy Directive). The
Commission carried out an ex post Regulatory Fitness and Performance
Programme (‘REFIT evaluation’) of the ePrivacy Directive and verified that
the Directive has not really guaranteed an effective legal protection of
privacy in the electronic communication. In particular, the REFIT evaluation
shows how important technological and economic developments took
place in the market since the last revision of the ePrivacy Directive in
2009. Consumers and businesses increasingly rely on new Internetbased services enabling inter-personal communications (e.g. Voice over
IP, instant messaging and web-based e-mail services) which fall down the
name of Over-The-Top communications services (OTTs). Generally, the OTTs
are not subject to the current EU electronic communications framework
which has not kept up with technological developments, resulting in a lack
of protection of electronic communications.
The REFIT evaluation further showed that some provisions have
created an unnecessary burden on businesses and consumers. For
example, the consent rule to protect the privacy of the users failed to
reach its objectives as end-users face requests to accept tracking cookies,
without understanding their meaning, due to the lack of basic knowledge
about privacy, and, sometimes, are even exposed to cookies being set
without their consent for unauthorised processing. This demonstrate that
the consent rule is ‘over-inclusive’, as it also covers aspects not related
with privacy but, contemporary, it is ‘under-inclusive’, as it does not cover
some tracking techniques (e.g. device fingerprinting) 131. For this reason, it
131
Explanatory memorandum of Proposal for a Regulation of the European Parliament and
of the Council, of 10 January 2017 (then modified on 10 February 2021), concerning the respect for
private life and the protection of personal data in electronic communications and repealing Directive
2002/58/EC (COM(2017) 10 final), 5.
90
2.3 Legal Analysis
has been published a Proposal for a Regulation on privacy and electronic
communications 132.
2.3.1.3 CLUES OF MISMATCH BETWEEN LEGAL AND USERS’ CONCEPTION
OF PERSONAL DATA PROTECTION 133
Introduction. Legal debate on privacy has for some time focused on
the features of the consent to the processing of personal data. In the spotlight,
there is the idea that a formal review of its application criteria is required to
make the most from such a legal solution and avoid privacy violations. On
the other hand, still today this represents one of the cornerstones of privacy
legal protection and, specifically, one of the main legal bases for processing
personal data.
Article 23 and 24 of Personal Data Protection Code 134 explicitly claims
that the processing of personal data may take place only after receiving the
consent of the party concerned which must be explicit and free so as to
be unequivocal. Silence is excluded as a valid form of consent. In contrast,
providing information and consent in oral form is allowed as long as this refers
to non-sensitive personal data.
Normative provisions are in tune with the European guidelines. In fact,
the importance of the consent has been recently remarked by the General
Data Protection Regulation 135 (hereafter GDPR), which highlights how the
processing of personal data is generally prohibited unless it is expressly
allowed by law or the data subject has consented to it. The explicit agreement
of data subject thus remains one of the main pillars in the formal structure
designed by law to make it possible personal data processing, along with the
contract, legal obligations, vital interests of the data subject, public interest
and legitimate interest (see Article 6(1) GDPR).
Moreover, as stated in Article 7 and specified further in recital 32 of the
GDPR, consent must be freely given. Therefore, on the one hand, the purposes
behind collecting and processing data need to be explicit, legitimate, adequate,
and relevant (Article 5); on the other, the consent must be specific, informed,
unambiguous. The data subject has to be notified about the controller’s identity,
what kind of data will be processed, how it will be used and the purpose of the
132
Proposal for a Regulation of the European Parliament and of the Council, of 10 January
2017 concerning the respect for private life and the protection of personal data in electronic
communications and repealing Directive 2002/58/EC (COM(2017) 10 final (then modified on 10
February 2021).
133
The Author is Margherita Vestoso Ph.D. candidate in Humanities and Technologies: an
integrated research path, at Suor Orsola Benincasa University of Naples.
134
Legislative Decree 30 June 2003, no. 196.
135
Regulation (EU) 2016/679.
91
2. The experiment: description, data collection and analysis, findings
processing operations. The idea is to keep data subjects safe from unauthorized
forms of “re-use” – e.g., secondary use or function creep cases 136.
The central role played by the consent in the regulatory structure
accompanying privacy protection, has prompted legal reflection on this topic to
be more and more focused on the accuracy of normative provisions as well as
on the criteria driving their application. Less attention has been paid, instead, to
the real effectiveness of those safeguards, namely to the ways in which such
norms concretely affect the behaviour of the citizens they intend to protect.
Results from the behavioural experiment discussed in this work
suggest actually that a wider reflection on this issue, beyond formal features
of the privacy regulation, is needed. Indeed, while personal data processing
regulation stresses consent as a safeguard tool, the cognitive-behavioural
analysis shows a misalignment between the image of consent emerging from
the legal framework and the one coming from social perception.
Dealing with such a challenge, in this section we try to identify
experimental evidence supporting the idea of a mismatch between how
legal norms and common users conceive the protection of personal data.
The analysis is mostly focused on section III of the questionnaire, sketching
a comparison between questions related to the role of the consent in the
processing of personal data (specifically those between n. 42 and 58) and
a reference set of questions (including n. 21, 23, 27, 28, 29, 32, 33, 36, 48,
49) evaluating the awareness of the subjects who took part in the experiment
compared to the tasks performed.
The idea is that the mismatch between the normative image of the
consent and the general users’ perception of privacy safeguards is driven by
two different and somehow connected phenomena: a) an unawareness of
data protection rules; b) a failure in the identification of risks for privacy in the
interactions with ICT services. The analysis carried out on survey results aims,
therefore, to detect clues about both the phenomena.
Unawareness of data protection rules. A first clue that the content of
rules on personal data protection is often misunderstood comes from Question
42 137 of the survey (see Image n. 14). As highlighted by the analysis of the
136
Function creep occurs when the use of an innovative tool shifts from the original
purpose, the one for which it was developed, to a different one. For a definition of the concept
of function creep, especially in relation to surveillance data gathered both by corporations or
governments, see Muhammad Safdar, Faizan Ullah, Imran Khan et al. Function Creep in Surveillance
Techniques (2016) 2, IJSRSET.
137
Question 42: ‘Do you believe that personal data can be processed without your consent?’
92
2.3 Legal Analysis
answers, a large majority of the participants shares the idea that personal
data can be processed even without consent.
As seen, the law not only requires the consent to process personal
data but it also asks for the consent to be informed and freely given. According
to this normative schema, any person dealing with the request to authorize
the processing of her personal data should be able to make an aware choice.
Given the numerous occasions in which a person today is exposed to such
kind of request, most people, in line with these regulatory provisions, should
have almost complete knowledge of the role played by the consent and the
constraints associated with it.
However, the results of the experiment show otherwise: they suggest
not only a poor knowledge of the rules on personal data protection but also a
lack of familiarity with the legal statements mandatory included in the formal
request that allows to consider the consent as informed. A clue supporting
the idea that in everyday practice forms of “unaware” authorization to the
processing of personal data are widespread.
The sample is moreover positively correlated to questions 28 and 36
(see images n. 15 and n. 16), specifically to answers of those who claim to
have understood all the steps concerning privacy and data protection issues
during the installation of the operating system (Question 28) or to have paid
attention to privacy and cookies policy in downloading the apps (Question
36). The correlation allows a consideration: if, on the one hand, there is
insufficient knowledge of the rules governing the processing of personal data,
on the other, users are not completely aware of such a lack neither they act
to bridge the gap.
Question n. 42: Do you think that personal data can be processed without your
consent?
14. Column chart of the answers to the question no. 42
93
2. The experiment: description, data collection and analysis, findings
Another interesting point of comparison is represented by Question 57
(see image n. 17). By looking at the answers provided to this question, we will
find that more than half of respondents (68,89%) are aware not to know the
distinction between general cookies and necessary cookies while authorizing
cookies collection.
However, by comparing this result with the answers provided to
question n. 32B, we find out that participants coming from the same sample
are sure to have largely or completely understood the consequences for their
privacy arising from “Privacy” panel options. That seems to throw the very
concept of informed consent into crisis by stressing how actually rare it is that
users fully understand the nature and characteristics of the object to which
their consent (i.e., the will to provide the consent) relates.
Failure in the identification of risks for privacy. The lack of awareness
about the normative frame on data protection is often coupled with a naïve
reading of the events that put privacy at risk, fostering an understatement of
the contexts in which the probability for unlawful treatments of personal data
is potentially high.
Elements in favor of this hypothesis can be grasped by comparing
the results of Question 42 with those of Question 21 138. A negative correlation
between the two questions can be indeed detected: almost all the subjects
that answered negatively to the Question 42 (those who tend not to believe the
possibility of their data being processed without their consent) is also convinced
that the main operating systems are all compliant with norms and constraints
established by Personal Data Protection Code, without any risk for privacy.
15. Pie chart of the answers to the question no. 28
138
Question 21: Do you think that the main operating systems (such as Windows
10, Windows 7, Windows 8, El Capitan) use your personal data in compliance with Personal Data
Protection Code (Legislative Decree no.196 of 30 June 2003)?
94
2.3 Legal Analysis
Question n. 36: How carefully did you read the privacy policy and cookies when
you downloaded the app?
16. Column chart of the answers to the question no. 36
Another interesting correlation in this sense can be identified by
comparing the answers given by the subjects who responded positively to
Question 42 – i.e. those who believe that their data can be processed without
their consent – to questions 23 139 and 27 140. While a relevant part of the them
consider “very important” to protect their data while surfing the Internet, they
also show to have a very poor perception of the actions that could minimize
When you authorize the collection of cookies, do you know the distinction
between general cookies and necessary cookies?
17. Pie chart of the answers to the question no. 57
139
Question 23: How important is it for you to protect your personal data when browsing
the Internet?
140
Question 27: On a scale from 1 (strongly disagree) to 5 (strongly agree), show how much
you agree with each of the statements listed below: 1. I think it’s helpful that apps find my position and
chronology of my positions – 2. I think it’s useful that Windows and Cortana can recognize my voice to
improve service customization – 3. I think it’s helpful allow to apps to access my name, my image, and
other information on my account – 4. I think it’s useful to allow apps to automatically share and sync
my information with wireless devices not explicitly connected to my PC, tablet, or smartphone.
95
2. The experiment: description, data collection and analysis, findings
the risk of indiscriminate collections of their personal data. Not by chance, a
large part of them is totally agree with the choice to share information such as
their voice or images if required for computer services improvements.
In other terms, users’ behaviour shows to be contradictory: they are
interested in making data safe during internet navigation, but they allow the
system to have indiscriminate access to some hardware components of their
computer to gather personal information.
Such a result can be considered as a clue that the lack of knowledge
about data protection rules – discussed in the previous point – is often
coupled with a failure in the identification of the risks connected to the
choices made when interacting with technology. In fact, although the GDPR
clearly asks to ascertain the specific purposes for which the personal data
will be processed, in order to avoid forms of improper use, a large part of
the users (as suggested by the experiment) provide their consent to very
generic requests, without any specification about the single data gathered,
the collection methods and finally the link between the data collected and
the improvement of the service.
Conclusion. A first remark supported by the correlations above
reported is that not only there is a misalignment between the formal
description of the privacy safeguards and the representation of the latter by
common users but this unavoidably affect their cognitive and behavioural
models. Moreover, if we combine those results with information concerning
users’ performances during the installation – e.g. data revealing that users do
not spend much time reading, modifying and then re-reading disclaimers and
settings (see Table 28) – we can clearly see how difficult is the translation
Question n. 23: How much does it matter for you to protect your personal data
while browsing the Internet?
18. Column chart of the answers to the question no. 23
96
2.3 Legal Analysis
Question n. 27: On a scale from 1 (strongly disagree) to 5 (strongly agree),
show how much you agree with each of the statements listed below.
19. Column chart of the answers to the question no. 27
of the normative framework into practical actions that allow people to have a
safe interaction with technology.
The issue seems in tune with recent concerns highlighted by the
European legislator. In line with the provisions of the “Better regulation”
communication, the Commission carried out ex-post a control on the
adequacy and effectiveness of regulation (“REFIT evaluation”)141 of the
directive on electronic private life (so-called ePrivacy Directive). The REFIT
evaluation, as explained in the proposed Regulation on Privacy and Electronic
Communications (ePrivacy Regulation, hereinafter “Proposal”) published on
January 10, 2017 142, has stressed the difficulty of framing all possible risks
for privacy due to massive interaction with internet-based communications.
As outlined in the explanatory memorandum of the Proposal, indeed,
technological and economic developments that took place in the market over
the past years have meant that consumers and businesses increasingly
141
The European Commission’s regulatory fitness and performance (REFIT) programme
aims to ensure that EU legislation delivers results for citizens and businesses effectively, efficiently
and at minimum cost. REFIT is taken into account in the preparation of the Commission’s annual work
programs, each of which contains proposals for new initiatives and a verification of the quality of the
legislation in force in the EU.
142
Proposal for a Regulation of the European Parliament and of the Council concerning
the respect for private life and the protection of personal data in electronic communications and
repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications), Brussels,
January 10, 2017, COM(2017) then modified on 10 February 2021. See, https://eur-lex.europa.eu/
legal-content/EN/TXT/HTML/?uri=CELEX:52017PC0010&from=EN.
97
2. The experiment: description, data collection and analysis, findings
rely on Over-the-Top communications services (“OTTs”) – new Internetbased services enabling inter-personal communications such as Voice over
IP, instant messaging and web-based e-mail services – instead of traditional
communications services.
The protection of these communications through traditional legal
safeguard systems would have required an immediate revision of the
regulatory framework, which is mostly incompatible with the times of a
legislative review process. In fact, although these services are essential for
consumers’ life and businesses, the legal framework of Union on electronic
communications has left them out so far, leading a void of protection of
communications conveyed through these new services.
The problem of keeping pace with technological developments
depends therefore on the inadequacy of traditional legal solutions in handling
the risks of a constantly evolving system such as that of IT services. The result,
as emerges from the contradictory answers considered above, is to prevent
people from having a clear frame of the normative safeguards that should
avoid their data to be indiscriminately used. At the same time, this returns a
weak image of those legal strategies, like the informed consent, for a long time
considered as cornerstones in the protection of personal data and that today
show to have a not sufficiently dynamic nature to adapt to the kind of risks
coming from new data-driven technologies.
A second remark can be then sketched in line with the correlations
described at the point b of the analysis (i.e., ‘Failure in the identification of
risks for privacy’): the difficulty of translating the legal principles on personal
data processing in conscious interactions with ICT is also due to the strong
fragmentation that an individual’s identity undergoes due to services offered
by new technologies. If, as claimed by Hildebrandt 143, ‘our essence is that we
are incomputable, meaning that any computation of our interactions can be
performed in multiple ways — leading to a plurality of potential identities’, the
more our life interactions become computer interactions, the more we become
‘computable’ and thus the higher will be the number of potential identities to
which the concept of privacy should be referred to.
This also explains why in such a context an effective protection
cannot be gained by relying on informed consent, even when supported
by well written legal norms, as the high stage of indeterminacy that today
characterizes privacy concept unavoidably affects, in negative terms, the
capacity of traditional legal solutions to be effective. In this scenario, indeed,
143
Hildebrandt, Mireille, Privacy as Protection of the Incomputable Self: Agonistic Machine
Learning (2017), 19 (1) Theoretical Inquiries of Law, 83.
98
2.3 Legal Analysis
personal data about the life of a user cannot be described as autonomous
pieces of information to be authorized for processing. They emerge most of
the times by relating data deriving from all the micro-interactions of the user
with the IT services (i.e., the so-called metadata).
Neither is it possible to envisage forms of consent for the collection
and processing of data coming from those actions. As highlighted in a
paper discussing the role of data protection in IoT environments, the more
our living space is completely interconnected the lower the opportunities
for informed consent to be used as a tool for data protection. This depends
on the concrete impossibility for most of computer devices to provide an
information notice and, above all, on ‘the lack of the information notice and,
therefore, the eventual consent, both in the case of automatic and “silent”
interconnection with other devices with which the data is exchanged as
well as when the object interacts with the surrounding environment without
being immediately visible’ 144.
The idea of informed consent as it has been understood until now,
with complete information notices followed by a tick-flag or electronic
signature, is therefore increasingly in conflict with current ICT networks.
Such a solution would not only end up preventing the Internet-based
services from working correctly but would eventually prove misleading, as
information to be protected with consent is not in the data deriving from the
single interaction with the IT service but in the relationship between these
and other data collected because of previous interactions with own devices
or those of others.
On the other hand, it should be noted that the processing of these data
and the information obtained from them can be useful to identify possible
abuses, allowing as a consequence to activate the relative safeguards.
Indeed, the already mentioned Proposal for Regulation on Privacy and
Electronic Communications clearly refers to the possibility for ICT providers to
use metadata in this sense. As reported in Article 6, ‘Providers of electronic
communications services may process electronic communications metadata
if: [...] b) it is necessary for billing, calculating interconnection payments,
detecting or stopping fraudulent, or abusive use of, or subscription to,
electronic communications services’.
A last remark, sketched starting from considerations above reported,
is that the elements to be taken into account within this scenario are
144
Bolognini L., Bistolfi C., Challenges of the Internet of Things: Possible Solutions from
Data Protecy and 3D Privacy (2016) in Schiffner S., Serna J., Ikonomou D., Rannenberg K. (eds)
Privacy Technologies and Policy. APF 2016. Lecture Notes in Computer Science, vol 9857 (Springer
Cham, 2016), p 71-80.
99
2. The experiment: description, data collection and analysis, findings
numerous and do not fall within the exclusive domain of law. In this vein, the
idea of a conceptual separation among the design of IT tools, the study of
behavioural dynamics and, the development of legal protections for personal
data loses its strength. As highlighted 145, the exploration of extra-legal
features of privacy law asks privacy scholars to make an interdisciplinary
effort, ‘inviting investigations from fields as diverse as economics, computer
science, anthropology, sociology, and science and technology studies in order
to understand privacy’s place in society’.
On the other hand, the need for law to move toward hybrid solutions
has been explicitly recognized in 2007 by a Commission Communication
on Privacy Enhancing Technologies (PETs). As claimed, ‘A further step to
pursue the aim of the legal framework, whose objective is to minimise the
processing of personal data and using anonymous or pseudonymous data
where possible, could be supported by measures called Privacy Enhancing
Technologies or PETs – that would facilitate ensuring that breaches of the
data protection rules and violations of individual’s rights are not only
something forbidden and subject to sanctions, but technically more
difficult’.
What emerges is the idea to address the problem of personal data
protection from outside the legal domain and specifically by means of a
system of ICT measures that protects privacy by preventing unnecessary
and/or undesired processing of personal data, all without losing the
functionality of the IT services. In this vein, the dialogue between law and
other research areas such as psychology, design or computer science, shows
to be fundamental. The use of an interdisciplinary approach could indeed
ease the identification of instruments able to overcome, by conveying legal
prescriptions in a new form, the limits of the current enforcement system of
privacy safeguards.
Experiment results suggest looking in such a direction. As seen,
traditional systems of legal protection fail in making people aware of the
effects of their actions when interacting with ICT networks. There is a need
for tools that can act as intermediation between legal safeguards and
information systems, tools able to make common users effectively aware
of the risks to which IT services expose their personal data and how to
concretely avoid them.
145
100
Calo, R., Privacy Law’s Indeterminacy (2018) 19(1) Theoretical Inquiries in Law, 33.
2.3 Legal Analysis
2.3.2 CRITICAL ISSUES ARISING FROM THE LEGAL ANALYSIS (L.A.)
OF THE LEGAL AND UX-HMI EXPERIMENT.
2.3.2.1 A COMPARISON BETWEEN W10 OPERATIVE SYSTEMS BEFORE
AND AFTER THE ENTRY INTO FORCE OF GDPR 146
Before arriving at the conclusions of the Legal Analysis of the results
of cognitive-behavioral analysis, carried out in paragraph 2.3.1., in this
subparagraph it is necessary to deepen the update of the W10 operative
system, after the entry into force of the GDPR (W10 version 2019)147, in
comparison with that of 2016, used before the privacy reform and now no
longer available. In particular, the main differences between W10 version 2016
and W10 version 2019 concern the implementation of privacy and security in
compliance with GDPR principles.
With regard to W10 v. 2016, the ‘Fast choice’ function (whose
characteristics have been already explored in the previous chapters, see
in particular the same chapter, paragraphs 2.3.1. and followings), has
been eliminated, as it does not protect the user’s privacy according to the
provisions of the new European regulation on data protection. Specifically, for
W10 v. 2019 the privacy policy has become longer and even more complex as
showed in the images nos. 20 and 21 below (from the already very long 39
20. Privacy Policy W10 version 2016
21. Privacy Policy W10 version 2019
146
The author of this section is Maria Cristina Gaeta, Research Fellow in Privacy Law at
University of Naples Suor Orsola Benincasa, Ph.D. in Private Law at University of Naples Federico II.
147
Precisely we analysed W10 operative system, updated 10 November 2019.
101
2. The experiment: description, data collection and analysis, findings
pages of text saved in PDF format to the 63 pages, without any icon, image
or graphic to simplify the content). On the one hand, this allows to provide
the user with more information but, on the other hand, often imply enormous
disadvantage for the users in terms of confusion and unawarness. In fact, the
average user is not able to understand such a large number of information and
often ends up getting disoriented and not reading the contents of the privacy
policy. Furthermore, such contents are also difficult to understand with the
consequence that the user consents to the processing of personal data is
not aware, because given without having actually read and understood the
content of the policy. Instead, it would be appropriate to rethink the privacy
policy from the point of view of the so-called legal design, according to what
will be said in chapter III.
Concerning the security settings, instead, they have been
implemented. Indeed, now the operative system requests a PIN in addition
to the password of the Microsoft user’s account in order to raise the security
level through which most of the user’s personal data are collected. The
account is used to install W10, as well as, for example, to use the Outlook
email or the Office package, but there are data processing that can also be
done also without a Microsoft account. This happens, for example, when
an online user browses on Microsoft edge. These user’s data are collected
independently from the creation of a Microsoft account, but for the sole fact
that the user browses online on Microsoft edge and gives the consents to
the use of cookies, which happens not only when the user expressly accepts
cookies in the banner that appears at the first access to the webpage, but also
when the user continues to browse the web page after the banner appears.
As already underlined, in the banner it is expressly indicated that the user
gives the consent to cookies not only when he explicitly accepts them but
also when he or she continues to browse by scrolling 148 the web page or by
clicking in any part of it (see subparagraph 2.3.1.2.)149.
148
In computer science scrolling means sliding text with or without images, and eventually
videos, across a monitor or display, vertically or horizontally. In the case of browsing the web, scrolling
take place with user input through an interactive devise equipped with touchscreen or a keypress,
and continue without further intervention until a further user action, or can be entirely controlled by
input devices (eg. mouse or keyboard). See J Daintith, Scroll, in A Dictionary of Computing (6th ed.,
OUP, 2008). https://www.oxfordreference.com/view/10.1093/acref/9780199234004.001.0001/
acref-9780199234004-e-4638?rskey=sq9Mfc&result=4981, Accessed 23 November 2020;
Scroll, in Encyclopedic Dictionary of the Bible, translated and adapted by L. Hartman, (McGraw-Hill
Book Company, 1963) 2144-2146.
149
Cookies are currently regulated under Directive 2002/58/EC of the European
Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the
protection of privacy in the electronic communications sector, (well known as ePrivacy Directive
or Cookie Law). However the European Commission carried out an ex post Regulatory Fitness and
102
2.3 Legal Analysis
Finally, again for implementing security, if the user wants to avoid
remembering or resetting his or her personal password, can use Windows
Hello or a FIDO 2–compliant security key to sign into Microsoft account.
Windows Hello, as shown in the image n. 22, is a more personal login
solution, based on user face ID or fingerprint (with the PIN, as alternative sig in
option) and can be used to sign into user’s account on the web or to sign into
user device on the lock screen.
Security key (see image n. 23), instead, is a physical device that user
can use in place of personal username and password to sign in and may be
a USB key that the user can keep on his or her keychain, or an NFC device
like a smartphone or access card. Since it is used in addition to a fingerprint
(Microsoft Hello) or a PIN, to sig in is necessary not only the security key but
also the fingerprint or the PIN 150.
22. Hello of W10
23. FIDO 2–compliant security key of W10
Having made these premises on the main distinctions between W10 v.
2016 and v. 2019, it is now necessary to apply them to the cognitive-behavioral
analysis conducted. In the experiment carried out on W10 version 2016,
emerged several contradictions between behaviour and statements of the user
regarding data protection and technology, and these results can be confirmed
Performance Programme (‘REFIT evaluation’) of the ePrivacy Directive verifying that the Directive
has not really guaranteed an effective legal protection of privacy in the electronic communication.
Furthermore with the data protection refor has been necessary to propose a new regulation for cookie
and it has been published a Proposal for a Regulation on privacy and electronic communications. See
previous subparagraph 2.3.1 on the Legal analysis (L.A.) of the cognitive-behavioral analysis, for
more details on the cookie law.
150
More information on Microsoft Hello or security key: https://support.microsoft.com/enus/windows/sign-in-to-your-microsoft-account-with-windows-hello-or-a-security-key-800a8c016b61-49f5-0660-c2159bea4d84
103
2. The experiment: description, data collection and analysis, findings
also for version 2019, being a consequence of social consciousness 151 and
user awareness, not altered by the new version of W10.
As a matter of fact, the majority of the sample affirms to be highly
interested in protecting their data, and even when the sample is required to
compare data protection and security, the majority of them prefer the first
one compared to the second one, without considering that the two aspects
are strictly connected and that to guarantee privacy it is needed security
(cybersecurity if the data processing is online). Furthermore, only a small
percentage of the sample disables the ‘Fast choice’ function provided only in
W10 version 2016 and not in that of 2019.
In relation to the task of changing the privacy setting already chosen,
or set by default, more than half percent of the sample do not read the privacy
policy at the time of W10 installation. Furthermore, the totality of the sample
modifies the privacy settings after installing W10. However, this behavior
does not depend on a real understanding of the importance of privacy and
security settings, consciously exercising the right to self-determination, but it
depended on the fact that, when asked to the users if they want to change some
privacy or security settings (respectively, Tasks 5 and 6), after completing the
first four tasks, the users are almost induced to think it is better to change
those settings. Given this possible alteration of the user’s behavior in the
execution of the Tasks 5 and 6, in any case, it should be noted that the most
relevant privacy settings that has been modified are that of geo-localization
and information given to the ‘Cortana’ (Microsoft virtual assistant software),
even though these two settings have been previously evaluated as useful by
the 2/3 of the users.
2.3.2.2 CONCLUSIONS ON THE LEGAL ANALYSIS (L.A.):
HOW STRENGTHENING THE EX ANTE PROTECTION 152
Once analysed the users’ behaviors, thanks to the complete and indepth analysis of all the data collected, we can now come to conclusions. In
particular, it can be underlined the need of straightening the ex ante protection
in terms of (i) rethinking the privacy policy and (ii) the instrument of the
consent as the legal basis for the processing, as well as, more in general, (iii)
fortifying the principles of privacy by design and by default, and security by
design and by default.
151
About social consciousness see in particular R.S. Laufer and M. Wolfe, ‘Privacy as a
Concept and a Social Issue: A Multidimensional Developmental Theory’ (1977) 33 Journal of Social
Issues 22 ff..
152
The author of this section is Maria Cristina Gaeta, Research Fellow in Privacy Law at
University of Naples Suor Orsola Benincasa, Ph.D. in Privacy Law at University of Naples Federico II.
104
2.3 Legal Analysis
With specific regard to the data processing management, users reveal
to have misunderstood the information provided, although they affirm to be
aware and able to manage the settings. In particular, the data show that also the
users who declare to care about personal data generally do not pay attention
to the privacy policy, even in a non-natural environment, i.e. the experiment
context where subjects are specifically required to perform the task of reading
the privacy policy 153. Furthermore, users have the general tendency to prefer
operative choices that imply the simplest and shortest solution in order to be
able to use the services and features they may need, without paying attention
to what they renounce (personal data) in order to enjoy them. This choice
is also a consequence of the fact that in the digital services market, the Big
Techs (also known as “Tech Giants”) are able to offer higher level services than
other market operators, which the user does not intend to give up, preferring
to grant the processing of their personal data154.
153
This lack of attention in privacy policy, in recent years, has been finding confirmation
in a part of the literature: JA Obar and A Oeldorf-Hirsch, ‘The Biggest Lie on the Internet: Ignoring the
Privacy Policies and Terms of Service Policies of Social Networking Services’ (2020) 23 Information,
Communication & Society, 128 ff.; more in general on the privacy paradox see DJ Solove, ‘The Myth
of the Privacy Paradox’, Legal Studies Research Paper [2020] George Washington University Law
School website, 1 ff.; B Agi, N Jullien, ‘Is the privacy paradox in fact rational?’, Legal Studies Research
Paper [2018] International Federation of Library Associations, 1 ff.. However, even before the GDPR
reform there were authors who questioned the efficacy of privacy tools: M Taddicken, ‘The ‘Privacy
Paradox’ in the Social Web: The Impact of Privacy Concerns, Individual Characteristics, and the
Perceived Social Relevance on Different Forms of Self-Disclosure’ (2014) J Comput-Mediat Comm,
248 ff.; F Schaub, B Könings and M Weber, ‘Context-Adaptive Privacy: Leveraging Context Awareness
to Support Privacy Decision Making’ (2015) 14 IEEE Pervasive Computing, 34 ff.
154
On the conduct of the operator in a dominant position on the market (the big techs, in
our case), with particular regard to the link between the personal data breach, because based on a
consent not freely provided, and the anti-competitive conduct on the market, see M Midiri, ‘Privacy
e antitrust: una risposta ordinamentale ai Tech Giant’ (2020) 14 Federalismi.it, 209 ff.; C. Osti, R.
Pardolesi, ‘L’antitrust ai tempi di Facebook’ (2019) 2 Mercato Concorrenza Regole, 195 ff. On the
case ‘Bundeskartellamt’ against Facebook decided by German Antitrust Authority see M. Messolo,
‘“Bundeskartellamt” c. Facebook: tempo di aggiornare “il muro di GDPR”?.’ (2018) 1 Rivista Italiana di
Antitrust, 8 ff; G. Colangelo, M. Maggiolino, ‘Big Data, protezione dei dati e “antitrust” sulla scia del caso
“Bundeskartellamt” contro Facebook’ (2017) 1 Rivista Italiana di Antitrust, 9 ff.. The problem of the
anti-competitive conduct of the Tech Giants is very concrete and is not only related to privacy. Indeed,
the European Commission already fined Google three times with three different decisions: European
Commission decision, 27 June 2017, relating to a proceeding under Article 102 of the Treaty on the
Functioning of the European Union and Article 54 of the EEA Agreement ( Case AT.39740 – Google
Search (Shopping)); European Commission Decision, 18 July 2018, relating to a proceeding under
Article 102 of the Treaty on the Functioning of the European Union and Article 54 of the EEA Agreement
(Case AT.40099 – Google Android); European Commission Decision, 20 March 2019, relating to
a proceeding under Article 102 of the Treaty on the Functioning of the European Union and Article
54 of the EEA Agreement (Case AT.40411 – Google Search (AdSense)). Furthermore the European
Commission opened an investigation against Amazon for possible anti-competitive conduct in 2019
(https://ec.europa.eu/commission/presscorner/detail/en/IP_19_4291) and a second one in 2020 for
105
2. The experiment: description, data collection and analysis, findings
Indeed, the inefficacy of privacy warnings has been clearly
demonstrated by the cognitive-behavioral analysis carried out, especially in
a digital environment. The analysis shows that the information included in
the privacy policy or in the banner is not read with attention by the users and,
therefore, the data subject’s consent is not consciously provided.
In order to implement user awareness, we should proceed to simplify
the privacy policy, in the double direction of limiting the contents to those
strictly necessary according to article 13 and 14 of the GDPR, and explain
them in the simpler and effective way for the average user, which does not
only concern the language used but also how the information is represented
graphically (see chapter III on legal design techniques), as well as straightening
the data protection and security by design and by default. In this sense, the
privacy policy, cannot correspond to a standard model used for each type of
processing. On the contrary, the privacy notice must contain information that
is concise, transparent 155, intelligible, easily accessible and to understand, as
well as written in clear and plain language, according to article 12, paragraph
1, and recital 58 of the GDPR. Indeed, the privacy notice should be clear and
understandable by an average user who in this way could be really aware
of the existence of any type of processing. Additionally, where appropriate,
visualization principles can be used in the organization of the information
contained therein. Indeed, the paragraph 7 of the article 12 of GDPR states
that information can be provided in combination with standardised icons, in
order to help the data subject to know and understand whether, by whom, for
what purpose and how long his or her personal data are being processed. In
this light, the proposal of the GDPR, as drafted by the European Parliament 156,
the use of non-public independent seller data (https://ec.europa.eu/commission/presscorner/detail/
en/ip_20_2077). Part of the literature not only of commercial law but also of public law has already
dealt with the issue of the anti-competitive conduct of the Big Techs: A Iannotti della Valle, ‘La tutela
della concorrenza ai tempi di Google Android’ (2020) 6 Dir. inf. e informatica (forthcoming); A. Buttà,
‘“Google Search (Shopping)”: una panoramica sul caso “antitrust” della Commissione europea’ (2018)
1 Rivista italiana di Antitrust, 16 ff; G De Minico, ‘Does the European Commission’s decision on Google
open new scenarios for the Legislator? (2017) 3 Osservatorio costituzionale, 1 ff; S. Bros, J.M. Ramos,
‘Google, Google Shopping and Amazon: The Importance of Competing Business Models and Two-Sided
Intermediaries in Defining Relevant Markets’ (2017) 62(2) The Antitrust Bulletin, 382 ff.
155
Concerning the transparency processing see Article 29 WP, Guidelines on transparency
under Regulation 2016/679, 11 April 2018, n. 260. During its first plenary meeting the EDPB
endorsed the GDPR WP29 Guidelines on transparency.
156
We refer to the Annex in the first reading of the European Parliament on COM (2012)0011,
that included the iconic privacy policy. Then, in the final version of the proposal, it was eliminated.
See Proposal for a Regulation of the European Parliament and of the Council of the 25.01.2012 on the
protection of individuals with regard to the processing of personal data and on the free movement of
such data (General Data Protection Regulation) (COM(2012)0011. On the GDPR proposal, with specific
reference to the iconic privacy policy, see J.S. Pettersson, ‘A brief evaluation of icons suggested for use in
106
2.3 Legal Analysis
was supplemented by an annex containing an iconic privacy policy. The
annex, indeed, provided, standardised icons that would have been helpful to
understand the information contained in the privacy policy. However, in the
final version of the GDPR, the annex has been eliminated.
The images n. 24 below reports the list of the icons proposed. The
appropriate symbols (see image n. 25) should have been marked next to
each icon, in the appropriate space in the right column (blank column in the
image n. 24), depending on the type of processing carried out by the data
controller.
In Italy, the Data Protection Authority recognized the importance of the
images and icons that it made available some of them on its website, already
24. List of the icons proposed in the Annex to the
first reading of the European Parliament on CO M
(2012)0011
25. Symbols proposed in the Annex to the
first reading of the European Parliament on
CO M (2012)0011
standardized information policies. Referring to the Annex in the first reading of the European Parliament
on COM (2012) 0011’, WP Universitetstryckeriet Karlstad, 2014. More in general see S de Jong, D
Spagnuelo, ‘Iconified Representations of Privacy Policies: A GDPR Perspective’, in Á Rocha, H Adeli, L Reis,
et all (eds.), Trends and Innovations in Information Systems and Technologies (Springer, 2020).
107
2. The experiment: description, data collection and analysis, findings
before the entry into force of the GDPR. Very relevant are the images related to
the data processing carried out by banks (images nos. 26 and 27)157.
26. Icon of the Italian DPA on the fingerprinting
detection and video surveillance
27. Privacy policy model of the Italian DPA
on the fingerprinting detection and video
surveillance
The best-known images are that related to video surveillance systems
(images nos. 28 and 29)158 which is actually the typical sign that can be seen
28. Image provided by Italian DPA warning
for the areas under video surveillance
29. Privacy policy model of the Italian DPA on the fingerprinting detection and video surveillance
in almost all video surveilled areas. In this way, the Italian DPA has identified
a simplified model of minimum information, which makes it immediately
obvious that the area is under video surveillance. After the entry in force of
the GDPR, updated version has been proposed by European Data Protection
157
Italian Data Protection Authority, Limitations and Safeguards Applying to Taking of
Fingerprints and Image Acquisition by Banks, 27 October 2005, [web. doc. n. 1276947].
158
Italian Data Protection Authority, Limitations and Safeguards Applying to Taking of
Fingerprints and Image Acquisition by Banks, 27 October 2005, [web. doc. n. 1712680].
108
2.3 Legal Analysis
Board (EDPB) in the new guidelines on the topic, where has been specified the
images are non-binding 159.
The aforementioned measures related to privacy policy, of no marginal
impact, should effectively strengthen the ex ante protection of the data subject.
Only in this direction, the strong disinterest of users towards privacy notices
can be counteracted, so that they can be effectively aware of the processing,
protecting their interests to a lawful, fair and transparent processing 160. At
the same time, the data controller and processor will not be sanctioned for
infringements of the GDPR.
About the consent to the processing of personal data, GDPR provides,
as one of the legal basis, the rule of the express consent of the processing
of personal data (art. 6, para 1, let a), and rec. 32, GDPR) with the exception
constituted by the explicit consent, required only with regard to special
categories of personal data (art 9, para 2, let a), GDPR), profiling (art 22,
GDPR) and the transfers of personal data to a third country or an international
organisation (art. 49, para 1, lett. a, GDPR)161. In addition, recital 32 considers
as lawful any positive act clearly indicating the willingness of the user to
consent to the processing of his or her personal data, such as in the case of the
consent provided online. This mode of consent is currently very common with
the use of electronic means, where are accepted certain actions that appear
to be more closely related to implied consent rather than the express consent
that does not match the definition of positive act. Moreover, there are cases
where consent is not required at all, because there are other legal basis for
data processing according to article 6 and 9 GDPR, as also the Proposal for a
Regulation on privacy and electronic communications shows 162.
159
EDPB, Guidelines on processing of personal data through video devices, 29 January
2020, n. 3.
160
Introduction to Proposal for a Regulation on e-privacy and Electronic Communications.
For an in-depth analysis see J Misek, ‘Consent to Personal Data Processing-The Panacea or the Dead
End’ (2014) 8 Masaryk UJL & Tech., 76 ff.
161
However, the difference between express and explicit consent is unclear and it
would appear that explicit consent is nothing more than an express consent characterized by
greater determination in the behaviour of the user. I.A. Caggiano, ‘Il consenso al trattamento dei
dati personali’ [2017] DIMT 20 ff.; G Zanfir, ‘Forgetting about consent. Why the focus should be on
“suitable safeguards” in data protection law’ (2014) Reloading Data Protection 11.
162
The Proposal analyse in details the consent, as the Directive 2002/58/EC on electronic
communications has not reached its predetermined goals, and also highlighting that end users face
requests to accept so-called tracking cookies, without understanding their meaning and, sometimes,
are even exposed to cookies set without giving their consent.
109
2. The experiment: description, data collection and analysis, findings
As demonstrated in the previous subparagraph on the L.A., the data
related to the consent required to the users of the experiment, after asking
them to read the privacy policy, demonstrate that consent is not an efficient
regulatory tool for the protection of personal data 163. Indeed, consent plays no
role in decisions pertaining to data subject daily activities, especially in online
activities, also considering that clauses on data processing in the digital
environment are based on the principle ‘take it or leave it’ 164.
Therefore, the L.A. highlights the limits of the current consent to the
processing of personal data both because the user is unaware of the kind of
data processed and the type of data processing carried out in relation to his or
her data, and because, even when it is consciously given, it does not prevent
a harmful processing for the user, unless accompanied by adequate security
protection tools.
A number of relevant elements emerges for proposing a hypothetical
scenario where the processing of personal data does not depend only and
always on the data subject’s consent. It would be conceivable, indeed, to
overcame, in whole or in part, the requirement of consent, since this is no
longer a legal basis that guarantees efficacy data protection 165. When the
processing of personal data is necessarily based on consent as a legal basis, it
is necessary that this is given after reading a privacy policy prepared according
to the principles of legal design and that, at the same time, the data processing
takes place in compliance with principles of privacy and security by design and
163
See A Mantelero, ‘The future of consumer data protection in the E.U. Rethinking the
“notice and consent” paradigm in the new era of predictive analytics’ (2014) 30 Computer Law &
Security Review, 643 ff.; J Míšek, ‘Consent to personal data processing – The panacea or the dead
end? (2014) 8 (1) Masaryk University Journal of Law and Technology, 69 ff.; I Pollach , ‘A Typology of
Communicative Strategies in Online Privacy Policies: Ethics, Power and Informed Consent’ (2005) 62
Journal of Business Ethics 221 ff.; D.J. Solove, ‘Privacy self-management and the Consent Dilemma’
(2013) 126 Harv. Law Rev.,1880 ff.. More recently, after the publication of GDPR: F Zuiderveen
Borgesius, ‘Informed consent: We Can Do Better to Defend Privacy’ (2015) Vol. 13 (2) IEE, 103 ff.;
O Tene, C Wolfe, ‘The Draft EU General Data Protection Regulation: Costs and Paradoxes of Explicit
Consent’, White Paper (The Future of Privacy Forum 2014); G Zanfir, ‘Forgetting about consent. Why
the focus should be on “suitable safeguards” in data protection law’, Working Paper [2013] University
of Craiova Faculty of Law and Administrative Sciences website, 1 ff.. Allow me to refer to M.C. Gaeta,
‘Data protection and self-driving cars: the consent to the processing of personal data in compliance
with GDPR’ (2019) 24 (1) Communications Law, 15 ff.; M.C. Gaeta, ‘La protezione dei dati personali
nell’Internet of Things: l’esempio dei veicoli autonomi’ (2018) 1 Diritto inf. e informatica, 147 ff..
164
See I.A. Caggiano, ‘A quest for efficacy in data protection: a legal and behavioural
analysis’, WP 10/2017, 11 ff.
165
L. Gatt, R. Montanari, I.A. Caggiano, ‘Consenso al trattamento dei dati personali e analisi
giuridico-comportamentale. Spunti di riflessione sull’effettività della tutela dei dati personali’ (2017)
2 Politica del diritto 350 f.; I.A. Caggiano, ‘Il consenso al trattamento dei dati personali’ (n 45) 20 ff.; G
Zanfir, ‘Forgetting about consent. Why the focus should be on “suitable safeguards” in data protection
law’ (2014) Reloading Data Protection 237 ff.
110
2.3 Legal Analysis
by default. The legal-behavioral analysis carried out, indeed, highlight that the
inefficacy of the consent alone as an ex ante protection and lead to consider
alternative models of regulation based on limitation of data processing.
Assuming the (at list partial) overcoming of the consent of the data
subject as a lawful basis for the processing of personal data, a possible
way to strengthening the ex ante remedies of data protection should be the
implementation of the principle of privacy by design and by default (art 25,
GDPR)166 and that of security by design and by default. These principles are
clear examples of techno-regulation: at the time of the determination of the
means for processing and at the time of the processing itself, the controller
shall implement appropriate technical and organisational measures, which
are designed to implement users’ privacy and security.
Concerning privacy by design (art. 25, para 1, GDPR), it is therefore
essential to have a more selective approach based on processing only
information that are relevant and limited to what is necessary in relation to the
purposes of the processing (i.e. data minimization, expressly provided by art.
5, para 1, lett c), and recalled in art 25 of GDPR). One of the examples of privacy
by design are the pseudonymization process 167, such as the cryptography,
the hash function (and its variants) as well as the tokenization 168. It should be
remembered that pseudonymised data cannot be compared to anonymised
data 169 (regulated under art 12, Directive 2016/681/EU on depersonalisation),
166
R. D’Orazio, ‘Protezione dei dati by default e by design’, La nuova disciplina europea
della privacy (CEDAM 2016) 81 ff., points out that the principle of privacy by design cannot be
applied absolutely and unconditionally but must ‘take account of the state of the art, the cost
of implementation and the nature, scope, context and purposes of processing as well as the
risks of varying likelihood and severity for rights and freedoms of natural persons posed by the
processing’ (art. 25, par. I, GDPR). At the same time, however, the data controller is required to
adopt technical and organizational measures to ensure and demonstrate that the processing of
personal data is implemented in compliance with the GDPR (Art. 24 , para 1 GDPR), leading to a
reversal of the burden of proof on the controller, in order to avoid the sanction under Arts 83 and
84, GDPR.
167
The GDPR expressly defines pseudonymisation (art 4, para I, n. 5, GDPR), which is a
process of irreversible dissociation of personal data from the data subject, so that the data can no
longer be attributed to a subject identified or identifiable without the use of additional information
that is kept separately and is protected by specific technical and organizational measures to achieve
and maintain the dissociation.
168
Early examples of pseudonymisation was already provided by Art. 28 WP. For
completeness, it should be noted that pseudonymisation is only one of the possible measures of
protection of personal data which concretizes the principle of data protection by design, and it is
possible to foresee others, as expressly provided by rec. 28, GDPR.
169
According to Rec. 26 ot the GDPR, Data are anonymised through masking the
information which could serve to identify directly the data subject to whom the data relate. Otherwise,
111
2. The experiment: description, data collection and analysis, findings
given that the first ones continue to allow the identification of the data
subjects170. Nevertheless, within data protection (in which anonymisation
does not fall), pseudonymisation is an adequate ex ante protection tool,
although with some exceptions171.
Pseudonymisation could be very relevant also for balancing of the
profiling process issues, in order to allow transparent processing of pseudoanonymous data, but not to prevent it completely. The tendency towards
pseudonymisation as a possible solution to the profiling of an identified
or identifiable user has long been supported considering that the profiling
process can also be performed without identifying profiled subjects. Therefore,
different types of users are actually profiled without being able to individually
identify every single profile processed, as are pseudo-anonymous. In fact, even
non-identifying data may give a somewhat exhaustive description of a user or
group of subjects (i.e. clustering), achieving the purposes (or some of them)
pursued by profiling but without damage to the interests of the subjects from
which the data comes. With respect to the use of pseudonymisation process
as a possible solution, the recital 29 of GDPR states that pseudonymisation
should also be encouraged in relation to big data in order to process large
amounts of data without infringing the right to data protection. To ensure this
protection, the GDPR imposes specific conditions about big data analysis:
the use of appropriate technical and organizational measures (such as data
protection by design and data protection by default) and of security measures
to ensure that the additional information needed for identification is kept
separately from the pseudonymised data.
On the other hand, we have the privacy by default (art. 25, para 1, GDPR),
that is, those technical and organizational measures adequate to ensure, by
default, that only the personal data necessary for each specific purpose of the
processing are processed. A very good example that is that of prefixed boxes to
be filled in by entering only a limited number of data or, even, only one of the data
in the list provided. In the last few years this type of data collecting has been
implementing more and more thanks to the development of new technologies
and the increasingly frequent compilation of online forms, because in this way,
by default, the data controller can establish which data to collect.
for pseudonymised data, there is the possibility to identify the data subject by accessing separately
stored information. For this reason, only pseudonymised data, and not anonymised data, are subject
to regulation in the GDPR.
170
The difference between pseudonymised data and anonymised data was clear already
to the Art. 29 WP. See Art. 29 WP, Opinion 4/2014, 11.
171
Y.A. De Montjoye, C Hidalgo, M Verleysen, V Blondel, ‘Unique in the Crowd: The privacy
bounds of human mobility’ (2013) 1376 Nature.
112
2.3 Legal Analysis
In addition to the principles of data protection by design and by default
it is important to implement security measures by design and by default
which, although not expressly regulated in the GDPR, are however provided in
a broad sense in articles 32 and following of the GDPR. The concept of security
by design (eg. designing secure hardware, password, PIN, face ID, fingerprint,
security key) and by default (eg. installing good antivirus or providing the
automatic periodic updating of software) implies that, in addition to the
functional requirement, also the design, development and the maintenance
of the operative systems must be taken into account to guarantee security (in
our case cybersecurity), throughout the life cycle of the operative system. In
this direction also the Data Protection Impact Assessment (DPIA), regulated
under articles 35 and following of GDPR, can play a very important role in order
to evaluate if is it is necessary to implement the cybersecurity measures. In
particular, it is useful for the part of the DPIA relating to risks such as offline
and online data storage, website and IT channel security, hardware security
(including the security key), and measures against malware.
Cybersecurity plays a fundamental role, as data security also implies
greater protection of the same data and, therefore, greater privacy of the data
subjects. The importance of cybersecurity it is now also known to the legislator,
who, although did not intervene properly on security by design by default,
however, intervened on the regulation of the so-called essential services
characterised by a strong impact of the information and communication
technologies (ICT). Considering that the topic is not centered in this work we
just intend to point out that the European legislator in 2016 introduced the socalled NIS directive (Dir. 2016/1148/EU)172 which aimed to improve national
cybersecurity capabilities by preparing member states and requesting them
to be adequately equipped. It created European cooperation by setting up a
Cooperation Group (art. 11, NIS directive)173, in order to collaborate at European
level and manage cybersecurity risks in all services considered essential for
the economy and the society, strictly related on ICT 174. In addition, the article 9
172
Directive 2016/1148/EU of the European parliament and of the Council of 6 July
2016 concerning measures for a high common level of security of network and information systems
across the Union [2016] OJ L 194/1. The Directive was implemented in Italy by legislative decree 18
May 2018, n. 65 on Implementation of the (EU) NIS Directive.
173
The Cooperation Group is composed by the representative of the Member State, the
Commission and ENISA (art. 11, para 2, NIS Directive).
174
Concerning the essential services, the figure of operator of the essential service (OSE)
has been established. They must take all appropriate and proportionate technical and organizational
measures to manage the risks posed to the security of networks and information systems, to
prevent and minimize the impact of accidents; they must also notify the incident to the competent
NIS authority or the CSIRT.
113
2. The experiment: description, data collection and analysis, findings
of the NIS Directive has also allowed the creation of Computer Security Incident
Response Teams (CSIRTs) in each individual Member States that must deal
with incidents and risks and ensure effective cooperation at the Union level 175.
After the NIS Directive, the main among the regulation recently adopted for
cybersecurity was the Cybersecurity Act (Reg. 2019/881/UE). It consists in a
Regulation aimed at creating a European framework for the certification of IT
security of ICT products and digital services, and to strengthen the role of the
European Union Agency for Cybersecurity (ENISA)176.
In a digital habitat, the context of UX-HMI necessary for carrying out
certain activities, should be considered for measuring the efficacy and the
efficiency of the legal rules and for promoting regulations adequate to the
aforementioned context and functioning. In this sense, it is possible to speak
not only of privacy regulation but of privacy tech-regulation (for example by
implementing the principles of privacy and security by design and by default).
The idea of using technologies to regulate technology itself and, in particular,
aspects related to the protection of personal data, goes back to art 17 of the
Directive 95/46/EC, that already introduced the technical and organizational
measures that the controller should take to protect the personal data. In
those years, the Privacy Enhancing Technologies (PET), i.e. technologies to
minimise possession of personal data without losing the functionality of an
information system, has been developed 177.
In order to develop techno-regulation, different sciences from the legal
one come into play. In fact, in the 21st century, the level of in-depth analysis
required of the new technology law is so high and complex that legal must
175
In Italy, the CSIRT will replace, by merging them, the National CERT (operating at the Ministry
of Economic Development – MISE) and the CERT-PA (operating at the Digital Italy Agency – AGID).
176
All the information on ENISA are available here: www.enisa.europa.eu. At national level,
in 2019, the competent NIS Authorities have drawn up guidelines for the risk management and the
prevention and mitigation of accidents that have a significant impact on the continuity and provision
of essential services.
177
As usual, California is among the leading Countries in the technology sector and in fact
one of the first studies on PETs is from California: I. Goldberg, D. Wagner, E. Brewer, ‘Privacy-enhancing
technologies for the Internet’ (1997) Study of the University of California, Berkeley, https://apps.
dtic.mil/dtic/tr/fulltext/u2/a391508.pdf, accessed 24 November 2020. Anonymization technology
are included in PET and a first complete definition was provided by A. Pfitzmann, M. Hansen, Marit, ‘A
terminology for talking about privacy by data minimization: Anonymity, Unlinkability, Undetectability,
Unobservability, Pseudonymity, and Identity Management’ (2010) v0.34, Report of the University of
Dresden, dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.34.pdf, accessed 24 November 2020. In
2003, then, Borking, Blarkom and Olk reviewed the technologies from a data protection perspective in
their Handbook of privacy enhancing technologies. An in-depth analysis is included in the book GW van
Blarkom, JJ Borking, JGE Olk (eds), “PET”. Handbook of Privacy and Privacy-Enhancing Technologies.
(The Case of Intelligent Software Agents) (CBP, 2003); J. Heurix, P. Zimmermann, T. Neubauer, S. Fen, ‘A
taxonomy for privacy enhancing technologies’ (2015) 53 Computers & Security, 1 ff.
114
2.3 Legal Analysis
work in synergy with experts from other sectors of science (e.g. engineers,
computer scientists, statisticians, psychologists). As a matter of facts, on
one side, legal science needs technical science to understand technologies
and provide specific regulations based on its functioning, on the other side,
technical science needs the social one for putting in place the principles of
techno-regulation based on current law, first of all, hard law, but also soft law,
which in this context plays a fundamental role of guidance and orientation.
115
CHAPTER 3
SOLUTIONS TO THE CRITICAL ASPECTS
EMERGING FROM THE EXPERIMENT:
THE LEGAL AND UX&HMI PERSPECTIVE
3.1
RESULTS FROM THE LEGAL AND UX&HMI: A PERSISTENT
CRITICAL ASPECTS ON CONSENT AND ITS REAL ROLE178
The consent of the data subject with regard to the processing of his/
her personal data has a relevant position in the overall processing, according
to the right to data protection (privacy) as a fundamental right in the European
Union.
However, it is arguable that the informed consent-based regulatory
approach is somewhat ineffective. Terms of Services or Privacy Policy Terms
and Conditions are normally not read.
The problem of “empty ceremony” 179, which accompanies the
subscription of any pre-arranged form, is emphasized in relation to the
informed consent to processing of personal data, where the protection
of personal data is perceived by the party as extraneous to the economic
operation he or she is interested in at that point. «Users are formally called
to make decisions concerning the disclosure of their personal information on
the basis of a difficult trade-off between data protection and the advantages
stemming from data sharing (eg. interacting with smart-transport or a healthcare system)» 180.
It can be argued that in the case of data protection, disclosure can play
a role different from that of addressing information asymmetries. Yet, it allows
the data subject to initiate the control of processing 181. However, it is arguable
that as for asymmetry, so for the controlling function, the information provided
is not capable of clearly signposting awareness of the individual’s will.
178
The Author of this paragrapgh is Prof. Ilaria Amelia Caggiano, Full Professor of Privatew
Law at Suor Orsola Benincasa University of Naples.
179
S. Patti, Consenso, sub art. 23), in AA.VV., La protezione dei dati personali. Commentario
a cura di C.M. Bianca – F.D. Busnelli, t. I, Padova (Cedam), 2007, p. 541 ss.
180
G. Comandè, Tortious privacy 3.0: a quest for resarch, in Essays in Honour of
Huldigingsbundel vir Johann Neethling, LexisNexis, 2015, 121 ff., 122.
181
Mazzamuto, Il principio del consenso e il potere della revoca, cit. at 1004.
119
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
In this regard, a broad literature on behavioural studies shows how
data protection decisions completely prescind from the rational choice
paradigm and that, in any case, data subjects are also willing to exchange
their data for a minimal benefit or reward 182. This tends to happen also in
situations where they have previously stated that they want a high degree
of protection for their data. Empirical evidence then demonstrates how,
given a confused perception of the meaning and scope of data protection
in social conscience, consent to the processing of personal data is formed
by heuristics or other cognitive shortcuts that are totally detached from the
world of informed and conscientious consent, which presupposes behaviour
inspired by rationality 183.
It is doubtful then that consent can be a mechanism that has any
function in protecting individuals’ right to personal information. These
considerations require some attention by the legal actors and the legislator
(!), who should finally acknowledge it 184.
As anticipated, the consent of the data subject, where required, is only
the foundation of a much more articulated procedure, which allows the data
subject to control the controller’s activity. It could be assumed, then, that the
consent, as a seal to the information provided, can at least recall the attention
of the actors involved (controller, processor) to the processing. However, even
as far as this aim is concerned, it is not clear how by providing information on
his/her own duties, one can be compelled to obey such duties.
Assuming – as it is – that these considerations are somehow known,
what is the legislators (and other regulators)’ attitude?
182
Strahilevitz, Toward a Positive Theory of Privacy Law, 113 Harv. Law Rev. (1999) 1;
Solove, Privacy self-management and the Consent Dilemma 126 Harv. Law Rev. (2013) 1880;
Borgesius, Informed consent: We Can Do Better to Defend Privacy, IEE, 2015 (vol. 13, p. 103-107) –
recommended cit.; ID. Behavioural Sciences and the Regulation of Privacy on the Internet, Amsterd.
Law School Research Paper no. 2014 – 54; Acquisti, Privacy, in Riv. pol. Econ., 2005, p. 319; Barocas
– Nisembaum, On Notice: the Trouble with Notice and Consent, Proceedings of the Engaging Data
Forum: The First International Forum on the Application and Management of Personal Electronic
Information, October 2009. Available at SSRN: https://ssrn.com/abstract=256740; Benshahar –
Chilton, Simplification of Privacy Disclosure: An Experimental Test 2015 45 Journ. Legal Studies
(S2):S41-S67 (2015).
183
S. Patti, Consenso, cited above. In a legal perspective, if one considers irrationality
of common cognitive perception in order to invalidate the consensus, this would result in all
consents being spoiled. Without coming to accept such eccentric prospects, which would also be
a serious problem of coordination with the system of legal acts, the analysis here provided seems
to demonstrate, however, the ineffectiveness of the law with respect to the interest intended to be
protected.
184
G. Comandè, op. cit., p. 123 unveils the mask of EU recognition of privacy as a
fundamental right: it «de facto enable (s) ample commodification of personal data, shifting […] to
private deregulation via contract rules (terms of services and privacy policies)».
120
3.1 Results from the legal and UX&HMI
For example, the Italian Data Protection Authority decisions and
guidelines stress the promotion of a better disclosure, multi-layered information
etc. as it has been recently requested to a non-EU business (Google case) 185.
However, it is questionable whether even ameliorated information can really
be capable of affecting the users’ awareness, as evidenced by recent studies
dealing with Google’s own information 186.
It seems obvious that if one persists within the logic of the information
disclosure or its progressive, minimal improvements, chances are that this
exercise does not lead to significant results 187.
Moreover, the ex ante provision of the consent to processing, which
embodies the idea of self-determination of the individual with regards to the
relevant information, in accordance with the theoretical model of fundamental
rights, cannot limit the risks of data dissemination, given also the complexity
of market’s self-governance.
In any case, the current state of surveillance and technology does
not prevent violation of data protection (think of spamming). This is true,
as the experience teaches, with regard also to the case of a user, singly and
especially reluctant to give consent to the processing of his/her data, and
may be explained by the massive requests for data dissemination and the
processing through algorithms 188.
As described above (sect. 1), in the world of big data, technical regulations
hardly make pertinent data controllable and comprehensible to the end user.
185
Google, who has agreed to comply with the requirements of the Italian Authority in
2013 in relation to the information provided: improvement and differentiation of the privacy policy in
relation to the various services provided, prior consent for profiling, filing and deletion of data. More
details are available at the link: www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/
docweb/3740038 The number of measures taken by the Authority on disclosure (Articles 13 and
161 privacy codes) are 642 (source: GarantePrivacy site)
186
O. Benshahar – A. Chilton, op. cit., where it is shown that none of the techniques
of simplifying information through best practices or warnings has changed the behaviour of the
“stakeholders”; L.J. Strahilevitz – M.B. Kugler, Is Privacy Policy Irrelevant to Consumers?, 45 Journ.
Legal Studies S2, pp. S69 – S95 (2017) conducted another experiment on more than 1,000
Americans by submitting them (at random) two text versions of Google and Facebook privacy
information, one clear and the other vague, to authorize facial recognition and processing of data.
The results of the experiment confirmed that the users’ choices, which also considered that highly
intrusive processing, have not changed because of the language of the information, but due to
social norms and the technological experience. On this point, let me cite L. Gatt, R. Montanari, I.A.
Caggiano, Consenso al trattamento dei dati personali. Un’analisi giuridico-comportamentale. Spunti
di rifessione sull’effettività della protezione dei dati personali, AA.VV., Nodi virtuali, legami informali.
Internet alla ricerca delle regole, edited by D. Poletti and P. Passagna, forthcoming, p. 53 ff.
187
See Sunstein, Nudge Effect: The Politics of Liberal Paternalism, trad. It. Edited by Barile,
Milan (Aegea University Bocconi), 2015.
188
In 2010, it is estimated that 90% of e-mails are nothing more than spam. Bocchiola,
Privacy, Filosofia e politica di un concetto inesistente, Roma (Luiss University Press), 2014, p. 46.
121
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
The online processing of personal data accentuates the problem of
controlling the flow of personal information in borderless cyberspace. The
data subject easily loses track of the consent given and remains unable to
understand which data controller has provided his/her data to subjects the
data subject does not want to share with 189.
189
122
G. Comandè, op. cit., 2015, 121 ff. , at 122 and 124.
3.2
A DE IURE CONDITO PROPOSAL FOR THE EMPOWERMENT
OF INFORMATION ABOUT DATA PROCESSING: LEGAL DESIGN
AS AN EX ANTE REMEDY 190
Introduction – This chapter deals with the causes and the possible
remedies of the low awareness issue of the data subjects reading privacy
policies. It focuses on the theme of legal design, according to which legal
information must be elaborated using an interdisciplinary approach, which
includes law, design thinking and technology, to encourage awareness of
rules, simplifying the processes and the ways in which they are exposed to
the recipients.
Considering an overview of current laws, it is believed that rewriting
legal clauses according to the legal design method, represents a possible ex
ante protection, as it should increase the awareness of the common user.
The complexity of the contracts and the use of a tricky language 191
reduces understanding for the common users 192.
In this regard, numerous studies 193 have focused on contract
visualization 194 as a tool to improve the understanding of contracts. Also,
190
The author is Livia Aulino, Lawyer and Ph.D. candidate in Humanities and Technologies:
an integrated research path, at Suor Orsola Benincasa University of Naples.
191
Hagedoorn J., Hesen G., Contractual Complexity and the Cognitive Load of R&D Alliance
Contracts (2009) Journal of Empirical Legal Studies 818, 847.
192
Baddeley has provided the first systematic interpretation of memory, still valid today,
in terms of both structures and processes, recognizing a limited capacity and potentially susceptible
to the overload caused by tasks that require an expensive attention load; Baddeley AD, ‘Working
memory and language: An overview’ (2003) Journal of Communication Disorders 189, 208.
193
The research in this field offers significant results in support of the effectiveness
of UX/UI design techniques in facilitating understanding: according to some empirical evidence,
contracts and business documents are more easily understood by the actors involved when they
include visual representations of the their content; Passera S., Haapio H., Transforming Contracts
from Legal rules to user-centered Communication Tools: to Human – information challenge (2013)
Communication Design Quarterly 38, 45.
194
On the theme, see: Passera S., Smedlund A., Linasuo M., Exploring contract visualization:
Clarification and framing strategies to shape collaborative business relationships (2016) Journal of
123
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
because of the lack of understanding of the contract can constitute a risk for
the consumer or the data subject, which are usually the weaker part of the
contract.
The need for greater protection of values, in particular those of the
awareness of the consumer or the data subject, strongly emerges, dominating
the debate about contract understanding 195.
New technologies are imposing new languages and new tools 196 in
society. The practical jurist should carry out a functionalistic reading of the
current legislation in order to understand the phenomenon rather than ignore
it 197. If, on the other hand, the jurist chooses to get to a non-functionalistic
reading of the current legislation, he would legitimize a repressive order of
those new emerging expressions and needs of the information society.
From the data of the experiment it emerged that the usability of
Microsoft’s Windows 10 operating system is medium-low. In particular, the
examination of biometric data shows that even those who declare that the
protection of privacy matters then opts for the default data settings. The
reason is that obviously the user is not perceiving the risk.
If an average user employs a few seconds to read an information,
which usually requires about three hours to read, it means that he has only
turned the pages of the legal document, giving his consent to the processing
of his personal data without any awareness.
In this context, it is necessary, for the jurist in front of the development
of the computer phenomenon, to develop a practical approach rather than
being a mere analyst of the legislative language.
Strategic Contracting and Negotiation 69,100; Kay M., Terry M., Textured Agreements: Re-Envisioning
Electronic Consent, Proceedings of the Sixth Symposium on Usable Privacy and Security (Association
for Computing Machinery 2010) https://doi.org/10.1145/1837110.1837127; Passera S., Haapio H.
User-Centered Contract Design: New Directions in the Quest for Simpler Contracting, in Henschel R.F.,
Proceedings of the 2011 IACCM Academic Symposium on Contract and Commercial Management
(Tempe, 2011) 80,97; Passera S., Pohjonen S., Koskelainen K., Anttila S., User-friendly Contracting
Tools – A Visual Guide to Facilitate Public Procurement Contracting (2013) Proceedings of the
Academic Forum on Integrating Law and Contract Management: Proactive, Preventive and Strategic
Approaches 74, 94; Mamula T., Hagel U., The Design of Commercial Conditions (2015) Jusletter IT.
195
The scenario in which the contract law pays is very different with respect to the
origins of the codicist model, but whose speed of global transformation of political, economic
and social phenomena is such that one wonders whether the contract to have subjugated the
law is now; Conference Diritto e confini, Venditti C., 23.05.2019, www.unisob.na.it/eventi/eventi.
htm?vr=1&id=19038.
196
In order to transcribe the legal language into electronic language (symbolic
metalanguage) and vice versa, it is necessary to resort to techniques of homogenization and
linguistic standardization in syntactic links. Frosini V., Diritto e informatica negli anni ottanta (1984)
2 Riv. trim. dir. pubbl. 390,400, now in ID., Informatica diritto e società (Giuffrè, 1988) 231.
197
Clarizia R., Informatica e conclusione del contratto (Giuffrè, 1985) 172.
124
3.2 A de iure condito proposal for the empowerment of information about data processing
Therefore, an ex ante remedy may be the “rewriting of legal clauses”
also on the internet, according to the legal design methodology.
30. Example of renewal of the drafting process of the contract
The principles and methods of providing legal information. The article
4 of the 2016/679 European Regulation n. 2016/679 198 on privacy and data
protection, so-called GDPR, defines the processing of personal data, as any
operation carried out 199 with or without the aid of an automated process and
applied to personal data.
The data processing has to be carried out according to the lawfulness
principle200 and correctness, and the data must be collected and processed for
specific, explicit and legitimate purposes, and used in terms compatible with
them 201.
Consent is one of the six legitimate bases for processing personal
data, as enshrined in article 6 of the GDPR. Therefore, the data controller must
carefully evaluate the appropriate legitimate basis for the data processing.
Working Party 29, in an opinion on the definition of consent 202, specified
that the invitation to accept the processing of data should be subject to strict
criteria, since the fundamental rights of the interested party are at stake.
198
The General Data Protection Regulation n. 2016/679, henceforth GDPR is the European
legislation on privacy and personal data protection. It was published in the European Official Journal on
May 4, 2016, and entered into force on May 24, 2016, but its implementation in Italy took place on May
25, 2018. Its main purpose was to harmonize data protection regulations within the European Union.
199
These operations are: the collection, registration, organization, structuring, storage,
adaptation or modification, extraction, consultation, use, communication by transmission, dissemination
or any other form of mass available, comparison or interconnection, limitation, cancellation or
destruction. The second paragraph specifies that the data processing concept encompasses all
those operations that imply a knowledge of personal data.
200
Illegally collected or processed data cannot be used in any way; otherwise, the user
may be subject to penalties and sentenced to compensation for the damages caused, pursuant to
art. 2050 of Italian civil code.
201
Furthermore, according to the standard, the data must be exact and updated, relevant,
complete and not excessive in relation to the purposes of the data processing. They must be kept for
a period not exceeding the time necessary to achieve the purposes of the processing, after which the
data must be deleted or anonymized.
202
Article 29 Data Protection Working Party: Opinion n. 15/2011 (2011) available at the
link: www.garanteprivacy.it/home/docweb/-/docweb-display/docweb/1895739.
125
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
In this sense there is also the article 7, paragraph 4, GDPR 203 which
dictates the rules for assessing whether consent has been freely given. This is
to ensure that the processing of personal data for which consent is requested
cannot be transformed directly or indirectly into a contractual performance.
The art. 12 of the GDPR provides that the data controller adopts
appropriate measures to provide the interested party with all the information
referred to in articles 13 and 14 and in the communications pursuant to
arts. 15 to 22 and to the art. 34 relating to data processing in a concise,
transparent, intelligible and easily accessible form, with simple and clear
language, in particular in the case of information specifically intended
for minors. This is to ensure the transparency and accuracy of the data
processed right from the design stage of the processing itself, and to be
able to prove it at any time 204.
The recent European Regulation n. 2016/679 has also introduced the
principles of privacy by design and by default where, in art. 25, requires that
the design of the software minimizes the use of personal data and the risk for
the data subject. Both in the sense of making choices that tend to anonymize
data to be collected, and in encouraging the awareness of the people who
consent to the data processing.
The W.P. 29 205, on the subject of transparency, established the
obligation to adapt the legal communication to the addressee.
The W.P. 29 also provided an in-depth analysis on the notion of
consent and on how to provide information. The W.P. 29, in fact, published
the guidelines on the topic, in 2018 and 2020.
In particular, the WP29 clarified that the data controller must always
use clear and simple language to describe the purpose of the data processing
for which consent is requested. This means that the message should be easily
understood by an average person, not just a lawyer. The data controller must
ensure that consent is provided on the basis of information that allows the
data subject to easily identify who the data controller is and to understand
what he is consenting to. Therefore, those informations that are necessary to
203
Article 7, paragraph 4: «When assessing whether consent is freely given, utmost
account shall be taken of whether, inter alia, the performance of a contract, including the provision of
a service, is conditional on consent to the processing of personal data that is not necessary for the
performance of that contract».
204
In compliance with the accountability principle enshrined in art. 5 of the GDPR, the
data controller, taking into account the nature, context and purpose of the processing, will have to
guarantee, and be able to demonstrate that it is carried out in accordance with the legislation and in
a way that does not determine risks to the interested parties.
205
Article 29 Data Protection Working Party: Guidelines on transparency under regulation
2016/679, 17/EN WP260 rev.01 (April 2018).
126
3.2 A de iure condito proposal for the empowerment of information about data processing
express consent to the processing of personal data cannot be hidden within
the general conditions of contract/service 206.
Furthermore, even when consent must be expressed electronically,
the request must be clear and concise 207.
With particular reference to the consent given electronically, the
European Data Protection Committee (EDPB), on May 4, 2020 adopted the
new Guidelines on consent 208.
The EDPB has clarified that the consent provided by the interested
party when accepting the so-called “cookie walls” is not valid, if the acceptance
of cookies is a necessary condition to access the website.
In addition, the EDPB considered the consent obtained by scroll to be
invalid, as this action can hardly determine an unequivocal manifestation of
consent.
Microsoft also exploits cookies 209 for various purposes, including:
storage of preferences and settings; access and authentication; safety;
storage of information provided by the user to a website; social media;
feedback; interest-based advertising; display of advertisements; analysis;
performance.
As a result, if a user visits a Microsoft website, the site sets cookies 210.
Most web browsers automatically accept cookies, but provide controls to block
or delete them.
In Microsoft, if the “Cortana” assistant is used to search for something
on Bing, the cookies are automatically installed, as can be seen in the following
31. Example of automatic installation of cookies
206
Article 7 of the GDPR provides that in cases where consent is required in a paper
contract that also covers other aspects, then the request for consent must be clearly distinguishable
from the rest.
207
Similarly, the Recital 32 of the Regulation provides that if consent is requested
electronically, the request for consent must be separate and distinct, and cannot simply appear in a
paragraph within the general conditions of contract/service.
208
European Data Protection Board (EDPB): Guidelines 05/2020 on consent under
Regulation 2016/679 (May 2020). This is an updated document of the guidelines developed in 2018.
209
Cookies are small text files that store data that can be retrieved from a web server in
the domain that stored the cookie. Some cookies are placed by third parties who act on behalf of the
device (e.g. Microsoft).
210
Companies that provide content store cookies independently.
127
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
image. The user, in fact, is notified of the use of cookies by a very small banner at
the top where it says: «this site uses cookies for analysis, personalized content
and advertising. By continuing to browse this site, you accept this use».
To block or to delete cookies, it is necessary to go to the settings in the
privacy and services section.
In this way, the legal information provided to the user does not comply
with the principles of clarity and transparency as exposed in the European
regulation and the EDPB guidelines.
Therefore, if the user consents to the processing of his personal data
without understanding the meaning, or if he continues to browse the site
always accepting cookies automatically, he is obviously not perceiving the
risk. As said before, a possible ex ante remedy could be a “rewriting of legal
clauses” also on the internet, according to the legal design method.
The legal design approach as an ex ante remedy. The legal design
issues the need to simplify the text of the privacy statement, drafted pursuant
to art. 13 of Regulation (EU) 2016/679 (GDPR), or the consent form for the
processing of personal data 211, pursuant to art. 6, through the legal design
approach 212.
So, we can say that the legal design 213 has become a particular
discipline that aims to achieving a correct and effective visualization of a
legal content, favoring communication and understanding through the use of
textual, para-textual and contextual elements, as well as of the information
visualization 214, each of them accurately evaluated according to their own
211
In most legal elaborations, privacy policy relating to the collection and processing of
data is written to simply satisfy the legal requirements of mandatory disclosure, instead of effectively
informing the persons concerned about the collection and processing of their personal data. Haapio
H., Hagan M., Palmirani M., Rossi A., Legal design patterns for privacy (2018) Data Protection/
LegalTech Proceedings of the 21st International Legal Informatics Symposium IRIS 445,450.
212
The notion of legal design was coined by Margaret Hagan in Hagan M., Law By Design,
(Retrieved March 2018), from www.lawbydesign.co/en/home. Legal Design is the application of
human-centered design to the world of law, in order to promote the usability and comprehensibility
of legal instruments, and in particular contracts.
213
Legal design is a method to create legal services, focusing on their usability,
usefulness and involvement. This is an approach with three main groups of resources – process,
mentality and mechanics – to be used by legal professionals. These three resources can help you
conceive, build and test better methods of acting in the legal sphere, which will involve and empower
both legal professionals and individual users. Hagan M., Law By Design, (Retrieved March 2018),
from ww.lawbydesign.co/en/home.
214
Information visualization is the study of (interactive) visual representations of abstract
data to reinforce human cognition. The abstract data include both numerical and non-numerical data,
such as text and geographic information. On the subject see: Card S.K., Mackinlay J.D., Shneiderman
B., Readings in Information Visualization: Using Vision to Think (Morgan Kaufmann 2007); Kerren
128
3.2 A de iure condito proposal for the empowerment of information about data processing
methodologies of the matter. In this way the sensation of cognitive fluidity of
the user would be significantly promoted 215.
This new discipline realizes a hybridization of knowledge through
a proactive ex ante methodology. In fact, legal design combines law 216,
technology, ethics, philosophy, semiotics and communication. In particular,
semantic technology, law and communication interact in the creation of visual
icons which, through design 217 and following the principles of transparency
and clarity, help the user to understand the legal concept.
32. Multidiciplinarity of the Legal design
The principles of transparency and clarity, which are inherent in the
Italian legal culture 218 and more generally in the European one 219, represent
A., Stasko J.T., Fekete J.D., North C., Information Visualization – Human-Centered Issues and
Perspectives, Vol. 4950 of LNCS (Springer, 2008) ; Mazza R., Introduction to Information Visualization
(Springer, 2009); Spence R., Information Visualization: Design for Interaction (Springer, 2014).
215
Cognitive fluidity is the human tendency to prefer / act on things that are more familiar
and easy to understand. On the point we refer to Kahneman D, ‘Thinking, Fast and Slow’, (Farrar,
Straus and Girox, 2012), that “the general principle is that whatever we do to reduce cognitive tension
is useful, so we should first make the sentence maximally legible”. According to Kahneman, there
are four underlying causes of cognitive fluidity between: an experience repeated over time that fuels
the feeling of familiarity; clear and legible characters able to increase the sensation of truth; an idea
subjected to priming, which leads to a feeling of positivity; the good mood that is able to fuel the
feeling of lack of effort.
216
The “legal” concept must go beyond the static positive law approach and therefore must
not refer only to the legislative system, but must also include doctrine, jurisprudence, contractual
provisions, policies. LeDA, The legal design alliance, www.legaldesignalliance.org The “legal” concept
must go beyond the static positive law approach and therefore must not refer only to the legislative
system, but must also include doctrine, jurisprudence, contractual provisions, policies. LeDA, The
legal design alliance, www.legaldesignalliance.org.
217
Design is the approach to a more usable, useful and accessible social system, which
aims to experiment and create new documents, services, technologies, rules and systems.
218
The art. 35 of the legislative decree n. 206/2005 (so-called Consumer Code) in the
first paragraph establishes the obligation to draft the clauses proposed to the consumer in writing
clearly and comprehensively. Furthermore, the Consumer Code, in art. 48 establishes that the
professional provides the consumer with information clearly and legibly; art. 51, 1 co., states that for
distance contracts the professional provides or makes the information available to the consumer in
simple and understandable language.
219
See articles 12 and 13 of the General Data Protection Regulation n. 2016/679.
129
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
the foundations of legal design. In fact the visual and textual elements used try
to represent the legal information in a clear and understandable way, without
compromising the freedom of decision of each user.
33. Basic principles of legal design
Through the tools of legal design not only the graphic layout of the
contract is changed. In fact, the aim is to make the consumer or the data
subject aware of the legal information provided. So, this also affects the
relationship between professional and consumer or data subject, and more
generally between the strong subject that predisposes the legal document
and the weak person who needs to immediately understand its content.
In fact, the methodology of legal design consists of some steps:
classification of the existing situation (discover the status quo); focus on the
type of user (focus on a person); re-frame the challenge; develop ideas (wide
brainstorming); understanding and prioritizing (make sense, prioritize the
information that have to present in the text); elaborate a prototype (creation
of a first example of the text); testing 220. The use of clear, effective and easily
understandable language, as well as the insertion of graphic icons 221 facilitates
the usability and accessibility of the legal content (of the privacy policy or of
the consent form) even for those who do not have a legal background 222; this
would also favor a better relationship between professional and consumer or
data subject, based on the accuracy, transparency and clarity.
220
Hagan M., Law By Design, (Retrieved March 2018), from www.lawbydesign.co/en/home.
It should be noted that other studies have been carried out, which similarly to the one
in question, have used tools for measuring behavioral analysis and human-centered methods, in
order to create a set of so-called graphic icons. DaPIS for legal concepts, in particular on the protection
of personal data. On this point we note: Rossi A., Palmirani M., DaPIS: a Data Protection Icon Set to
Improve Information Transparency under the GDPR (2019) available on gdprbydesign.cirsfid.unibo.
it/wp-content/uploads/2019/01/report_DaPIS_jan19.pdf.
222
Some authors suggest five guidelines that can help everyone deal more easily the
documents: 1) develop the information from bite, to snack, to meal; 2) show the structure with an
informative title, headings, and links; 3) break up the information with short sections, lists, paragraphs,
and sentences; 4) write in plain language, with common words and active verbs; 5) Design for visual clarity.
Jarrett C., Redish J., Summers K., Straub K., ‘Design to Read: Guidelines for People Who Do Not Read Easily’
(2010) 9 User Experience Magazine. www.effortmark.co.uk/design-read-guidelines-people-read-easily.
221
130
3.2 A de iure condito proposal for the empowerment of information about data processing
This demand emerges more clearly where software and devices
– as often happens – collect data also for other purposes that are different
than those that are recognized as necessary for the service provider (eg
marketing, improvement of its systems, collection of big data for analysis),
often requiring data that is not strictly necessary, in contrast to the principle
of minimization established by the GDPR. So, it is necessary to make common
users aware about the purposes of that data processing and about those data
types that are actually necessary for the delivery of the service, as well as the
destination of all the additional data that is collected.
In this context, legal design could play an essential role, through
visual tools that draw the attention of the users, make them more aware of
the purposes for which they have given consent to the processing of its data.
In particular, they must gain consciousness of their data that will be collected
and processed even without explicit consent (or for the other legal bases
referred to in article 6), and that their activity involves those data for which is
required explicit consent to the processing, either because they are sensitive
data (details data categories pursuant to article 9 GDPR) or because their
processing is aimed at further purposes, and in the case has to be specified
whether the processing takes place in an automated form (Big Data).
Legal design should increase this awareness because in a perspective
of privacy by design and by default, consent cannot be given automatically
but the conscious will of the interested party must emerge and therefore that
is not inherent in an automatic act.
Moreover, the need to make the user more aware emerges even more
when the data subject is a minor, where, pursuant to art. 8 GDPR, in the case of
a direct offer of IT services to minors, the processing of the minor’s personal
data is lawful if the minor is at least 16 years old, and at least 14 years in
Italy 223. Finally, in the case of processing sensitive data, the purpose of the
processing should be clearer and more transparent to the common user, in
order to guarantee a really informed consent. This is because the qualification
of explicit consent 224 remains only with regard to sensitive data, where consent
is required for these.
An empirical study 225 carried out on the same subject as the
one in question has shown that the privacy policy should be divided into
223
See: Caggiao I.A., Privacy e minori nell’era digitale. Il consenso al trattamento dei dati dei
minori all’indomani del Regolamento UE 2016/679, tra diritto e tecno-regolazione (2018) Familia 3, 23.
224
The explicit consent can be understood as consent not only not inferable from
conclusive behavior (expressed), but that is clearly manifested. Caggiano I.A. ‘Il consenso al
trattamento dei dati personali’ (2017) DIMT online 12.
225
The research was conducted by M. Hagan at Stanford University. 20 participants
interviewed, young adults attending Stanford University, and 100 participants from Mechanical Turk,
131
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
shorter communications, to be included in a single page, whose terms
must be presented in more discreet and pertinent ways in order to cover all
conditions and types of situations. Instead, the other types of terms should
be communicated to the user only if he is interested in making a decision on
that topic. With this different approach, the user should be able to avoid being
overwhelmed by an enormous flow of information.
Hence the user’s understanding can be facilitated ex ante, without
generating any type of error or frustration for him, if the service provider, with
the help of a legal design expert, designs the privacy policy through the use of
the user experience (UX) 227 and the interaction design techniques (behavioral
science) together with the application of the principles of private law.
The fact that legal information is not clear and immediate in its
understanding contrasts with the need to guarantee a high degree of situation
awareness 228 in the operational context in which the user moves.
SA is a metric connected to the world of cognitive ergonomics that
refers to the understanding of environmental elements in a specific spacetime dimension for decision makers. Even in these cases, a reduction in user
operations, and a contraction of the mental workload, reduces the SA with
consequences in the effectiveness of monitoring and recovery.
The inefficiency of the current ex ante protection, which is recognized in a
prohibition on data processing, emerged from the experimentation conducted
and from the legal-behavioral analysis of the collected data 229.
Therefore, as anticipated in the introduction, if a functionalist 230
interpretation of the current legislation is pursued 231, it is possible to conceive
another type of ex ante protection.
Consequently, in a de iure condito vision, the legal design methodology
arises as an ex ante remedy that could also guarantee a greater degree of
situational awareness.
were between 20 and 40 years old, were interviewed and ranked themselves as somewhat tech-savvy
or above. Hagan M., The User Experience of the Internet as a Legal Help Service: Defining Standards for
the next Generation of User-Friendly Online Legal Services (2016) Va. JL& Tech 395, 465.
226
User Experience (UX) concerns the interaction with a product, a system or a service, and
precisely personal perceptions on the utility, on the simplicity of use and on the efficiency of the system.
227
La situational awareness (SA) is an ergonomics concept, that was defined by Endsley.
See Endsley M., ‘Toward a Theory of Situation Awareness in Dynamic Systems’ (1995) Human
Factors Journal of the Human Factors and Ergonomics Society 37 32,64.
228
Gatt L., Montanari R., Caggiano I.A., Consenso al trattamento dei dati personali e analisi
giuridico-comportamentale. Spunti di riflessione sull’effettività della tutela dei dati personali (2017)
Politica del diritto 343,360.
229
Clarizia R., Informatica e conclusione del contratto (Giuffrè, 1985), 12.
230
Alpa G., Intervento conclusivo, in Gatt L., Il contratto del Terzo Millennio. Dialogando con
Guido Alpa (Editoriale Scientifica, 2018).
132
3.3
A DE IURE CONDENDO (NORMATIVE) PROPOSAL
FOR AMELIORATING THE GDPR 231
If the data subject’s decision does not influence protection of personal
data, this may call for a change of paradigm.
The EU Charter of Fundamental Rights already recognizes consent
as only one of the legitimate bases for the processing of personal data.
Hence, one might say that the consent is not a key requisite, as confirmed
by the legislation that will apply from 2018, which includes the purpose
of direct marketing as a legitimate interest pursued by the data controller
(cons. 47), such as to exclude the consent by the data subject (Article 6,
par. 1 ( f )).
In this context, then, there may be room for a system where the
consent rule is interpreted restrictively or, in a de jure condendo perspective,
no longer required. The protection of data (if!) would be entrusted in principles
and rules relating to the processing, as fairness, risk of loss minimization and
security assessments according to the accountability principle, promoted
by the new discipline, which reinforces the controller’s responsibilities as
regards to its organisation. It should also be pointed out that the fact that the
processing can rely on legitimate bases other than consent does not, at least
in principle, exclude the necessary balancing of these other interests with
fundamental freedoms.
Such a scenario would unveil the “illusion of consent” where consent
is considered as a tool for controlling one’s data, and overcome the false
perception of people able to independently control and protect their data.
More effective means for the purpose of protecting data (such
as uncontrolled or inappropriately justifiable data dissemination), can be
considered: rules of processing, limits (or prohibitions) for certain types of
231
The Author of this paragrapgh is Prof. Lucilla Gatt, Full Professor of Private Law at Suor
Orsola Benincasa University of Naples.
133
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
processing and/or for certain data, or more efficacious data anonymization
guarantees and internal technical rules of processing 232.
From this point of view, it is appropriate to distinguish within the types
of data and types of processing already present in European legislation.
As to the types of data, the Italian law refers to personal and sensitive
data (special categories of personal data, according to art. 9 Reg. 2016/679,
where fundamental rights of the person are at stake). Until the new Regulation
is applicable in 2018, under Italian law in case of processing of sensitive
data, a preliminary authorization by the national Data Protection Authority
is still required. The new European regulation eliminates the bureaucracy of
preventative authorization in case of special categories of (sensitive) data,
but retains the distinction of limiting, in principle, the number of cases in
which this data can be processed without consent.
In a de jure condendo perspective, this distinction could open a
differentiated process between personal data and sensitive data. Consent could
be maintained for limited sensitive data types, for which the abstract possibility
of effective attention by the data subject at the time of request is conceivable. At
the same time, the only alternative to the existing scenario would be to impose
limits to processing that may also cause injury as a result of the processing and
combining of multiple personal/sensitive types of data (as in case of profiling).
This perspective would mean that the European legislator takes a stand
with regard to the development of the data markets and the long-term impact
on the rights of the person by regulating the dissemination of information.
The central issue therefore is about rules and limitations of processing,
given the lack of normative efficacy (effectiveness) of ex ante selection
mechanisms by the party (consent).
Personal data protection, from the point of view of private law, is also
limited in the remedial phase i.d. ex post remedies, represented at the private
law level by compensation for damage (Article 82 GDPR).
Italian judgements are scarce in this regard and in any case not
unequivocal 233. There appears to be a substantial rejection of claims for
232
Prescription of technical rules and conformed technologies (regulation by technology)
can be found in the GDPR – the so-called Privacy by design (Article 25) and the mechanisms and
bodies for the certification of data protection (42). Mantelero, Privacy digitale: tecnologie conformate
e regole giuridiche, in AA.VV., Privacy digitale. Giuristi e informatici a confronto, Torino (Giappichelli),
2005, p. 19 ss. On the reversibility of anonymization procedures, OHM, Broken Promises of Privacy:
Responding to the Surprising Failure of Aninymization 57 UCLA Law Rev. 1701 (2010); Article 29
Working Party on Data Protection, Opinion 05/2014 on anonymizing techniques, 10 April 2014.
233
A recent research on a major Italian database (De Jure), based on the corresponding
Italian article of the Data protection code (art. 15 d. lgs. 196/2003), without any restriction, comes
up with 156 results.
134
3.3 A de iure condendo (normative) proposal for ameliorating the GDPR
damages, where there is a difficulty in proving damages if they do not concern
any fundamental right (eg. right to privacy in the strict sense; right to identity)
different from the right to the protection of personal data.
The development of technology has imposed or induced us to
“surrender” personal information, which are then stored and processed for
the most diverse purposes: economic interests, sharing and manifestation of
personality, of public interest. As already seen, people have their own personal
data processed, without being aware of any misuse.
However, even if a person comes to know of the illegitimate processing
of his or her own personal data, he/she may fail in effecting a legal remedy.
This can happen for a number of reasons:
– A more general difficulty in liquidating non patrimonial losses;
– Trivial amount of damages (think of spamming)234;
– Difficulty of tracing the processing that has given rise to or is involved
in the production of the harmful event (it may be the case of unauthorized
profiling leading to the personal tastes of the individual he intends to keep
confidential).
Even in cases where it is possible to envisage a non-pecuniary loss
for personal data breach, there are issues related to the liquidation of the
damage, as evidenced in recent cases. The remedial shortage of private
law remedies for the violation of the right to personal data tout cour, tends
to confirm that the right to data protection is not effectively upheld for the
private individual.
These considerations lead to two alternative scenarios.
In the first view, it is arguable that the real purpose of protecting the
right to personal data (when not sensitive) concerns the protection of the
markets or the protection of a more general condition of human existence,
but it is not controllable or closely referable to natural person, in a ex ante
(consent) or ex post (damages) perspective.
In the second view, given the lack of protection but with the purpose of
enhancing remedies for data protection, data breach would allow for punitive
damages in favour of the data subject, even when serious damages are not
234
Italian Cass., January 11, 2016, no. 222, in DeJure commented in Dir. Giust., 2016, p.
3, denying eventual damage, clarifies that the violation of the right to the protection of personal data
must exceed the minimum tolerance threshold imposed by the duties of solidarity according to art.
2 of the Constitution, which provides for the recognition and safeguarding of the inviolable rights of
man, both as a single person and in the social formations where his person acts, and demands the
fulfillment of the indispensable duties of political, economic and social solidarity. A violation of the
right to personal data can therefore only occur in the event of a significant offense to the effective
scope of the law. Likewise, with regard to so-called supersensitive data (sex rectification), Cass. May
13, 2015, 9785 in Fam Dir., 2016, p. 469 ss.
135
3. Solutions to the critical aspects emerging from the experiment: UX&HMI perspective
incurred, but the breach led to gain for the offender 235. However, even in this
case, given the aggregation of personal data, a careful evaluation of the criteria
for determining the recoverable amount would be needed.
However, this is another story, which contributes to dissatisfaction
with the current legal framework 236.
235
Sirena, La gestione di affari altrui. Ingerenze altruistiche, ingerenze egoistiche e
restituzione del profitto, Torino, 1999, p. 277.
236
Suggests a different set of remedies A. Mantelero, Personal Data for Decisional
Purposes in the Age of Analytics: from an Individual to a Collective Dimension of Data Protection, 32
Comp. Law & Secur. Rev. (2016) 238 ff.
136
LIST OF AUTHORS
LUCILLA GATT
Full professor of Civil Law, Family, New technology Law at Università
degli Studi Suor Orsola Benincasa di Napoli (UNISOB), where she was
also Professsor of Private Law and Private Comparative Law. Chair
Holder of the Jean Monnet Chair PROTECH (European Protection Law
of Individuals in Relation to New Technologies) at Università degli
Studi Suor Orsola Benincasa di Napoli, and already Jean Monnet
Professor of European Contract Law at the University of Campania
Luigi Vanvitelli (2005-2009). Lawyer admitted to practice before
Superior Courts (Cassazionista) and Arbitrator of the Banking and
Finance Arbitrator. Director of the Research Centre of European Private
Law (ReCEPL) at UNISOB and member of the scientific committee
of CIRB – Centro Interuniversitario di Ricerca Bioetica, at Università
degli Studi di Napoli Federico II. President of the Academy of European
private lawyers (and Director of the Neapolitan section), she has
founded the Observatory on the process of unification of private law
in Europe. She is also a member of the Interdepartmental Centre of
Project Design and Research Scienza Nuova and coordinates the
Living Lab Utopia on the possible interactions between law and new
technologies. She is in charge of the legal profile in the Ph.D. course
in Humanities and Technologies: an integrated research path and
she was coordinator of the Ph.D. Course in Legal strategy for the
development and internationalization of SMEs. She is vice-director of
Familia Journal and director of the European Journal of Privacy Law
&Technologies, as well as and member of the scientific committee of
the Diritto di Internet Journal.
137
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
ILARIA AMELIA CAGGIANO
Full Professor of Private Law at Università degli Studi Suor Orsola
Benincasa di Napoli (UNISOB), where she holds, at the Law
Department, the chairs of Private Law, New technology law. She also
teaches Private Law in the course of Business Economics – Green
Economy of the same University. She is Director of the II Level
Master degree in Data Protection Officer and Privacy Law. She is vicedirector of the Research Centre of European Private Law (ReCEPL)
and member of the Living Lab Utopia of the Interdepartmental Centre
of Project Design and Research Scienza Nuova, both at UNISOB. Vice
editor in chief of the European Journal of Privacy Law & Technologies,
as well as Referee of numerous scientific journals including Il Foro
italiano, Familia, Global Jurist. She is lawyer since 2008, Arbitrator
of the Banking and Finance Arbitrator, and member of the CIRB –
Centro Interuniversitario di Ricerca Bioetica at Università degli Studi
di Napoli Federico II. Formerly Fulbright Scholar, she has carried
out numerous research and teaching stays at foreign universities.
She is the author of articles and monographs concerning, lastly,
biotechnology, new technologies and data protection.
ROBERTO MONTANARI
Professor of Interaction Design and Cognitive Ergonomics at
Università degli Studi Suor Orsola Benincasa di Napoli, Technical
and Scientific Coordinator of the Scienza Nuova Research Centre
at the same university, and co-founder and head of research and
development of RE:Lab (re-lab.it). He has a long experience in
national, regional and international projects aimed at the creation
of user interfaces able to reconcile the human perspective and the
most advanced frontiers of technological development. His areas
of interest include design strategies in the fields of transportation,
cultural heritage and the impact of design on privacy law.
LIVIA AULINO
Ph.D.c. in Humanities and Technologies at Università degli Studi Suor
Orsola Benincasa di Napoli (UNISOB), where she also holds a II level
University Master degree in Family law. She is Teaching assistant in
Civil Law, Family Law and New Technology Law at UNISOB. Member
of the Research Centre of European Private Law (ReCEPL) and
Interdepartmental Centre of Project Design and Research Scienza
138
List of authors
Nuova (Living Lab Utopia) at UNISOB. She is member of the editorial
board of Familia Journal, of the European Journal of Privacy Law
& Technologies, and Diritto di Internet Journal. Lawyer since 2017,
qualified with honor.
LUCA CATTANI
Luca Cattani is a postdoctoral research fellow at the Department of
Legal Studies, University of Bologna, where he received his doctoral
degree in European Law and Economics in 2014 with a thesis titled
“Educational mismatch in the Italian and European labour markets”.
He was Research fellow at the Department of Economics, University
of Modena and Reggio Emilia, adjunct professor of Economics at
the University of Padua and visiting researcher at the Institute for
Employment Research, University of Warwick. His main research
interests include graduate transitions to employment, educational
and skills mismatches and the relationship between skills utilisation
and technological change.
MARIA CRISTINA GAETA
Research fellow in Privacy Law and Scientific Secretary of the
Research Centre of European Private Law (ReCEPL), both at
Università degli Studi Suor Orsola Benincasa di Napoli (UNISOB). She
is also member of the Interdepartmental Centre of Project Design
and Research Scienza Nuova (Living Lab Utopia) at UNISOB. She
is Editorial Team Coordinator of the European Journal of Privacy
Law & Technologies and member of the editorial board of Familia
Journal and Diritto di Internet Journal. Teaching assistant in New
Technology Law, Civil Law, Private Comparative Law and Consumer
Law at Università degli Studi Suor Orsola Benincasa di Napoli as well
as Teaching assistant in Private Law and Family Law at Università
degli Studi di Napoli Federico II, where she held a PhD in People,
Business and Markets Law in 2019. Lawyer since 2018, expert in
New Technology Law.
EMANUELE GARZIA
PhD student in Humanities and Technologies: an integrated research
path at University Suor Orsola Benincasa (Naples). He is subject expert
in Interaction Design at the University Suor Orsola Benincasa since
2020. He deals with methodologies for the design and evaluation
of digital interfaces, design strategies for cultural heritage and the
139
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
impact of design on privacy. He collaborates to the research activities
of the Interdepartmental Centre of Project Design and Research
Scienza Nuova (University Suor Orsola Benincasa, Scientific Director
R. Montanari).
ANNA ANITA MOLLO
PhD in Humanities and Technologies: an integrated research path at
Università degli Studi Suor Orsola Benincasa di Napoli (UNISOB). She
is member of the Research Centre of European Private Law (ReCEPL)
and member of the Living Lab Utopia of the Interdepartmental Centre
of Project Design and Research Scienza Nuova, both at University
Suor Orsola Benincasa. She is member of the editorial board of the
European Journal of Privacy Law & Technologies, Diritto di Internet
Journal and Familia Journal. Teaching assistant in New Technology
Law, Private Law and Private Comparative Law, at University Suor
Orsola Benincasa. Lawyer since 2015.
FEDERICA PROTTI
UX designer, expert in the design of interaction experiences with
physical and digital objects in their context of use. She studied
semiotics at the University of Bologna and Multimedia Communication
at the University of Turin. In 2012 she won a scholarship promoted by
the Kauffman Foundation of Kansas City (MO), for participation in an
international training program on innovation and entrepreneurship in
the United States and she attended some workshops on Design Thinking
at the Hasso Plattner Institute of Design at the Stanford University, CA.
In 2014 he obtained a PhD in Language and Communication Sciences
at the University of Turin with an experimental thesis with the support
of eye-tracking methodologies for the study of ocular behaviors, that
explores how information visualization supports the understanding of
urban changes. She worked in some Italian automotive companies as
a designer of safe and easy to use on-board vehicle solutions.
MARGHERITA VESTOSO
PhD in Humanities and Technologies: an integrated research path
and a member of the Research Centre of European Private Law
(ReCEPL), both at Università degli Studi Suor Orsola Benincasa di
Napoli (UNISOB), both at the University Suor Orsola Benincasa, Napoli
(UNISOB). She is also an honorary research fellow in Legal Informatics
and Human-Machine Interaction. Her scientific activity is focused on
140
List of authors
topics between Law and Artificial Intelligence. In particular, she is
specialized in the integration of techniques such as Social Network
Analysis (SNA), Machine Learning (ML), and Agent-Based Simulation
Models (ABMs) in empirical legal research.
SIMONA COLLINA
Associate Professor of Cognitive Psychology at Università degli Studi
Suor Orsola Benincasa. Her research interests involve language
processing, nonverbal communication, emotions and cognitive
workload. She coordinates the Simula lab at the Scienza Nuova
Research Center. She is responsible for the Liaison Office of the
Università degli Studi Suor Orsola Benincasa in Brussels, which
focuses on sustainable growth through dialogue with the local
European and international institutions.
141
SOURCES
INDEX OF AUTHORS
Acquisti A., Grossklags J., ‘What Can Behavioral
Economics Teach Us about Privacy?’ (2007) 18
Digital Privacy: Theory, Technologies, and Practices 363, 377.
Acquisti A., Grossklags J., ‘What Behavioal Economics Teach Us About Privacy?’ in Acquisti A.
and others (eds), Digital Privacy: Theory, Technologies and Practices, (Auerbach Publications,
2007), 367 ss.
Acquisti A., Grossklags J., ‘Privacy and Rationality in Decision Making’ (2005) 3 (1) Ieee Secur
Priv 26,33.
Acquisti A., ‘Privacy’ [2005] Riv. pol. econ 319 ss.
Agi B., Jullien N., ‘Is the privacy paradox in fact
rational?’, Legal Studies Research Paper [2018]
International Federation of Library Associations
1, 13.
Ajello G.F., ‘La protezione dei dati personali dopo
il trattato di Lisbona. Natura e limiti di un diritto
fondamentale «disomogeneo» alla luce della
nuova proposta di General Data Protection Regulation’ (2015) 2 Osservatorio di dir. civ. comm.
421,446.
Ajzen I., Brown T.C., Carvajal F., ‘Explaining the
Discrepancy between Intentions and Actions:
The Case of Hypothetical Bias in Contingent Valuation’ (2004) 30 (9) Personality and Social
Psychology Bulletin 1108,1121.
Allen A.L., ‘Protecting One’s Own Privacy in a Big
Data Economy’ (2016) 130 Harvard Law Review
Forum 71,79.
Alpa G., ‘Intervento conclusivo’, in Gatt L., Il contratto del Terzo Millennio. Dialogando con Guido
Alpa (Editoriale Scientifica, 2018).
Altman I., ‘Privacy Regulation: Culturally Universal or Culturally Specific?’ (1977) 33 Journal of
Social Issues 66.
Ambrose M.L., ‘From the Avalanche of Numbers
to Big Data: A Comparative Historical Perspective on Data Protection in Transition’, in O’Hara
K., Nguyen M.-H.C., Haynes P. (eds), Digital Enlightenment Yearbook (IOS Press 2014), 25-48.
Arnaudo L., ‘Diritto cognitivo. Prolegomeni a una
ricerca’ (2010) 1 Politica dir, 101,135.
Ashton K., ‘That “Internet of Things” Thing. In
the real world, things matter more than ideas’
[2009] RFID.
Athan T., Governatori G., Palmirani M., Paschke
A., Wyner A., ‘LegalRuleML: Design Principles
and Foundations’ in Wolfgang Faber and Adrian
Paschke (eds), Reasoning Web. Web Logic
Rules: 11th International Summer School 2015,
Berlin, Germany, July 31- August 4, 2015, Tutorial
Lectures (Springer International Publishing 2015)
https://doi.org/10.1007/978-3-319-21768-0_6
Baddeley A.D., ‘Working memory and language:
An overview’ [2003] Journal of Communication
Disorders 189, 208.
Balebako R., Shay R. and Cranor L., ‘Is Your Inseam a Biometric? A Case Study on the Role of
Usability Studies in Developing Public Policy’
[2014] Internet Society.
Bargelli E., ‘sub art. 15, co. 2’, in Bianca C.M., Busnelli F.D. (eds) La protezione dei dati personali.
Commentario, vol. I (Cedam 2007).
Barnes S.B., ‘A privacy paradox: Social networking in the United States’ (2006) 11 (9) First
Monday.
143
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
Barocas S., Nisembaum H., ‘On Notice: the
Trouble with Notice and Consent, Proceedings
of the Engaging Data Forum: The First International Forum on the Application and Management
of Personal Electronic Information,’ October
2009, available at SSRN: https://ssrn.com/abstract=256740
Barrett L., ‘Confiding in Con Men: U.S. Privacy
Law, the GDPR, and Information Fiduciaries’
(2019) vol. 42 Seattle University Law Review
1057, 1113.
Barton T.D., Berger-Walliser G., Haapio H., ‘Visualization: Seeing Contracts for What They Are, and
What They Could Become’ (2013) 19 Journal of
Law, Business & Ethics 3, 15.
Bassini M., ‘La svolta della privacy europea: il
nuovo pacchetto sulla tutela dei dati personali’
(2016) 3 Quaderni costituzionali 587, 589.
Bederson B. and Shneiderman B., The Craft of Information Visualization: Readings and Reflections
(Morgan Kaufmann 2012).
Bem D.J., ‘Self-Perception Theory’ in Leonard
Berkowitz (ed), Advances in Experimental Social
Psychology, vol. 6 (Academic Press 1972) https://
doi.org/10.1016/S0065-2601(08)60024-6
Ben-Shahar O., Schneider C., More than You Wanted to Know: The Failure of Mandated Disclosure
(Princeton University Press 2014).
Ben-Shahar O., Schneider C.E., ‘The Failure of
Mandated Disclosure’ (2011) 159 University of
Pennsylvania Law Review 103.
Bennett C.J., Bennett C., Regulating privacy.
Data protection and public policy in Europe
and the United States (Cornell University Press
1992).
Berger-Walliser G., Barton T.D. and Haapio H.,
‘From Visualization to Legal Design: A Collaborative and Creative Process’ (2017) 54 (2) American Business Law Journal 347,392.
Bert-Jaap K., Leenes R., ‘Privacy regulation cannot be hardcoded. A critical comment on the
‘privacy by design’ provision in data-protection
law’ (2014) 28 (2) International Review of Law,
Computers & Technology 159, 171.
Bhatia J. and others, ‘A Theory of Vagueness and
Privacy Risk Perception’, in 2016 IEEE 24th International Requirements Engineering Conference
RE (IEEE, 2016).
Bianca C.M., Diritto civile, vol. I, La norma giuridica. I soggetti (Giuffrè 2002).
144
Bianca C.M., Introduction to Bianca C.M., Busnelli F.D. (eds) La protezione dei dati personali.
Commentario, vol. I (Cedam 2007).
Bocchiola M., Privacy. Filosofia e politica di un
concetto inesistente (Luiss University Press
2014).
Boehnen C.B., Bolme D., Flynn P., ‘Biometrics IRB
best practices and data protection’, in Proceeding of SPIE, vol. 9457, Kakadiaris I.A.; Kumar A.,
Scheirer W.J. (eds), Biometric and Surveillance
Technology for Human and Activity Identification XII (SPIE 2015).
Bogni B., Defant A., ‘Big data: diritti IP e problemi
della privacy’ (2015) 2 Dir. industriale 117,126.
Bolognini L., Bistolfi C., ‘Challenges of the Internet of Things: Possible Solutions from Data
Protecy and 3D Privacy, in Schiffner S., Serna
J., Ikonomou D., Rannenberg K. (eds) Privacy
Technologies and Policy, Lecture Notes in Computer Science, vol. 9857 (Springer Cham, 2016),
71-80.
Bolognini L., Pelino ., Bistolfi C., Il Regolamento
Privacy europeo (Giuffrè 2016).
Botes M., ‘Using Comics to Communicate Legal
Contract Cancellation’ (2017) 7 The Comics Grid:
Journal of Comics Scholarship 7, 14. http://doi.
org/10.16995/cg.100
Boyne S.M., ‘Data Protection in the United States:
U.S. National Report’, Legal Studies Research Paper No. 11 [2017] Indiana University Robert H.
McKinney School of Law 1, 39.
Brayans J.W., ‘The Internet of Automotive Things: vulnerabilities risk and policy implications’
(2017) 2 Journal of Cyber Policy 185,194.
Busch C., De Franceschi A., ‘Granular Legal Norms: Big Data and the Personalization of Private
Law in V Mak V Tjong Tjin Tai E, Berlee A. (eds),
Research Handbook on Data Science and Law
(Edward Elgar 2018).
Busnelli F.D., ‘Il “trattamento dei dati personali”
nella vicenda dei diritti della persona: la tutela risarcitoria’, in Cuffaro V., Ricciuto V., Zeno-Zencovich V., (eds) Trattamento dei dati e tutela della
persona (Giuffrè 1999).
Butt P., ‘Legalese versus Plain Language’ (2001)
Amicus Curiae 28.
Caggiao I.A., ‘Privacy e minori nell’era digitale.
Il consenso al trattamento dei dati dei minori
all’indomani del Regolamento UE 2016/679, tra
diritto e tecno-regolazione’ (2018) Familia 3,23.
Sources: Index of authors
Caggiano I.A., ‘A quest for efficacy in data pro- Comandè G., ‘sub art. 15, co. 1, in Bianca C.M.,
tection: a legal and behavioural analysis’, Wor- Busnelli F.D. (eds) La protezione dei dati personali. Commentario, vol. I (Cedam, 2007).
king Paper no. 10/2016, 1, 18.
Caggiano I.A., ‘Disgorgement, Compensation and Comellini S., Il responsabile della protezione
Restitution: A Comparative Approach’ (2016) 16 dei dati (data protection officer-dpo) (Maggioli,
2018).
(2) Global Jurist 243,266.
Caggiano I.A., ‘Il consenso al trattamento dei dati Common Market Law Review – Kluwer Law Online
Conti G., Sobiesk E., ‘Malicious Interface Design:
personali’ (2017) DIMT online 1, 19.
Caggiano I.A., ‘Pagamenti non autorizzati tra Exploiting the User’, in Proceedings of the 19th
responsabilità e restituzioni. Una rilettura del d. international conference on World wide web –
lgs. 11/2010 e lo scenario delle nuove tecnolo- WWW ’10 (ACM Press 2010).
Contissa G., Docter K., Lagioia F., Lippi M., Migie’, (2016) Riv. dir. Civ. (forthcoming).
Calderai V., ‘Consenso informato’ in Enc. dir., An- cklitz H.-W., Pałka P., Sartor G., Torroni P., ‘Claudette Meets GDPR: Automating the Evaluation
nali VIII (Giuffrè 2015).
Calo M.R., ‘Against Notice Skepticism in Privacy of Privacy Policies Using Artificial Intelligence’
(and Elsewhere)’ (2013) 87 Notre dame Law [2018] Study Report, Funded by The European
Consumer Organisation (BEUC), 1-62.
rev. 1027,1072.
Calo R., ‘Privacy and Markets. A Love Story’ Corsi S., ‘Il quadro europeo sul flusso di dati personali verso Paesi terzi nell’era digitale’, in Sa(2016) 91 Notre dame Law rev., 649,690.
Calo R, ‘Privacy Law’s Indeterminacy’ (2018) 20 pienza Legal Papers, Studi Giuridici 2015-2016
(Iovene, 2016) 135-144.
Theoretical Inquiries L. XX (2019), 33, 52.
Camardi C., ‘Mercato delle informazioni e pri- Courtney K.L., ‘Privacy and Senior Willingness
vacy: riflessioni generali sulla l. 675/1996’ to Adopt Smart Home Information Technology in
Residential Care Facilities’ (2008) 47 Methods
(1998) 4 Eur. dir. Priv., 1049, 1073.
Card S.K., Mackinlay J.D., Shneiderman B., of Information in Medicine 76, 81.
Readings in Information Visualization: Using Vi- Cuffaro V., ‘Il consenso dell’interessato’, in Cuffaro V., Ricciuto V. (eds) La disciplina del trattasion to Think (Morgan Kaufmann 2007).
Cardarello C., D’ Amora F., Fiore F., Adempimenti mento dei dati personali (Giappichelli 1999).
privacy per professionisti e aziende (Giuffrè, Cuffaro V., ‘Il diritto europeo sul trattamento dei
dati personali’ (2018) 3 Contratto e impresa
2018).
Ceccarelli V., ‘La soglia di risarcibilità del danno 1098,1119.
non patrimoniale da illecito trattamento dei dati Custers B., Dechesne F., Sears A.M., and others,
‘Simone van der Hof ‘A Comparison of Data Propersonali’ (2015) 4 Danno e resp. 343.
Chao B., ‘Privacy losses as wrongful gains’, Wor- tection Legislation and Policies Across the EU’
king Paper no. 20.05 (2020) University of Den- [2017] Computer Law & Security Review 1,18.
D’Acquisto G., Naldi M., Bifulco R., Pollicino O.,
ver Sturm College of Law, 1,51.
Chico V., Taylor M.J., ‘Using and Disclosing Con- Bassani M., Intelligenza artificiale, protezione
fidential Patient Information and the English dei dati personali e regolazione (Giappichelli
Common Law: What are the information require- 2018).
ments of a valid consent?’ (2018) 26 (1) Medi- D’Acquisto G., Naldi M., Big data e privacy by design (Giappichelli 2017).
cal Law Review 51, 72.
Ciccia Messina A., Bernardi N., Privacy e regola- Das S., Dev J., and Camp L.J., ‘Privacy Preserving Policy Model Framework’ [2019] Private
mento europeo (IPSOA, 2016).
Clarizia R., Informatica e conclusione del con- Law Theory 1, 14.
Davenport J., ‘How EU data protection law could
tratto, (Giuffrè, 1985).
Cofone I., ‘The Dynamic Effect of Information interfere with targeted ads’ [2015] The converPrivacy Law’ (2017) 18 (2) Minn. LJ Sci & Tech sation (online).
Degani L.E., Lopez A., Familiari S., L’applicazione
517,574.
Cohen L.E., ‘Turning Privacy Inside Out’ (2019) del GDPR privacy nei servizi sociosanitari (Maggioli, 2018).
20 Theoretical Inquiries in Law 1,31.
145
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
DePaulo B.M., Wetzel C., Sternglanz R.W., Wilson
M.J.W., ‘Verbal and Nonverbal Dynamics of Privacy, Secrecy, and Deceit’ (2003) J. Soc. Issues,
59 (2) 391, 410.
Determann L., ‘No One Owns Data’ (2018) vol.
70 (1) Hastings Law Journal 1, 44.
Dhir A., Torsheim T., Pallesen S., Andreassen
C.S., ‘Do Online Privacy Concerns Predict Selfie
Behavior among Adolescents, Young Adults and
Adults?’ (2017) Front. Psychol. 8 815.
Di Ciommo F., ‘La risarcibilità del danno non patrimoniale da illecito trattamento dei dati personali (2004) 10 Danno e resp. 801, 803.
Di Ciommo F., ‘Vecchio e nuovo in materia di danno non patrimoniale da trattamento dei dati personali’ (2004) 10 Danno e resp. 820, 824.
Di Majo A., ‘Il trattamento dei dati personali tra
diritto sostanziale e modelli di tutela’, in Cuffaro
V., Ricciuto V., Zeno-Zencovich (eds) Trattamento dei dati e tutela della persona (Giuffrè 1999).
Di Porto F., ‘Dalla convergenza digitale-energia l’evoluzione della specie: il consumatore
«iper-connesso»’ (2016) 1 Mercato concorrenza 59,78.
Di Resta F., ‘Normativa sulla data retention:
sono ad alto rischio contenziosi anche in Italia’
[2015] Il Sole 24 Ore (online).
Di Resta F., La nuova ‘Privacy Europea’. I principali adempimenti del regolamento UE 2016/679
e profili risarcitori (Giappichelli 2018).
Dix A.J., Human-Computer Interaction (Prentice-Hall 1997).
Endicott T., ‘Law is necessarily vague’ (2001) 7
Legal Theory (4) 379, 385.
Endicott T., Vagueness in law (Oxford University
Press 2000).
Engel C., Rahal R.M., ‘Justice is in the Eyes of the
Beholder – Eye Tracking Evidence on Balancing
Normative Concerns in Torts Cases’ (2020) 3
Max Planck Institute for Research on Collective
Goods 1, 56.
Endsley M., ‘Toward a Theory of Situation Awareness in Dynamic Systems’ (1995) 37 Human
Factors Journal of the Human Factors and Ergonomics Society 32, 64.
Epstein R.E., ‘The Legal Regulation of Genetic Discrimination: Old Responses to New Technology’
(1994) 74 (1) B.U.L. Rev. 1, 23.
Equifax I., Louis Harris and Associates, ‘Equifax-Harris Consumer Privacy Survey.’ [1991]
Equifax-Harris consumer privacy survey.
146
Esayas S.Y., ‘The idea of ‘emergent properties’
in data privacy: towards a holistic approach’
(2017) 25 Int. Journ. Law & Info. Technology
139, 178.
Fabian B., Ermakova T. and Lentz T., ‘Large-Scale Readability Analysis of Privacy Policies’, in Proceedings of the International
Conference on Web Intelligence (Association
for Computing Machinery 2017) https://doi.
org/10.1145/3106426.3106427
Farkas T.J., ‘Data Created by the Internet of Things: The new gold without ownership?’ (2017) 23
Revista la Propiedad Inmaterial 5,17.
Fici A., Pellecchia E., ‘Il consenso al trattamento’,
in Pardolesi R. (ed) Diritto alla riservatezza e circolazione dei dati personali, vol. I (Giuffrè 2003).
Finocchiaro G., Delfini F., Diritto dell’Informatica
(UTET, 2014).
Floridi L. (ed), Protection of Information and the
Right to Privacy – A New Equilibriu? (Springer
2014).
Frosini T.E., ‘Internet e democrazia’ (2017) 4-5
Dir. inf. e informatica 657, 671.
Frosini V., ‘Diritto e informatica negli anni ottanta
(1984) 2 Riv. trim. dir. pubbl. 390,400, now in ID.
Informatica diritto e società (Giuffrè, 1988) 231.
Gaeta M.C., ‘Data protection and self-driving
cars: the consent to the processing of personal
data in compliance with GDPR’ (2019) 24 (1)
Communications Law 15, 23.
Gaeta M.C., ‘La protezione dei dati personali
nell’Internet of Things: l’esempio dei veicoli autonomi’ (2018) 1 Diritto inf. e informatica 147, 179.
Galetta A., De Hert P., ‘The Proceduralisation of
Data Protection Remedies under EU Data Protection Law: Towards a More Effective and Data
Subject-oriented Remedial System?’ (2015)
8 (1) Review of European Administrative Law
125,151.
Gambino A.M., ‘Informatica giuridica e diritto
dell’informatica’, Treccani G (ed.) Enciclopedia
Giuridica (online edn 2013).
Gatt L., Montanari R., Caggiano I.A., ‘Consenso
al trattamento dei dati personali e analisi giuridico-comportamentale. Spunti di riflessione
sull’effettività della tutela dei dati personal’
(2017) 2 Politica del diritto 337, 353.
Giannone Codiglione G., ‘Risk-based approach e
trattamento dei dati personali’, in Sica S., D’Antonio V., Riccio G.M. (eds) La nuova disciplina
europea sulla privacy (CEDAM 2016).
Sources: Index of authors
Gibson J.J., ‘The theory of affordances’ R Shaw
& J Bransford (Eds.) Perceiving, acting, and
knowing (Lawrence Erlbaum Associates 1977).
Gillis T.B., Simons J., ‘Explanation ‘Justification:
GDPR and the Perils of Privacy’ (Forthcoming)
Pennsylvania Journal of Law and Innovation
1,31.
Gonzalez Fuster G., The Emergence of Personal
Data Protection as a Fundamental Right of the
EU (Springer 2014).
Gorla S., Ponti C., Privacy UE: il vecchio e il nuovo.
Confronto tra D.Lgs 196/2003 “Codice Privacy”
e Regolamento Europeo 2016/679 “GDPR” (ITER
2018).
Grossklags J. and Acquisti A., ‘What Can Behavioral Economics Teach Us about Privacy?’ in De
Capitani Di Vimercati S., Gritzalis S., Lambrinoudakis C., Acquisti A. (eds), Digital Privacy (Auerbach Publications 2007) http://www.crcnetbase.com/doi/10.1201/9781420052183.ch18
Ha-Redeye O., ‘Continued Utility of Privacy Class
Actions in Deterrence’ [2019] Canada’s online
legal magazine.
Haller S., Karnouskos S., Schiroh C., ‘The Internet
of Things in an enterprise context’ [2008] Future Internet, Lecture Notes, 5468 Computer
Science.
Hagan M., ‘The User Experience of the Internet
as a Legal Help Service: Defining Standards for
the next Generation of User-Friendly Online Legal Services’ (2016) Va. JL& Tech 395, 465.
Hagan M., Law By Design, (Retrieved March
2018), from www.lawbydesign.co/en/home/.
Hagedoorn J., Hesen G., ‘Contractual Complexity and the Cognitive Load of R&D Alliance Contracts’ (2009) Journal of Empirical Legal Studies 818, 847.
Haapio H., Hagan M., Palmirani M., Rossi A., ‘Legal design patterns for privacy’ [2018] Data
Protection/LegalTech Proceedings of the 21st
International Legal Informatics Symposium IRIS
445, 450.
Harbach M., Fahl S., Yakovleva P., Smith M., ‘Sorry,
I Don’t Get It: An Analysis of Warning Message
Texts’ in Andrew A Adams, Michael Brenner and
Matthew Smith (eds), Financial Cryptography
and Data Security (Springer 2013).
Hargittai E., Litt E., ‘New strategies for employment? Internet skills and online privacy practices during people’s job search’ (2013) 11 (3)
IEEE Secur Priv 38, 45.
Hargittai E., Shafer S., ‘Differences in Actual and
Perceived Online Skills: The Role of Gender*.’
(2006) 87 Soc. Sci. Q 432, 448.
Hartzog W., ‘Website Deisgn as Contract’ (2018)
60 (6) American University Law Review 1635,
1671.
Hartzog W., Privacy’s Blueprint: The Battle to
Control the Design of New Technologies (Harvard University Press 2018).
Heider F., The Psychology of Interpersonal Relations (John Wiley & Sons Inc 1958).
Helleringer G., Sibony A.L., ‘European consumer
protection through the behavioral lens’ (2016)
23 Column. European Law Journal 607.
Hersent O., Boswarthick D., Elloumi O., The
Internet of Things: Key Applications and Protocols (2nd edition, Wiley 2012).
Hijmans H., Raab C.D., ‘Ethical Dimensions of
the GDPR in: Cole M., Boehm F. (eds.), Commentary on the General Data Protection Regulation
(Edward Elgar 2018, Forthcoming).
Hildebrandt M., ‘Privacy as Protection of the
Incomputable Self: Agonistic Machine Learning’
(2017) 19 (1) Theoretical Inquiries of Law, 83,
121.
Hochhauser M., ‘Why patients won’t understand
their hipaa privacy notices https://www.privacyrights.org/blog/why-patients-wont-understand-their-hipaa-privacy-notices-hochhauser
accessed 01 January 2020.
Hoofnagle C.J., King J., Li S., Turow J., ‘How Different are Young Adults From Older Adults When
it Comes to Information Privacy Attitudes &
Policies?’ [2010] Legal Studies Research Paper
University of Pennsylvania 1, 20.
Iaselli M., Manuale operativo del D.P.O. Aggiornato al d.lgs. 10 agosto 2018, 2018 in materia di
privacy (Maggioli, 2018).
Inglesant P.G and Sasse M.A., ‘The True Cost of
Unusable Password Policies: Password Use in
the Wild’ in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
(Association for Computing Machinery 2010)
https://doi.org/10.1145/1753326.1753384.
Irti N., ‘Diritto Civile’, in Digesto sez. civ. VI (4th
edn, UTET 1990) 128.
Irti N., Dialogo su diritto e tecnica (Laterza,
2001).
Irti N., Un diritto incalcolabile (Giappichelli,
2016).
147
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
ISO 9241-210:2019, ‘Ergonomics of human-system interaction — Part 210: Human-centred
design for interactive systems’ (2010) 1, 32.
Jabłonowska A., Kuziemski M., Nowa A.M., Micklit H.-W., Pałka P., Sartor G., ‘Consumer Law
and Artificial Intelligence: Challenges to the EU
Consumer Law and Policy Stemming from the
Business’ Use of Artificial Intelligence’, Final
report of the ARTSY project Working Paper LAW
no. 11 [2018] EUI Department of Law 1, 79.
Janeček V., Malgieri G., ‘Data Extra Commercium’,
in Lohsse S, Schulze R and Staudenmayer D.
(eds), Data as Counter-Performance – Contract
Law 2.0? (Hart Publishing/Nomos 2019) (forthcoming).
Jarrett C., Redish J., Summers K., Straub K., ‘Design to Read: Guidelines for People Who Do Not
Read Easily’ (2010) 9 User Experience Magazine. http://www.effortmark.co.uk/design-read-guidelines-people-read-easily/
Jensen C. and Potts C., ‘Privacy Policies as Decision-Making Tools: An Evaluation of Online Privacy Notices’, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
(Association for Computing Machinery 2004)
https://doi.org/10.1145/985692.985752
Jones O., Goldsmith T.H., ‘Diritto e biologia comportamentale’ (2006) 4 i-lex (online).
Jori M.G., Diritto, nuove tecnologie e comunicazione digitale (Giuffrè, 2013).
Kahneman D., Thinking, Fast and Slow (Farrar,
Straus and Giroux, 2012).
Kahneman D., Tversky A., ‘Choices, Values, and
Frames’ (1984) 39 American Psychologist 341,
350.
Kahneman D., Knetsch K.L., Thaler R.H., ‘Anomalies:
the endowment effect, loss aversion, and status
quo bias’ (1991) J. Econ. Perspect 5 (1) 193,206.
Kahneman D., Slovic P., Tversky A., ‘Judgment
under Uncertainty: Heuristics and Biases’ (Cambridge University Press 1982).
Kallgren C.A. and Wood W., ‘Access to Attitude-Relevant Information in Memory as a Determinant
of Attitude-Behavior Consistency’ (1986) 22
Journal of Experimental Social Psychology 328.
Kamarinou D., Millard C., Singh J., ‘Machine Learning with Personal Data’, (2016) Legal Studies
Research Paper no. 216 Queen Mary University
of London website 1, 47.
148
Kay M., Terry M., ‘Textured Agreements: Re-Envisioning Electronic Consent’, Proceedings of the
Sixth Symposium on Usable Privacy and Security (Association for Computing Machinery 2010)
https://doi.org/10.1145/1837110.1837127
Kay M., ‘Techniques and Heuristics for Improving the Visual Design of Software Agreements’
(2010) Master Thesis, University of Waterloo.
Kerr I., ‘Schrödinger’s Robot: Privacy in Uncertain States’ (2018) 20 Theoretical Inquiries in
Law 123, 154.
Kerren A., Stasko J.T., Fekete J.D., North C., Information Visualization – Human-Centered Issues
and Perspectives, Vol. 4950 of LNCS (Springer,
2008).
Kim H.-W., Xu Y. and Koh J., ‘A Comparison of Online Trust Building Factors between Potential Customers and Repeat Customers’ (2004) 5 Journal of the Association for Information Systems.
https://aisel.aisnet.org/jais/vol5/iss10/13.
Klopfer P.H., Rubenstein D.I., ‘The Concept Privacy and Its Biological Basis’ (1977) J. Soc. Issues 33 (3) 52, 65.
Kuan Hon W., Millard C., Singh J., ‘Twenty Legal Consideration for Clouds of Things, Legal
Studies Research Paper no. 247 (2016) Queen
Mary University of London website 1,23.
La Rocca G., ‘Appunti sul Regolamento UE n.
679/2016, relativo alla protezione delle persone
fisiche con riguardo al trattamento dei dati personali (Parte I) [2017] Il Caso (online), 1-10.
La Rocca G., ‘Trattamento dei dati personali e
impresa bancaria (Reg. UE 679/2016)’ (2018) 6
Rivista di diritto bancario 2, 20.
Laufer R.S., Wolfe M., ‘Privacy as a Concept and
a Social Issue: A Multidimensional Developmental
Theory’ (1977) 33 Journal of Social Issues 22, 42.
Lee S.M., ‘Internalizing the Harm of Privacy Breaches: Do Firms Have an Incentive to Improve
Data Protection? An Event Study’ [2019] TPRC 47,
Research Conference on Communication, Information and Internet Policy 1, 59.
Leffi M., ‘I trasferimenti di dati verso Stati terzi
nel nuovo Regolamento UE’ (2017) 1 Rivista di
Diritti comparati 187,205.
Leon P., Ur B., Shay R., Wang Y., Balebako R., Cranor
L., ‘Why Johnny can’t opt out: a usability evaluation of tools to limit online behavioral advertising’,
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (Association for
Computing Machinery, 2012) 589-598. https://
doi.org/10.1145/2207676.2207759.
Sources: Index of authors
Lewis J.R., ‘Introduction: Current Issues in
Usability Evaluation’ (2001) 13 (4) International Journal of Human–Computer Interaction
343,349.
Liberman V., Samuels S.M. and Ross L., ‘The
Name of the Game: Predictive Power of Reputations Versus Situational Labels in Determining
Prisoner’s Dilemma Game Moves’ (2004) Personality and Social Psychology Bulletin 30 (9)
1175-1185.
Lisi A., Ungaro S., ‘Il “Responsabile deresponsabilizzato” e il “Titolare irresponsabile”: il teatro
dell’assurdo nell’applicazione del GDPR’ [2019]
Legal Euroconference website 1, 6.
Litman-Navarro K., ‘We Read 150 Privacy Policy.
They Were an Incomprehensible Disaster’ Opininon of the privacy project [2019] the New York
Time website.
Lodder A.R., Loui R., ‘Data Algorithms and Privacy in Surveillance: On Stages, Numbers and
the Human Facto) in Barfield W., Pagallo U.
(eds.) Research Handbook on the Law and Artificial Intelligence (Elgar 2018).
Loewenstein G., ‘Out of Control: Visceral Influences on Behavior’ (1996) Organizational
Behavior and Human Decision Processes 65 (3)
272-292.
Loewenstein G., Adler D., ‘A Bias in the Prediction
of Tastes’ (1995) 105 (7) Econ. J. 929, 937.
Longo A., ‘Dati sanitari alle multinazionali, senza consenso: passa la norma in Italia’ [2017] La
Repubblica (online).
Lurger B., ‘Empiricism and Private Law: Behavioral
Research as Part of a Legal-Empirical Governance Analysis and a Form of New Legal Realism’
(2014) 1 Aust. Law Jour. 2014 20, 39
Lynskey O., ‘Grappling with “Data Power”: Normative Nudges from Data Protection and Privacy’ (2019) 20 Theoretical Inquiries in Law
189, 220.
Lynskey O., ‘Deconstructing Data Protection:
The “Added-Value” of a Right to Data Protection in
the EU Legal Order’ (2014) 63 (3) International
and Comparative Law Quarterly 569,597.
Mamula T, Hagel U., ‘The Design of Commercial
Conditions’ (2015) Jusletter IT.
Mazza R., Introduction to Information Visualization (Springer, 2009);
Mantelero A., ‘Big data: i rischi della concentrazione del potere informativo digitale e gli strumenti
di controllo’ (2012) 1 Dir. inf. e informatica 135,
144.
Mantelero A., ‘Data protection, e-tracking and
intelligent systems for public transport’ (2015)
5 (4) International Data Privacy Law 309, 320.
Mantelero A., ‘Digital privacy: tecnologie “conformate” e regole giuridiche’, in Privacy digitale.
Giuristi e informatici a confronto (Giappichelli
2005).
Mantelero A., ‘Personal Data for Decisional Purposes in the Age of Analytics: from an Individual
to a Collectivce Dimension of Data Protection’
(2016) 32 Comp. Law & Secur. Rev. 238, 255.
Mantelero A., Poletti D. (eds), Regolare la tecnologia: il Reg.UE 2016/679 e la protezione dei dati
personali. Un dialogo tra Italia e Spagna, (Pisa
University Press, 2018).
Mantelero A., ‘The future of consumer data protection in the E.U. Rethinking the “notice and
consent” paradigm in the new era of predictive
analytics’ (2014) 30 Computer Law & Security
Review 643, 660.
Marx G.T., ‘Murky Conceptual Waters: The Public
and the Private’ (2001) Ethics and Information
Technology 3 157, 169.
Mazzamuto S., ‘Il principio del consenso e il
potere della revoca’, in Pennetta R. (ed) Libera
circolazione e protezione dei dati personali, vol.
I (Giuffrè 2006).
McDonald A.M., Cranor L.F., ‘The cost of reading
privacy policies’ (2008) I/S: J.L. & Pol’y for Info.
4 (3).
Medaglia C.M., Serbanati A., ‘An Overview of Privacy and Security Issues in the Internet of Things, in The Internet of Things’, in Giusto D., Iera A.,
Morabito G., Atzori L. (eds), The Internet of Things (Springer 2010).
Mendoza I., Bygrave A.L., ‘The Right not to be
Subject to Automated Decision based on Profiling’ Legal Studies Research Paper no. 20 [2017]
University of Oslo website 1-22.
Merla L., ‘Droni, privacy e la tutela dei dati personali’ (2016) 1 Informatica e dir. 29, 45.
Miglietti L., Il diritto alla privacy nell’esperienza
giuridica statunitense ed europea (ESI 2014).
Milne G., Culnan M., ‘Strategies for Reducing
Online Privacy Risks: Why Consumers Read (or
Don’t Read) Online Privacy Notices’ (2004) 18
(3) J. Interact. Mark. 5, 29.
Milne G.R., Culnan M.J., Greene H., ‘A longitudinal
assessment of online privacy notice readability’
(2006) 25 (2) J. Public Policy Mark 238, 249.
149
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
Miltgen C.L. and Peyrat-Guillard D., ‘Cultural and
Generational Influences on Privacy Concerns: A
Qualitative Study in Seven European Countries’
(2014) 23 European Journal of Information Systems 103, 125.
Míšek J., ‘Consent to personal data processing
– The panacea or the dead end? (2014) 8 (1) Masaryk University Journal of Law and Technology,
69-83.
Mondal S., Bours P., ‘Swipe Gesture based Continuous Authentication for Mobile Devices’ [2015]
Int. Conf. on Biometrics 458, 465.
Moore B., Privacy: studies in social and cultural
history (Routledge 1984).
Moro Visconti R., ‘Valutazione dei big data e impatto su innovazione e digital branding’ (2016)
1 Dir. Industriale 46, 53.
Mullainathan S., Thaler R.H., ‘Behavioral economics’, in Smelser N.J., Baltes P.B. (ed) International encyclopedia of the social and behavioral
sciences 1094, 1100.
Muscillo D.P., ‘Il regolamento europeo sulla privacy – 679/2016/UE adempimenti da completare entro il termine del 25 maggio 2018 e le relative sanzioni in caso di non adempimento’ [2018],
academia.edu (online) 1-16.
Navarreta E., ‘sub. art. 29, co. 9’, in Bianca C.M.,
Busnelli F.D. (eds) Tutela della privacy. Commentario alla L. 31 dicembre 1996, n. 675, Nuove leggi civ., (1999) 317 ff.
Nielsen J., ‘Usability Engineering’ (Morgan Kaufmann Publishers Inc., 1994).
Nissenbaum H., ‘Contextual Integrity Up and
Down the Data Food Chain’ (2019) 20 Theoretical Inquiries in Law, 221, 256.
Norman D.A., La caffettiera del masochista. Il
design degli oggetti quotidiani (Giunti Editore
2014).
Norman D.A., User centered system design: new
perspectives on human-computer interaction
(Erlbaum 1986).
O. BenShahar, A. Chilton, ‘Simplification of Privacy Disclousure: An Experimental Test’ (2015)
45 Journ. Legal Studies.
O’Callaghan J., ‘Inferential Privacy and Artificial
Intelligence – A New Frontier?’ (2018) 11 (2)
Journal of Law & Economic Regulation 72, 89.
O’Dell E., ‘Compensation for breach of the proposed ePrivacy Regulation’ [2018] Cearta.ie
(online).
150
Obar J.A. and Oeldorf-Hirsch A., ‘The Biggest Lie
on the Internet: Ignoring the Privacy Policies
and Terms of Service Policies of Social Networking Services’ (2020) 23 Information, Communication & Society 128.
Ohm P., ‘Broken Promises of Privacy: Responding to the Surprising Failure of Aninymization’
(2010) 57 UCLA Law Rev. 1701, 1777.
Oleshchuk V., ‘Internet of things and privacy
preserving technologies’ [2009] 1st International Conference on Wireless Communication,
Vehicular Technology, Information Theory and
Aerospace & Electronic Systems Technology
336, 340.
Olphert C., Damodaran L., May A., ‘Towards digital inclusion: engaging older people in the ‘digital world’ Proceedings of the 2005 international
conference on Accessible Design in the Digital
World (Accessible Design, 2005).
Omri B.-S., ‘More Failed Nudges: Evidence of
Ineffective “Behaviorally Informed” Disclosures’
[2017] The Journal of Things We Like 1, 3.
Oppo G., ‘Sul consenso dell’interessato’, in Cuffaro V., Ricciuto V., Zeno-Zencovich (eds) Trattamento dei dati e tutela della persona (Giuffrè
1999).
Ovidiu V., Peter F., Patrick G. and others, ‘Internet
of Things Strategic Research Roadmap’ (2nd edn.
2011).
Palazzolo E., ‘Firme grafometriche e GDPR: lo
stato dell’arte’ [2018] Ius in itinere (online).
Pallone E.C., ‘Internet of Things e l’importanza
del diritto alla privacy tra opportunità e rischi’
(2016) 17 (55) Ciberspazio e diritto 163, 183.
Park Y.J., ‘Do Men and Women Differ in Privacy?
Gendered Privacy and (in)Equality in the Internet’ (2015) 50 Computers in Human Behavior
252, 258.
Passera S., Smedlund A., Linasuo M., ‘Exploring
contract visualization: Clarification and framing
strategies to shape collaborative business
relationships’ [2016] Journal of Strategic Contracting and Negotiation 69, 100;
Passera S., Haapio H. ‘User-Centered Contract
Design: New Directions in the Quest for Simpler
Contracting’, in Henschel RF, Proceedings of the
2011 IACCM Academic Symposium on Contract
and Commercial Management (Tempe,2011)
80, 97.
Sources: Index of authors
Passera S. and Haapio H., ‘Transforming Contracts from Legal Rules to User-Centered Communication Tools: A Human-Information Interaction Challenge’ (2013) 1 Communication
Design Quarterly 38.
Passera S., ‘Beyond the wall of text: How information design can make contracts user-friendly’
in Aaron M (ed) Design, User Experience, and
Usability: Users and Interactions (Springer International Publishing 2015).
Passera S., ‘Enhancing Contract Usability and
User Experience Through Visualization – An
Experimental Evaluation’ (2012) 16th International Conference on Information Visualisation
376, 382
Passera S., Beyond the Wall of Contract Text –Visualizing Contracts to Foster Understanding and
Collaboration within and across Organizations
(Aalto University 2017).
Passera S., Pohjonen S., Koskelainen K., Anttila S.,
‘User-friendly contracting tools – A visual guide to
facilitate public procurement contracting’ (2013)
Proceedings of the IACCM Academic Forum on
Contract and Commercial Management 2013,
Phoenix, USA, Available at SSRN: https://ssrn.
com/abstract=2620595
Patti S., ‘Consenso’, sub art. 23), Bianca C.M.,
Busnelli F.D. (ed) La protezione dei dati personali. Commentario, vol. I (Cedam 2007).
Pavlou P.A., ‘State of the Information Privacy Literature: Where Are We Now and Where Should
We Go?’ (2011) MIS Quarterly 35 (4) 977-988.
Pedrazzi C., ‘Consenso dell’avente diritto’, in Enc.
dir., Annali IX, (Giuffrè 1961).
Perlingieri C., Ruggieri L. (eds.), Internet e diritto
civile (ESI 2015).
Perugini M.R., ‘Il consenso raccolto prima della
data di efficacia del GDPR sarà ancora valido?’
[2017] Europrivacy (online).
Petty R.E., Cacioppo J.T., ‘The Elaboration Likelihood Model of Persuasion’ in Communication and
Persuasion (Springer Series in Social Psychology
1986).
Pfister C., Getting Started with the Internet of
Things: Connecting Sensors and Microcontrollers to the Cloud, (O’Reilly 2011).
Pino G., ‘Il diritto all’identità personale ieri e oggi.
Informazione, mercato, dati personali’ in Pannetta R. (ed) Libera circolazione e protezione
dei dati personali, vol. I (Giuffrè 2006).
Pizzetti F., Privacy e il diritto europeo alla protezione dei dati personali. Il Regolamento europeo
2016/679, vol II (Giappichelli 2016).
Pizzetti F., Privacy e il diritto europeo alla protezione dei dati personali. Dalla direttiva 95/46 al
nuovo regolamento europeo, vol I (Giappichelli
2016).
Pollach I., ‘A Typology of Communicative Strategies in Online Privacy Policies: Ethics, Power
and Informed Consent’ (2005) 62 Journal of
Business Ethics 221.
Pollicino O., ‘Un digital right to privacy preso
(troppo) sul serio dai giudici di Lussemburgo?
Il ruolo degli artt. 7 e 8 della Carta di Nizza nel
reasoning di Google Spain’ (2014) 4/5 Dir. info.
569, 589.
Pollicino O., Bassinbi M., ‘Trattamento dei dati
personali e ordine di protezione europeo: alla
ricerca di un punto di equilibrio’, in Belluta H.,
Ceresa-Gastaldo M. (eds), L’ordine europeo di
protezione: la tutela delle vittime di reato come
motore della cooperazione giudiziari (Giappichelli 2016) 121, 153.
Popoli A.R., ‘Social Network e concreta protezione dei dati sensibili: luci ed ombre di una difficile
convivenza’ (2014) 6 Dir. Info 981, 1017.
Posner R.A., ‘Privacy, Surveillance, and Law’
(2008) 75 U. Chi. Law Re 245, 260.
Posner R.A., ‘The Right of Privacy’ (1997) 12 Georgia Law Rev 393,422.
Priolo F., Di Cataldo V., Lattanzi R. and others, ‘Diritto alla privacy e trattamento automatizzato
dei dati fra diritto civile, diritto penale e diritto
internazionale ed europeo’ (2014) 63 I quaderni
europei 1,106.
Proctor R., Ali A. and Vu K.-P., ‘Examining Usability of Web Privacy Policies’ (2008) 24 Int. J. Hum.
Comput. Interaction 307,328.
Proietti G., La responsabilità nell’intelligenza artificiale e nella robotica. Attuali e futuri scenari
nella politica del diritto e nella responsabilità
contrattuale (Giuffrè Francis Lefebvre 2020)
Quan-Haase A., Williams C., Kicevski M., Elueze
I., Wellman B., ‘Dividing the Grey Divide: Deconstructing Myths About Older Adults’ Online Activities, Skills, and Attitudes’ (2018) 62 (9) Am.
Behav. Sci. 1207,1228.
Rabin R.L., ‘Perspectives on privacy, data security and tort law’ (2017) 66 Depaul Law Review
313, 338.
151
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
Reed C., ‘Liability of On-line Information Provi- Rossi A., Palmirani M., ‘DaPIS: a Data Protection
ders: Towards a Global Solution’ (2003) 17(3) Icon Set to Improve Information Transparency
International Review of Law, Computing & Tech- under the GDPR’ (2019) http://gdprbydesign.
nology 255,65.
cirsfid.unibo.it/wp-content/uploads/2019/01/
Reed C., Kennedy E., Silva S.N., ‘Responsibility, report_DaPIS_jan19.pdf.
Autonomy and Accountability: legal liability for Rothchild J.A., ‘Against notice and choice: The
machine learning’, Legal Studies Research Pa- manifest failure of the proceduralist paradigm
per no. 243 (2016) Queen Mary University of to protect privacy online (or anywhere else)’
London website 1, 31.
[2018] Legal Studies Research Paper Series No.
Reidenberg J.R., Bhatia J., Breaux T.D., Norton 40 Wayne State University Law School website
T.B., ‘Ambiguity in privacy policies and the im- 1, 85.
pact of regulation’ (2016) 45 (S2) S1 The Jour- Salimbeni M.T., ‘La riforma dell’articolo 4 dello
nal of Legal Studies 63, 190.
Statuto dei lavoratori: l’ambigua risolutezza del
Resta G., Autonomia privata e diritti della perso- legislatore’ (2015) 4 Dir. inf. e informatica 589,
nalità (Napoli 2005).
616.
Riccio G.M., Scorza G., Belisario E. (eds), GDPR Sandelands L.E., Larson J.R., ‘When measuree Normativa Privacy – Commentario (IPSOA, ment causes task attitudes: A note from the labo2018).
ratory’ (1985) J Appl Psychol 70 116, 121.
Riva G.M., ‘I Diritti della Personalità ed Internet’, Santucci G., ‘The Internet of Things: The way
in Cassano G (ed.), Stalking, Atti Persecutori, Ahead’, in Vermesan O., Friess P. (eds), Internet
Cyberbullismo e Tutela dell’Oblio (Ipsoa 2017) of things – Global Technological and Societal
449-490.
Trends (River Publishers 2011) 53-100.
Riva G.M., ‘Privacy and Internet’, in Cassano G Sarma A.C., Girão J., ‘Identities in the Future In(ed.), Stalking, Atti Persecutori, Cyberbullismo e ternet of Things’ (2009) 49 Wireless Personal
Tutela dell’Oblio (Ipsoa 2017) 391-448.
Communications 353, 363.
Rodotà S., ‘Protezione dei dati personali e circo- Sartor G., ‘Privacy, reputazione, affidamento:
lazione delle informazioni’ [1984] Riv. crit. dir. dialettica e implicazioni per il trattamento dei
priv. 732.
dati personali’, in Mantelero A., Ruffo Bergadano
Rodotà S., ‘Tecnologie dell’informazione e fron- F., Privacy digitale. Giuristi e informatici a contiere del sistema sociopolitico’ (1982) 4 Pol. dir. fronto (Giappichelli 2006).
25, 34.
Sarzana F., Nicotra M., Diritto della Blockchain,
Rodotà S., Intervista su Privacy e libertà, Conti P. Intelligenza Artificiale e IoT (IPSOA, 2018).
(ed) (Laterza 2005).
Schafer B., ‘EU clarifies the European parameRodotà S., Tecnologie e diritti (Il Mulino 1995).
ters of data protection’ (2015) 62 Ingenia 12, 13.
Roman R., Najera P., Lopez J., ‘Securing the Inter- Schafer B., Buchanan W., Fan L. and others,
net of Things’ (2011) 44 (9) Computer 51, 58.
‘Computational Data Protection Law: Trusting
Roman R., Najera P., Lopez J., ‘Securing the Inter- each other Offline and Online’ [2012] Legal
net of Things’ (2011) 44 IEEE Computer 51, 58.
Knowledge and Information Systems 31, 40.
Romana R., Zhoua J., Lopezb J., ‘On the features Schafer B., Danidou Y., Legal Environments for
and challenges of security and privacy in distri- Digital Trust: Trustmarks, Trusted Computing
buted internet of things’ (2013) 57 (10) Compu- and the Issue of Legal Liability (2012) 7 (3)
ter Networks 2266, 2279.
Journal of International Commercial Law and
Romeo F., ‘Dalla Giuritecnica di Vittorio Frosini Technology 212, 222.
alla Privacy by Design’ (2016) XLII annata Vol. Schafer B., ‘Information Quality and Evidence
XXV 2 Informatica e diritto 9,23.
Law: A New Role for Social Media, Digital PubliRossi A., Palmirani M., ‘From Words to Images shing and Copyright Law?’ (2014), in Floridi L,
Through Legal Visualization’ in Ugo Pagallo and Illari P. (eds) The Philosophy of Information Quaothers (eds), AI Approaches to the Complexity lity. Synthese Library (Studies in Epistemology,
of Legal Systems (Springer International Publi- Logic, Methodology, and Philosophy of Science)
shing 2018) 72-85.
vol. 358 (Springer, 2014) 217-239.
152
Sources: Index of authors
Schaub F., Balebako R., Durity A.L., Cranor L.F., ‘A
Design Space for Effective Privacy Notices’ Proceedings of the Eleventh USENIX Conference on
Usable Privacy and Security (USENIX Association 2015) 1, 17.
Schaub F., Könings B. and Weber M., ‘Context-Adaptive Privacy: Leveraging Context Awareness
to Support Privacy Decision Making’ (2015) 14
IEEE Pervasive Computing 34.
Schaub F., Könings B., Weber M., ‘Context-Adaptive Privacy: Leveraging Context Awareness to
Support Privacy Decision Making’ (2015) 14 (1)
IEEE Pervasive Computing 34, 43.
Schoeman F., ‘Privacy: Philosophical Dimensions’ (1984) 21 (3) Am. Philos. Q. 199, 213.
Scholz L.H., ‘Privacy Remedies’ (2019) vol. 94
(2) Indiana Law Journal 653, 688.
Schwartz P.M., Solove D., ‘Notice and choice’
[2009] The Second NPLAN/BMSG Meeting on
Digital Media and Marketing to Children
Sciaudone R., Caravà E., Il codice della privacy.
Commento al D. Lgs. 30 giugno 2003, n. 196 e al
D. Lgs. 10 agosto 2018 n. 101 alla luce del Regolamento (UE) 2016/679 (GDPR) (Pacini, 2019).
Scripa A., ‘Artificial Intelligence as a Digital Privacy Protector’ (2017) 31 (1) Harv. J.L. & Tech.
217, 235.
Sedgewick M.B., ‘Transborder Data Privacy as
Trade’ (2017) 105 (5) California Law Review
1512, 1542.
Shelby Z., The Wireless Embedded Internet (Wiley 2010).
Sia S., D’Antonio V., Riccio G.M. (eds), La nuova
disciplina europea della privacy (CEDAM 2016).
Sibony A.-L. and Helleringer G., ‘European Consumer Protection through the Behavioral Lens’
[2017] The Columbia Journal of European Law
607.
Sica S., ‘Danno morale e legge sulla privacy informatica’ [1997] Danno resp. 282, 286.
Simon H.A., ‘Theories of decision–making in
economics and behavioral science’ (1959) Am.
Econ. Rev 49 253-283.
Sinha G.A., ‘A Real-Property Model of Privacy’
(2019) Vol 68 (3) DePaul Law Review 567,614.
Skotko V.P., Langmeyer D., ‘The effects of interaction distance and gender on self-disclosure
in the dyad Sociometry’ (1977) 40 (2) 178,182.
Soffientini M. (ed), Privacy. Protezione e trattamento dei dati (IPSOA 2018).
Solove D.J., ‘Privacy self-management and the
Consent Dilemma’ (2013) 126 Harv. Law Rev.
1880, 1903.
Solove D.J., ‘Privacy self-management and the
consent dilemma’ (2013) Harvard Law Review
126.
Solove D.J., ‘Taxonomy of privacy’ (2006) Univ.
Pa. Law Rev. 154 (3) 477, 564.
Solove D.J., ‘The Myth of the Privacy Paradox’,
Legal Studies Research Paper [2020] George
Washington University Law School website 1,42.
Spence R., Information Visualization: Design for
Interaction (Springer, 2014).
Spiekermann S., Grossklags J., Berendt B., ‘E-privacy in 2nd generation E-commerce: privacy preferences versus actual behavior’ Proceedings of
the 3rd ACM conference on Electronic Commerce
(EC ’01) (Association for Computing Machinery
2001) 38-47.
Spoto G., ‘Disciplina del consenso e tutela del
minore, in Sica S., D’Antonio V., Riccio G.M. (eds),
La nuova disciplina europea della privacy (CEDAM 2016).
Stradella E., Palmerini E. (eds.), Law and Technology. The Challenge of Regulating Technological Development (Pisa University Press 2013).
Strahilevitz L.J., Kugler M.B., ‘Is Privacy Policy Irrelevant to Consumers?’ (2017) 45 (S2) Journ.
Legal Studies 69, 95.
Stuart K.C., Mackinlay J.D., Shneiderman B.
(eds.), Readings in Information Visualization:
Using Vision to Think (Morgan Kaufmann Publishers Inc. 1999).
Taddicken M., ‘The ‘Privacy Paradox in the Social
Web: The Impact of Privacy Concerns, Individual
Characteristics, and the Perceived Social Relevance on Different Forms of Self-Disclosure’
(2014) 19 J Comput-Mediat Comm 248, 273.
Tene O., Wolfe C., ‘The Draft EU General Data
Protection Regulation: Costs and Paradoxes of
Explicit Consent’ [2014] The Future of Privacy
Forum.
Thiene A., ‘Riservatezza e autodeterminazione
del minore nelle scelte esistenziali’ (2017) 2
Famiglia e diritto 172, 179.
Thiene A., ‘Segretezza e riappropriazione di
informazioni di carattere personale: riserbo e
oblio nel nuovo regolamento europeo’ (2017) 2
NLCC 410, 444.
Thiesse F., Michahelles F., Building the Internet of
things Using RFID [2013] IEEE Internet Computing 48, 55.
Thobani S., I requisiti del consenso al trattamento dei dati personali (Maggioli 2016).
Tiersma P.M., ‘Legal language’ (University of Chicago Press 1999).
153
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
Toma C.L., Hancock J.T., ‘Self-Affirmation Underlies Facebook Use’ (2013) Pers. Soc. Psychol.
Bull 39 (3) 321,331
Tosi E., High tech law. The digital legal frame in
Italy. An overview of contracts, digital content
protection and ISP liabilities emerging issues
(Giuffré, 2016).
Tschider C., ‘Regulating the IoT: Discrimination, Privacy, and Cybersecurity in the Artificial Intelligence Age’ (2018) 96 Denv. U. L. Rev. 87, 144.
Tversky A., Kahneman D., Advances in prospect
theory: cumulative representation of uncertainty
(Springer, Cham 2016) 493-519.
Tyler S.W., Hertel P.T., McCallum M.C., Ellis H.C.,
‘Cognitive effort and memory’ (1979) 5 J. Exp.
Psychol. Learn. Mem. Cogn 607, 617.
Uckelmann D., Harrison M., Michahelles F., Architecting the Internet of Things (Springer 2011).
Valentino D. (ed.), Manuale di diritto dell’informatica (ESI 2016).
Van der Sloot B., ‘Where is the Harm in a Privacy
Violation? Calculating the Damages Afforded in
Privacy Cases by the European Court of Human
Rights’ (2017) 8 (4) JIPITEC (online).
Van Erp S., ‘Ownership of data and the numerus
clausus of legal object’ (2017) 6 Brigham-Kanner Prop. Rts. Conf. J. 1,15.
Vance A., Elie-Dit-Cosaque C., Straub D.W., ‘Examining trust in information technology artifacts’ The effects of system quality and culture
(2008) Manag. Inf. Syst. 24 (4) 73, 100.
Vasseur J.-P., Dunkels A., Interconnecting Smart
Objects with IP: The Next Internet (Elsevir 2010).
Vecchi P.M., ‘sub art. 1’, in Bianca C.M., Busnelli
F.D. (eds) La protezione dei dati personali. Commentario, vol I (Cedam 2007).
Villaronga E.F., Kieseberg P., Li T., ‘Humans Forget, Machines Remember: Artificial Intelligence
and the Right to Be Forgotten’ (2018) 34 Computer Law & Security Review 304, 313.
Visco Comandini V., ‘Il ruolo della privacy nella
competizione per l’accesso delle risorse pubblicitarie su Internet’ (2012) 1 Dir. Econ. e Tecn.
Della Privacy 1, 11.
Vivarelli A., Il consenso al trattamento dei dati
personali nell’era digitale (ESI, 2019).
Wachter S, Mittelstadt B., Russell C., ‘Counterfactual Explanations Without Opening the Black
Box: Automated Decisions and the GDPR’ (2017)
31 (2) Harvard Journal of Law & Technology
841,887.
154
Weber R.H., ‘Internet of Things. New security
and privacy challenges’ (2010) 26 (1) Computer law & security report 23, 30.
Welbourne E., Battle L., Col G. and others, ‘Building the Internet of Things Using RFID: The RFID
Ecosystem Experience’ (2009) 13 (3) Internet
Computing IEEE 48, 55.
Wood A., O’Brien D.R., Gasser U., ‘Privacy and
Open Data’ [2016] Networked Policy Series, Berkman Klein Center 1, 12.
Woolf V., ‘Mrs. Dalloway’ (Wordsworth Editions
Limited 2003).
Xu H., Teo H., Tan B.C.Y., Agarwal R., ‘Effects of
individual self-protection, industry self-regulation, and government regulation on privacy
concerns: A study of location-based services’
(2012) 23 (4) Inf. Syst. Res 1342-1363.
Xu H., Teo H.H., Tan B.C.Y., Agarwal R., ‘The Role
of Push-Pull Technology in Privacy Calculus: The
Case of Location-Based Services’ (2009) 26 (3)
Manag. Inf. Syst. 135, 174.
Yan L., Zhang Y., Yang L.T., Ning H., The Internet
of Things: From RFID to the Next-Generation Pervasive Networked Systems (Auerbach Publications 2008).
Yanisky-Ravid S., Hallisey S., ‘Equality and Privacy by Design: Ensuring Artificial Intelligence
(AI) Is Properly Trained & Fed: A New Model of AI
Data Transparency & Certification As Safe Harbor
Procedures’ (2019) 46 Fordham Urb. L. J. 428,
486.
Zanfir G., ‘Forgetting about consent. Why the focus should be on “suitable safeguards” in data
protection law’, Working Paper [2013] University of Craiova Faculty of Law and Administrative
Sciences website 1, 21.
Zeno-Zencovich V., ‘Una lettura comparatistica
della l. 675/96 sul trattamento dei dati personali’, in Cuffaro V., Ricciuto V., Zeno-Zencovich
(eds) Trattamento dei dati e tutela della persona,
(Giuffrè 1999).
Ziviz P., ‘Lesione del diritto all’immagine e risarcimento del danno’ (2000) 3 Resp. civ. prev. 710,
720.
Zuiderveen Borgesius F., ‘Behavioural Sciences
and the Regulation of Privacy on the Internet’
[2014] Legal Studies Research Paper no. 54 Amsterdam Law School website 1, 39.
Zuiderveen Borgesius F., ‘Informed consent: We
Can Do Better to Defend Privacy’ (2015) Vol. 13
(2) IEE 103, 107.
Sources: Index of Reports, studies and research projects; Legislation
‘EU justice committee passes amended ePrivacy directive’ [2017] DIMT (online).
‘Privacy and Technology. Technology Can Make
Your Life Safer and Easier, But Is It Worth the Risk
to Your Privacy’ [2019] Safehome.org (online)
INDEX OF REPORTS, STUDIES AND RESEARCH
PROJECTS
‘Ex-post review of EU legislation: a well-established system, but incomplete [2018] European
Court of Auditors, https://www.eca.europa.eu/
Lists/ECADocuments/SR18_16/SR_BETTER_REGULATION_EN.pdf
‘La nuova disciplina dell’analisi e della verifica
dell’impatto della regolamentazione’ [2018] Senate of the Italian Republic https://www.senato.
it/application/xmanager/projects/leg18/ESPERIENZE_DIRETTIVA_AIR.pdf
‘Privacy and Internet of Things: a behavioural
and legal approach’ [2017] University of Naples
Suor Orsola Benincasa, https://www.unisob.
na.it/ateneo/c008_e.htm?vr=1&lg=en
‘Training activities to implement data protection
reform’ (TAtoDPR), European Project [2017]
www.tatodpr.eu
‘Leading the IoT, Gartner insight on how to lead
in a connected word’ [2017] Gartner estimate,
https://www.gartner.com/imagesrv/books/iot/
iotEbook_digital.pdf
‘Privacy and Data Protection by Design’ [2017]
ENISA www.enisa.europa.eu
LEGISLATION
INTERNATIONAL LEGISLATION
Council of Europe, Guidelines on artificial intelligence and data protection, 25 January 2019,
Convention no. 108, T-PD(2019)01.
Declaration on ethics and data protection in artificial intelligence, 23 October 2018.
Council of Europe, Guidelines on the protection
of individuals with regard to the processing of
personal data in a world of Big Data, 23 January
2017, T-PD (2017) 01
Council of Europe, Convention for the protection
of individuals with regard to automatic processing of Personal Data, European Treaty Series,
no. 108, 28 January 1981.
EUROPEAN UNION LEGISLATION
Commission Implementing Decision 2016/1250/
EU of 12 July 2016 pursuant to Directive 95/46/
EC of the European Parliament and of the Council on the adequacy of the protection provided
by the EU-U.S. Privacy Shield [2016] OJ L 207/1.
Directive the European Parliament and of the
Council 2016/11487EU of 6 July 2016 concerning measures for a high common level of security of network and information systems across
the Union [2016] OJ L 194/1.
Regulation of the European Parliament and of
the Council 2016/679/EU of 27 April 2016 on
the protection of natural persons with regard to
the processing of personal data and on the free
movement of such data, and repealing Directive
95/46/EC (General Data Protection Regulation)
[2016] OJ L 119/1.
Directive of the European Parliament and of the
Council 2016/680/EU of 27 April 2016 on the
protection of natural persons with regard to the
processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal
offences or the execution of criminal penalties,
and on the free movement of such data, and repealing Council Framework Decision 2008/977/
JHA [2016] OJ L 119/89.
Directive of the European Parliament and of the
Council 2016/681/EU of 27 April 2016 on the
use of passenger name record (PNR) data for
the prevention, detection, investigation and prosecution of terrorist offences and serious crime
[2016] OJ L 119/132.
Directive of the European Parliament and of the
Council 2011/83/EU of 25 October 2011 on consumer rights amending council directive 93/13/
EEC and directive 1999/44/EC of the European
parliament and of the Council and repealing
council directive 85/577/EEC and directive
97/7/EC of the European parliament and of the
council [2011] OJ L 304.
Directive 2006/24/EC of the European Parliament and of the Council 2002/58/EC of 15 March 2006 on the retention of data generated or
processed in connection with the provision of
publicly available electronic communications
services or of public communications networks
and amending Directive 2002/58/EC [2007] OJ
105/54.
155
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
NATIONAL LEGISLATIONS
Italy:
Provisions for the adaptation of national legislation to the provisions of Regulation (EU)
2016/679, D.Lgs. 10 August 2018 n. 101, OJ 205.
Provisions for the fulfilment of the obligations
resulting from Italy’s membership of the European Union - European Law 2017, L. 20 November 2017, n. 167, OJ 277.
Single text of media and radio services, d.lgs 31
July 2005, n 177, OJ 208.
Italian Data protection Code, d. lgs 30 June
2003 n 196, OJ 174.
Italian electronic Communication Code, d. lgs. 1
August 2003, n 259, OJ 214.
Italian Civil Code, R.D. 16 March 1942, n 262, OJ
79.
SOFT LAW
EUROPEAN UNION:
European Data Protection Board, Guidelines on
consent under Regulation 2016/67910, 4 May
2020, no. 5/2020.
European Data Protection Board, Guidelines on
article 25 data protection by design and by default, 20 October 2020, no. 4/2019.
High-Level Expert Group on Artificial Intelligence set up by the European Commission Ethics
Guidelines For Trustworthy AI of the, 8 April 2019.
European Commission Communication to the
European parliament, the European council,
the Council, the European economic and social
committee and the Committee of the regions on
Building Trust in Human-Centric Artificial Intelligence, COM(2019) 168 final.
European Commission Communication to the
European parliament, the European council, the
Council, the European economic and social committee and the Committee of the regions, of 25
April 2018, on Artificial Intelligence for Europe
(COM(2018) 237 final).
Proposal for a Regulation of the European parliament and of the Council of 10 January 2017,
concerning the respect for private life and the
protection of personal data in electronic communications and repealing Directive 2002/58/
EC (COM(2017) 10 final).
Article 29 Data Protection Working Party. Guidelines on transparency under regulation
2016/679, 17/EN WP260 rev.01, April 2018.
156
Article 29 Data Protection Working Party, Guidelines on Consent under Regulation 2016/679, 28
November 2017, WP259.
European Commission Staff Working Document
of 7 July 2017, on Better Regulation Guidelines
(SWD (2017) 350).
Communication from the Commission to the
European parliament, the council, the European
economic and social committee and the Committee of the regions of 19 May 2017, on the MidTerm Review on the implementation of the Digital Single Market Strategy (COM(2017) 288).
White Paper COM (2017) 2025 of 1 March 2017
on the Future of Europe, Reflections and Scenarios for the EU27 towards 2025 [2017].
Communication from the Commission to the
European parliament, the Council, the European
Eonomic and Social Committee and the Committee of the Regions of 19 April 2016, on Digitising
European Industry Reaping the full benefits of a
Digital Single Market (COM(2016) 180 final)
European Commission Staff Working Document,
of 19 April 2016, on the advancing the Internet
of Things in Europe, accompanying the Communication from the Commission to the European
Parliament, the Council, the European Economic
and Social Committee and the Committee of the
Regions on the Digitising European Industry Reaping the full benefits of a Digital Single Market
(SWD(2016) 110/2).
European Data Protection Supervisor. Opinion
4/2015 towards a new digital ethics. Online,
September 2015.
European Parliament Resolution of 16 February 2017 with recommendations to the
Commission on Civil Law Rules on Ro- botics
(2015/2103(INL))
Council conclusions of 9 June 2016 on improving criminal justice in cyberspace [2016].
NATIONAL LEGISLATIONS
Italy:
Italian Data Protection Authority, Provision on
Cambridge Analytica 10 January 2019 [doc.
web n. 9080914]
Italian Data Protection Authority, Measure that
identifies the provisions contained in the general authorizations nos. 1/2016, 3/2016, 6/2016,
8/2016 and 9/ 2016 which are compliante with
GDPR and with legislative decree no. 101/2018
for the adjustmentn of the Privacy code, 13 December 201, no. 497 [doc. web n. 9068972].
Sources: Table of case-law
Italian Data Protection Authority, Deontological
rules relating to the processing of personal data
made to carry out defensive investigations or to
enforce or defend a right in judicial proceedings
published pursuant to art. 20, paragraph 4, of
Legislative Decree 10 August 2018, n. 101, 29
December 2018, no. 512, OJ 12/2019 [doc. web
n. 9069653]
Agency for Digital Italy (AGID) White book on Artificial Intelligence at the service of the citizen
[2017].
Italian Data Protection Authority, Interpretative
measure of some provisions of the Code SIC,
26 October 2017, no. 438, OJ 279 [doc. web n.
7221677].
Italian Data Protection Authority, Guidelines on
health records, 4 June 2015, no. 331, OJ n. 164
[doc. web n. 4084632].
Italian Data Protection Authority, Start of public
consultation on the Internet of things, 26 March
2015, no 179 [doc. web n. 3898704].
Italian Data Protection Authority, Guidelines on
the processing of personal data for online profiling, 19 March 2015, no 161, OJ no. 103 [doc.
web n. 3881513]
Italian Data Protection Authority, Approval of
the verification protocol which governs the monitoring activities by the Italian Data Protection
Authority on the instructions given to Google, 22
January 2015, no. 30 [doc. web n. 3738244].
Italian Data Protection Authority, General prescriptive provision on biometrics, 12 December
2014, no. 513, OJ 280 [doc. web n. 3556992].
Italian Data Protection Authority, Guidelines on
biometric recognition, 12 November 2014.
Italian Data Protection Authority, Guidelines on
Marketing and against Spam, 4 July 2013, no
330, OJ 174 [doc. web n. 2542348].
WG 29, On anonymisation techniques, 10April
2014, n. 5
Italian Data Protection Authority, Guidelines on
the processing of personal data for the purpose
of publication and dissemination of web sites
exclusively dedicated to health, 25 January
2012, OJ 42 [doc. web n. 1870212].
Italian Data Protection Authority, Guidelines on
Processing Personal Data to Perform Customer
Satisfaction Surveys in the Health Care Sector, 5
May 2011, no. 182, OJ no. 120.
Italian Data Protection Authority, Work: the Data
Protection Authority’s guidelines for electronic
mail and the Internet, 1 March 2007, no. 13 OJ
on. 58 [doc. web n. 1387522].
Italian Data Protection Authority, Guidelines
for data processing of private employees, 23
November 2006, no. 53 OJ no. 285 [doc. web n.
1364939].
TABLE OF CASE-LAW
EUROPEAN UNION:
CJEU, Third Chamber, Judgment 3 October 2019,
Facebook C-18/18, ECLI:EU:C:2019:821
CJEU, Grand Chamber, Judgment 24 September 2019, Google/CNIL, C-507/17, ECLI:EU:C:2019:772.
CJEU, Grand Chamber, Judgment 6 October 2015,
Maximilian Shrems, C-362/14, EU:C:2015:650.
CJEU, Grand Chamber, Judgment 13 May 2014,
Goolge Spain, C-131/12, EU:C:2014:317.
CJEU, Judgment 08 April 2014, Data retention,
C-293/12 e C- 594/12, EU:C:2014:238.
ITALY:
Italian Supreme Court, Joint civil section, Order,
22 July 2019, no. 19681, in Pluris (online)
Italian Supreme Court, civil section III, Order, 5
November 2018, no. 28084, in Pluris (online).
Italian Supreme Court, civil section I, Order, 19
June 2018, no. 16429, in Pluris (online).
Italian Supreme Court, civil section I, Order, 4
June 2018, no. 14242, in Pluris (online).
Italian Supreme Court, civil section I, Order, 25
January 2017, no. 1931, in Pluris (online).
Italian Supreme Court, civil section I, Order, 20
May 2016, no. 10510, in Pluris (online).
Italian Supreme Court, civil section VI, Order, 10
February 2016, no. 2687, in Pluris (online).
Italian Supreme Court, civil section III, Judgment,
19 July 2016, no. 14694, in DeJure (online).
Italian Supreme Court, civil section I, Judgment,
23 May 2016, no. 10638, in DeJure (online).
Italian Supreme Court, civil section VI, Ordenance, 11 January 2016, no. 222, in DeJure (online) (with comment of Alovisio M, ‘Risarcimento
del danno per diffusione indebita di dati sanitari’
in Dir. Giust. (2016) 3 ff.)
Italian Supreme Court, civil section III, Judgment, 15 October 2015, no. 20890 in Danno e
resp. (2016) 372, (with comment of Gagliardi M,
‘La prova del danno non patrimoniale in caso di
157
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
trattamento illecito dei dati personali’, 373 ff.)
Italian Supreme Court, civil section I, Judgment,
13 May 2015, no. 9785, in Fam. Dir. (2016) 469
ff.
Italian Supreme Court, civil section VI, Judgment,
5 September 2014, n. 18812, in DeJure (online).
Italian Supreme Court, civil section III, Judgment,
14 August 2014, no. 17974, in DeJure (online).
Italian Supreme Court, 15 July 2014, no. 16113,
in Danno resp. (anno) 339 ff. (with comment of
Ceccarelli V).
Italian Supreme Court, civil section 19 May
2004, no. 10947, in Fam. dir. (anno) 468 ff.
Italian Court of Appeal of Milan, 22 July 2015, in
Danno e resp. (2015), 1047 (with comments of
Ponzanelli G., ‘Quanto vale la riservatezza di un
giocatore di calcio?’, 1057 ff. and Foffa R, ‘Il caso
Vieri: secondo tempo’ 1059).
Italian Tribunal of Palermo, 5 October 2017, n.
5261 in Altalex (online).
Italian Tribunal of Milan, 3 September 2012, in
Riv. it. medicina legale (2013) 1067 ff. (with
comment of Serani E.).
158
ANNEXES
QUESTIONARIO DI PROFILAZIONE
1. ETÀ
____________________________________________________
2. GENERE
M
3. TITOLO DI STUDIO
____________________________________________________
4. PROFESSIONE
____________________________________________________
F
5. QUANTE ORE DI NAVIGAZIONE SU INTERNET IN MEDIA AL GIORNO?
•0
• da 1 a 2
• da 2 a 3
• da 3 a 5
• più di 5
6. QUALE SISTEMA OPERATIVO UTILIZZA PREVALENTEMENTE?
• Microsoft Windows • Linux
• Apple Mac OSX e iOS • Android
• altro (specificare) _________________________
7. QUALE BROWSER UTILIZZA PREVALENTEMENTE?
• Mozilla Firefox
• Opera
• Chrome
• Safari
• Internet Explorer
• altro (specificare) ____________________________________________________
8. UTILIZZA UNO SMARTPHONE?
• Sì • No
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
9. LEGGE GIORNALI ONLINE?
• Sì • No
10. GUARDA IL CONTO BANCARIO ONLINE?
• Sì • No
11. HA UN ACCOUNT:
• Google
• Twitter
• Facebook
• Microsoft
12. È ABITUATO AD USARE E-COMMERCE? FA ACQUISTI ONLINE?
• Sì • No
13. CON QUALE OPERATORE:
• Amazon
• Ebay
• Yoox
• Asos
• altro (specificare) _______________________
14. È ABITUATO AD USARE APP?
• Sì • No
15. HA MAI SOSTENUTO UN ESAME UNIVERSITARIO IN MATERIE GIURIDICHE?
• Sì • No
16. QUALE? ____________________________________________________
17. SI INTERESSA DI ARGOMENTI DI DIRITTO NELLA VITA QUOTIDIANA (ES. NOTIZIE
SU LEGGI FINANZIARIE, INCHIESTE GIUDIZIARIE) SU GIORNALI, INTERNET, ETC.?
• Sì • No
18. CHE COS’È IL TRATTAMENTO DEI DATI PERSONALI?
• L’utilizzo di informazioni relative ad una persona per ricavarne il profilo
• La conservazione e/o l’utilizzo di dati personali da parte di un soggetto
diverso dal titolare dei dati
• La vendita a terze parti dei dati personali relativi ad un individuo
• Non lo so
19. QUANDO FIRMA UN DOCUMENTO CARTACEO (CONTRATTO, AUTORIZZAZIONE,
ETC.), PRESTA DI REGOLA ATTENZIONE A QUANTO È SCRITTO NEL DOCUMENTO
PRIMA DI FIRMARE?
• Sì • No
160
Annexes: Questionario di profilazione
20. QUANDO NAVIGA IN INTERNET, TEME PER LA SICUREZZA DEI SUOI DATI?
• Sì • No
21. PENSA CHE I SISTEMI OPERATIVI (AD ES. WINDOWS 10, WINDOWS 6 ,
WINDOWS 8, EL CAPITAN) UTILIZZINO LE INFORMAZIONI CHE LEI FORNISCE
LORO NEL PIENO RISPETTO DELLA NORMATIVA SULLA PRIVACY
(D.LGS. 196/2003)?
• Sì • No
22. PENSA CHE I SISTEMI OPERATIVI CHE LEI UTILIZZA USINO LE INFORMAZIONI
CHE LEI FORNISCE LORO IN MODO TRASPARENTE?
• Sì • No
23. QUANTO È IMPORTANTE PER LEI PROTEGGERE I SUOI DATI PERSONALI
QUANDO NAVIGA IN INTERNET?
• Per nulla importante
• Poco importante
• Abbastanza importante
• Piuttosto importante
• Molto importante
24. HA MAI AVUTO PROBLEMI DI SICUREZZA INFORMATICA?
• Sì • No
SEZIONE I
25. SE HA AVUTO PROBLEMI DI SICUREZZA INFORMATICA, QUALI IN PARTICOLARE?
• Virus
• Furto dati da agenti esterni perdita
• Dati per agenti esterni
• Altro (specificare) ____________________________________________________
26. PENSA DI SAPER CONFIGURARE IL SISTEMA OPERATIVO DA LEI UTILIZZATO
IN MODO CHE I SUOI DATI RISULTINO SICURI E PROTETTI?
• Sì • No
161
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
27. SU UNA SCALA CHE VA DA 1 (TOTALMENTE IN DISACCORDO) A 5
(TOTALMENTE D’ACCORDO), INDICHI QUANTO RITIENE DI ESSERE D’ACCORDO
CON CIASCUNA DELLE AFFERMAZIONI SOTTO ELENCATE:
totalmente
in disaccordo
totalmente
d’accordo
Credo sia utile acconsentire
al rilevamento della mia
posizione e della cronologia
delle mie posizioni da parte
di app
•
•
•
•
•
Penso sia utile che Windows
e Cortana riconoscano
la mia voce per creare
suggerimenti migliori per me
•
•
•
•
•
Ritengo si utile acconsentire
alle app di accedere al mio
nome, alla mia immagine e
ad altre info sul mio account
•
•
•
•
•
Credo sia utile consentire
alle app di condividere
e sincronizzare
automaticamente le mie
informazioni con dispositivi
wireless non associati in
modo esplicito al mio PC,
tablet o telefono
•
•
•
•
•
SEZIONE II.I
28. DURANTE L’INSTALLAZIONE DI WINDOWS 10 (W10), HA COMPRESO SE VI
ERANO DEI PASSAGGI RELATIVI ALLA PRIVACY ED ALLA PROTEZIONE DEI SUOI
DATI PERSONALI?
assolutamente no
•
•
assolutamente sì
•
•
•
29. RITIENE CHE LE INFORMAZIONI IN MATERIA DI PRIVACY E DATI PERSONALI
SIANO STATE PRESENTATE IN MODO CHIARO?
assolutamente no
•
162
•
assolutamente sì
•
•
•
Annexes: Questionario di profilazione
30. USANDO UNA SCALA CHE VA DA 1 (ASSOLUTAMENTE NO) A 5
(ASSOLUTAMENTE SÌ), RISPONDA ALLE DOMANDE SOTTO ELENCATE:
assolutamente no
assolutamente sì
In fase di completamento
dell’installazione di W10,
ha compreso in maniera
chiara che cliccando sul tasto
“Fare più in fretta” in fondo
alla pagina ha dichiarato di
acconsentire all’uso dei suoi
dati personali e dei cookies?
•
•
•
•
•
Ritiene che, nella fase di
configurazione di Cortana, le
siano state fornite in maniera
chiara le informazioni relative
al modo in cui Microsoft
potrebbe utilizzare i suoi dati?
•
•
•
•
•
Cortana può raccogliere
informazioni in merito alla
sua grafia e alla sua voce
per aiutarla nelle ricerche.
Ritiene le sia stato chiesto in
maniera chiara il consenso
all’utilizzo di questi dati?
•
•
•
•
•
31. HA MAI PRESTATO IL CONSENSO AL TRATTAMENTO DEI DATI PERSONALI
IN ALTRI SISTEMI OPERATIVI?
• Sì • No
SEZIONE II.II
32. SU UNA SCALA CHE VA DA 1 (ASSOLUTAMENTE NO) A 5 (ASSOLUTAMENTE SÌ),
INDICHI QUANTO APPROVA I CONTENUTI DELLE DOMANDE SOTTO ELENCATE:
assolutamente no
assolutamente sì
Se ha acconsentito
alla modalità “Fare più
in fretta” al momento
dell’installazione, saprebbe
indicare cosa determina
questa modalità di
installazione sui dati che
lei fornisce (es. tipi di dati
forniti, loro utilizzo)?
•
•
•
•
•
Ha compreso quali sono
le conseguenze per la
sua privacy delle
impostazioni scelte nel
pannello “Privacy”?
•
•
•
•
•
163
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
33. HA COMPRESO CHE COSA COMPORTA ACCONSENTIRE AL TRATTAMENTO
DEI SUOI DATI PERSONALI DA PARTE DI SOGGETTI TERZI?
• No
• Non lo so
• Sì, significa che anche soggetti diversi da quelli che hanno raccolto
il mio consenso, potranno usare i miei dati per finalità varie
• Sì, vuol dire che solo chi ha raccolto il mio consenso può utilizzare i miei dati
• Significa che i miei dati possono essere utilizzati solo da soggetti terzi
SEZIONE II.III
34. USANDO UNA SCALA CHE VA DA 1 (ASSOLUTAMENTE NO) A 5
(ASSOLUTAMENTE SÌ), RISPONDA ALLE DOMANDE SOTTO ELENCATE
assolutamente no
assolutamente sì
In base alle informazioni
fornite nella fase di
installazione e nelle
impostazioni della privacy,
saprebbe richiedere
l’adeguamento, modifica
o cancellazione dei dati
personali trattati in W10?
•
•
•
•
•
Pensa che dare o non dare il
consenso al trattamento dei
dati personali incida molto
sul funzionamento di W10?
•
•
•
•
•
SEZIONE II.IV
35. USANDO UNA SCALA CHE VA DA 1 (ASSOLUTAMENTE NO) A 5
(ASSOLUTAMENTE SÌ), RISPONDA ALLE DOMANDE SOTTO ELENCATE
assolutamente no
assolutamente sì
Ha compreso bene che
Cortana, se abilitata, sfrutta
tutti i dati presenti sul suo
dispositivo?
•
•
•
•
•
Ha compreso bene che
installando W10 in modalità
“Fare più in fretta”, Cortana
ha accesso ai suoi dati
presenti sul dispositivo?
•
•
•
•
•
164
Annexes: Questionario di profilazione
36. CON QUANTA ATTENZIONE HA LETTO L’INFORMATIVA SULLA PRIVACY
E SUI COOKIE QUANDO HA SCARICATO L’APP?
• Nessuna attenzione
• Scarsa
• Media
• Molta
• Massima attenzione
37. USANDO UNA SCALA CHE VA DA 1 (PER NIENTE) A 5 (COMPLETAMENTE),
RISPONDA ALLE DOMANDE SOTTO ELENCATE
per
niente
poco
abbastanza
molto
completamente
Quanto ritiene
che sia sicuro
il suo PC?
•
•
•
•
•
Il suo pc è
attaccabile
da terzi?
•
•
•
•
•
Quanto ritiene di
aver memorizzato
l’informativa che ha
letto?
•
•
•
•
•
SEZIONE III
38. SI SENTE IN GRADO DI GESTIRE I SUOI DATI PERSONALI SU WINDOWS 10?
• Sì • No
39. TROVA CHE WINDOWS 10 FACILITI L’IMPOSTAZIONE DELLA SICUREZZA DEL SUO
PC, DEI CONTENUTI E DEI SUOI DATI PERSONALI?
• Sì • No
40. RITIENE SIA VALSA LA PENA ESEGUIRE OPERAZIONI COME QUELLE CHE LE
SONO STATE RICHIESTE PER LA GESTIONE DEI DATI PERSONALI?
• Sì, molto
• No, per niente
• Abbastanza
165
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
41. QUANDO NAVIGA SU INTERNET RITIENE CHE I SUOI DATI SIANO PROTETTI
DA FURTI, APPROPRIAZIONI NON AUTORIZZATE DA TERZI O ALTRE AZIONI SIMILI?
• Sì, molto
• Sì, abbastanza
• Sì, ma poco
• Non so
• No, per niente
42. RITIENE POSSIBILE CHE VENGANO GESTITI DEI DATI PERSONALI IN ASSENZA
DEL SUO CONSENSO?
• Sì, lo ritengo possibile
• No, non lo ritengo possibile
43. PENSA CHE CI POSSANO ESSERE DELLE CONSEGUENZE NEGATIVE PER LEI
DALL’UTILIZZO DA PARTE DI TERZI DEI SUOI DATI PERSONALI?
• Sì • No
44. POTREBBE FORNIRE QUALCHE ESEMPIO?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
45. È PIÙ IMPORTANTE PRESERVARE LA SICUREZZA DEL SUO PC O PRESERVARE
LA PRIVACY?
• Preservare la privacy
• Preservare la sicurezza del PC
• Non lo so
46. RISPONDA ALLE SEGUENTI DOMANDE:
sì
no
non lo so
Sarebbe disposto a pagare
una somma pari a 20 euro,
se così facendo, potesse
evitare che i suoi dati
venissero venduti ad altri
per finalità commerciali?
•
•
•
Sarebbe disposto a cedere
un numero consistente di
dati personali (es.: numero
di cellulare, codice fiscale,
matricola universitaria,
ecc.) al solo scopo di
garantire una maggiore
sicurezza del suo pc e del
suo contenuto?
•
•
•
166
Annexes: Questionario di profilazione
Quando ha navigato su
internet ed è comparso
l’avviso dei cookie, ha
acconsentito?
•
•
•
Secondo lei con Windows10
è possibile autorizzare il
trattamento
soltanto di alcuni dei suoi
dati personali?
•
•
•
Ritiene possibile che con
l’autorizzazione dei cookie
lei possa scegliere soltanto
alcuni operatori per il
trattamento dei suoi dati
personali?
•
•
•
47. CHE DIFFERENZA C’È TRA L’AUTORIZZAZIONE AI COOKIE E L’AUTORIZZAZIONE
AL TRATTAMENTO DEI DATI PERSONALI?
• Nessuna
• L’autorizzazione ai dati personali è più ampia
• L’autorizzazione ai cookie è più ampia
• Non lo so
SEZIONE III.I (aggiuntiva)
48. A COSA SERVE DARE IL CONSENSO AL TRATTAMENTO DEI DATI PERSONALI?
• A proteggerli, cioè ad impedirne usi abusivi
• A esercitare una scelta consapevole
• Ad autodeterminarsi
• Non lo so
SEZIONE III.II (aggiuntiva)
49. RITIENE CHE IL CONSENSO AL TRATTAMENTO EFFETTIVAMENTE PROTEGGA I
SUOI DATI PERSONALI?
• Sì • No • Non lo so
167
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
SEZIONE III.III (aggiuntiva)
50. PERCHÈ?
• Con il consenso i miei dati personali verranno trattati solo dai soggetti
autorizzati
• Con il consenso non evito in pratica che i miei dati vengano trattati
contro la legge
• Con il consenso non potrò ricevere danno dal trattamento dei miei dati
personali
• Altro (specificare)
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
SEZIONE III.IV (aggiuntiva)
51. PERCHÈ?
• Con il consenso, in pratica, non evito che i miei dati vengano trattati anche
da soggetti non autorizzati
• Con il consenso non evito in pratica che i miei dati vengano trattati contro
la legge
• Con il consenso non evito di subire dei danni dal trattamento dei miei dati
personali
• Altro (specificare)
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
SEZIONE III.V (aggiuntiva)
52. PENSA CI POSSANO ESSERE CONSEGUENZE NEGATIVE PER LEI DERIVANTI
DALL’UTILIZZO DI SUOI DATI PERSONALI SENZA IL SUO CONSENSO?
• Sì • No • Non lo so
168
Annexes: Questionario di profilazione
SEZIONE III.VI (aggiuntiva)
53. PERCHÈ?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
SEZIONE III.VII (aggiuntiva)
54. QUALI?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
SEZIONE III.VIII (aggiuntiva)
55. RITIENE CHE NEGARE IL CONSENSO ALL’UTILIZZO DEI DATI POSSA COSTITUIRE
UN LIMITE A TALI CONSEGUENZE NEGATIVE?
• Sì • No • Non lo so
56. CHE SIGNIFICA PER LEI CHE I SUOI DATI DEBBANO ESSERE PROTETTI?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
57. QUANDO AUTORIZZA I COOKIE CONOSCI LA DISTINZIONE TRA COOKIE GENERALI
E COOKIE NECESSARI?
• Sì • No • Non lo so
58. SA COS’È LA PROFILAZIONE DEI DATI?
• Sì
• No • Non lo so
59. SE SÌ, CHE COS’È E COME SI OTTIENE?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
169
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
SEZIONE IV
60. LE CHIEDIAMO DI ESPRIMERE UNA VALUTAZIONE COMPLESSIVA
DELL’INTERFACCIA CHE HA APPENA PROVATO. SU UNA SCALA CHE VA DA 1
(TOTALMENTE IN DISACCORDO) A 5 (TOTALMENTE D’ACCORDO), INDICHI
QUANTO RITIENE DI ESSERE D’ACCORDO CON CIASCUNA DELLE
AFFERMAZIONI SOTTO ELENCATE:
totalmente
in disaccordo
totalmente
d’accordo
Penso che userei volentieri
questo sistema
•
•
•
•
•
Ho trovato il sistema
complesso da usare
•
•
•
•
•
Ho trovato il sistema molto
semplice da usare
•
•
•
•
•
Penso che avrei bisogno del
supporto di una persona
già in grado di utilizzare il
sistema
•
•
•
•
•
Ho trovato le varie
funzionalità del sistema
bene integrate
•
•
•
•
•
Ho trovato incoerenze tra le
varie funzionalità
del sistema
•
•
•
•
•
Penso che la maggior parte
delle persone potrebbe
imparare a utilizzare il
sistema facilmente
•
•
•
•
•
Ho trovato le procedure
molto macchinose da
eseguire
•
•
•
•
•
Mi è parso di avere molta
confidenza nell’uso delle
procedure
•
•
•
•
•
Penso che dovrei imparare
molte cose del sistema,
prima di poter completare i
compiti
•
•
•
•
•
170
Annexes: Questionario di profilazione
PROFILATION QUESTIONER
1. AGE
____________________________________________________
2. GENDER
M
3. TITLE OF STUDY
____________________________________________________
4. PROFESSION
____________________________________________________
F
5. HOW MANY HOURS OF INTERNET SURFING PER DAY ON AVERAGE?
•0
• from 1 to 2
• from 2 to 3
• from 3 to 5
• more than 5
6. WHICH OPERATING SYSTEM USES PREDOMINANTLY?
• Microsoft Windows • Linux
• Apple Mac OSX e iOS • Android
• Other (specify) _________________________
7. WHICH BROWSER USES PREDOMINANTLY?
• Mozilla Firefox
• Opera
• Chrome
• Safari
• Internet Explorer
• Other (specify) ____________________________________________________
8. DO YOU USE A SMARTPHONE?
• Yes • No
9. DO YOU READ ONLINE NEWSPAPERS?
• Yes • No
10. DO YOU CHECK YOUR ONLINE BANK ACCOUNT?
• Yes • No
11. DO YOU HAVE AN ACCOUNT:
• Google
• Facebook
• Twitter
• Microsoft
171
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
12. ARE YOU USED TO USING E-COMMERCE? DO YOU SHOP ONLINE?
• Yes • No
13. WHICH OPERATOR?
• Amazon
• Yoox
• Ebay
• Asos
• Other (specify) _______________________
14. ARE YOU USED TO USING APPS?
• Yes • No
15. HAVE YOU EVER ATTENDED A UNIVERSITY EXAM IN LEGAL MATTERS?
• Yes • No
16. WHICH ONE? ____________________________________________________
17. ARE YOU ORDINARY INTERESTED IN LAW? (FOR EXAMPLE.: FINANCIAL
NEWS,LEGAL INQUIRIES) ABOVE NEWSPAPERS, INTERNET, ETC.?
• Yes • No
18. WHAT IS THE PROCESSING OF PERSONAL DATA?
• Using informations related to people in order to get an outline
• The preservation and/or use of other’s personal data by a third person
• The trade of personal data to third people
• I don’t know
19. WHEN YOU SIGN A DOCUMENT (CONTRACT, AUTHORIZATION , ETC.), DO
YOU USUALLY TAKE NOTICE OF WHAT IS WRITTEN IN THE DOCUMENT
BEFORE SIGNING IT?
• Yes • No
20. WHEN YOU SURF THE WEB, ARE YOU WORRIED ABOUT THE SAFETY OF YOUR
PERSONAL DATA?
• Yes • No
21. DO YOU THINK OPERATIVE SYSTEMS (SUCH AS: WINDOWS 10, WINDOWS
6, WINDOWS 8, EL CAPITAN) USE THE INFORMATIONS YOU PROVIDED THEM
BY RESPECTING THE PRIVACY LEGISLATION (D.LGS. 196/2003)?
• Yes • No
172
Annexes: Profilation Questioner
22. DO YOU THINK OPERATIVE SYSTEMS CLEARLY USE INFORMATIONS YOU
PROVIDED THEM ?
• Yes • No
23. HOW MUCH PROTECTING PERSONAL DATA WHILE SURFING THE INTERNET IS
IMPORTANT TO YOU?
• Nothing at all
• A bit
• Quite important
• Rather important
• Very importante
24. HAVE YOU EVER EXPERIENCED SAFETY INFORMATICS TROUBLES?
• Yes • No
I SECTION
25. IN CASE YOU HAD SAFETY TROUBLES, WHICH ONES?
• Virus
• Theft, from third people loss of your data
• Caused by third agents
• Other (specify what) ____________________________________________________
26. DO YOU THINK YOU ARE ABLE TO SET UP AN OPERATIVE SYSTEM IN ORDER TO
MAKE YOUR DATA PROTECTED?
• Yes • No
27. SU UNA SCALA CHE VA DA 1 (TOTALMENTE IN DISACCORDO) A 5
(TOTALMENTE D’ACCORDO), INDICHI QUANTO RITIENE DI ESSERE D’ACCORDO
CON CIASCUNA DELLE AFFERMAZIONI SOTTO ELENCATE:
fully
disagreeding
fully
agreeding
I believe it is usefull giving
the consent in order to
make my apps able to
localize me and knowing
my chronology
•
•
•
•
•
I believe it is useful that
Windows e Cortana
Recognize my voice in
order to provide better
suggestions for me
•
•
•
•
•
173
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
fully
disagreeding
fully
agreeding
I think it is useful giving
to apps the allowance of
knowing my name,my
picture and other
informations related to my
account
•
•
•
•
•
I think it is usefull
consenting the apps
sharing, authomatically
syncrhonizing my
informations through
wireless devices not openly
linked to my Pc or mobile
phone
•
•
•
•
•
II.I SECTION
28. WHILE INSTALLATING WINDOWS 10 (W10), DID YOU REALIZE IF THERE
WHERE TRANSITIONS RELATED TO PRIVACY AND PERSONAL DATA PROTECTION?
absolutely not
•
•
absolutely yes
•
•
•
29. DO YOU THINK INFORMATIONS CONCERNING PERSONAL DATA AND PRIVACY
ARE CLEARLY REPRESENTED?
absolutely not
•
•
absolutely yes
•
•
•
30. ON A SCALE FROM 1 (ABSOLUTELY NOT) TO 5 (ABSOLUTELY YES), ANSWER
TO THE FOLLOWING QUESTIONS:
absolutely
not
absolutely
yes
While ending the
installation of W10, did
you realize opting for “fast
selection” below you gave
the consent to processing
your personal data and
authorized cookies?
•
•
•
•
•
Do you think that while
setting up Cortana,
informations concerning
how Microsoft could use
your personal data where
clearly represented ?
•
•
•
•
•
174
Annexes: Profilation Questioner
absolutely
not
Cortana can collect
informations linked to your
handwriting and to your
voice in order to help you
researching.
Do you think you have been
asked to consent to theese
informations?
•
absolutely
yes
•
•
•
•
31. HAVE YOU EVER GIVEN THE CONSENT OF PROCESSING YOUR PERSONAL DATA
IN OTHER OPERATIVE SYSTEMS?
• Yes • No
II.II SECTION
32. ON A SCALE FROM 1 (ABSOLUTELY NOT) TO 5 (ABSOLUTELY YES), ANSWER
TO THE FOLLOWING QUESTIONS:
absolutely
not
absolutely
yes
If while installing you
picked the” fast choice “
option, could you be able
of indicating what is this
option consequence on
personal data you provide
(ex. Kind of data provided ,
related use)?
•
•
•
•
•
Did you understand which
are the consequences for
your privacy by choosing
the planning of the
“Privacy”dashboard?
•
•
•
•
•
33. DID YOU REALIZE WHAT IS MEANING GIVING THIRD PEOPLE THE CONSENT TO
PROCESS YOUR PERSONAL DATA?
• No
• Don’t know
• Yes, it means that also third partis to who i didn’t give consent, will be
able to use my personal data for different aims
• Yes, it means that only people to who i gave my consent will be abe to use
my data
• Yes, it means thatn only third people can use my personal data
175
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
II.III SECTION
34. ON A SCALE FROM 1 (ABSOLUTELY NOT) TO 5 (ABSOLUTELY YES), ANSWER
TO THE FOLLOWING QUESTIONS:
absolutely
not
absolutely
yes
According to the
informations represented
during the setting up and
in the Privacy panel, would
you be able to ask for the
update, modification or
cancellation of personal
data processed by W10?
•
•
•
•
•
Do you think that the option
of giving or not the consent
to process personal data
impact a lot on W10
working?
•
•
•
•
•
35. ON A SCALE FROM 1 (ABSOLUTELY NOT) TO 5 (ABSOLUTELY YES), ANSWER
TO THE FOLLOWING QUESTIONS:
absolutely
not
absolutely
yes
Did you deeply understand
that Cortana, if allowed,
uses all your personal
device data?
•
•
•
•
•
Did you understand that by
picking the fast installation
Cortana takes advantage of
all your personal data?
•
•
•
•
•
36. WHEN YOU DOWNLOADED APPS, HOW MUCH OF ATTENTION YOU TOOK TO
READ ABOUT PRIVACY AND COOKIES?
• None
• Few
• Medium
• Lot
• High attention
176
Annexes: Profilation Questioner
37. ON A SCALE FROM 1 (ABSOLUTELY NOT) TO 5 (ABSOLUTELY YES), ANSWER
TO THE FOLLOWING QUESTIONS
nothing
at all
a little
enough
a lot
completely
How much do you think is
your pc safe?
•
•
•
•
•
Do you think third agents
can attach your pc?
•
•
•
•
•
How much of the
informations about privacy
read you memorized?
•
•
•
•
•
III SECTION
38. DO YOU THINK YOU ARE ABLE TO MANAGE YOUR PERSONAL DATA ON WINDOWS 10?
• Yes • No
39. DO YOU THINK WINDOWS 10 FACILITATES SECURITY PLANNING,CONTENTS
AND PERSONAL DATA ON YOUR PC?
• Yes • No
40. DO YOU THINK IT WORTH MAKING OPTIONS LIKE THOSE YOU’VE BEEN ASKED
FOR MANAGING YOUR PERSONAL DATA
• Yes, a lot
• Not at all
• Sufficiently
41. WHEN YOU SURF ON INTERNET, DO YOU THINK YOUR DATA ARE PROTECTED
FROM THEFTS,NOT AUTHORIZED APPROPRIATIONS BY THIRD AGENTS OR
SIMILAR CASES?
• Yes, a lot
• Yes, sufficiently
• Yes, but not so much
• Don’t know
• Not at all
42. DO YOU THINK IS IT POSSIBLE ANYONE MANAGES YOUR PERSONAL DATA
WITHOUT YOUR CONSENT?
• Yes, I do
• No, I don’t
177
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
43. DO YOU THINK THERE ARE NEGATIVE CONSEQUENCES FOR YOU, IF ANYONE
NOT AUTHORIZED USES YOUR PERSONAL DATA
• Yes • No • I don’t know
44. COULD YOU MAKE ANY EXAMPLE OF IT?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
45. WHAT IS MORE IMPORTANT BETWEEN SAFEGUADING YOUR PC AND
PROTECTING YOUR PRIVACY
• Protecting privacy
• Protecting the safety of my PC
• I don’t know
46. ANSWER THE FOLLOWING QUESTIONS:
yes
no
I don’t know
Would you pay a sum
of 20 euros, if so you
could prevent your data
being sold to others for
commercial purposes?
•
•
•
Would you provide a large
number of personal data
(eg mobile phone number,
tax code, university
registration number, etc.)
only to ensure greater
security of your PC and its
content?
•
•
•
When you surfed the
internet and the cookie
alert appeared, did you
accepted it?
•
•
•
According to you, with
Windows10, can you only
authorize the processing
of some of your personal
data?
•
•
•
Do you consider that with
the authorization of cookies
you can only choose a few
operators for the processing
of your personal data?
•
•
•
178
Annexes: Profilation Questioner
47. WHAT IS THE DIFFERENCE BETWEEN COOKIE AUTHORIZATION AND
AUTHORIZATION TO PROCESS YOUR PERSONAL DATA?
• None
• The authorization for personal data is wider
• Cookie authorization is wider
• I don’t know
III.I SECTION (additional)
48. WHAT IS THE PURPOSE OF GIVING CONSENT TO THE PROCESSING OF
PERSONAL DATA?
• To protect them, that is to prevent abusive uses of them
• To make a conscious choice
• To self determination
• I don’t know
III.II SECTION (additional)
49. DO YOU CONSIDER THAT YOUR CONSENT TO THE PROCESSING PROTECTS
EFFECTIVELY YOUR PERSONAL DATA?
• Yes • No • I don’t know
III.III SECTION (additional)
50. WHY?
• With the consent, my personal data will only be processed by authorized
persons
• With my consent my personal data will be processed according to the law
• With the consent, I could suffer any damages as a result of the unlawful
processing of my personal data
• Other (specify)
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
179
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
III.IV SECTION (additional)
51. WHY?
• With the consent, in practice, I do not avoid that my data will also be
process by unauthorized persons
• With the consent, I can not avoid that my data will be process against the law
• With the consent, I will not avoid suffering damages caused by the
processing of my personal data
• Other (specify)
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
III.V SECTION (additional)
52. DO YOU THINK THERE MAY BE NEGATIVE CONSEQUENCES FOR YOU ARISING
FROM THE USE OF YOUR PERSONAL DATA WITHOUT YOUR CONSENT?
• Yes • No • I don’t know
III.VI SECTION (additional)
53. WHY?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
III.VII SECTION (additional)
54. WHICH?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
180
Annexes: Profilation Questioner
III.VIII SECTION (additional)
55. DO YOU CONSIDER THAT DENYING CONSENT TO THE USE OF DATA COULD BE A
LIMIT TO THOSE NEGATIVE CONSEQUENCES?
• Yes • No • I don’t know
56. WHAT DOES IT MEAN FOR YOU TO HAVE YOUR DATA PROTECTED?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
57. WHEN YOU AUTHORIZE COOKIES, DO YOU KNOW THE DISTINCTION BETWEEN
GENERAL COOKIES AND NECESSARY COOKIES
• Yes • No • I don’t know
58. DO YOU KNOW WHAT DATA PROFILING IS?
• Yes • No • I don’t know
59. IF YOU KNOW IT, WHAT IS IT AND HOW DO YOU GET IT?
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
__________________________________________________________________________________________________________________________________
IV SECTION
60. WE ASK YOU TO GIVE AN OVERALL RATING OF THE INTERFACE YOU HAVE
JUST TRIED. ON A SCALE FROM 1 (TOTALLY DISAGREE) TO 5 (TOTALLY
AGREE), INDICATE HOW MUCH YOU AGREE WITH EACH OF THE STATEMENTS
LISTED BELOW:
fully
disagreeding
fully
agreeding
I think I would be happy to
use this system
•
•
•
•
•
I found the system difficult
to manage
•
•
•
•
•
I found the system easy to
manage
•
•
•
•
•
I think I would need the
support of a person who
can already use the system
•
•
•
•
•
181
Privacy and Consent. A Legal and UX&HMI Approach for Data Protection
fully
disagreeding
fully
agreeding
I found the various features
of the system wellintegrated
•
•
•
•
•
I found inconsistencies
between the features of
the system
•
•
•
•
•
I think most people could
learn to use the system
easily
•
•
•
•
•
I found the procedures
very tricky to follow
•
•
•
•
•
I felt that I was very
confident in using the
procedures
•
•
•
•
•
I think I should learn many
things about the system,
before I can complete the
tasks
•
•
•
•
•
182
Printed on January 2021
Area University Press, Università degli Studi Suor Orsola Benincasa
Via Suor Orsola, 10 – 80135 Naples
Italy