Academia.eduAcademia.edu

Power in the digital age: how to govern through algorithms? (2017)

2017, Draft, Talk

Presentation at the event "Café & Chat: who governs algorithms?", organized by the Reference Institute in Internet and Society (IRIS) and by the International Studies Group on Intellectual Property, Internet and Innovation (GNet), at the UFMG Law School, in Belo Horizonte, on August 18, 2017.

Power in the digital age: How to govern through algorithms? Marco Antônio Sousa Alves Professor at the Faculty of Law of the Federal University of Minas Gerais “To govern is to follow a path, or put on a path” (Foucault, Security, Territory, Population, 8 february 1978, p. 166) First, I would like to thank the organizers for the invitation. It´s a pleasure for me to be here today. I want also to welcome our guest. It´s an honor for me to share the table with you. You are welcome. I would like to start with a little warning. The proposed title for this table was "Who governs the algorithms? Transparency and prejudice in the information society". I do not intend to get away from these issues (government, algorithms and transparency), but I would like to face them starting from a different question. More than who governs algorithms, my question is how to govern with or through algorithms. In other words, I would like to understand how we are governed by algorithms. And in putting this question, my attention turns to the strategy of power employed, following a clear Foucauldian approach. I would like to start my speech by highlighting my recent research trajectory. My intention is to present the path that brought me to the question of the government of the algorithms. I hope, in this way, to clarify the kind of research that I have accomplished, and that I intend to continue developing in the coming years. 1 Four years ago I defended a doctoral thesis in the Graduate Program in Philosophy of the Federal University of Minas Gerais on the emergence of the modern notion of authorship and the birth of copyright. It was a work strongly inspired by Michel Foucault's genealogical research. After the thesis, I began a new investigation as a postdoctoral researcher on more current issues related to the digital world and the internet. The title of this research, which I developed in the last four years, was "Internet, cyberculture, new technologies of power and subject-positions: for a diagnosis of the present time". In general terms, the research analyzed the current transformations that conform what is known as "cyberculture", highlighting cognitive impacts, new power technologies and emerging subject positions. Several questions have been addressed: What are the cognitive, political and ethical repercussions of this process? That is: how our thinking and our cognitive abilities are changed by the new practices of the digital world? What kind of power regime is set up by this new order and how does it work? And what are the emerging subject positions? The questions above point to three central domains of our philosophical tradition: the cognitive, the political, and the ethical. In short, the interest lies in the way we think, in the way that power works, and in the new forms of constitution of the contemporary self. The first two years of the research were devoted to cognitive issue, that is, to the way we think (the changes in intellectual technologies and their impact on our cognitive abilities) and also to the question of contemporary subjectivity, that is, the new practices of constitution of the self. Over the past two years, the focus of the investigation was directed to the question of power in the digital age, with emphasis on new social movements, surveillance practices and resistance strategies. 2 This semester I am taking a new step in my career, as Professor of Theory and Philosophy of Law in this Faculty. I intend to continue my research, with even more emphasis on the problem of power, as well as the challenges that the information society presents to the field of law. I want to keep here at the Law School the study group that I coordinate at the Faculty of Philosophy and Human Sciences since 2014. The group, called SIGA (Study Group on Information Society and Algorithmic Government), will have its first meeting next week and all are invited to attend. The study of this semester will be dedicated to the Belgian law philosopher Antoinette Rouvroy, who has several works on the subject of algorithmic governmentality. She is currently a researcher of the FNRS (Fonds pour la Recherche Scientifique / Funds for Scientific Research), attached to the Centre de Recherche Information, Droit et Société (Information, Law and Society Research Center) of the Université de Namur. This afternoon I would like to talk about this notion of algorithmic governmentality developed by Antoinette Rouvroy, who is strongly influenced by Michel Foucault's work. Taking into consideration that this research is still taking its first steps, my intention here is just to bring some questions and suggest some provisional theses. I would like to address the following issues: 1) What does governmentality mean? 2) What is an algorithm? In what way does an algorithm govern? 3) What does it mean to be governed by algorithms? What challenges does it bring to the law and for human freedom? What kind of resistance is still possible against algorithmic governmentality? 3 1. What does governmentality mean? Foucault introduced the notion of ‘governmentality’ during his lectures at the Collège de France in the late 1970s. In his lecture on January 10, 1979, Foucault explains that he was interested in learning "the art of government," or the way an individual's conduct is shaped and controlled. In these lectures Foucault defined and explored a fresh domain of research into what he called 'governmental rationality', or, in his own neologism, 'governmentality'. The Foucault´s 1978 lectures on Collège de France, titled Security, Territory, Population, start with the analysis on the how of power. In this context, biopower or apparatus of security are studied, in addition to the studies made on the power of sovereignty and the disciplinary power. Biopower referred to a set of procedures, or relations, that manipulate the biological features of the human species into a strategy for governing an entire population. Population in this sense refers not simply to ‘people’ but to phenomena and variables, such as birth rate, mortality rate and marriage statistics. It thus encompasses the whole field of the social. The fundamental object of governmental security dispositifs is precisely the population. The population is a set of elements in which we can note constants and regularities, and with regard to which we can identify a number of modifiable variables on which it depends. Security technologies are in this sense an attempt to govern circulation processes at the population level in an economical and rational way. Security dispositifs aim at what not yet has happened. As a result, security strategies operate as a management of open series that can only be controlled by an estimate of probabilities. It is this new strategy of power that will be analyzed in the light of the notion of government. 4 Government, according to Foucault, was a term discussed not only in political tracts, but also in philosophical, religious, medical and pedagogic texts. In addition to the management by the state or the administration, "government" also signified problems of self-control, guidance for the family and for children, management of the household, directing the soul, etc. The French verb gouverner covers a range of different of meanings. It can have a material and physical meaning of ‘to direct or move forward’, or ‘to provide support for’. It can have a moral meaning of ‘to conduct someone’ in a spiritual sense or, tangentially, to ‘impose a regimen’ (on a patient) or to be in a relationship of command and control. According to Foucault, in his 1982 essay 'The subject and power' : “[G]overnment” [does] not refer only to political structures or to the management of states; rather, it [designates] the way in which the conduct of individuals or groups might be directed – the government of children, of souls, of communities, of the sick (...) It [does] not only cover the legitimately constituted forms of political or economic subjection but also modes of action, more or less considered or calculated, which [are] designed to act upon the possibilities of action of other people. To govern in this sense is to structure the possible field of action of others.” A focus on ‘conduct’ perhaps leads to the most concise definition of ‘governmentality’ as the ‘conduct of conducts’ – or the regulation (conduct) of behaviors (conducts). Governmentality operates to produce a (governmentable) subject. In short, government refers to conduct, or an activity meant to shape, guide, or affect the conduct of people and forge the very constitution of the subject. In his 1982 essay 'The subject and power', Foucault also states that power is only power (rather than mere physical force or violence) when addressed to individuals who are free to act in one way or another. Power is defined as 'actions on others' actions': that is, it presupposes rather than eliminates their capacity as agents; it acts upon, and through, an open set of practical 5 possibilities. Quoting Foucault : « In itself the exercise of power is not violence (…). It is a total structure of actions brought to bear upon possible actions; it incites, it induces, it seduces, it makes easier or more difficult; in the extreme it constrains or forbids absolutely; it is nevertheless always a way of acting upon an acting subject or acting subjects by virtue of their acting or being capable of action. A set of actions upon other actions. » In short, government relies on freedom. Quoting Foucault again : « When one defines the exercise of power as a mode of action upon the actions of others, when one characterizes these actions by the government of men by other men - in the broadest sense of the term - one includes an important element: freedom. Power is exercised only over free subjects, and only insofar as they are free. By this we mean individual or collective subjects who are faced with a field of possibilities in which several ways of behaving, several reactions and diverse comportments, may be realized. » Power as strategic games is an omnipresent feature of human interaction, insofar as it signifies structuring the possible field of action of others. Government refers to more or less systematized, regulated and reflected modes of power (a “technology”) that go beyond the spontaneous exercise of power over others, following a specific form of reasoning (a “rationality”) which defines the telos of action or the adequate means to achieve it. Government then is the regulation of conduct by the more or less rational application of the appropriate technical means. Foucault used the term 'rationality of government' almost interchangeably with 'art of government'. He was interested in government as an activity or practice, and in arts of government as ways of knowing what that activity consisted in, and how it might be carried on. A rationality of government will thus mean a way or system of thinking about the nature of the 6 practice of government (who can govern; what governing is; what or who is governed). With the notion of governmentality Foucault moves from the ‘‘what’’ to the ‘‘how’’ of governance. Finally, one last question : Why the need for this ‘ugly word’, governmentality? Why not simply call this ‘new government’ or even ‘governance’? The word ‘govern/mentality’ refers to both the processes of governing and a mentality of government (thinking about how the governing happens). It is thus both an art (a practice) and a rationality (a way of thinking about) government. The semantic linking of governing ("gouverner") and modes of thought ("mentalité") indicates that it is not possible to study the technologies of power without an analysis of the political rationality underpinning them. As a way of thinking, governmentality represents an important methodological tool (not theory) that provides a flexible and open-ended lens through which the minor tactics of governing are magnified. So, a whole field that can be described a ‘governmentality studies’ can now be identified. Antoinette Rouvroy's studies on algorithmic governmentality can be perfectly situated in this domain. Governmentality studies have shown how numbers, indicators and other means of quantification lie at the very heart of the art of governing people. 2. What is an algorithm? In what way does an algorithm govern? What actually is an algorithm? For most computer scientists and programmers, an algorithm, at its most basic level, is the set of instructions used to solve a well-defined problem. Generally, they differentiate between the algorithm (the set of instructions) and its implementation in a particular source language. Algorithms usually express the computational solution in terms of logical conditions (knowledge about the problem) and structures of control (strategies for solving the problem), leading to the following definition: algorithms = logic + control. 7 This may seem harmless, but nowadays a large part of our daily lives have become inhabited by algorithms or code operating mostly implicitly and in the background. Algorithmic action has become a significant form of action (actor) in contemporary society. To figure it out, just keep in mind some services provided, for example, by Google, Facebook, Uber, Waze and Netflix. The recent rise of algorithms in the social sciences has brought about an impressive range of work. Algorithms are said to be powerful, and also dangerous. Sociologists, computer scientists, political scientists, legal scholars, anthropologists, designers, philosophers, users, activists, policy makers, and many more are talking about algorithms. In most cases, the current events are causing some fear and we are increasingly concerned that individual autonomy is lost in an impenetrable set of algorithms, seen as powerful entities that rule, sort, govern, shape, or otherwise control our lives. Algorithms concern us because they seem to operate under the surface or in the background that is, they are inscrutable. We cannot directly inspect them or, in many cases, understand them as source code. They become black boxes, even to their designers sometimes. Thus, decisions become encoded and encapsulated in complex inscrutable algorithms. These algorithmic systems can also operate ‘‘automatically’’ (in the background) without the need of human intervention. If algorithms are inscrutable and thus can operate automatically and in the background, then they are most certainly actors with which we should be concerned. As such, there is a strong sense that they need to be governed more explicitly. If one accepts the argument that algorithms are important actors in contemporary society, then the question of governance of their actions, or of these actors, naturally emerges. It is becoming clear that algorithms govern us in some way, as a kind of governance guided by learning machines and computing systems that are able to automatically capture and process data from 8 multiple sources, using statistical calculations. Across state and private institutions, a vast array of algorithmic actors are becoming more or less interconnected to operate as technologies of calculation and regulation deployed to enact and regulate their subjects — be it citizens, migrants, tourists, suspects, customers, students, friends, and many more besides. Antoinette Rouvroy uses the term algorithmic governmentality (gouvernementalité algorithmique), clearly based on Michel Foucault, to refer very broadly to a certain type of (a)normative or (a)political rationality founded on the automated collection, aggregation and analysis of big data so as to model, anticipate and pre-emptively affect possible behaviours. We can better understand algorithmic governmentality by taking into account its three stages of operation: dataveillance, datamining and profiling. The first stage consists of the collection and automated storage of unfiltered mass data, what can be called dataveillance, constitutive to big data. It is estimated that the digital universe today is made up of more than 1,200 billion billion bytes, 90% of which would appear to have been produced in the last two years. This number, which doubles every two years, will need to be multiplied tenfold by the year 2020, reaching a total of 44 zettabytes, or 44 trillion of gigabytes. The data are available in massive quantities, from different sources, in a variety of formats (text, images, sounds, geo-location, mobility data, etc.). We leave all the time digital “footprints” which are often collected by default by mechanisms to monitor online movements, CCTV, GPS tracking, traffic flow monitoring, satellite imagery, recording of banking transactions, etc. An increasing proportion of digital data comes from what is now referred to as the Internet of Things: the networking of “smart” devices able to communicate with each other and therefore themselves to produce huge amounts of data. These networked devices emit information on the movements, activities, performance, energy consumption, lifestyles etc. of their users. Today, when we work, consume or travel we inevitably “produce” data. Governments collect them for 9 the purposes of security, control, resource management, etc. Private companies collect large quantities of data for marketing and advertising purposes, to customize offers, in short, to improve their sales efficiency and therefore their profits, etc. Individuals themselves willingly share “their” data on social networks, blogs, “mailing lists”, etc. All these data are collected and stored in “data warehouses” as much as possible by default, devoid of any prediction about specific end uses of this collection, in other words the purposes that the data will serve once correlated with other data. The optimal functioning of this mode of statistical intelligibility presupposes the non-selective collection of as-much data as possible, a priori independent of any specific finality. The exponential increase in Big Data is a result of the retention by default not only of directly useful data, but also of the data which are merely of potential utility. The usefulness of each data item depends on the quantity of the other data with which it may be correlated. In the Big Data universe, it is therefore perhaps not going too far to think that by means of a network effect, the potential value of each piece of data increases depending on the quantity of data collected. The second stage is that of datamining as such, in other words the automated processing of these big data to identify subtle correlations between them. It seems crucial to note here that it is therefore a matter of knowledge production (statistical knowledge comprised of simple correlations) based on information that is unsorted and therefore perfectly heterogeneous. This knowledge production is automated, which means that it requires minimal human intervention, and is uninformed by any pre-existing hypothesis (unlike traditional statistics). Indifferent to the causes of phenomena, it functions on a purely statistical observation of correlations between data captured in an absolutely non-selective manner in a variety of heterogeneous contexts. The purpose of what is called machine learning is ultimately to directly enable the production of hypotheses based on the data themselves. Norms seem to emerge directly from reality itself. These norms or this “knowledge” are however “only” made up of correlations. 10 The third stage consists in using this probabilistic statistical knowledge to anticipate individual behaviours and associate them with profiles defined on the basis of correlations discovered through datamining. In order to properly understand what constitutes the algorithmic profiling, it is important to understand the crucial difference that exists between information at individual level, on the one hand, and on the other hand the knowledge produced through the profiling. Most of the time, this knowledge is not available to individuals and they cannot perceive it, but it is nevertheless applied to them in such a way as to infer knowledge or probabilistic predictions regarding their preferences, intentions and propensities which would otherwise not be evident. The aim is therefore to prompt individuals to act without forming or formulating any explicit reason and even without any clear desire. In short, the goal is to conduct the conduct of individuals, or, as Foucault says, to govern. Algorithmic governance thus seems to signal the culmination of a dispersal of the traditional conditions of subjectification and individuation. These are being replaced by objective, operational regulation of possible behaviours, based on “raw data” that carry no meaning on their own and whose statistical processing is primarily designed to accelerate flows – avoiding any form of “detour” or subjective “reflexive suspension” between “stimuli” and their “reflex responses”. Moreover, the field of action of this “power” is not situated in the present, but in the future. As well as the security dispositifs analyzed by Michel Foucault, the algorithmic governmentality is essentially related to what could become, to propensities rather than actions taken. To talk of government by data is to immediately evoke a change in approach in the detection, classification and predictive assessment of events in the world and of the behaviour and propensities of its inhabitants, that is to say, therefore, a new way of making the world “predictable” or a new way of exercising power: a new “governmentality”. 11 3. What does it mean to be governed by algorithms? What challenges does it bring to the law and for human freedom? What kind of resistance is still possible against algorithmic governmentality? I would like to conclude my presentation with some brief remarks on certain political, ethical, social and legal repercussions of the phenomenon of algorithmic governmentality previously outlined. Today, digital data plays an increasingly predominant role in informing and guiding action, in virtually all sectors of business and government. We face an abundance of data, since individuals are considered as temporary aggregate of exploitable data at an industrial scale. It is on the basis of these data rather than on the basis of institutional or deliberative processes that the categories, by means of which individuals are classified, evaluated, rewarded or punished, are drawn up. These same categories are used to evaluate the merits and needs of individuals or the opportunities or dangers underlying the lives they lead. In this view of “government by data”, how can we ensure the survival of individuals as subjects of law? How can we ensure that individuals are not viewed only as temporary digital data exploitable on an industrial scale but as subjects of law in their own right ? I believe that having an understanding of the rationality of the algorithmic processes (data mining, machine learning, etc.) is a necessary precondition for any normative reflection on Big Data in terms of the rule of law and fundamental rights and freedoms. The new capacities based on “data intelligence”, much of which remains imperceptible or inaccessible to the ordinary citizen, can significantly magnify the asymmetry of information and power between those who hold those data and those who, voluntarily or not, “emit” them. 12 Although the digitisation of the world does not meet with any significant reluctance from individuals, this is because it seems to be the inevitable, indissociable and necessary cost of a multitude of new services, new functionalities of digital devices, the ability to engage in social interaction via digital processes. We look generally at the most immediate effects and it is not clear what may be the long-term consequences in our lives. We could argue that there is a close complicity between algorithmic governmentality and advanced capitalism. The digital raw data are today the very ‘texture’ of capitalism. This explosion of data is a hyperindexation of absolutely everything, including the personal form. Something seems quite worrying here: the fact that individuals themselves come to conceive themselves as only hyperquantified (called the contemporary ‘quantified self’). We bypass subjectivity by automation. Here also, we avoid subjectivity since we no longer appeal to the human capacity of understanding and will to govern. It is no longer a matter of threatening or inciting, but simply by sending to people signals that provoke stimuli and therefore reflexes. It is not only that there are no longer any subjectivity, but it is that the very notion of subject is itself being completely eliminated or radically transformed thanks to this collection of infra-individual data. Yet, algorithmic governmentality is not without ‘producing’ peculiar subjectivities: fragmented, the subject comes in the form of a myriad of data that link him or her to a multitude of profiles (as a consumer, a potential fraudster, a more or less trustable and productive employee and so on). All of them are related to him or her without inscribing him or her in any collective context and the individual becomes infinitely calculable, comparable, indexable, and interchangeable. A profile is not, in reality, about any one person. No one fits it exactly and no profile pertains to a single identified or identifiable individual. Algorithmic governmentality focuses not on 13 individuals, on subject, but on relations. It circumvents and avoids reflexive human subjects, feeding on infra-individual data which are meaningless on their own, to build supra-individual models of behaviours or profiles without ever involving the individual. Being profiled in this or that way, however, affects the opportunities that are available to us and consequently the realm of possibilities that defines us: not only what we have already done or are doing, but also what we could have done or could do in the future. One might think that all this is science fiction. Not at all. If we are to believe Eric Schmidt, Google’s CEO, technology will soon become so effective that it will become very difficult for people to see or consume something that has not in some sense been tailored for them. In the marketing field, ultimately the aim is not so much to adapt supply to individuals’ impulsive or spontaneous desires but rather to adapt a person’s wishes to what is on offer, by adapting sales strategies (the time when advertising is sent out, the way the product is presented, setting the price, etc.). In this way, we are perhaps moving from an intention-based economy to an instinctdriven economy. Individuals are most often described as “consumers” or “users” with promises of improving their experience, and much more rarely as “citizens”. Algorithmic governmentality devalues politics; it does away with institutions, with public debate. Presenting the issues in terms of innovation, competitiveness and the individual interests of consumers or users often hides the ethical, legal and political issues of the digital revolution at the risk of undermining the rule of law, human rights and fundamental freedoms. Insofar as “data intelligence”, reviving a sort of digital behaviourism, would gradually supplant the political and legal forms through which we represent what is real, we need to ask how the law will still be able to contain, limit and restrict the dominance of algorithmic governmentality. 14 It is understandable the problem for legislators to protect, to erect barriers around the individual, but these barriers emphasize precisely what is at stake. We can give rights to individuals on their personal data, and this is necessary. But all these rights are not properly applicable to the algorithmic scenario. Big Data is interested in categorising of a quantity of persons but without being concerned about these persons individually. We bypass the subjectivity and we thus arrive at a kind of machinic objectivity. The legal systems for protecting individuals with regard to automatic processing of data must therefore, first of all, ensure that individuals, subjects of law, have a presence, an impact, a consistency in a universe in which only temporary data exploitable on an industrial scale count. Secondly, they must prevent people being locked into “profiles” they know nothing about and which they are unable to challenge. Giving consistency to subjects of law means always taking into account individuals’ capacity for not doing or wanting everything which they are “statistically” predisposed to do or want, and to always assert their right to themselves account for their own motivations. Subjects are shown no respect if we do not at the same time respect their capacity for reticence, for reservation, for not doing what the algorithms predict and their ability to say, for themselves, what prompts them to act. It is only by reframing the concept of “subjects of law” that it is possible to imagine how applications based on data intelligence can be developed in harmony with political beings that we are. Furthermore, this “digital revolution” calls for constant vigilance and a continually renewed examination of the relevance and appropriateness of the legal instruments for protecting our fundamental rights and freedoms. What all this suggests is that an intensive replacement of human observation, evaluation and prediction by autonomic processes might well deprive us, in part at least, of our abilities to make normative judgements, and, more fundamentally even, to set new norms. The question is thus the following: when individuals are subjected to the ‘gaze’ of multimodal observation dispositives 15 functioning on autonomic computing and when forward-looking statistical evaluation and classification becomes the privileged vector of governmentality, aren’t individuals deprived of occasions and on the long run, of their capability, to form and implement moral judgments and normative reflexivity? Do these new uses of statistics that are datamining and profiling not reduce us to impotence faced with the immanent norms spawned by algorithmic governmentality? Does the regime of digital behaviourism not threaten, today, to undermine the very underpinnings of emancipation by eliminating notions of critique and of project? That same danger of ‘depoliticization’ and ‘demoralization’ is carried by ‘technological paternalism’, and ramping in any technology designed for the purpose of enforcing a certain ‘regularity’ of behaviours, or of rendering practically impossible behaviours, attitudes or actions that were previously ‘simply’ forbidden by morality or law. These preemptive dispositives, in so far as they succeed in their regulative and normative tasks, simply bypass conscious acceptation or contestation of the norms they enforce. Then, to finally conclude, one final remark : politics is nothing more nor less than that which arises with resistance to governmentality. The possibility, the potentiality of dissent, contestation, insurrection demarcates power from violence, force or domination. Power, that which allows some to drive the conducts of others, the “conduct of conduct” always presupposes the possibility, for the individuals and groups targeted, of counterconducts. ‘Counterconduct’ refers to Foucault’s more preferable term signifying ‘resistance’. This remains a fascinating area of study: by interrogating the ‘how’ of government, we might perform ‘the art of not being governed quite so much’. Thanks for your patience and attention. * * * 16