Papers by Tarleton Gillespie
Big Data & Society, 2024
Proponents of generative AI tools claim they will supplement, even replace, the work of cultural ... more Proponents of generative AI tools claim they will supplement, even replace, the work of cultural production. This raises questions about the politics of visibility: what kinds of stories do these tools tend to generate, and what do they generally not? Do these tools match the kind of diversity of representation that marginalized populations and non-normative communities have fought to secure in publishing and broadcast media? I tested three widely available generative AI tools with prompts designed to reveal these normative assumptions; I prompted the tools multiple times with each, to track the diversity of the outputs to the same query. I demonstrate that, as currently designed and trained, generative AI tools tend to reproduce normative identities and narratives, rarely representing less common arrangements and perspectives. When they do generate variety, it is often narrow, maintaining deeper normative assumptions in what remains absent.
Proceedings of the ... Annual Hawaii International Conference on System Sciences, 2017
This minitrack hosts the best emerging scholarship from sociology, anthropology, communication, i... more This minitrack hosts the best emerging scholarship from sociology, anthropology, communication, information studies, and science & technology studies addressing the most pressing critical and ethical concerns around DSM. Papers interrogate how DSM support existing power structures or realign power for marginalized groups, and illuminate the ethical issues in doing research with and on DSM. Papers explore new ways of thinking about information exchange in communities and societies, in periods of rapid sociotechnical change. The challenge for our field is, at the same time, to move analytically closer to the technology to divine the assumptions that animate it, and step back to recognize how DSM technologies and institutions exert influence over public participation, culture, and knowledge.
Algorithms may now be our most important knowledge technologies, “the scientific instruments of a... more Algorithms may now be our most important knowledge technologies, “the scientific instruments of a society at large,” and they are increasingly vital to how we organize human social interaction, produce authoritative knowledge, and choreograph our participation in public life. Search engines, recommendation systems, and edge algorithms on social networking sites: these not only help us find information, they provide a means to know what there is to know and to participate in social and political discourse. If not as pervasive and structurally central as search and recommendation, trending has emerged as an increasingly common feature of such interfaces and seems to be growing in cultural importance. It represents a fundamentally different logic for how to algorithmically navigate social media: besides identifying and highlighting what might be relevant to “you” specifically, trending algorithms identify what is popular with “us” more broadly. But while the techniques may be new, the instinct is not: what today might be identified as “trending” is the latest instantiation of the instinct to map public attention and interest, be it surveys and polling, audience metrics, market research, forecasting, and trendspotting. Understanding the calculations and motivations behind the production of these “calculated publics,”in this historical context, helps highlight how these algorithms are relevant to our collective efforts to know and be known. Rather than discuss the effect of trending algorithms, I want to ask what it means that they have become a meaningful element of public culture. Algorithms, particularly those involved in the movement of culture, are both mechanisms of distribution and valuation, part of the process by which knowledge institutions circulate and evaluate information, the process by which new media industries provide and sort culture. This essay examines the way these algorithmic techniques themselves become cultural objects, get taken up in our thinking about culture and the public to which it is addressed, and get contested both for what they do and what they reveal. We should ask not just how algorithms shape culture, but how they become culture.
Social Studies of Science, Jun 1, 2006
The term 'end-to-end' has become a familiar characterization of the architecture of the Internet,... more The term 'end-to-end' has become a familiar characterization of the architecture of the Internet, not only in engineering discourse, but in contexts as varied as political manifestos, commercial promotions, and legal arguments. Its ubiquity and opacity cloaks the complexity of the technology it describes, and stands in for a richer controversy about the details of network design. This essay considers the appearance, in the 1970s, of the term 'end-to-end' in computer science discourse, and how the term became a point of contention within disputes about how to build a packet-switched network. I argue that the resolution of some of those disputes depended on the transformation of the term from descriptor to 'principle'. This transformation attempted to close specific design debates, and, in the process, made the term dramatically more useful in those discourses beyond engineering that eventually took a keen interest in the design of digital communication networks. The term, drawn from common parlance and given not only meaning but conviction, was shaped and polished so as to be mobile. As such, it actively managed and aligned disparate structural agendas, and has had subtle consequences for how the Internet has been understood, sold, legislated, and even redesigned.
Wiley-Blackwell eBooks, Jan 31, 2013
This essay examines how online content providers such as YouTube are positioning themselves to us... more This essay examines how online content providers such as YouTube are positioning themselves to users, clients, advertisers, and policymakers. One term in particular, "platform," helps reveal the contours of this discursive work. "Platform" has been deployed by these content providers in both their populist appeals to users and their marketing pitches to advertisers and media providers, not just as technical platforms but as platforms of opportunity. Whatever tensions exist in serving all of these constituencies are elided. The term also fits their efforts to shape information policy, where they seek legislative protection on the basis of facilitating user expression, yet also claim limited liability for what those users say. As these providers increasingly become the curators of public discourse, we must examine the roles they aim to play, and the criteria they set by which they hope to be judged.
Big Data & Society, Jul 1, 2020
AI seems like the perfect response to the growing challenges of content moderation on social medi... more AI seems like the perfect response to the growing challenges of content moderation on social media platforms: the immense scale of the data, the relentlessness of the violations, and the need for human judgments without wanting humans to have to make them. The push toward automated content moderation is often justified as a necessary response to the scale: the enormity of social media platforms like Facebook and YouTube stands as the reason why AI approaches are desirable, even inevitable. But even if we could effectively automate content moderation, it is not clear that we should.
TripleC, 1970
Digital rights management technology, or DRM, provides self-enforcing technical exclusion from pr... more Digital rights management technology, or DRM, provides self-enforcing technical exclusion from predetermined uses of informational works. Such technical exclusion may supplement or even supplant intellectual property laws. The deployment of DRM has been subsidized by laws prohibiting both disabling of technical controls and assisting others to disable technical controls. To date the public debate over deployment of DRM, has been almost entirely dominated by utilitarian arguments regarding the social costs and benefits of this technology. In this paper, we examine the moral propriety of laws endorsing and encouraging the deployment of DRM. We argue that a deontological analysis, focusing on the autonomy of information users, deserves consideration. Because DRM shifts the determination of information use from users to producers, users are denied the choice whether to engage in use or misuse of the technically protected work. State sponsorship of DRM in effect treats information users as moral incompetents, incapable of deciding the proper use of information products. This analysis militates in favor of legal penalties that recognize and encourage the exercise of autonomous choice, even by punishment of blameworthy choices, rather than the encouragement of technology that limits the autonomous choices of information users.
Porn studies, Oct 2, 2021
Hosted by Northumbria and Birmingham City Universities, the Deplatforming Sex roundtable took pla... more Hosted by Northumbria and Birmingham City Universities, the Deplatforming Sex roundtable took place via Teams in October 2021. Participants included Danielle Blunt, Stefanie Duguay, Tarleton Gillespie and Sinnamon Love. Clarissa Smith chaired the discussion, which was transcribed and then edited to cut digressions and repetitions for publication. The roundtable provided the opportunity to reflect on recent moves to excise sex and forms of sexual commerce and performance from online spaces, while marking out some key issues for future research with and about sex workers, performers and other content providers. Our discussion provided critical engagement with ongoing legislative changes that are impacting content and providers directly and indirectly
Routledge eBooks, Oct 23, 2019
Journal of Information, Communication and Ethics in Society, May 1, 2009
Wired Shut discusses digital rights management and its effects on culture. Throughout the book, t... more Wired Shut discusses digital rights management and its effects on culture. Throughout the book, technologies are examined in a broad context. After discussing the internet and its foundations generally, Gillespie questions the decisions that have been made regarding the internet. After explaining how file sharing became demonized in public opinion, Wired Shut describes the history of three different trusted systems which have met different ends. The cultural implications of Digital Rights Management are considered.
transcript Verlag eBooks, Dec 31, 2017
Social Media + Society
Public debate about content moderation has overwhelmingly focused on removal: social media platfo... more Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility of problematic content is becoming a commonplace element of platform governance. Platforms use machine learning classifiers to identify content they judge misleading enough, risky enough, or offensive enough that, while it does not warrant removal according to the site guidelines, warrants demoting them in algorithmic rankings and recommendations. In this essay, I document this shift and explain how reduction works. I then raise questions about what it means to use recommendation as a means of content moderation.
Media and Communication
Recent social science concerning the information technology industries has been driven by a sense... more Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created, at least not on the terms in which they offer them. When researchers are enlisted in solving the industry’s problems, we tend to repeat some of the missteps common to the study of technology and society.
With increasing attention to the labor, criteria, and implications of content moderation, come op... more With increasing attention to the labor, criteria, and implications of content moderation, come opportunities for real change in the ways that platforms are governed. After high profile exposes like The Guardian’s “Facebook Files,” it is becoming more difficult for platforms to regulate in secret. Governments around the world are increasingly seeking to influence moderation practices, and platforms now face substantial pressure from users, civil society, and industry groups to do more on specific issues like terrorism, hatred, ‘revenge porn, and ‘fake news.’ In light of this pressure and the opportunities it implies, this roundtable will consider options for the future of content moderation. The question is not just how the moderation apparatus should change, but what principles should guide these changes? This panel brings together perspectives from media and information studies, law, and civil society to explore a variety of approaches to regulation, from corporate self-governance ...
Media & Communication, 2023
Recent social science concerning the information technology industries has been driven by a sense... more Recent social science concerning the information technology industries has been driven by a sense of urgency around the problems social media platforms face. But it need not be our job to solve the problems these industries have created, at least not on the terms in which they offer them. When researchers are enlisted in solving the industry’s problems, we tend to repeat some of the missteps common to the study of technology and society.
Social Media + Society, 2022
Public debate about content moderation has overwhelmingly focused on removal: social media platfo... more Public debate about content moderation has overwhelmingly focused on removal: social media platforms deleting content and suspending users, or opting not to do so. However, removal is not the only available remedy. Reducing the visibility of problematic content is becoming a commonplace element of platform governance. Platforms use machine learning classifiers to identify content they judge misleading enough, risky enough, or offensive enough that, while it does not warrant removal according to the site guidelines, warrants demoting them in algorithmic rankings and recommendations. In this essay, I document this shift and explain how reduction works. I then raise questions about what it means to use recommendation as a means of content moderation.
Online content providers such as YouTube are carefully positioning themselves to users, clients, ... more Online content providers such as YouTube are carefully positioning themselves to users, clients, advertisers, and policymakers, making strategic claims as to what they do and do not do, and how their place in the information landscape should be understood. One term in particular, ‘platform, ’ reveals the contours of this discursive work. ‘Platform ’ has been deployed in both their populist appeals and their marketing pitches – sometimes as technical platforms, sometimes as platforms from which to speak, sometimes as platforms of opportunity. Whatever tensions exist in serving all of these constituencies are carefully elided. The term also fits their efforts to shape information policy, where they seek protection for facilitating user expression, yet also seek limited liability for what those users say. As these providers become the curators of public discourse, we must examine the roles they aim to play, and the terms with which they hope to be judged.
Platforms rose up out of the exquisite chaos of the web. Their founders were inspired by the free... more Platforms rose up out of the exquisite chaos of the web. Their founders were inspired by the freedom it promised, but also hoped to provide spaces for the web's best and most social aspects. But as these platforms grew, the chaos found its way back onto them-for obvious reasons: if I want to say something, be it inspiring or reprehensible, I want to say it where people are likely to hear me. Today, we by and large speak on platforms when we're online. Social media platforms put people at "zero distance" (Searls, 2016) from one another, afford them new opportunities to speak and interact, and organize them into networked publics (Varnelis, 2008; boyd, 2011)-and though the benefits of this may be obvious, even seem utopian at times, the perils are also painfully apparent. While scholars have long discussed the dynamics of free speech online, much of that thinking preceded the dramatic migration of online discourse to platforms (Balkin, 2004; Godwin, 2003; Lessig, 1999; Litman, 1999). By platforms, I mean sites and services that host public expression, store it on and serve it up from the cloud, organize access to it through search and recommendation, or install it onto mobile devices.
[Abstract Not Available]WOS:00046537130001
Uploads
Papers by Tarleton Gillespie