Forthcoming in Routledge Companion to Social Media and Politics, 2016, (Eds.) Bruns A.,
Skogerbo E., Christensen C., Larsson O.A. and Enli G.S., NYC, NY: Routledge
Where there is Social Media there is Politics
Karine Nahon
Abstract
This chapter discusses the role of politics in social media – the power interplays among actors
on social media as they attempt to promote their interests and values. It argues that social
media cannot exist without some kind of political involvement: where there is social media
there is politics; neutrality is the exception rather than the norm in social media. Next, it
discusses the different manifestations this involvement and suggests classifying the social
media politics into politics of architecture (platforms and networks structure) and politics of
dynamics (networks structure, information flows, and curated flows). Power is exercised in
each dimension in three key modes: influencing decisions, setting the agenda, and shaping
stakeholder preferences and norms.
Introduction
The recent general elections in Israel, held on March 17, 2015, were won by the right. When
examining empirically the content created by politicians on social media, it was clear that
right-wing politicians exhibited higher levels of activity in terms of the number of posts,
likes, and shares. However, users who were supporters of the center or left-wing parties
would subsequently complain that they had been certain of a center-left victory, as indicated
by their Facebook feed. It was full with posts, videos and images attacking the incumbent
2
right-wing Prime Minister Benjamin Netanyahu, who eventually won, and supporting the
idea of replacing the government.
The Facebook algorithm was largely responsible for the gap between their hopes and
illusions and the electoral reality. Facebook presents users with only a small fraction of the
information flows created by their friends (Constine, 2014). If I had 100 friends on Facebook,
and they all posted at the same time, Facebook would show me only a few and I would not
even know that the remainder posted as well. This reduced feed is even more bounded as
Facebook prioritizes homophilous contents or those which one is more likely to agree
(Pariser, 2012). These two practices of the Facebook platform are an example of how the
self-selection power of users can be skewed. This gap between electoral preferences as
reflected in social media and the actual preferences of Israeli society, in this case, is a distinct
product of social media politics.
“No idea is more provocative in controversies about technology and society than the
notion that technical things have political qualities. At issue is the claim that the machines…
can embody specific forms of power and authority” (Winner, 1996, 19). Politics of social
media refers to the power interplays among actors on social media platforms, as they attempt
to promote their interests and values. The concept of social media politics remains elusive
despite the recent wealth of case and empirical studies on related topics, such as the bias of
algorithms (Gillespie, 2010, 2014); power law of networks (Barabasi, 2003; Nahon, Hemsley,
Walker, & Hussain, 2011); mediators control of information flows (Barzilai-Nahon, 2008;
Lievrouw, 2009; Shaw & Hill, 2014) and user attention (Wihbey, 2014); agenda-setting in
networks (Wallsten, 2007; Woodly, 2008); and the politics of protocols (Elmer, 2010), search
engines (Halavais, 2008; Segev, 2010), and technology in general (Introna, 2006; Tufekci,
2014). These diverse topics have something in common: they are all different manifestations
of the politics of social media.
3
Social media are the collection of web- and mobile-based platforms where individuals
and groups interact. They include blogs such as Wordpress, update streams such as Twitter,
general social networks such as Facebook, image-sharing platforms such as Flickr, location
platforms such as Swarm, social news forums such as Reddit, business networks such as
LinkedIn, and curation platform such as Pinterest. Given its unique affordances and rules,
each platform is conducive to particular power dynamics.
The politics of social media may have the power to affect the behaviors, preferences
and value systems of individuals and groups according to the intentions of those wielding it.
This may have significant consequences for the sort of information people receive,
potentially shifting the gravity centers of meaning-making power. Researchers have observed
some of these changes, for example through the skews and biases of information flows.
However, they may largely remain obscure. The scope of this concept is broad, and this
chapter does not pretend to provide a comprehensive typology. Rather, it represents an
interdisciplinary attempt to enrich the conversation in the field while suggesting, defining,
and classifying a number of ways to better understand the building blocks of the politics of
social media.
This chapter makes the normative claim that social media cannot exist without some
kind of political involvement or bias. In social media, neutrality is the exception rather than
the rule. This is followed by a description of the different manifestations of social media
politics, a review of empirical evidence and discussion of several examples of this powerful
phenomenon. As the literature on these topics is scattered, this chapter attempts to classify
and organize the different types of political manifestations of social media.
4
The Neutrality Myth Shattered: Power Modes in Social Media
Information technologies and social media in particular are not neutral artifacts but
significantly political and social spaces. Power relations are fundamental to any society,
whether mediated off- or online. Wherever there are people and social relationships there are
power relationships (Castells, 2009). Politics does not reside in a vacuum, but in a social
locus where actors (potentially) exercise their power. The first question we need to ask
therefore is, are social media social? If so, to what extent? Trottier and Fuchs (2014) answer
the first question in the affirmative, and identify three forms of sociality which determine the
extent to which social media are social: cognition, communication, and cooperation or
coproduction. Individuals have certain cognitive features that they use to interact with others,
“so that shared spaces of interaction are created. In some cases, these spaces are used not just
for communication but also for the coproduction of novel qualities of overall social systems
and for community building” (p. 6).
The special appeal of social media resides in their ability to not only host but also
facilitate and enhance social interactions. This is the source of their impact, but at the same
time also of their flux and complexity, as so many holders have a stake in them. Every
decision about how technology is designed, how information is produced, shared, distributed
or accessed, involves various stakeholders jostling for different normative positions. This
dynamic, political process leads perforce to a struggle over hegemony of certain actors
(individuals, institutions, groups or networks) over others, influencing their values and
behaviors. How these issues are resolved will, thus, determine which people, under what
circumstances, can do what on Facebook, Twitter or WhatsApp. The politics of social media,
like any other technology, produces, reproduces, reinforces, and shifts power and privilege
among designers and users, but also – increasingly – among non-users. Just like in the Roman
5
forum, the outcomes of the politics of social media promote the interests and values of the
powerful.
True enough, social media have empowered users (especially non-professionals) with
readymade tools that enable them to share and distribute information, create viral events,
enrich content through metadata, locate people with similar interests, develop applications,
collaborate to produce knowledge, and build on others’ work and donations to create new
things much more easily than before. This empowerment was accompanied by the illusion of
social media being neutral, egalitarian, objective and democratic. In fact, however, the basic
elements of social media – their architecture (platforms and networks structure) and dynamics
(networks structure1, information flows and curated flows) – are political, non-neutral and
non-democratic in design, practices, and policies. These elements are inherently biased by the
values of particular stakeholders, which in and through social media regulate the others’
behavior in line with those values.
Since politics is the exercise of power in the resolution of issues – which for our
purposes may arise in or be addressed through social media – we need to define power in
social media. I have discussed the notion of power in networks in depth in some of my earlier
writing (Barzilai-Nahon, 2008; Nahon, 2011), and since this is not the main focus of this
chapter, I will briefly present the three main power modes critical to the understanding of the
phenomenon in question, and will later use them to inform some of its instances.2
The first power mode has to do with the capacity to influence the decisions of other
social actors. This is aligned with the ideas of political scientists in the 1950s and ‘60s, such
as Robert Dahl (1957) and Nelson Polsby (1963), who argued that decision-making analysis
1
As elaborated further below, the politics of networks structure can occur both on the architecture level and on
the dynamics level.
2 For an extensive discussion of social power in general, see Lukes (2005). For a discussion of power in
networks, see Castells (2009), and a special volume dedicated to network theory and power in the International
Journal of Communications (Vol. 5, 2011).
6
would be the best way to determine which individuals and groups have more power in social
life, and that decisions involve direct, that is actual and observable, conflict. For example, in
the context of social media, Liu et al. (2011) found in a qualitative study that 36% of content
remained shared with the default Facebook privacy settings. They also found that in the
majority of cases there was a gap between the desirable privacy settings and the actual
controls users decided to apply. In most cases, users exposed content to more users than
expected, aligned with the Facebook’s vested interest to exercise power over its users, or
more precisely influence their decisions regarding what content to expose, and to whom, in
service of its business model.
The second major mode of power is the shaping of and control over the political
agenda, which determines how social media are designed and operate, and the how potential
issues are kept out of the political process and public spheres. The conceptualization of the
second mode of power is informed by political scientists of the 1960s, ‘70s and ‘80s such as
Peter Bachrach and Morton Baratz (1962). Any satisfactory analysis of power thus involves
not only examining social media’s influence on decisions (the first mode of power), but also
examining non-decisions (suppressing challenges to the status quo or adding new issues to an
agenda) as decisions. For example, Twitter restricts tweets to 140 characters. This design has
had major ramifications on the content flowing across the service. When users are limited to
140 characters, their posts must be short, laconic, and simplistic, if not outright blunt. It is no
coincidence that Twitter is mainly used for live event updates. It has been purposely
structured this way by its designers, imitating SMS practices and consequently appropriating
this user behavior as a tool for sharing activities in the immediate present (Sagolla, 2009). By
constraining the agenda (to 140 characters), Twitter has privileged a particular type of content
(real-time posts) over other content, such as complex and nuanced arguments.
7
The third mode of power focuses on actions and inactions aimed at shaping and
influencing one’s perceptions, cognitions, and preferences (latent or manifest). This is done
for example by securing acceptance of the status quo since no alternatives appear to exist, or
because it is seen (but actually shaped) as “natural”, unchangeable, or favorable. This
conceptualization is well aligned with Lukes (2005) and to some extent also with Foucault
(1977; 1980) who argued that power is the ability to shape the mind and construct meaning to
phenomena. While this mode of power is difficult to observe, it has the strongest impact
among the three, as the changes occur within the social actor (person or group) affected by it.
It involves a profound transformation of value systems, making Social Actor A believe and
choose to act in a way that reinforces the system’s bias, thereby promoting the interests of
Actor B at her own expense, usually in the form of compliance. For example, Bond et al.
(2012) studied the effect of the iVote button added by Facebook in order to encourage voting
in the 2010 US congressional elections on the actual behavior of 61 million users. The study
showed that the button directly influenced the “political self-expression, information seeking
and real world voting behavior of millions of people. Furthermore, the messages not only
influenced the users who received them but also the users’ friends, and friends of friends”
(Bond et al., 2012, p. 295). Critical voices raised the concern that the use of the iVote button
by Facebook was not transparent, and could be exploited to create social pressure on
particular groups and exclude others (e.g. by providing the iVote button only to people
identified by Facebook as Democrat supporters). Be that as it may, this study exemplified the
ability of social media politics to change the preferences of people who were reluctant to
vote, and mobilize them to vote.
One way to enrich the debate on the spectrum of social media politics is by
classifying it into social media architecture and the dynamics (see Figure 1). The three modes
of power are exercised in each one of these dimensions by influencing decisions, setting the
8
agenda, or shaping stakeholder preferences. The overlap between the dimensions is minimal,
as the politics of dynamics focuses directly on content, while the politics of architecture
focuses on technology and only indirectly on content. Both, however, aim at changing
people’s behavior, preferences, and values.
Figure 1: Politics of Social Media Dimensions
Platforms
Politics of
Architecture
Networks Structure
ructure of Networks
Networks Structure
Politics of
Dynamics
Information Flows
Curated Flows
Politics of Architecture
The architecture of social media – the design of the actual components that make up social
media (hardware, software, operating and network systems) and their interrelations – is
ultimately based on code written by developers. This code is constructed hierarchically:
similarly to any other human language, where words come together to form a sentence, and
many sentences form a text that expresses an idea. The lines of code form the algorithm, or
set of rules for making the technology “do something”. The algorithm is at the heart of any
automated or semi-automated technological process in social media: from making the
9
hardware react to users pressing on a button to unfriending someone with a click. More
important for our current purposes, the architecting of social media is a conscious, willful,
non-neutral act by various stakeholders, usually designers or developers, but also users. The
struggle over who gets to architect the platforms and affordances, and how they are
architected, is one of the key manifestations of the power struggles and political arrangements
in social media.
Technocrats tend to argue that because technology is based on algorithms devoid of
human interference, it is able to construct consistently neutral and non-discriminatory
processes. However, by the very fact that humans design it, every technology is inherently
political, involving values and interests cast in the image of its architects and subsequently
shaped by its users. Architecting social media is a conscious act of exercising power in its
three modes: influencing decisions, setting the agenda, and shaping preferences of
stakeholders around questions of affordances and use. Through the design of infrastructure
architects influence the decisions of users regarding their privacy, what to share, how to
write, and how to behave on that platform. Through the design of infrastructure architects
determine the rules and boundaries of users’ speech, behavior on their platforms: what to
write, what types of photos to upload or not, what types of videos to share. And through the
design of infrastructure architects not only change the decisions and boundaries of behavior,
but also shape the preferences and values of users according to their own interests.
One example of controversy manifesting a power struggle between users and the
platform is the issue of nude photos. Facebook has a policy of removing such photos.
However, users have complained that this policy also censors breastfeeding, nudes in art,
naked mannequins, and kisses of same-sex individuals. In January 2015, a group of mothers
protested online by uploading breastfeeding images. Consequently, in March 2015 Facebook
10
announced it would no longer remove such images, as they did not violate its rules on nudity.
The updated policy states that:
We restrict the display of nudity because some audiences within our global
community may be sensitive to this type of content.... In order to treat people fairly
and respond to reports quickly, it is essential that we have policies in place that our
global teams can apply uniformly and easily when reviewing content…. We remove
photographs of people displaying genitals or focusing in on fully exposed buttocks.
We also restrict some images of female breasts if they include the nipple, but we
always allow photos of women actively engaged in breastfeeding or showing breasts
with post-mastectomy scarring. We also allow photographs of paintings, sculptures,
and other art that depicts nude figures (“Facebook: Community Standard Page,” n.d.)
Studies on how the architect's role introduces valence into technology, and social
media in particular, have flourished in the last decade. These include analyses of censorship
policies of Facebook and YouTube and their changes over time, and their impact on user
behaviors (Gillespie, 2010; 2014); as well as studies on biases in Google content provision
and ranking for different localities (Halavais, 2008; Segev, 2010). Gillespie (2014) articulates
six dimensions of the political valence of algorithms, which he uses as a heuristic for
considering the scope of the “politics of algorithm”: (1) the choices (of those who architect
the algorithms) behind the platforms’ inclusion/exclusion rules; (2) the collection of
information not necessary for the algorithm to operate; (3) the level of obscurity of what is
relevant; (4) the way the algorithm’s technical character is positioned as an assurance of
impartiality; (5) the technology’s appropriation for purposes of political contest; and (6) how
the algorithmic presentation shapes the public’s sense of itself. Gillespie argues that the
algorithm, presented as objective results of queries, shapes users’ practices and serves as a
legitimization apparatus for the production of biased knowledge.
11
While it is important to focus on the platforms’ architects as those who determine the
values of social media affordances, as Gillespie suggests, the power struggles ecosystem
involves more than just a unilateral, omnipotent type of actor. It involves many other actors:
users and non-user individuals, groups, and ephemeral networks of people and institutions.
Each of these stakeholders seeks hegemony by pushing their own value sets and interests in
the context of the various debates arising in or addressed by social media. Winner suggests
that one way for an artifact to contain political properties are “instances in which the
invention, design, or arrangement of a specific technical device or system becomes a way of
settling an issue in the affairs of a particular community” (1986, 2). How these issues are
resolved and by whom will determine the extent to which social media will reproduce power
structures and biases, or enforce new structures of power in an attempt to regulate user
behavior.
When social media users appropriate (or attempt to appropriate) a technology by
altering the algorithm to their purposes or by using the same algorithm for purposes other
than intended – this is a political act. This is when a latent power struggle occurs between the
original architects and the users. Nevertheless, platforms and architects have an inherent
power advantage when it comes to determining affordances on their platforms. They can
decide to change in the design unilaterally, arbitrarily and again – non-neutrally. Users can
invoke a power struggle, but in a manner somewhat reminiscent of labor struggles, this would
require them to unite in order to create a critical mass that can counter the values imposed by
the architects.
Two examples illustrate this politics of platforms: one has to do with the decisions of
several leading platforms to censor any explicit illustrations of the beheadings carried out by
ISIS. These guidelines came after the beheading of American journalist James Foley in
12
August 2014. On August 20, the CEO of Twitter, Dick Costolo, tweeted: “We have been and
are actively suspending accounts at we discover them related to this graphic imagery”
(Costolo, 2014) . Consequently, Twitter changed its policies to allow family members to
request the removal of content depicting a deceased user. Unlike Twitter, YouTube did not
feel required to make any specific changes, as its community guidelines already included
specific rules against content that incites violence or depicts violence with the intent of
causing shock. This example shows however, that platforms do not hesitate dictate their
values on critical questions such as the boundaries of freedom of expression.
Another example for politics of social media is when Facebook introduced the “Year
in Review” app by the end of 2014. This application invited members to watch and share the
important moments of their life over the past year with the default caption “It’s been a great
year! Thanks for being a part of it”. Soon users began complaining on the inadvertent
algorithmic cruelty (Meyer, 2014), which reminded events they did not want to remember,
such as death and divorce. Here, Facebook exemplified the third power mode by proactively
impacting the memory and awareness of people, sometimes against their choice.
Networks Structure: Between Politics of Architecture and Politics of Dynamics
The structure of networks encompasses two main aspects. First, a conceptualization derived
from the social sciences denoting the rules, practices and arrangements through which the
behavior of people is regulated in networks (Bourdieu, 1977; Durkheim, 1982; Foucault,
1978, 1990; Giddens, 1986; Weber, 1946). These rules can be social rules determined by a
group of people, but can be also be created by algorithms. Both interpretations refer to
regulating users’ behavior in networks, and both are related to particular manifestations of
politics. For example, McKelvey (2010) studied the conflict around two types of algorithms:
13
quality-of-service and end-to-end algorithms. Each type offers a different solution for the
network neutrality issue: the former prioritizes preventing network congestion while the latter
focuses on providing equality between communication modalities. Accordingly, each
promotes different values and the dynamics of promoting these values is a manifestation of
the politics inherent in the networks’ structure.
A second aspect of the network structure refers to “the typology of interconnected
nodes” (Castells, 2009), identified by “the observed set of ties linking the members of a
population” (Watts, 2004, 48) in their social networks. This definition complicates our
discussion of politics, as social media offers different types of structures. Bruns and Moe
(2014) classify these structures on Twitter into three types: meso (follower-followee
networks), macro (hashtagged exchanges) and micro (@reply conversations). More
generally, there are two important types of networks’ structures relevant to our discussions.
The first is a more permanent and stable model of network structure directly related to the
platform’s architecture. For example, the network of Twitter followers or Facebook friends of
a single social media account. While their number changes, it is a slow change and the
boundaries of the network are clearly identified. There is little room for power dynamics here
as the network structure is determined to a large extent by the platform designers. An
example for the politics of network structure at the architecture level is the Internet backbone,
which serves as the physical basis for the social media operation, and refers to the core
routers that connect large networks, including network access points that control traffic
between countries. The competition for joining the exclusive group of backward network
providers is a political, not just a business one, as it affects the scope of control these
institutions have.
14
Ephemeral types of network structures are the shapes and patterns we see in the links
connecting people in social networks as information flows on topical issues (e.g., a
conversation around a particular hashtag). Network structures like that are constituted
dynamically around a topic and their boundaries and members are dynamic. Importantly,
hashtags were not originally a Twitter design feature. Norms around their use emerged out of
the collective practices of users. The use of hashtags later spread to other platforms, and they
are now commonplace. The hashtag is part of the network structure in that it functions as a
classifier that allows other people to follow specific conversations or topic. It evolves as
information flows, and dissolves as that flow fades. Politics of networks structure with
ephemeral nature should be classified as politics of dynamics. In summer 2014, Israel
launched a military operation in the Gaza Strip against the Hamas Rule, operation protective
edge. The event was tweeted using different hashtags like #IsraelUnderFire and
#GazaUnderAttack, representing competing narratives. This exemplified politics of topical
networks structure in an attempt to capture the attention of users around the world and
influence their awareness.
Politics of Dynamics
Most of the literature on the politics of social media focuses on instances in which algorithm
writers and platform providers introduce values into social media components, or on the way
architects shape policies and standards. However, the politics of social media is not just about
the architecture. It is also about forms of power, which operate as the dynamics of
interactions between social actors evolve. These relationship dynamics are revealed to us, as
researchers, through the information as it flows or is curated in social media.
15
Politics of Information Flows
In the information age, the ability to control the flows of information is a significant source of
power. The politics of information flows refers mainly to the conflict around how
information, the most critical resource in social media, is shared and distributed among users.
Studies have repeatedly demonstrated the formation of skewed information flows in social
media, which result in unequal distribution of attention, and unequal impact on behavior and
preferences (See for example, Nahon & Hemsley, 2013; Wihbey, 2014).
Not every skewed information flow is a result of a political intervention. For example,
homophilous patterns found in conversations in social media platforms may form by users
independently of any intervention by their platforms. However, collective patterns of
behavior (manifested by the clustering of information flows) rarely evolve without any
political intervention, let alone political implications. In this section I focus solely on patterns
that are driven or exploited by social actors, which I consider as political dynamics. A
systematic review of the politics of information flows should include a discussion of (1)
mediators or gatekeepers, the actors who control of information flows, and (2) clustering
effects that are the product of information flow politics.
(1) Mediators. Network gatekeepers have a tremendous impact on information flows: by
choosing which information can or cannot pass, by connecting networks or clusters to one
another, or more generally by regulating the movement of information as it flows. They can
impact the chances of one video getting millions of views, while millions of other videos will
receive only few. Network gatekeepers (people, collectives, or institutions) are those with the
discretion to control information as it flows in and among networks. However, their power is
not absolute and their impact depends to a large extent on the gated – those subjected to their
16
gatekeeping – and on the power dynamics with other network gatekeepers (Barzilai-Nahon,
2008; 2009; Nahon, 2011).
Network gatekeepers are social actors that control information as it flows, so by
definition they exercise power and are therefore political actors. “Actors in this system are
articulated by complex and evolving power relations based upon adaptation and
interdependence. They create, tap, or steer information flows in ways that suit their goals and
in ways that modify, enable, or disable others’ agency, across and between a range of older
and newer media settings” (Chadwick, 2013, 157).
Therefore, a major power struggle in social media is over the number and identity of
gatekeepers or mediators. Technological improvements have immensely enhanced the
individual user’s ability to both produce and disseminate data. Despite this ability, however,
true control of information flows still lies in the hands of a small number of mediators. The
huge amount of information produced every second, as well as the need to create, share, and
read content, require the user to rely on their services. They help users in all their activities in
social media, from filtering excess information through connectivity with others to producing
new content. We rely on Google to find what we are looking for, on social media opinion
leaders to keep us posted, or on Facebook and Twitter to show us the posts uploaded by our
friends. By definition, however – otherwise this mediation service would be of little practical
use – Facebook, for example, does not show us all of our friends’ posts, but only those it
selects. In return for this service, it gets to control the agenda of the information transferred
from one subscriber to another. The struggle over the number and identity of gatekeepers or
mediators is a struggle for controlling the agenda of the information conveyed and transferred
from one person to another.
While traditional gatekeeping focuses mainly on selection (e.g. by newspaper editors),
network gatekeepers have many additional information control mechanisms. The power of
17
network gatekeepers does not necessarily reside in their ability to stop or filter information as
it is transferred. On the contrary, it is concealed in their ability to link networks together,
allowing information to travel far and fast, and to connect people to information and ideas.
Attracting users’ attention is the name of the game for network gatekeepers (Wihbey, 2014).
Content will spread if people know it is available to be spread, and mediators bring content to
the attention of those who follow them. They become network power hubs.
(2) Clustering Effects. Skewed clustered information flows can occur for a number of
collective
behavior
effects,
including
power-law
and
follow-the-herd
tendencies,
homophilous tendencies and polarization. Research has demonstrated that linking entities in
social networks follows a power-law distribution, where a few elites receive the attention of
many and thus have a disproportionate amount of influence (Adamic & Glance, 2005;
Drezner & Farrell, 2008; Karpf, 2008; Nahon et al., 2011; Wallsten, 2011). Scientists have
shown that the structure of networks plays an important role in how, and to what degree,
information spreads. For example, Barabási uses mathematical models to show that many
social networks are scale-free, which means that the number of connections between
individuals follows a power-law distribution (a few nodes have many connections while most
have relatively few), supporting the idea that the attention of the masses is concentrated on a
few influential actors. It turns out that a power-law distribution of attention or linking is a
fairly normal social pattern evident both online and offline. Of course, capturing the attention
of others may later translate into the ability to influence them (Nahon & Hemsley, 2013).
Figure 2 illustrates this increasingly uneven distribution through the growing market
share of the top four search engines from 2002 to 2015. From 2010 they have captured the
attention of more than 98% of users. While the identity of the four top has changed with time
(AOL and MSN were replaced by Baidu and Bing after 2010), the main question we need to
18
address is what this implies in terms of the variety and value bias of the information we are
exposed to. These leading search engines exercise all three forms of power. Specifically,
through their responses to billions of queries they control the personal agendas of every one
of us: from how to find a particular plumber to what type of news to read. What information
to exclude or include and with what priority are clearly manifestations of power over
information flows and political decisions. The power-law tendency is also evident in other
types of social media: in the blogosphere with dominant of blogs such as Huffington Post, in
micro-blogging with Twitter, or in general social media with th e dominance of Facebook –
making the social media landscape significantly political.
Figure 2: Four Top Search Engines’ Market Share
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015
Other Search Engines
Four Major Search Engines
*Source: Alexa, Netmarketshare, Search Engine Watch
The power law is just one dominant information flow tendency with political
implications in social media. Empirical studies have also demonstrated that social networks
19
also induce homophily, fragmentation, and polarization (Benkler, 2006; Sunstein, 2009). In
particular, homophily, the tendency of people with similar attributes to associate with each
other more frequently than they associate with others (Lazarsfeld & Merton, 1954;
McPherson, Smith-Lovin, & Cook, 2001), has long been recognized as a factor in linking
behavior. More recently, it has been shown to be statistically confounded with influence
(Shalizi & Thomas, 2011), meaning that statistics cannot be used to differentiate between
similar behavior due to homophilous linking or due to the influence of Actor A over B.
Scholars who have attempted to quantitatively distinguish homophily from personal influence
(e.g. Aral, Muchnik, & Sundararajan, 2009; Centola, González-Avella, Eguíluz, & San
Miguel, 2007) have been refuted by prominent statisticians who show that the two processes
are generically confounded (Shalizi & Thomas, 2011).
Thus, a complex relationship exists between homophily and influence. Homophilous
links in social media arise because people interact with similar others. As they interact over
time they co-influence each other and become more similar. Thus, over time, homophily
changes the group structure, in a process indistinguishable from social influence (Centola et
al., 2007). Homophily is fundamentally a mechanism of selection, but at the same time it is
also a mechanism of (albeit latent) influence at the individual and group levels. It is induced
by social structure and, in turn, influences those structures in what Centola et al. (2007) refer
to as a co-evolutionary model. Going back to our example at the beginning of the chapter on
the Israeli elections, users’ homophilous tendency was exploited and intensified by Facebook,
resulting in a more skewed clustering of information flow, and ultimately in an illusory
electoral reality.
20
Politics of Curated Flows
“Data and data sets are not objective; they are creations of human design. We give numbers
their voice, draw inferences from them, and define their meaning through our interpretations.
Hidden biases in both the collection and analysis stages present considerable risks”
(Crawford, 2013). The vastness of information in social media is created and shared by users,
not the platforms themselves, and continues to explode on a daily basis. These big data are
the basis of social media platforms’ business model. Many stakeholders engage with this
information: users, platforms, third-party companies, governments, researchers and more.
Politics also occurs after information flows, at the curation phase. “To curate, historically,
has meant to take charge of or organize, to pull together, sift through, select for presentation,
to heal and to preserve. Traditionally reserved for those who worked with physical materials
in museum or library settings, curation today has evolved to apply to what we are all doing
online” (Mihailidis & Cohen, 2013). For the purposes of the current discussion, I would like
to borrow the term “curated flows” from Thorson and Wells (2015), which refers to curation
in the broadest sense: to select and organize, to filter abundance into a collection of
manageable size, to search, reframe and remix, or in short manage information flows after
they have flowed in, particularly, social media.
Being a participatory space, social media empower users to perform functions
previously reserved to professional curators, such as archiving, annotating, appropriating and
recirculating real-time information (Jenkins, Purushotma, Weigel, Clinton, & Robison, 2009).
In the social media and big data era, there are suddenly many stakeholders who curate
content, raising many different information issues: What is the most appropriate way to
harvest information flows? What are the ethical considerations regarding privacy that need to
be addressed when archiving public data? Who is responsible if a post is taken out of
context? Who can access this data? The resolution of these and other emerging issues is a
21
political act, given that, interpretations and behaviors that rely on curated flows are inherently
biased as they depend on what and how we collect from these information flows, how we
clean the noise, how we understand the data, and most importantly, what our power
motivations are.
For example, interpreting the impact politicians have on social media will depend on
multiple decisions: the hashtags or keywords we collect, the languages, the platforms, the
technical constraints (such as the API), and the way we understand the context. Each of these
will require a political decision, no matter whether it is manually or automatically operated.
Someone behind the scenes will have to decide about how to curate flows, and this decision
will impact on others. Politicization of curation occurs in small amounts of data and all the
more so in big data. Expecting curation of flows to be a neutral, objective and accurate
process in big data because they are too big for humans to handle them directly is a myth
(boyd & Crawford, 2012).
In another key example, Crawford and Gillespie (2014) show how the flagging
mechanism for reporting offensive content on social media platforms “serves both as a
solution to the problem of curating massive collections of user-generated content and as a
rhetorical justification for platform owners when they decide to remove content”. In practice,
it is a political mechanism of negotiating contentious public issues between users, groups,
moderators and platforms which attempt to promote certain values by reconciling “their
ability to directly know or quantify community values with their practical and rhetorical
value” (p.3)
Power struggles in Wikipedia will serve as our final example for a political struggle
around curated flows. Its stated mission is to provide a free encyclopedia that people all over
the world can use and contribute to. However, multiple studies have shown that it is rife with
22
struggles around determining why and how content is included that have, for the most part,
taken place behind the scenes and far from the public eye. These dynamics occur among
editors who represent different value systems. In January 2006, a Wikipedia entry item was
created to document the controversy surrounding the publication of a cartoon depicting the
prophet Muhammad by a Danish broadsheet called Jyllands-Posten, which in return sparked
a controversy around whether the original cartoons should be published, or made available
via thumbnails or links. When editors involved in the article’s initial creation decided to
republish a large thumbnail version of the original cartoons at the top of the article, many
Wikipedia readers and editors objected to this as unnecessarily inflammatory. After a power
struggle among editor groups, those in favor of keeping the images on the top won the day,
claiming that a consensus decision had been reached (Morgan, Mason, & Nahon, 2011).
Conclusion
This chapter discussed and demonstrated the pervasiveness of the role played by politics in
social media. Social media politics represents conscious and frequent acts whereby multiple
actors exercise power to wrestle for competing claims on issues relevant to the governance
and use of social media, but also issues with political ramifications outside of the social
media realm.
Lately there has been a an important discourse raising growing concerns that most
decisions related to social media algorithms are not only non-neutral, but heavily biased
politically, leading to the conclusion that “Cyberspace never was – and never could be –
independent from the governing institutions, economic structures, and culture and social
worlds that gave rise to it”(Kreiss, 2014, p. 133). This chapter attempts to add a further
refinement to this discourse deconstructing the general and opaque notion of algorithm and
23
focusing on the locus of the political process: whether it occurs while information flows or
curated, or while the basic elements of a social media platform are architected (see Figure 3).
Manifestations of politics can be identified in social media architecture (on the platform and
networks structure levels) and dynamics (on the networks structure, information flows and
curated flows levels). In all instances, the product is the same: a constant attempt to regulate
human practices and norms, and moreover, to transform or reinforce the behavior,
preferences and values of individuals and groups according to the worldview and interests of
those in power.
Figure 3: Politics of Social Media Relationships
Politics of Dynamics
Structure of Networks
Information Flows
Curated Flows
Politics of Architecture
Information
Structure of Networks
Infrastructure
Behavior &
Preferences
Platforms
The politics of platform architecture is different than the dynamic politics of
information flows. Whereas the former mainly involves infrastructure as a mechanism to
regulate behavior, the latter mainly involves the content of information flows. The legitimacy
of information to flow is derived mainly from users who share content. In contrast, platforms
receive their operational legitimacy from the false belief in procedural justice, the myth that
platforms are neutral.
24
For example, there is a difference between the politics around allowing to post nude
photos on a particular platform, and that of making it go viral or preventing it from being
shared at all. It is a subtle but important difference. The politics of information flows entails a
greater number of stakeholders, and is more inclusive in terms of the power at play, since it
usually does not reside in a single platform. Deconstructing the politics of social media also
allows the discourse to move from examining at specific dominant actors such as platforms
and governments to a more nuanced evaluation of interrelations that also involve users and
non-users.
However, identifying the political manifestations is only a small fraction of the story.
Most of the political processes occur without any serious public discussion about the values
at stake. Questions such as the right to be forgotten by Google; what governments may
collect about me on social media; or whether nude photos should be censored inevitably
receive different answers from different people. What holds for one community doesn’t hold
for another sharing the same network. This messy ecosystem creates a political vacuum in
which platforms gets to play the police (sometimes against their will) while users look for
creative solutions to circumvent policing. Political struggles and actions are determined
without thorough scrutiny by all relevant stakeholders, rather than just the powerful. Where
there is social media there is politics.
25
Acknowledgements
I would like to express deep gratitude to Ami Asher for helping me with the language editing
of this chapter. Also, I thank Shawn Walker and Alison Wohlers for providing comments on
earlier drafts of the chapter.
References
Adamic, L. A., & Glance, N. (2005). The political blogosphere and the 2004 U.S. election:
divided they blog. In Proceedings of the 3rd international workshop on Link
discovery
(pp.
36–43).
New
York,
NY,
USA:
ACM.
http://doi.org/10.1145/1134271.1134277
Aral, S., Muchnik, L., & Sundararajan, A. (2009). Distinguishing influence-based
contagion from homophily-driven diffusion in dynamic networks. Proceedings of
the
National
Academy
of
Sciences,
106(51),
21544–21549.
http://doi.org/10.1073/pnas.0908800106
Bachrach, P., & Baratz, M. S. (1962). Two Faces of Power. The American Political Science
Review, 56(4), 947–952. http://doi.org/10.2307/1952796
Barabasi, A. L. (2003). Linked. Plume New York.
Barzilai-Nahon, K. (2008). Toward a theory of network gatekeeping: A framework for
exploring information control. Journal of the American Society for Information
Science and Technology, 59(9), 1493–1512.
Barzilai-Nahon, K. (2009). Gatekeeping: A Critical Review. Annual Review of Information
Science and Technology (ARIST), 43, 433–479.
Benkler, Y. (2006). The wealth of networks: how social production transforms markets
and freedom. New Haven: Yale University Press.
Bond, R. M., Fariss, C. J., Jones, J. J., Kramer, A. D. I., Marlow, C., Settle, J. E., & Fowler, J. H.
(2012). A 61-million-person experiment in social influence and political
mobilization. Nature, 489(7415), 295–298. http://doi.org/10.1038/nature11421
Bourdieu, P. (1977). Outline of a Theory of Practice. (R. Nice, Trans.). Cambridge
University Press.
boyd, danah, & Crawford, K. (2012). Critical Questions for Big Data. Information,
Communication
&
Society,
15(5),
662–679.
http://doi.org/10.1080/1369118X.2012.678878
Bruns, A., & Moe, H. (2014). Structural layers of communication on Twitter. In K. Weller,
A. Bruns, J. Burgess, M. Mahrt, & C. Puschmann (Eds.), Twitter and Society (Vol.
89,
pp.
15–28).
New
York:
Peter
Lang.
Retrieved
from
http://eprints.qut.edu.au/66324/
Castells, M. (2009). Communication power. Oxford University Press, USA.
26
Centola, D., González-Avella, J. C., Eguíluz, V. M., & San Miguel, M. (2007). Homophily,
Cultural Drift, and the Co-Evolution of Cultural Groups. Journal of Conflict
Resolution, 51(6), 905 –929. http://doi.org/10.1177/0022002707307632
Chadwick, A. (2013). The Hybrid Media System: Politics and Power. Oxford University
Press.
Constine, J. (2014). Why Is Facebook Page Reach Decreasing? More Competition And
Limited
Attention.
Retrieved
from
http://social.techcrunch.com/2014/04/03/the-filtered-feed-problem/
Costolo, D. (2014, August 20). We have been and are actively suspending accounts at we
discover them related to this graphic imagery. Retrieved from
https://twitter.com/dickc/status/502005459067625473.
Crawford, K. (2013, April). The Hidden Biases in Big Data. Harvard Business Review.
Retrieved from https://hbr.org/2013/04/the-hidden-biases-in-big-data
Crawford, K., & Gillespie, T. (2014). What is a flag for? Social media reporting tools and
the vocabulary of complaint. New Media & Society, 1461444814543163.
http://doi.org/10.1177/1461444814543163
Dahl, R. A. (1957). The concept of power. Behavioral Science, 2(3), 201–215.
http://doi.org/10.1002/bs.3830020303
Drezner, D., & Farrell, H. (2008). The power and politics of blogs. Public Choice, 134(12), 15–30. http://doi.org/10.1007/s11127-007-9198-1
Durkheim, E. (1982). Rules of Sociological Method. Free Press.
Elmer, G. (2010). Exclusionary Rules? The Politics of Protocols. In A. Chadwick & P. N.
Howard (Eds.), Routledge Handbook of Internet Politics (pp. 376–384). Taylor &
Francis.
Facebook:
Community
Standard
Page.
(n.d.).
Retrieved
from
https://www.facebook.com/communitystandards
Foucault, M. (1978). Discipline & Punish: The Birth of the Prison (2nd Edition). Vintage.
Foucault, M. (1990). The History of Sexuality. (R. Hurley, Trans.) (Fifth or Later Edition).
Vintage.
Giddens, A. (1986). The Constitution of Society: Outline of the Theory of Structuration.
University of California Press.
Gillespie, T. (2010). The politics of “platforms.” New Media & Society, 12(3), 347–364.
http://doi.org/10.1177/1461444809342738
Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. Boczkowski, & K.
Foot (Eds.), Media Technologies (pp. 167–194). Cambridge, MA: MIT Press.
Retrieved
from
https://www.academia.edu/2257984/The_Relevance_of_Algorithms
Halavais, A. (2008). Search Engine Society. Polity.
Introna, L. D. (2006). Maintaining the reversibility of foldings: Making the ethics
(politics) of information technology visible. Ethics and Information Technology,
9(1), 11–25. http://doi.org/10.1007/s10676-006-9133-z
27
Jenkins, H., Purushotma, R., Weigel, M., Clinton, K., & Robison, A. J. (2009). Confronting
the Challenges of Participatory Culture: Media Education for the 21st Century.
Cambridge, MA: The MIT Press.
Karpf, D. (2008). Measuring Influence in the Political Blogosphere: Who is Winning and
How Can We Tell?. George Washigton Univeristy’s Institute for Politics,
Democracy
and
the
Internet.
Retrieved
from
http://www.the4dgroup.com/BAI/articles/PoliTechArticle.pdf
Kreiss, D. (2014). A Vision of and for the Networked World: John Perry Barlow’s A
Declaration of the Independence of Cyberspace at Twenty. In J. אלtt & N. Strange
(Eds.), Media Independence: Working with Freedom or Working for Free? (pp.
117–138). Routledge.
Lazarsfeld, P., & Merton, R. (1954). Friendship as a social process: a substantive and
methodological analysis. In M. Berger (Ed.), Freedom and Control in Modern
Societies (Vol. 18, pp. 18 – 66). New York: Van Nostrand.
Lievrouw, L. A. (2009). New Media, Mediation, and Communication Study. Information,
Communication
&
Society,
12(3),
303–325.
http://doi.org/10.1080/13691180802660651
Liu, Y., Gummadi, K. P., Krishnamurthy, B., & Mislove, A. (2011). Analyzing Facebook
Privacy Settings: User Expectations vs. Reality. In Proceedings of the 2011 ACM
SIGCOMM Conference on Internet Measurement Conference (pp. 61–70). New
York, NY, USA: ACM. http://doi.org/10.1145/2068816.2068823
Lukes, S. (2005). Power: A Radical View (2nd ed.). Palgrave Macmillan.
McPherson, M., Smith-Lovin, L., & Cook, J. M. (2001). Birds of a feather: Homophily in
social networks. Annual Review of Sociology, 27, 415–444.
Meyer, E. (2014, December 24). Inadvertent Algorithmic Cruelty. Retrieved from
http://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-algorithmiccruelty/
Mihailidis, P., & Cohen, J. N. (2013). Exploring curation as a core competency in digital
and media literacy education. Journal of Interactive Media in Education, 2013(1),
Art–2.
Morgan, J. T., Mason, R. M., & Nahon, K. (2011). Lifting the veil: the expression of values
in online communities. In Proceedings of the 2011 iConference (pp. 8–15). ACM.
Retrieved from http://dl.acm.org/citation.cfm?id=1940763
Nahon, K. (2011). Network Fuzziness of Inclusion-Exclusion [forthcoming].
International Journal of Communication.
Nahon, K., & Hemsley, J. (2013). Going Viral (1 edition). s.l.: Polity.
Nahon, K., Hemsley, J., Walker, S., & Hussain, M. (2011). Fifteen Minutes of Fame: The
Power of Blogs in the Lifecycle of Viral Political Information. Policy & Internet,
3(1), 6–33. http://doi.org/10.2202/1944-2866.1108
Pariser, E. (2012). The Filter Bubble: How the New Personalized Web Is Changing What
We Read and How We Think (Reprint edition). New York, N.Y.: Penguin Books.
Polsby, N. W. (1963). Community Power and Political Theory. New Haven, CN: Yale
University Press.
28
Sagolla, D. (2009). 140 Characters: A Style Guide for the Short Form (1st ed.). Wiley.
Segev, E. (2010). Google and the Digital Divide: The Bias of Online Knowledge (1 edition).
Oxford: Chandos Publishing.
Shalizi, C. R., & Thomas, A. C. (2011). Homophily and contagion are generically
confounded in observational social network studies. Sociological Methods &
Research, 40(2), 211.
Shaw, A., & Hill, B. M. (2014). Laboratories of Oligarchy? How the Iron Law Extends to
Peer
Production.
Journal
of
Communication,
64(2),
215–238.
http://doi.org/10.1111/jcom.12082
Sunstein, C. R. (2009). Republic.com 2.0. Princeton, N.J.; Woodstock: Princeton University
Press.
Thorson, K., & Wells, C. (2015). How Gatekeeping Still Matters: Understanding Media
Effects in an Era of Curated Flows. In P. V. Tim & H. Froncois (Eds.), Gatekeeping
in Transition (pp. 25–44). Retrieved from https://wordery.com/gatekeeping-intransition-tim-p-vos-9780415731614
Trottier, D., & Fuchs, C. (2014). Theorising Social Media, Politics and the State. In Social
Media, Politics and the State (pp. 3–38). New York: Routledge. Retrieved from
http://www.bokus.com/bok/9780415749091/social-media-politics-and-thestate/
Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational
politics.
First
Monday,
19(7).
Retrieved
from
http://www.firstmonday.dk/ojs/index.php/fm/article/view/4901
Wallsten, K. (2007). Agenda Setting and the Blogosphere: An Analysis of the
Relationship between Mainstream Media and Political Blogs. Review of Policy
Research, 24(6), 567–587. http://doi.org/10.1111/j.1541-1338.2007.00300.x
Wallsten, K. (2011). Beyond Agenda Setting: The Role of Political Blogs as Sources in
Newspaper Coverage of Government. In 2011 44th Hawaii International
Conference
on
System
Sciences
(HICSS)
(pp.
1–10).
http://doi.org/10.1109/HICSS.2011.80
Watts, D. J. (2004). Six Degrees: The Science of a Connected Age. W. W. Norton &
Company.
Weber, M. (1946). From Max Weber: Essays in Sociology. (H. H. Gerth & C. W. Mills, Eds. &
Trans.). Oxford University Press.
Wihbey, J. P. (2014). The Challenges of Democratizing News and Information:
Examining Data on Social Media, Viral Patterns and Digital Influence. Shorenstein
Center on Media, Politics and Public Policy Discussion Paper Series. Retrieved from
http://dash.harvard.edu/handle/1/12872220
Winner, L. (1996). Who will we be in cyberspace? The Information Society, 12(1), 63–72.
Woodly, D. (2008). New competencies in democratic communication? Blogs, agenda
setting and political participation. Public Choice, 134(1), 109–123.