Caleb Knox - The Final Paper - 2963930 1
Caleb Knox - The Final Paper - 2963930 1
Caleb Knox - The Final Paper - 2963930 1
Caleb T. Knox
Abstract
Democracies have long understood the necessity of political discourse to protect a nation and
finding the greatest solution to its problems, but the means through which political speech is
shared has fundamentally shifted with the advent of social media platforms stepping into the role
of the modern public square. The discussion that follows is not about freedom of speech at large,
nor does it concern the surge of social media platforms in general, rather it delves into why
political speech is important, how it has been affected by the modern means of discourse, and
what is the proper route of action. This paper explores said development, beginning with
outlining the role political speech has for a free people. Secondly, it looks into how social media
has morphed political conversation, and finally, the paper ends with a discussion of what
regulations would look like and how they can be done correctly to uphold a healthy mode of
discussion.
DIGITAL DEMOCRACY 3
Essential to upholding the American experiment is the right to speak freely of legislative
affairs without fear of state interference (Gamreklidze, 2015). With the advent of a technological
boom, the modern American most likely finds themselves hearing and discussing a majority of
political opinions on social media platforms (Shearer & Matsa, 2018). These expanded
infrastructures bring with them a whole new array of legal, ethical, and psychological questions
Searching for the best way to go about working through these modern dilemmas requires
a nuanced solution that takes into account the complexities of each actor. Upholding the
blessings of free speech in the face of a new technology will require social media platforms, the
government, and the end-user to each play a part in protecting public discourse.
Political Speech
In Austin v. Michigan Chamber of Commerce (1990) The United States Supreme Court
articulated that "the right to engage in political expression is fundamental to our constitutional
system." Although not chronologically aligned, this reasoning is what brought Justice Williams
to conclude in the landmark Supreme Court case of New York Times Co. v. Sullivan (1964), that
there is a “profound national commitment that debate on public issues should be uninhibited,
robust, and wide-open, and that it may well include vehement, caustic, and sometimes
It is not self-evident why the courts consider the right to speak freely on political matters
so important. For instance, two centuries of First Amendment Jurisprudence has established that
DIGITAL DEMOCRACY 4
there is communication that lies outside the protections the Constitution offers (Galloway, 1991).
Even within the bounds of protected speech, the degree to which the state can infringe is
hierarchically oriented, placing the greatest amount of scrutiny on legislative attempts to control
The answer to the sacralization of political speech, at least in the eyes of the courts, finds
its origin embedded in America’s founding. The framers saw freedom of speech not solely as a
natural right, but also a right that serves the purpose to help uphold the political structure at
large, ensuring the survival of natural rights. In Cato’s Letters, a series of political commentaries
that spurred classical republicanism in Great Britain and American colonies, the author outlines
that “Without freedom of thought there can be no such thing as wisdom; and no such thing as
publick liberty, without freedom of speech” (Eijnatten, 2011). This derives from free speech
holding an “enlightenment function” as outlined in Whitney v. California (1927) where the court
ruled that the “Freedom to think as you will and to speak as you think are means indispensable to
The primacy of freedom of speech stems from the idea that truth is necessary to discern
good ideas from the bad. The essence of political speech becomes finding the closest sembence
to this truth, so a nation may be guided in making the necessary qualitative distinctions to an ever
roadblock, namely nobody can know a-priori what this truth is (Milton, 1644). Each person,
bound by biases and blindspots, is in a position removed from a greater view of the truth that can
only be rectified through a discussion with other subjective agents who see the world lay itself
out in a different manner (Peterson, 17). In an L.A. Times article from 1987, Howard Rosenburg
DIGITAL DEMOCRACY 5
found himself reporting on the dangerous development of neo-natzism when he pointed out that,
"Presuppos[ing] that right conclusions are more likely to be gathered out of a multitude of
tongues, than through any kind of authoritative selection. To many that is, and always will be
folly; but we have staked upon it our all” (1987). To this, political speech has to be understood
is no way for a nation to censor, protect, or let alone address political speech without thoroughly
defining it. In Wells v. State of Illinois Justice Barnes of the Court of Appeals of Indiana offered
a legal definition: “Expressive activity (speech included (Duhaime Law Dictionary)) is political
if its point is to comment on government action, including criticizing the conduct of an official
acting under color of law.” This definition can be applicable to any issue that the government
One intricacy of defining political events is it’s ever widening scope. In Alexis De
Tocqueville’s classic, Democracy in America ( 1835), he argues that democratic nations have a
natural impetus toward centralizing power. Individualist democracies free people from the state
but also from bonds with one another, so when a problem presents itself outside the bounds of
the individual, the natural impulse is to outsource said problem to the state, seen as the abstracted
will of the individual. The government, therefore, becomes the primary centripetal force in a
large democratic nation, creating a positive feedback loop with a strong central power at its base.
Regardless of whether or not the solvency for problems ends up developing from central
power, what this process does is bring a steady flow of problems into the political sphere. David
DIGITAL DEMOCRACY 6
Brooks, a social critique writing for The New York Times, reported earlier this year how this
process has been amplified in a digital era. An upcoming generation of voters are forming their
political consciousness from online voices, where they will see an expanding amount of
problems as political ones as they spend an increased amount of time on these platforms (Bialik,
14). So when defining political speech in keeping with Justice Barnes definition provided in
Wells v. State of Illinois, it is important to understand that online political discourse will
increasingly mark a majority of national discourse for upcoming years as more issues are brought
Writing for the majority, in the 2017 United States Supreme Court case of Packingham v.
North Carolina, now-retired Justice Anthony Kennedy “called the cyber age a revolution of
historic proportions” prophesying that “we cannot appreciate yet its full dimensions and vast
potential to alter how we think, express ourselves, and define who we want to be” as
“cyberspace” becomes “the most important place . . . for the exchange of views” (Hudson, 2019).
But bringing about this “modern Gutenberg Revolution” was not an isolated event of this past
year, but imbedded in the development of social media’s politicization (Peterson, 17).
The social media of today finds itself in quite a different place than it was at its
conception (Haidt & Rose-Stockwell, 2019). Early internet infrastructures like Facebook and
Myspace appeared between 2002 and 2004, with the goal of “helping users connect with
friends.” Come 2006, with the introduction of Twitter to the market, came the timeline, and with
it Facebook added the news feed, each designed to deliver a continuous diet short updates. This
effectively allowed any post, from any user, to spread like wildfire as long as it generates views.
DIGITAL DEMOCRACY 7
By 2009, Twitter would also update its airways adding the retweet, which “essentially enabled
Social media crossed the rubicon in the early 2010s when Upworthy, among other
pioneers in the industry, began conducting a series of psychological tests, looking at multiple
variations of headlines to develop the thread that generated the engagement. This brought the
advent of clickbait paired with images “tested and selected to make us click impulsively,” paving
the way for a corrupted mode of discourse (Haidt & Rose-Stockwell, 2019).
By 2014 this technology was first harnessed for political ends with its ability to distribute
political sentiments that have international reach. In bypassing the top-down style that news
media offers, social media delivers a more spontaneous and inclusive approach (Miller, 19).
Combined with the blessings of an expanded dialogue, social media has presented new problems
to our Republic. In the section below some of these issues will be briefly outlined in their
Political Polarization
Humans evolved in tribal terms, this makes our conceptual framework view the world in
“us vs them” manner. Simply put, we see people either as part of our tribe or not (Goldberg,
2018). In Federalist No. 10 James Madison articulated the dangers of the modern tribe, the
“faction,” which “inflamed (men) with mutual animosity,” abandoning any sense of common
good. Madison saw the geographical diversity and multiplicity of associations as a check against
But social media serves to exploit the inner barbarian in the modern man, flattening
regional differences (Merkovity, Imre, & Owen, n.d.), making it easier for any idea, no matter
DIGITAL DEMOCRACY 8
how extreme, to find an ally, but more importantly to find a common enemy. In 2017, a group of
social psychologists at NYU measured the manifestation of 500,000 tweets, finding that each
“moral or emotional word” used in a tweet increased its engagement by an average of 20 percent.
On social media, posts that exhibit “indignant disagreement” received nearly twice as many
views, likes, and shares as any other category of content (Haidt & Rose-Stockwell, 2019).
Over the course of a 25 year study, Pew Research has been documenting America’s
polarization. The pollsters, in getting peoples opinions on ten political questions, were able to
test “ideological consistency.” In 1994, only 10% of American were either consistently
conservative or liberal, this stayed relatively constant a decade later in 2004 at 11% ideological
consistency. A leap occurred in 2014, after social media took its developing role in our public
forums; the national ideological consistency increased to 21%. Among many issues, this has
diminished moderate thought from our political discourse. In 1994 and 2001 about half (49%) of
the population took an equal number of conservative and liberal positions. Come 2014, this
falsified construction of reality where views contrary to the end-user can be suppressed and taken
off of their feed with ease (Bolter, 19). For instance, Youtube’s “associative linking” has “turned
bias to end-users, creating ideologs who are trained not to engage with ideas, but to instead
Simplifying Dialogue
DIGITAL DEMOCRACY 9
By establishing a political structure that rests the source of power in the people,
liberalism depends upon an informed populous. One which understands that matters of policy
entails complexities beyond party lines and does not present a clear differentiation between good
and evil or right and wrong (Anderson & Enevoldsen, 2005). This was obvious to prior
generations, for instance the now infamous Lincoln-Douglas Debates would begin each
discussion with an hour opening statement, followed by an hour and a half rebuttal, with a final
thirty minute rebuttal from the opening speaker (NPS, 2017). These longer forum discussions
were seen as necessary to find the truth lying at the heart of political speech.
Modern technology, by the implied consent of the end user, has traded longer, nuanced
discussions for soundbites and 140-character tweets, and rather than generating a dialogue that
promotes thoughtful discussion it devolves into ideological reductions (Ebeling, 2017). More
than ever, individuals are relying on social media sites for their political news (Shearer & Matsa,
2018), but these individuals enter an environment which incentivizes ideologies jabs over
Trump has been on the frontier of harnessing social media’s ability to compress a
message. From his first two years alone, he has “tweeted more than 600 times about Russia and
collusion, more than 400 times lamenting fake news, and more than 200 times each about
Clinton and Obama. Often, the tweets carry a simple, emotional conclusion, such as ‘No
Collusion’ or ‘Just more Fake News.’” (Bolter, 2018). Far from being a partisan divide the
development of political correctness has weaponized words such as “hate speech, equality, and
white supremacist” to call forth a message that lay latent in ideologically charged words
DIGITAL DEMOCRACY 10
(Emmons & Wilson, 2019). Trump’s base is aware of the message “Fake News” entails without
it being fully articulated, the politically correct types know what calls for “Equality” embody.
The majority of those participating in online political discussion aren’t searching for truth
in an attempt to find the best route of action, rather it is advocacy, with a-priori ideological
reductions designed to make simplified statements that will simultaneously spark outrage and fit
preconceived notions("Section 5: Political," 2014). Among our leaders this phenomena is not
unique to Trump; it is the most ideological members of congress who post the most on social
media (Kessel, Hughes, & Messing, 2018). As I stated earlier “political speech has to be
understood not as a monologue, but as a continuous dialogue”, but the modern forum for
Distributing Misinformation
It is nearly impossible to quantify how much of our online political discourse consists of
falsified information (Read, 18). But as posts between laymen became primary means of political
making people educated on political questions. The researchers concluded this is the result of
the modern end-user’s inability to distinguish “the signal from the noise” (Fabrega & Sajuria,
2013).
Our modern conception of fake news was generated in the digital environment, as a
personal opinion, grounded in no truth, was given the same look and feel as a story from The
Atlantic (Fabrega & Sajuria, 2013). Misinformation is disseminated by both private and public
DIGITAL DEMOCRACY 11
actors, on a global scale (Omidyar, 2017). International entities from “Russian bots” to “chinese
click farms” have developed an army of “online bots” with the sole purpose of spreading
This manifested itself most clearly in the last presidential election, as Russia in particular
used “social media in an unprecedented way in terms of not just political advertising, but more
disinformation”(Laslo, 18). A study (Bovet & Makse, 19) on the “Influence of fake news in
Twitter during the 2016 US presidential election” found that 25% of information generated from
American news outlets or users tagged in the United States was falsified or misleading. Domestic
and international actors now hold the capacity to manipulate information at levels not possible
Social media infrastructure are receiving political pressure from separate schools of
thought to carry out contradictory roles. The open arena of information combined with a
polarized fervor has brought out an increase in “hate speech.” In response to these the
commonality between companies has been to suppress voices (Hudson, 2019). Being
unchartered territory, infrastructures cast a wide net in an attempt to censor or dampen hateful
speech on their platform. Yet lacking a concrete legal definition, hate speech became, regardless
of intent, a tool of suppressing not just hate speech, but political voices at large (Strossen, 2018).
Continuing below is not meant to be a general discussion on the line between hate speech and
political speech, rather pointing out how this question plays itself out on a digital platform.
DIGITAL DEMOCRACY 12
There exists a near universal agreement that there is online speech that can be destructive,
but there is no agreement on its criteria or what it includes (Masnick, 19). Sheryl Sandberg, the
Chief Operating Officer of Facebook, spoke of the difficulty of determining what types of speech
Just recently, Twitter suspended a group of journalists for referencing the Pensacola
shooter’s manifesto. To justify the suspension, Twitter claimed the journalist had posted the
manifesto. However, the manifesto was never posted, and only one of the journalists posted a
screenshot of it (Slatz, 2019). PragerU, a right wing political organization that speaks on
controversial topics, had a series of their videos restricted from Youtube. After filing a
compliment, they were told their “videos aren't appropriate for the younger audiences..." (Youtbe
continues, n.d.). At one point, Facebook had the Declaration of Independence flagged as hate
speech, with an automated note claiming “goes against our standards of hate speech” but
anti-refugee Facebook posts and attacks on refugees in Germany (Muller & Schwarz, 2017).
Domestically, prosecutors said the Charleston church shooter, who in 2015 opened fire on a
church, killing nine black clergy and worshippers, was in a “self-learning process” online. This
brought him to believe that the goal of white supremacy required violent action (Laub, 2017).
The Pittsburg synagogue shooter was an end-user on Gab, a new social media site claiming to be
a “free speech alternative”. Although it is not clear if he was radicalized online, he did send a
final message on it’s server, with the chilling remark: “Screw your optics, I’m going in.”
DIGITAL DEMOCRACY 13
These six examples are to show that both statements are simultaneously true. 1) “No
principled distinction can be drawn between public hate speech and other forms of political
expression” (Heyman, 09). Private regulation of hate speech will continue to infiltrate political
discourse, hindering our search of truth (Leetaru, 2019). 2) Without a means of regulating
In becoming “the most important place . . . for the exchange of views” (Packingham v.
North Carolina, 2017), social media infrastructures have been given the impossible task of
upholding First Amendment values and protecting its end users from potentially dangerous
content. Answering these questions are not unique to social media, but the expanded dialogue
these companies offer magnifies these issues at large. Within the friction between political
speech and hate speech, political discourse has been corrupted by the digital era.
In the eyes of the public, the wealth of services these corporations may provide, or the
ideals they profess, have been outweighed by the issues mentioned above; giving rise to a unified
negative sentiments from all over the political spectrum, among both politicians and citizens,
against social media companies (Masnick, 2019). These companies have recognized their
responsibility in the modern cultural landscape as providing “platforms for speech” (Brannon,
2019). The disconnect between these separate stakeholders becomes apparent on the legislative
question, technology is developing faster than laws can keep up, and different coalitions are
calling for modern infrastructures to take responsibility for different, often contradictory roles.
This transitions into the next element of the paper; the significance of political speech has been
DIGITAL DEMOCRACY 14
established, the way it has changed with the advent of technology has been discussed. What is
left to consider is: what limits should be placed and upon what actor should they fall?
Any discussion concerning free speech needs to include limits (Van Mill, 2018). Great
free speech apologists, among them John Locke and John Stuart Mill, presupposes a certain
degree of limits, legal or not, when defending the free to speak. The question was rather not
what but who defines the limits and what do they look like? The section that follows will attempt
to take a nuanced and multi-leveled look at how speech can be best regulated in a digital era to
The common conception of free speech is grounded in legal jurisprudence leading all the
way up through the twentieth century (Balkin, 2018). It is often referred to as a “Dyadic Model,”
on the one hand exists the state, who governs and regulates those on the other hand, the speakers
and publishers of content. But the twenty-first Century, with its technological advancements,
requires a new mode of understanding the metrics of Free Speech, a triangle. This triangle
composes, the state on one corner, social media infrastructures at the other, and the end-user at
Jack Balkin, the scholar who developed this idea, is a Professor of Constitutional Law
and the First Amendment at Yale Law School. His theory, articulated as “Free Speech is a
Triangle,” (which will be laid out in greater detail below) provides the best framework to answer
questions surrounding how to limit political speech online. This updated mode of
DIGITAL DEMOCRACY 15
conceptualizing free speech is the most realistic means through which any regulatory attempts
would occur.
Prior to the advent of social media, existing under the “Dyadic Model,” speech was
limited through what Balkin (2004) refers to as “old school regulation.” This means of regulation
aims directly “at speakers and publishers of content”, utilizing “traditional methods of
enforcement”, such as fines, imprisonment, and in extreme cases, violence or the threat of it; all
in an attempt to disincentivize speakers and publishers from certain actions. Updating this
method, the triangle model employs “new school regulation” which does not aim directly at
speakers or publishers, but instead the digital infrastructure they operate on.
The primary example of new school regulation is collateral censorship, defined as when
the “state targets entity A (social media for the purposes of this paper) to control the speech of
another entity, B (end-user for the purposes of this paper)… In effect, collateral censorship
attempts to harness a private organization to regulate speech on the state’s behalf”. Congress,
through the use of collateral censorship of many forms has private governance do their jobs for
Congress grants companies immunity under Section 230 of the Communications Decency
Act, protecting them from legal objections “decisions to host content created by others and for
actions taken voluntarily and in good faith to restrict access to objectionable material” (Zeran v.
Am. Online, 1997). This effectively allows companies free reign to allow and censor material.
Showing its pragmatic effects, Professor at Yale Law School, Jed Rubenfield points out “If an
article or advertisement in the New York Times libels you, you can sue the Times; if an article or
According to Professor Dawn Nunziato, Congress grants this immunity for two reasons
already mentioned: First, they lack the technological capacity to efficiently regulate the airways.
Secondly, the First Amendment restricts the state from restricting “harmful, offensive, and
otherwise undesirable speech” (Rubenfeld, 2019). Binding the airways to this same guideline
would stop any ability to interfere on content over the ways of the internet, outside of political
By enabling this freedom, infrastructures have grown into private bureaucratic states
“almost by accident” (Balkin, 2018). Through the use of collateral censorship the state has
outsourced this responsibility to private infrastructures. Forcing upon them the creation of
Any post today “exists in an architecture of privately owned websites, servers, routers, and
backbones,” forcing its existence online to be bound to the rules of those private governments
(Peters, 17).
Although these companies may function like governments, they are not bound by The
rule of law, the First Amendment, or due process (Regulating social, 2018). Infrastructures lack
transparency. Facebook’s criteria for censoring content had to be leaked to the press (Hopkins,
2018). Furthermore, censorship occurs behind closed doors by a bureaucrat or algorithm, there
is no judicial restraint restricting companies decision to remove content (Balkin, 2018). This
becomes extremely problematic with Facebook alone flagging more than one million posts per
The first solution would be to mandate a system of judicial restraint upon infrastructures.
These companies would hold the same immunity and autonomy to regulate speech as they
please, but in doing so there has to be a process. End-users have to understand the rules which
guide their online existence, and expect due process if they violate them (Balkin, 2018). Most
likely these will not work if imposed politically, rather through voluntary adoption to preserve
the norms of individual infrastructures, ensuring legitimacy (Masnick, 2019). The only current
solution as to what this would look like is “The Manila Principles”, a series of proposals
developed by civil society organizations in 2015 to curb internet dominance. These are a
roadmap, as previously stated, the groundwork for rules that would be imposed voluntarily. They
require, among other reforms (1) “clear and public notice of the content-regulation policies
companies actually employ; (2) an explanation and an effective right to be heard before content
is removed; and (3) when this is impractical, an obligation to provide a post facto explanation
principles,n.d.).
By ensuring due process companies could retain their freedom in regulating content, but
protect the end-user in the process by diminishing the chance of tyrannical and arbitrary limits
(Leetura, 2019). Judicial restraint would actively serve in finding a line between political and
hate speech as companies would have articulate rules that could be discussed and tried.
As explained in the triangle, most regulations from the government would be placed not
directly on the end-user, but instead on the company. The state needs to place content-neutral
regulations, defined as limits on “the time, place, and manner of speech in contrast to
DIGITAL DEMOCRACY 18
content-based laws, which regulate speech based on content” (Hudson, n.d.). This means they
can not force regulation of specific views (Rosenberger v. Rector and Visitors of University of
Virginia, 1995), but apply broad regulations upon the manner in which content is displayed. In
the realm of political speech, this is a well documented mode of action from the state.
In the case of Ward v. Rock Against Racism (1989), the Supreme Court of the United
States adopted a three-pronged guide to approving state regulation of political speech (direct or
indirect): 1) “It must be content-neutral.” 2) “It must be narrowly tailored to serve a significant
governmental interest.” 3) “It must leave open ample alternative channels for communicating the
This allows for the government to regulate a certain variance of speech when it presents a
what viewpoint the speaker represents. Content-neutral regulation also halt the dangers of
legislatively suppressing voices based on their opinion (Hare & Weinstein, 2009). Aside from
being beneficial, content-neutral regulations are most likely the only regulation the court would
As discussed in the beginning of this discussion, any realistic solution will require a joint
effort of the private and public sector. Above I outlined what social media companies can do and
what the state can do, but of greater importance is their combined effort.
Much of the conversation surrounding social media regulation can be boiled down to the
same attack - big structures. Some believe that social media has become too big and powerful
DIGITAL DEMOCRACY 19
and must be regulated by the government. Others rebut this by outlining the dangers of the state,
another big power, interfering in political discourse. Both ideas recognize the problems of having
big structures regulating speech, and switching one for the other will not solve the fundamental
issue.
It has been established that social media platforms have been pushed into developing
massive systems of private governance to regulate the airways. This translates into a
responsibility that has no time for due process, the formation of thousands of low paid workers
or AI to flag millions of posts daily, and massive corporations applying one central set of rules to
international networks (Hopkins, 2018). The best solution seems to be decentralizing social
media platforms, on two seperate levels through legislative means and market developments. As
Mike Masnick explained in his recent article, “Protocols, not platforms: A technological
A protocol-based system, however, moves much of the decision making away from the
center and gives it to the ends of the network. Rather than relying on a single centralized
platform, with all of the internal biases and incentives that that entails, anyone would be
able to create their own set of rules—including which content they do not want to see and
This breaks up social media companies within their own infrastructure, decentralizing
what rules, due process, and norms ought to look like, as opposed to super-imposing a vague
definition from a central location which inevitably disenfranchises some group of users.
The state. To help in this process the government would need to impose
“pre-competition regulation.” In the status quo, Facebook and the other major networks have
DIGITAL DEMOCRACY 20
gridlocked the system pushing competition out of the picture (Griffith, 2017). Having a few
major platforms stalls innovation, makes us more susceptible to large scale interference from a
foreign power, and removes any friction from misinformation spreading at large. By introducing
this regulation more competition would introduce itself to the market allowing for norms
conforming more to localized ideals rather than absolutizing one set of beliefs (Flock, 17).
One example of this has been Jordan Peterson’s creation of “Thinkspot”, concerned by
the simplification of dialogue, addressed in section two; his platform has a minimum character
limit rather than a maximum. This allows for an expanded conversation rather than ideological
reduction.
The infrastructures. Jack Dorsey, the founder and CEO of Twitter, announced on
December 11 through a thread of tweets that Twitter was looking into decentralizing their
platform. He admitted to the “new challenges centralized solutions are struggling to meet…
which is why the work must be done transparently in the open, not owned by any single private
Dorsey later explains that it is difficult to know what this would look like practically but
Twitter has built a team called @bluesky to redesign how twitter works. The decentralized plan,
laid out by Mike Masnick (which Jack Dorsey referred to in the thread), is becoming a reality
through market principles. Although this will most likely take time, especially in relation to
Section two of this paper explores many of the problems of social media, many of them,
for instance, political polarization could be curbed through decentralization. As explained, our
founders saw regionalism as a check to tyranny. Rather than forcing all players with different
DIGITAL DEMOCRACY 21
values to enter into the same online norms, decentralized levels would allow for people to craft
their own regional identity. On the matter of disinformation, localities could establish a burden of
evidence to post content. Decentralizing allows problem solving to happen, there just needs to
exist 1) inter-network competition, increasing amount of platforms available on the market and
natural inclination to look to centralized forces when problems present themselves (De
Tocqueville, 1865). While it is the case that the issues explored in this paper have to be
addressed in large part through policy, it does not matter how many regulations are laid out and
in what fashion they exist if individuals are not filing their expanded role as citizens of a republic
(Knox, 2019). Each right, exists only embedded inside the context of a responsibility, and free
In 1644 the English poet John Milton wrote a letter to Parliament when they considered
licensing political and religious speech, explaining that the state is not needed to regulate speech
because the individual can do it for themselves. Free speech is built from the ground up,
beginning with the end-user, the speaker, operating as the fundamental unit of analysis from
which the whole structure is generated (Peterson, 2019). In the words of psychologist Carl Jung
“the psychology of the individual is reflected in the psychology of the nation. What the nation
does is done by each individual, and so long as the individual continues to do it, the nation will
do likewise. Only a change in the attitude of the individual can initiate a change in the
Conclusion
nation and finding the greatest solution. But the means by which these topics have been
discussed have shifted, with social media platforms stepping into the role of the modern public
square. Although they have played an active role in expanding the political conversation to
many, platforms have played a large role in corrupting our discourse through polarizing forces,
enabling foreign interference, spreading misinformation, just to name a few of the topics
discussed. Limits are required but what they look like is not self-evident, any solution requires
nuance and an understanding of the role all shareholders play. Through combining solutions
with social media companies, the state, and the individual, the digital public square can return to
finding the best solutions to problems, rather than help create them.
DIGITAL DEMOCRACY 23
References
Applebaum, A. (2019, February 1). Regulate social media now. The future of democracy is at
https://www.anneapplebaum.com/2019/02/01/regulate-social-media-now-the-future-of-de
mocracy-is-at-stake/
Balkin, Jack M., "Free Speech in the Algorithmic Society: Big Data, Private Governance, and
https://digitalcommons.law.yale.edu/fss_papers/5160
Balkin, J. M., (2018, May 28). Free Speech is a Triangle . Columbia Law Review, 2018,
Forthcoming; Yale Law School, Public Law Research Paper No. 640. Available at
SSRN: https://ssrn.com/abstract=3186205
Balkin, J. M. (2004). How Rights Change: Freedom of Speech in the Digital Era. Yale Law
School Legal Scholarship Repository Faculty Scholarship Series, 26( 00). Retrieved from
https://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?referer=https://www.google.co
m/&httpsredir=1&article=1241&context=fss_papers
Barnhart, B. (2019, August 13). Everything you need to know about social media algorithms.
https://sproutsocial.com/insights/social-media-algorithms/
DIGITAL DEMOCRACY 24
Bialik, K. (2018, August 15). 14% of Americans have changed their mind about an issue because
of something they saw on social media. Retrieved from Pew Research website:
https://www.pewresearch.org/fact-tank/2018/08/15/14-of-americans-have-changed-their-
mind-about-an-issue-because-of-something-they-saw-on-social-media/
Bite-Sized Philosophy. (2017, March 7). Jordan Peterson - Liberals And Conservatives Need
https://www.youtube.com/watch?v=3Ho5VZp_ps4
Bolter, J. D. (2019, May 19). Social Media Is Ruining Political Discourse. Retrieved from The
Atlantic website:
https://www.theatlantic.com/technology/archive/2019/05/why-social-media-ruining-politi
cal-discourse/589108/
Bovet, A., & Makse, H. A. (2019, January 2). Influence of fake news in Twitter during the 2016
https://www.nature.com/articles/s41467-018-07761-2 website:
https://www.nature.com/articles/s41467-018-07761-2
Brannon, V. C. (2019, March 27). Congressional Research Service Report: Vol. R45650. Free
Speech and the Regulation of Social Media Content (Research Report No. R45650).
https://fas.org/sgp/crs/misc/R45650.pdf
Brown, R. H., & Irving, L. (1993, December). The Role of Telecommunications in Hate Crimes.
(NTIA) website:
https://www.ntia.doc.gov/legacy/reports/1993/TelecomHateCrimes1993.pdf
Buni, C., & Chemaly, S. (2014, April 13). The secret rules of the internet. Retrieved from The
Verge website:
https://www.theverge.com/2016/4/13/11387934/internet-moderator-history-youtube-face
book-reddit-censorship-free-speech
Carl Jung: Analytical psychology. (n.d.). Retrieved from King's Psychology Network website:
http://www.psyking.net/id160.htm
Clark, M. J. (n.d.). Village of Skokie v. Nat'l Socialist Party of America. Retrieved from Justia
Cliteur, P. B. (2001). Eric Voegelins critique of ideology and the functions of ideology under
https://sites01.lsu.edu/faculty/voegelin/wp-content/uploads/sites/80/2015/09/Cliteur.pdf
Confessore, N., & Rosenburg, M. (2018, November 17). Facebook fallout ruptures Democrats'
longtime alliance with Silicon Valley. Retrieved from The New York Times website:
https://www.nytimes.com/2018/11/17/technology/facebook-democrats-congress.html
Conger, K. (2019, October 30). Twitter Will Ban All Political Ads, C.E.O. Jack Dorsey Says.
https://www.nytimes.com/2019/10/30/technology/twitter-political-ads-ban.html
Cook, S. (2017, February 10). The First Amendment and what it means for free speech online.
https://www.comparitech.com/blog/vpn-privacy/the-first-amendment-what-it-means-free-
speech-online/
Cost, J. (2017, September 4). James Madison: Free Speech Rights Must Be Absolute, Nearly.
https://www.nationalreview.com/2017/09/james-madison-free-speech-rights-must-be-abs
olute-nearly/
East, S. (2016, August 1). Teens: This is how social media affects your brain. Retrieved from
Ebeling, R. M. (2017, August 11). Would-be tyrants capture language to control thought.
https://fee.org/articles/would-be-tyrants-capture-language-to-control-thought/
Histories Texts Cultures: Freedom of Speech The History of an Idea (p. 36). Lanham,
Histories Texts Cultures: Freedom of Speech The History of an Idea (p. 31). Lanham,
(El Ciudadano Imparcial, no. 5 (1813), 40; Orlando Pelayo Galindo, "La libertad de prensa: un
prensa en la Revolucion liberal, ed. Alberto Gil Novales (Madrid: Edit. Universidad
Emmons, L., & Wilson, B. (2019, June). The Guardian unwittingly gives Jordan Peterson's new
https://www.thepostmillennial.com/the-guardian-unwittingly-gives-jordan-petersons-new
-platform-free-advertising/
Fabrega, J., & Sajuria, J. (2013, August 6). The Emergence of Political Discourse on Digital
Networks: The Case of the Occupy Movement. Retrieved from arXiv (Cornell University)
website: https://arxiv.org/pdf/1308.1176.pdf
Federalist No. 51 (1788). (n.d.). Retrieved from The Bill of Rights Institute website:
https://billofrightsinstitute.org/founding-documents/primary-source-documents/the-federa
list-papers/federalist-papers-no-51/
Finck, M. (2019, January). Artificial intelligence and online hate speech. Retrieved from Centre
https://www.cerre.eu/sites/cerre/files/CERRE_Hate%20Speech%20and%20AI_IssuePape
r.pdf
Flock, E. (2017, August 18). Spotify has removed white power music from its platform. But it's
still available on dozens of other sites. Retrieved from PBS News Hour website:
https://www.pbs.org/newshour/arts/spotify-removed-white-power-music-platform-still-av
ailable-dozens-sites
DIGITAL DEMOCRACY 28
Florida, R. (n.d.). Shift Power Back to the Local Level. Retrieved from Politico website:
https://www.politico.com/interactives/2019/how-to-fix-politics-in-america/polarization/s
hift-power-back-to-the-local-level/
Gadde, Vijaya. (2019, October 3). hi - here's our current definition: 1/ Ads that refer to an
election or a candidate, or 2/ Ads that advocate for or against legislative issues of national
importance (such as: climate change, healthcare, immigration, national security, taxes)
(Tweet). https://twitter.com/vijaya/status/1189664481263046656
Galloway, R. W. (1991, January). Santa Clara Law Review: Vol. 2. Basic Free Speech Analysis.
https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=1689&context=lawrevie
Galton, F. (n.d.). The Wisdom of Crowds (Vox Populi) by Francis Galton (Originally Published
https://www.all-about-psychology.com/the-wisdom-of-crowds.html
Gamreklidze, E. (2015, October 1). Political Speech Protection and the Supreme Court of the
https://www.natcom.org/communication-currents/political-speech-protection-and-suprem
e-court-united-states
Gohmert, L. (2018, December 20). [ Gohmert Introduces bill that removes liability protections
for social media companies that use algorithms to hide, promote, or filter user content].
DIGITAL DEMOCRACY 29
https://gohmert.house.gov/news/documentsingle.aspx?DocumentID=398676
Griffith, E. (2017, October 25). Will Facebook kill all future Facebooks? Retrieved from Wired
website:
https://www.wired.com/story/facebooks-aggressive-moves-on-startups-threaten-innovatio
n/
Grimmelmann, J. (2015). The virtues of moderation. Yale Journal of Law and Technology,
https://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1110&context=yjolt
Guynn, J. (2019, April 10). Ted Cruz threatens to regulate Facebook, Google and Twitter over
https://www.usatoday.com/story/news/2019/04/10/ted-cruz-threatens-regulate-facebook-t
witter-over-alleged-bias/3423095002/
H. Rosenberg, (1987, November 24) Neo-Nazis Cloud the Utah Air: "Aryan Nations" to Debut
Haidt, J. (n.d.). More Social Media Regulation. Retrieved from Politico website:
https://www.politico.com/interactives/2019/how-to-fix-politics-in-america/polarization/m
ore-social-media-regulation/
Haidt, J., & Rose-Stockwell, T. (2019, December). The Dark Psychology of Social Networks.
Hare, I., & Weinstein, J. (2009). General Introduction: Free Speech, Democracy, and the
Suppression of Extreme Speech Past and Present [Introduction]. In Extreme Speech and
Haun, W. J. (2017, September 18). The Natural Law of Free Speech. Retrieved from Law &
Heyman, S.J. (2009). Hate Speech, Public Discourse, and the First Amendment. In I. Hare & J.
Hopkins, N. (2017, May 21). Revealed: Facebook's internal rulebook on sex, terrorism and
https://www.theguardian.com/news/2017/may/21/revealed-facebook-internal-rulebook-se
x-terrorism-violencehttps://www.theguardian.com/news/2017/may/21/revealed-facebook-
internal-rulebook-sex-terrorism-violence
Hudson, D. L., Jr. (n.d.). In the Age of Social Media, Expand the Reach of the First Amendment.
https://www.americanbar.org/groups/crsj/publications/human_rights_magazine_home/the
-ongoing-challenge-to-define-free-speech/in-the-age-of-socia-media-first-amendment/
Hudson, D. L., Jr. (2019, April 1). Free speech or censorship? Social media litigation is a hot
http://www.abajournal.com/magazine/article/social-clashes-digital-free-speech
Hudson Jr, D. L. (n.d.). Content Based. Retrieved from The First Amendment Encyclopedia
website: https://mtsu.edu/first-amendment/article/935/content-based
DIGITAL DEMOCRACY 31
Hudson Jr., D. L. (n.d.). Content Neutral. Retrieved from The First Amendment Encyclopedia
website: https://www.mtsu.edu/first-amendment/article/937/content-neutral
Intermediary Liability & Content Regulation. (n.d.). Retrieved from Global Network Initiative
website:
https://globalnetworkinitiative.org/policy-issues/intermediary-liability-content-regulation/
iqsquared. (2018, February 12). Brave new world vs nineteen eighty-four [Video file]. Retrieved
from
https://www.youtube.com/watch?v=31CcclqEiZw&hl=id&client=mv-google&gl=ID&ful
ldescription=1&app=desktop&persist_app=1
John, R. R. (2019). Freedom of expression in the digital age: A historian's perspective. Taylor &
https://www.tandfonline.com/doi/full/10.1080/23753234.2019.1565918
Juan de Olavarria, "Reflexiones a las Cortes" y ostros escritos politicos, ed. Claude Morange
Kessel, P. V., Hughes, A., & Messing, S. (2018, January 19). Very liberal or conservative
legislators most likely to share news on Facebook. Retrieved from Pew Research website:
https://www.pewresearch.org/fact-tank/2018/01/19/very-liberal-or-conservative-legislator
s-most-likely-to-share-news-on-facebook/
file:///Users/calebknox12/Downloads/Paradox_of_Liberalism.pdf
Laslo, M. (2018, April 6). A conversation with Mark Warner: Russia, Facebook and the Trump
https://www.wvtf.org/post/conversation-mark-warner-russia-facebook-and-trump-campai
gn#stream/0
Laub, Z. (2019, June 7). Hate speech on social media: Global comparisons. Retrieved from
https://www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons
Leetaru, K. (2019, August 24). Social Media Platforms Will Increasingly Define 'Truth'.
https://www.forbes.com/sites/kalevleetaru/2019/08/24/social-media-platforms-will-increa
singly-define-truth/#7b4301496427
Lerer, L. (2019, November 21). Debate night: The 'on politics' breakdown. The New York Times
https://www.nytimes.com/2019/11/21/us/politics/democratic-debate-analysis.html?search
ResultPosition=6
The Lincoln-Douglas debates of 1858. (2017, February 16). Retrieved from National Park
Mackowiak, M. (2019, November 13). Competitors challenge Facebook's liberal bias. Retrieved
https://m.washingtontimes.com/news/2019/nov/13/facebooks-liberal-bias-challenged-co
mpetitors/
Mak, T. (2019, August 14). Senator pushes bill to curb 'exploitative and addictive' social media
https://www.npr.org/2019/08/14/750585438/senator-pushes-bill-to-curb-exploitative-and-
addictive-social-media-practices
Malik, M. (2011). Extreme Speech and Liberalism. In I. Hare (Author), Extreme Speech and
Manila Principles on intermediary liability. (n.d.). Retrieved from Manila Principles website:
https://www.manilaprinciples.org/
Masnick, M. (2019, August 21). Protocols, not platforms: A technological approach to free
website:
https://knightcolumbia.org/content/protocols-not-platforms-a-technological-approach-to-f
ree-speech
Mbongo, P. (2011). Hate Speech, Extreme Speech, and Collective Defamation in French Law. In
I. Hare (Author), Extreme Speech and Democracy (pp. 229-230) [PDF e-book]. Retrieved
from
https://ebookcentral-proquest-com.eztcc.vccs.edu:2443/lib/tidewater/reader.action?docID
=4700863&ppg=280
McColley, G. (1937). Smith College Studies in History: Vol. 22. The Defense of Galileo.
Merkovity, N., Imre, R., & Owen, S. (n.d.). Homogenizing social media – affect/effect and
https://www.academia.edu/27480991/Homogenizing_Social_Media_Affect_Effect_and_
Globalization_of_Media_and_the_Public_Sphere
DIGITAL DEMOCRACY 34
Miller, S. (2019, November 5). The top trending featured news story on Twitter is a story no
journalists are really talking about and no major news network or outlet is covering.
There is a fundamental disconnect between media and audience and that's why
https://twitter.com/redsteeze/status/1191808759187685376
Milton, J. (2019). Areopagitica (W. Jonson, Comp.). Middletown, DE. (Original work published
1644)
Morton, V. (2018, July 4). Facebook flags Declaration of Independence as hate speech.
https://www.washingtontimes.com/news/2018/jul/4/facebook-flags-declaration-independ
ence-hate-speec/
Muller, K., & Schwarz, C. (2017, December). Fanning the Flames of Hate: Social Media and
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3082972
Omidyar, P. (2017, October 9). Pierre Omidyar: 6 ways social media has become a direct threat
https://www.washingtonpost.com/news/theworldpost/wp/2017/10/09/pierre-omidyar-6-w
ays-social-media-has-become-a-direct-threat-to-democracy/
O'Neill, K. F. (n.d.). Time, Place and Manner Restrictions. Retrieved from The First Amendment
Encyclopedia website:
https://mtsu.edu/first-amendment/article/1023/time-place-and-manner-restrictions
DIGITAL DEMOCRACY 35
Ortner, D. (2019, August 12). Government regulation of social media would kill the internet —
https://thehill.com/opinion/technology/456900-government-regulation-of-social-media-w
ould-kill-the-internet-and-free
Our ongoing work to tackle hate [Blog post]. (2019, June 5). Retrieved from Youtube Official
Blog: https://youtube.googleblog.com/2019/06/our-ongoing-work-to-tackle-hate.html
Peters, J. (2017). The "Sovereigns of cyberspace" and state action: The First Amendment's
http://btlj.org/data/articles2017/vol32/32_2/peters_web.pdf
Peterson, J. B. (2017, January 16). 2017 maps of meaning 01: Context and background [Video
https://www.youtube.com/watch?v=I8Xc2_FtpHI&feature=youtu.be&list=PL22J3VaeA
BQAT-0aSPq-OKOpQlHyR4k5h&t=6401
Read, M. (2018, December 26). How much of the internet is fake? Turns out, a lot of it, actually.
http://nymag.com/intelligencer/2018/12/how-much-of-the-internet-is-fake.html
Regulating social media: we need a new model that protects free expression. (2018, April 25).
https://www.article19.org/resources/regulating-social-media-need-new-model-protects-fr
ee-expression/
DIGITAL DEMOCRACY 36
Roose, K. (2018, October 28). On Gab, an extremist-friendly site, Pittsburgh shooting suspect
aired his hatred in full. Retrieved from The New York Times website:
https://www.nytimes.com/2018/10/28/us/gab-robert-bowers-pittsburgh-synagogue-shooti
ngs.html?auth=login-email&login=email
Rubenfeld, J. (2019, November 4). Are Facebook and Google State Actors? Retrieved from
Schulze, F., & Ssymank, P. W. (1910). Das Deutsche Studententum von den Aeltesten Zeiten bis
Sebastian, J. F. (2011). The Crisis of the Hispanic World: Tolerance and the Limits of Freedom
Cultures: Freedom of speech: The history of an idea (pp. 103-131). Lewisburg [Pa.]:
Section 5: Political engagement and activism. (2014, June 12). Retrieved from Pew Research
website:
https://www.people-press.org/2014/06/12/section-5-political-engagement-and-activism/
Section 1: Growing ideological consistency. (2014, June 12). Retrieved from Pew Research
website:
https://www.people-press.org/2014/06/12/political-polarization-in-the-american-public/
Shearer, E., & Matsa, K. E. (2018, September 10). News Use Across Social Media Platforms
2018. Retrieved from Pew Research Center Journalism and Media website:
https://www.journalism.org/2018/09/10/news-use-across-social-media-platforms-2018/
DIGITAL DEMOCRACY 37
Slatz, A. (2019, December). Journalists suspended from Twitter for reporting on Pensacola
https://www.thepostmillennial.com/journalists-suspended-from-twitter-for-reporting-on-p
ensacola-shooters-motivation/
Strossen, N. (2018). Hate: Why we should resist it with free speech, not censorship. New York,
Sutton, R. (1953). The Phrase Libertas Philosophandi. Journal of the History of Ideas, 14(2),
310-316. doi:10.2307/2707480
Truth, D. O. (2017, December 7). Youtube is the modern day Gutenberg press- Jordan Peterson.
van Mill, David, "Freedom of Speech", The Stanford Encyclopedia of Philosophy (Summer 2018
<https://plato.stanford.edu/archives/sum2018/entries/freedom-speech/>.
Weinstein, J. (2009). An Overview of American Free Speech Doctrine and its Application to
Extreme Speech. In I. Hare & J. Weinstein (Authors), Extreme speech and democracy
(pp. 81-91).
Wilson, B. (2019, October). Trudeau will censor your social media if reelected . Retrieved from
https://www.thepostmillennial.com/trudeau-will-censor-your-social-media-if-reelected/
DIGITAL DEMOCRACY 38
Wong, J. C. (2019, March 11). #BreakUpBigTech: Elizabeth Warren says Facebook just proved
https://www.theguardian.com/us-news/2019/mar/11/elizabeth-warren-facebook-ads-brea
k-up-big-tech
YouTube bans 'malicious insults and veiled threats'. (2019, December 11). Retrieved from BBC
website: https://www.bbc.com/news/technology-50733180
YouTube continues to restrict many PragerU videos. Fight back. (n.d.). Retrieved from PragerU
website: https://www.prageru.com/petition/youtube/