Libraries and Information Science: Encouraging Discerning Minds
DISCERN Project to Address Social Media Manipulation in the Disinformation Age
Elizabeth A. Pallitto, PhD
Human Information Behavior
17:610:510
Professor E.E. Lawrence
December 12, 2019
word count: 4721
excluding Proposal, Table of Contents, Works Cited
Table of Contents
Framing the Issues ………………………………………………………………….. 3
Research Question ………………………………………………………………….. 4
Narrowing and Defining the User Population ……………………………………. 4
Literature Review: Scope and Selection …………………………………………… 5
Link to Concept Matrix: ……………………………………………………………. 5
Literature Review A: Information Literacy, Anxiety and Poverty ……..……….. 5
Literature Review B: Psychological Models ………………………………………. 6
Literature Review C: Social Media Data Analyses ……………………………….. 6
Literature Review D: Epistemological Anxieties …………………………………...7
Analysis: Solutions for Discerning Disinformation…………………………………9
Ethics of the LIS professions and the role of Bias ………………………………… 9
Conclusion: Restoring Faith in the Civic Sphere as Information Professionals … 10
DISCERN Service Proposal ………………………………………………………… 12
Appendix 1: How to Spot Fake News (IFLA) ……………………………………… 14
Appendix 2: Misleading Twitter Post (SHEG) ……………………………………. 14
Works Cited …………………………………………………………………………. 15
Theoretical Works and Works from Other Disciplines Cited ……………………. 18
DISCERN Social Media Manipulation in the Disinformation Age:
How Libraries and Information Science Encourages Discerning Minds among “Outsiders”
“Any semiotic system that can be used to tell the truth can also be used to lie.” -Umberto Eco, Treatise on Semiotics
In 1789, Thomas Jefferson theorized that citizens of the new republic would have to be informed in order to participate in the civic experiment that was to become the United States. In a letter to Richard Price, Jefferson wrote,
“…[w]herever the people are well informed they can be trusted with their own government; that whenever things get so far wrong as to attract their notice, they may be relied on to set them to rights.” Thomas Jefferson to Richard Price Jan 8, 1789.
Jefferson, T (1789). Letter by Thomas Jefferson to Richard Price. Library of Congress. https://www.loc.gov/exhibits/jefferson/60.html ; image : https://www.loc.gov/item/mtjbib004021/
Author(s)
*=peer-rev
Î=InfoScience
L=Library/ians
2016
Election
InfoLiteracy/
Poverty/ Avoidance
Fake News
Mis-/Dis-info*
Twitter
Posttruth, Post-fact, FactCheck
ConfirmBias
FilterBubble
EchoChamber
Clickbait,
honeypot
Algorithm
Manipul*
Bots/cyborgs
Libraries
public
academic
Info
science
Alvarez, B
x
x
x
x
x
x
x L
x
Banks, M
x
x
x
x
x
x
x L
Batchelor,O * Î
x
x
x
x
x
x
x
X L
x
Bessi+ *Î
x
x
x
x
x
x
x
X L
x
Buschman,J Î *
x
x
x
x
x
X L
x
Cooke, N* Î
x
x
x
x
x
X
x
x
x
X L
x
El Rayess+ *Î
x
x
x
X
x
x
x
x
X L
x
Enli, G *Î
x
x
x
x
x
x
x
x
x
x
Fallis, D* Î
x
x
x
Grinberg * Î
x
x
x
X
x
x
x
Halford+ * Î
x
x
Jacobson, L Î
x
x
X
x L
Jamieson,KH Î
x
x
X
x
X
x
x
x
x
x
Knox, E *
x L
x
Lenker, M *
x
x
x
x
x L
x
MakseBovet* Î
x
x
x
x
X
x
x
x
x
Sullivan, * Î
X
x
x
x
X
X
X
x
X L
x
Walsh *
x
x
X L
x
Yang * Î
x
x
x
x
x
x
x
Zhang+2018*
x
x
x
x
x
x
x
x
Zhang+ EtAl
x
x
x
x
x
x
x
ALA
x
x
x
x
x
x
XL
x
IFLA
x
x
x
x
xL
SHEG
x
x
x
x
x
x
Barclay(video)Î
x
x
x
x
x
X
XL
Essentially, Jefferson warned that for democracy to work, it requires an informed electorate. We the people, according to Jefferson, must keep the government accountable. But from the 2016 election to the present, agents of (dis)information continue to sow discord and confusion among today’s electorate, using social media technology to propagate deliberate falsehoods. Though all are impacted by this phenomenon, the user population for this project focuses upon two groups: a.) information seekers who want to make informed voting decisions, and b.) those who have decided not to trust expert opinion and the established sources that they may perceive as elitist.
This proposal seeks to serve both groups, especially those who inhabit an echo chamber, perhaps through misplaced trust (Cooke, 2017a), belief echoes (Sullivan, 2019, quoting Thorsen 2016), or the now-historical principle of least effort (Zipf, 1949). Their information-seeking is suffused with distrust, and their beliefs filter out avenues and sources of information normally considered trustworthy. Cynical about mainstream media sources, they are suspicious of the educated “elite” that is in part a media creation. Not necessarily poor economically, they are nevertheless characterized by information poverty in an era of information abundance.
The problem described above is laden with paradox: one can be affluent but lack critical thinking. One can distrust “the liberal media” yet consume social media voraciously, a copia of information in new visual forms. Too many options can result in information avoidance, filtering, withdrawal and/or information anxiety and overload, diagnosed by Bawden and Robinson (2009) as pathologies. “Alternative” media may offer them a response to these problems, however. My argument utilizes the concept that uncertainty, anxiety, and a distrust of traditional media fed into platforms they may view as more “authentic” news channels.
Trump fanned the flames of this fire, tweeting negatively about “fake news” hundreds of times before the election. This distrust of mainstream media, a disingenuous but effective move, fed the construction of the “outsider” persona for Trump and his supporters. This is of course an illusion. As Enli (2017) comments, “Trump’s Twitter campaign might critique the mainstream media, but is also in itself a mass media channel.” Trump correctly divined that his outsider status might resonate with voters left behind by the perceived elites. His promise to “drain the swamp” appealed to the anti-elitist, anti-professional element in the population’s psyche.
Narrowing and Defining the User Population: Serving Those Who Distrust Professionalism
With the increasing sophistication of disinformation warfare, how do we provide the distrusting outsiders, information-poor, less-educated or less critical members of the voting public with trustworthy information? How do we begin to propose an information service that would serve a broad spectrum of people, especially those susceptible to extreme right-wing bias in social media use? Finally, how do we keep our own biases in check when serving patrons who are decidedly mistrustful of established media outlets and emotionally attached to their preconceived notions?
In this study, I will draw from Information Science and the related fields of political science, communications, and critical theory applied to the social sciences. Using this interdisciplinary model, I will propose a new category of information seekers. This group is “new” in the sense of unprecedented, due to a new information landscape, but it is based upon prior models of I-S behavior, especially information poverty.
My research focuses upon a section of the population which draws critical attention but also misunderstanding at this cultural and historic moment: voters swayed to vote for Donald Trump over Hillary Clinton because of what they read on social media. (The disinformation-influenced 2016 election and its aftermath of distrust and dissonance influence the present, 2019, with anxieties about trustworthy information leading into the 2020 election). This “new” category includes the self-styled “outsiders” who are information-poor, whether involuntarily swayed by media disinformation, unwittingly allowing themselves to be dis-informed; or voluntarily, through a lack of desire to seek and verify authentic information sources.
These outsiders seem to prefer communication media in a way that appeals to certain instincts: xenophobia, crowd mentality, and a principle of scarcity, or gratification, to name a few. Enli (2017) provides another explanation as well as a working label for this population: “authentic outsiders.” Studying the Trump campaign’s Twitter feed, Enli found that “political marketing” and amateurism” appeared “authentic” to these users, giving Trump an advantage in tapping into this population. Nevertheless, this population was at first difficult to identify. A first question, addressed by the data science literature, is how to identify the group most susceptible to the manipulation of content on social media such as Twitter. Further complicating the question is that these users include overlapping groups such as “white males” or southern evangelical conservatives. Self-identified conservatives, conservative evangelicals, and those with an extreme-right bias were easier to identify and in fact those most targeted with disinformation.
Chatman’s “Impoverished Life-World of Outsiders” (1996) provides me with a concept: information poverty, since explored by many, including Haider and Bawden (2007), and over half of the authors of articles in this paper. My users are not impoverished because they lack access to social and traditional media; they may not be aware of this poverty or its sources. Because social media algorithms are so sophisticated, bots can target Facebook accounts of people already vulnerable to misinformation, having “liked” or “followed” accounts that display an extreme bias similar that resembles their own views, or even shapes thought. Disinformation creates information poverty. If the intent was to “sow discord and confusion,” then they have succeeded well before the 2016 election (Jamieson, 2018, Zhang, 2018, and Mueller, 2019).
Literature Review: Scope and Selection
Most of the articles (and one book) were chosen for their recent publication date: 2016 or later. Articles focused upon principles, such as Fallis, D. (2015) on disinformation, constitute another category of sources, relevant despite preceding the 2016 election. Data analyses of the 2016 election suggest that most effective cyber-disinformation efforts favored and furthered an extreme-right political bias (Bovet and Makse, 2016); (Grinberg et al., 2019); Yang et al (2019). “Canonical” sources in the Information Science profession are chosen for their relevance to theoretical concerns in the paper; these sources include information poverty (Chatman, 1996), and information overload and anxiety (Bawden and Robinson, 2009). Psychological theories provide insight into the population’s information practices; examples range from Festinger (1957) to Lenker (2016). Bridging the psychology and epistemology are sources such as Buschman (2016) and Sullivan (2019). An article by Kamal and Burkell (2011) predates 2016, but its definition of epistemic incertitude and the mechanisms by which people address internal and external uncertainty are timeless. Similarly, critical theory from the social sciences from my prior reading undergirds much of the thinking in this paper, particularly practices of everyday life, structures of knowledge and power (Foucault), and discourse analysis (Bakhtin).
Link to Concept Matrix:
https://drive.google.com/file/d/1U2qNvlWZeXgNbmlyaOvaEEyQv5eO2VMB/view?usp=sharing
Literature Review, Part A: Information Literacy, Anxiety and Poverty
Information literacy and its opposite, information poverty, are central concepts in articles by Alvarez, Bessi & Ferrara, Banks, Buschman, Cooke, Fallis, Jacobson, Batchelor, Walsh, and Yang. Information literacy was coined in 1974 by Paul Zurkowski, then president of the Information Industry Association (Barclay, 2017b). Walsh (2010) suggests the teaching of tools to acquire knowledge rather than deciding what constitutes true knowledge; this argument resembles Knox (2017), opposing censorship despite the potential civic cost of disinformation. As Valenza (2016) notes, freedom of the press is a basic American principle; with that freedom comes the responsibility of helping patrons develop discernment and critical thinking.
Information poverty, or knowledge gaps, is associated with Elfrida Chatman (1996). Chatman’s article “The Impoverished Life-World of Outsiders” provides the concept for this paper, but not the population. Early research on information poverty conflates the condition with socioeconomic poverty or “laziness.” Buschman argues with Cooke’s (2017a) use of the term. The target population of this paper is difficult to categorize and generalize. They are information-poor partly because the information that their bias attracts is generated by agents with motives that they do not perceive. Let us look at the history of the term and then suggest its evolution; information poverty is now related to information overload and anxiety.
An old definition by Childers and Post (1975) equates information poverty with someone who watches a lot of television, never reads books, and is locked into a deficient informal information network (Case and Given, 2007, 121). Today’s information-poor, however, have mostly unlimited information access on their phones, although they may not derive maximum benefit from those devices. Dervin (1989) and Haider and Bawden (2007) define information poverty in opposition to “elites” -- a concept used differently in this paper. Wilson’s (1983) definition describes the information-poor as those whose world and information supply are small; however, today’s information-poor do not have the same problem. In this paper, Information poverty is in critical thinking, not in social class or economic level. Controlled by fear or anger, they could be targets for agents that abuse the power of information.
The research suggests that American voters use least-effort shortcuts in obtaining and processing information about political candidates and issues. Correct information might not be retained accurately. Sullivan (2019) cites psychological studies by Flynn (2017); Thorson (2016), and Lewandowsky (2012) about how disproven or untrue information remains in the memory. For example, a politician’s actions are independent of the reputational damage that has been done, i.e., the appearance of impropriety. These psychological insights explain why the Trump campaign had such success with the “sound bite” approach of Twitter. Enli (2017) claims that a far greater percentage of Trump’s tweets were retweeted by “ordinary users” compared with Clinton’s. The data analyses reviewed here identify many of these users as bots.
Literature Review, Part B: Relevant Psychological Models of Information Behavior
Psychological models such as cognitive dissonance discomfort when one cannot coordinate input with one’s prior beliefs, (Festinger, 1957), help to understand the user population. Confirmation bias refers to the human need to confirm what we believe to be true; this (1998) theory by Nickerson appears in eight (8) of the sixteen (16) main sources in this paper. Related concepts include information overload and anxiety (Bawden and Robinson, 2009).
Motivated reasoning cited by Lenker (2016), is a related phenomenon among information seekers, the “cure” suggested is information literacy education. Librarians should have current training in human information behavior, as motivated reasoning has a great influence upon the processing of political information.
Bias is a concept central to both in the literature review and in the service proposal: Cooke 2017a, Bessi 2016, Batchelor 2017, Lenker 2016, Makse and Bovet 2016) all deal with the concept of bias in one form or another. The former deals with user population bias and the latter discusses preventing bias on the part of librarians, including this proposal’s author.
Disinformation. Cooke (2017a, 2017b) Cooke (in 2017a, article; and 2017b, video) distinguishes between mis- and disinformation, the latter term meaning deliberate deception, often for political purposes. Fallis (2004) addresses the problem of accuracy and refines the terminology and problems of disinformation (2015). Steven Colbert is credited with coining the word truthiness (2005) to satirize the confirmation bias George W. Bush’s subjectively biased confirmation of Harriet Miers (“I know her heart”). The concept extends beyond its this context.
Literature Review, Part C: Social Media Data Analyses
Data analyses of the 2016 election continue to be written. The studies mentioned below suggest that effective cyber-disinformation efforts favored and furthered an extreme-right political bias (Bovet and Makse, 2016); (Grinberg et al., 2019). Yang et al (2019) use RISP data analysis to reach similar conclusions regarding the 2016 election -- and climate change. Statistical analyses by Grinberg et al (2019) found that 27% of the U.S. population visited fake news sources before the 2016 election; that 1% of “individuals,” quite possibly bots, generated 80% of tweets; and that the population influenced skewed to older, more conservative voters.
An onslaught of propaganda on social media (Twitter) was further spread by bot attacks prior to the 2016 election, allowing the information-poor to remain within an echo chamber or filter bubble of human and artificial intelligence. Attributed to Pariser, 2018, these common terms in the information behavior lexicon occur in half the articles reviewed here. Twitter posts by Americans and temporarily transplanted Russians multiplied exponentially the conspiracy theories of the extreme right -- more than the left or far left. (Bovet and Makse, 2016). Using “state-of-the-art social bot detection algorithms,” Bessi and Ferrara (2016) come to the same conclusion: that U.S. civic discourse became distorted enough to influence the results of the 2016 election. Via bot detection algorithms, they concluded that about 20% of the content was driven by bots, “algorithmically driven entities that appear as legitimate users.”
The data and analyses of Bovet/Makse, Bessi/Ferrara, Yang, and Zhang et al point to the same conclusion: that social media bots had the potential “to alter public opinion and endanger the integrity of the Presidential election,” as happened in 2016. In Cyberwar, Kathleen Hall Jamieson, a data analyst who specializes in election analysis, proves this hypothesis based upon decades of experience and overwhelming amounts of data analysis. Like Bovet, Grinberg, Yang, and Zhang (et al), she concludes that the Russian disinformation campaign, using human actors and bots, swung the election for Trump. Mayer (2018) synthesizes Jamieson’s thesis in a New Yorker article, where I first encountered the subject of a data-driven hypothesis of effective social media interference. At the WWW Steering Conference Committee in Geneva, Tambuscio et at (2015), explore what makes hoaxes “go viral.” El Rayess et al (2018) extend the discussion to the question of similar strategies of the political manipulation of social media in Lebanon.
The ability to exponentially disseminate false “news” using bots creates a brave new information landscape with different challenges to information seekers, not to mention upending information-seeking theories of past decades. Belkin’s ASK or anomalous states of knowledge theory (1980), which seems to assume that seekers really want knowledge, might be modified, at least in the light of the deliberately information-poor. Like Knox (2017), Frost-Arnold (2014) favors free speech. Frost-Arnold also offers examples of the misuse of this free speech, such as a British blogger’s appropriation of a woman’s identity. These two articles do not offer a remedy for disinformation’s current challenges to civic literacy in the U.S. and the U.K. People in both countries were stunned by the results of the respective 2016 elections, forcing Americans and Brits to grapple with what the unseen hand of disinformation had wrought.
Literature Review, Part D: Epistemological Anxieties
Disinformation is not new, an argument often made in the context of presentations and articles on fake news. Cooke (2017b) and Barclay (2017b) offer examples of propaganda that predate the internet -- and the printing press. Renaissance “self-fashioning” (Greenblatt, 1980), especially among rulers and patrons of the arts, includes image management, employing literature, art, architecture, and other forms of pageantry in the service of power. (Muir, 1979). “Propaganda,” or dissemination of information, comes from propaganda fides, the title of a 1659 document that commands “spreading the faith.” Propaganda acquired more sinister connotations when its results were violent; such as the Inquisition’s punishment of heretics, literally those with diverse belief choices outside the official version of truth. Similarly, after eliminating critics of their regime, Hitler’s National Socialist party installed their own editors and writers, at newspapers as information gatekeepers. The original term for “fake news” was coined by the Nazis: Lugenpresse meant “lying press” -- any publication that departed from the party line.
Image management, misinformation, propaganda – none of this is new. What is new is unique technologically-enabled damage to democracy in the U.S. and U.K., warning us how ill-prepared we are to combine online disinformation in 2020. As Sullivan (2016) puts it, “The problem is not new, but there is something new about it” (1147). As Ali Velshi (2017) puts it, “The problem is that in the era of fake news, people begin thinking that the real news is fake.” Velshi discusses the social problems of fake Twitter posts overtaking real tweets by roughly 25%, and how high-quality graphics make people trust malicious sources.
In order to understand the human side of the equation, much of the social science literature is useful in combating disinformation. Kamal and Burkell (2011) make a distinction between epistemic incertitude and the psychological mechanisms by which people address uncertainty, internal and external. Predating 2016, their argument still applies to proliferated disinformation: the line between internal and external uncertainty is blurred. As the child of a friend expresses this problem, “I can’t know.” When trust in established journalism erodes and one can no longer distinguish between “real” and false information, the result is confusion and cognitive dissonance, creating risks to individuals’ health as well as to the body politic.
Although this paper’s scope does not encompass subjects such as climate change and the vaccination controversies, two articles here mention these above subjects, adding further support to my claims about “cyberwars” influencing the 2016 election. Broniatowski et al (2018) analyze the effects of amplification on Twitter by Russian agents both AI and human. Reported in an NIH journal (Broniatowski et al (2018) as well as by the New York Times, PBS, and Foreign Policy, disinformation extends to the medical field as well; the vaccine “controversy” literally endangers the health and well-being of Americans. Unlike the pre-election amplification of Twitter bots prejudicing voters against Clinton, bots originating in Russian intelligence amplified both sides of the vaccine debate. This sheds new light on the online forums on vaccines in Hara and Sanfilippo (2017). There, parents sharing information about vaccinations are vulnerable to deliberately malevolent actors. These “disrupters” could be human -- or social media bots.
Like social media campaigns designed to influence elections, these online disinformation campaigns about health issues apparently also erode social cohesion and generate confusion. allowing equal time to invalid, unscientific claims create the illusion of objectivity -- like the climate example cited by Kamal and Burkell (2011), citing Boykoff and Boykoff (2004). Although scientific consensus does exist on climate change, labelling it a debate underestimates its effects, worsening the problem by this very denial. John Buschman, writing on Nov. 8, 2016, warns of the dangers of a both-sides mentality (reminiscent of Trump’s 2017 tweet about “nice people on both sides”). Buschman sees such false equivalence as an irresponsible brand of neoliberalism that belies the core values and the mission of libraries and librarians. I examine this principle further in the conclusion.
Solutions for Discerning Disinformation
The most common category in my research sources was librarians (from academic and public libraries) who published articles in professional journals, most of which were peer-reviewed. Articles by Alvarez (2016), Banks (2017), Berry (2016), and Valenza (2016), are valuable sources for a real-world perspective on the challenges that librarians are facing with regard to disinformation. Sharing her realization of the implications on voting decisions in the 2016 U.S. election, Alvarez begins with an anecdote, explains “clickbait,” and warns colleagues about our changing responsibilities in a post-truth era: librarians’ role(s) of as “partners, educators, and community champions,” uniquely positioned to teach media and information literacy.
Arguing that Internet users need guidance in navigating this confusing terrain, librarians such as Alvarez, Barclay, Buschman, Cooke, Lenker, Walsh, and Sullivan offer perspectives ranging from positive to negative, all of which are helpful to this research.
Epistemological ethics are paramount in articles by Sullivan (2019) and Buschman (2016), both of whom apply critical theory to the epistemological problems of discerning truth in civic discourse. With the realization of extreme-right bias dominating the cybersphere, Sullivan laments the “loss of control, context, and capacity” among information seekers.
Like Sullivan and Buschman, Berry (2016) in “The Misinformation Age,” faces the tide of disinformation with a slightly pessimistic view of librarians’ task as “pitting that trickle of truth against the tides of bought and paid for propaganda and distortions.” Similarly, Banks (2017) offers two concrete suggestions: libraries can offer workshops much like the one proposed in this paper. One such event is “Storytelling without Borders,” in which journalists offer students their fact-finding and -checking expertise. Librarians and journalists, he says, are “natural allies.” He recommends The Trust Project, based in Santa Clara’s Markkula Center for Applied Ethics, an initiative that promotes ethical journalism – as an antidote for “The Misinformation Age.”
Donald Barclay, an academic librarian at UC Merced, gives a video presentation (2017b) begins with a historical perspective, reminding us that in fake news, truth and lies are mixed to make deception more effective. He explains propaganda, clickbait, and offers solutions for academic librarians, some of which are used in the DISCERN proposal. Like Berry, Barclay realizes the enormity of the Goliath against which the David (librarians and libraries) is pitted: “using a water pistol to fight a fire.” (2017a, 26:14; qtd in Sullivan, 2019). Nevertheless, the solutions he proposes, to his students and to patrons in general, are all included in my proposal: Snopes, Politifact, IFLA, and the ACRL Framework for Information Literacy.
Ethics of the LIS Professions and Efforts to Eliminate Bias
While this writer has strong political views, elsewhere expressed in poetry and in academic prose, these views are limited to the term paper part of this project. The service proposal, on the other hand, is designed to keep such biases in check. We are bound to transparency, service, and “objectivity.” While it is ethical to uncover, identify, and debunk disinformation, the information professional has an equal responsibility to eliminate bias on both extremes of the ideological spectrum. (The Stanford History Group’s source-verification exercise is a valuable example: students are taught to distinguish news from sponsored content. I also understood the exercise teaches the distinction between an ethics of gun control and statistics manipulated in the service of that cause (MoveOn citing a survey of gun owners in a Twitter post). Information seekers and students could learn basic verification of Twitter posts, but it is still difficult to distinguish those one disagrees with from bad actors or bots.
The problem with the solution, echoing Sullivan (2019, 1151), is that the research reviewed here demonstrates that the amplification of “news sources” of extreme bias was exponentially greater on the extreme right than on the far left. “Neoliberalism” is the enemy for Buschman, an ideological stance that may not mean much to many librarians on the front lines. Writing the day of the U.S. election, 11/8/2016, Buschman warns that “complete neutrality” in the face of deliberate deception/disinformation can also be an abdication of responsibility to the ethics of the Library and IS profession. Similarly, Lenker (2016) cites the ALA, ACRL, and IFLA for the profession’s ethical commitment to serve the entire population.
Public librarians, a main target group for the Service Proposal, need practical guidance and workable solutions. I agree with Buschman, however, that one cannot be completely objective; that not all sources are created equal; and that some debates, such as the reality of climate change, are not scientifically valid. We must find the golden mean and avoid “paternalism” (Vedder, qtd. in Sullivan, 2019) on the one hand, while eschewing a complete equanimity in the face of dangerous disinformation. For example, in recent televised impeachment hearings, senators promoted conspiracy theories about Ukraine with no factual basis, but which serve Kremlin/Russian interests. Dr. Fiona Hill’s expertise might not speak to an information-poor media environment the same way as the table-pounding, SCIF-invading, made-for-TV dramatic antics of the more extreme Republicans, however.
To quote Professor Joyce Valenza (2016), we are given free speech and freedom of the press as fundamental rights. But increasingly, as disinformation becomes more subtle and media manipulation, more difficult to detect, we will need to serve information patrons in the “age of misinformation.” We will need to use our complete arsenal in the ongoing “cyberwars”: media literacy, combining specialized technology for source-verification; reference librarian skills for trustworthy source-location; and the critical thinking that is the legacy of a humanistic education.
Conclusion: Restoring Faith in the Civic Sphere as Information Professionals
The above literature is but the tip of the iceberg, giving an overview of information science literature, current and suggested library practices, data analyses, and epistemological problems concerning the problem of disinformation in social media and irresponsible information sources. Media manipulation by malicious counter-knowledge in the current “misinformation age,” (Berry, 2016) threatens not only democracy, but the existentialist ideal of an authentic existence. From the Cartesian “cogito, ergo sum” to the legitimation crisis envisioned by the philosopher Jurgen Habermas, our ability to reason and to think critically is supposed to ensure our existence as free human beings. The threat of disinformation to critical thinking abolishes Jefferson’s hope for a responsible, informed citizenry and thus endangers our nation.
The academic librarian John Buschman, who is seriously grappling with the ethics and the semiotics of information, made this pronouncement on the state of our democracy after the 2016 election: “November 8, 2016 […] brought the underside of neoliberal developments and our self-contained media-technology bubbles—and our profession’s bad faith—into full view.” Buschman’s call to the library profession echoes Jefferson, with a postmodern ring of Habermas (1966): “only in an emancipated society, which had realized the autonomy of its members, would communication have developed into that free dialogue of all with all which we always hold up as the very paradigm of a mutually formed self-identity, as well as the ideal of true consensus” (284). Difficult to imagine at this hyper-polarized cultural moment, “free dialogue of all with all” could be used to describe the civic forum of the social media landscape: “free,” as in the image of the Wild West. We are free to argue, to debate, to choose to be informed or misinformed citizens. As information professionals, we have the opportunity and a duty to contribute our knowledge and our critical thinking, and technological strengths. The future success or failure of the American experiment in democracy will, to some extent, depend upon these abilities.
Two-page flyer to be circulated through the local public library’s announcements and website
Proposed Public Library Service:
DISCERN (Disinformation/Information Critical Evaluation of Real News)
Online and in-person: free lectures (several sessions at different times) and Community events with refreshments to draw a diverse cross-section of people
Points to be included in the DISCERN workshop:
The history of the concept of Fake News? “Lugenpresse” context/definition (term paper)
Consider the source: a “friend” sends you a video, a Facebook or Twitter post with a link.
Are you reading/viewing content from a known news media source? Which one?
Opinion Spectrum: show bias chart ranging from extreme right bias to the left equivalent.
Author/authority: The word author is related to the word authority. “Who created this content? (author/creator?) If not, it’s less trustworthy. What expertise does the content creator have?”
Audience: Can you determine the audience for which the content is intended?
Clickbait: definition. Some people run websites with untrue, junk “news” to make money from the advertising revenue. These sites peddle conspiracy theories such as Pizzagate.
C.R.A.A.P. test: currency relevance authority accuracy purpose (Meriam Library, CSU, Chico)
Deepfakes can look real. 1) Facebook fact-checker: UK Conservatives ran ads with altered BBC headline; statistics were distorted in favor of Nigel Farage of the UKIP (UK Independence Party). https://www.google.com/amp/s/mobile.reuters.com/article/amp/idUSKCN1VZ00Z 2) https://www.theguardian.com/politics/2019/dec/01/facebook-bans-tories-distorted-party-advert-with-laura-kuenssberg (The Guardian: Facebook bans Tories' ‘distorted’ party advert w/ Laura Kuenssberg)
Exercise: similar to the Stanford History Group exercise
Examples: Fukushima flowers, NRA post, Stanford History Group, The Denver Guardian, http://abcnews.com.co/
Google or Google Scholar! But fact-check your source. The first hit in a search can be the least accurate, if it is the most well-funded. A Google search for “Dr. Martin Luther King Jr.” once resulted in a website based on racist lies about Dr. King. Think: why would someone do this?
Omission: What is missing? Author’s name, date, data (evidence/sources) publication information. Is the content is sponsored by a business, social, cultural, or religious organization?
P.O.V.: What is the point of view of the content (article, video, post, cartoon, graphic)? Does the content appeal to emotions (fear) or impulses (hatred)? Has it been distorted? Are you sure?
Photographs: right-click on the photo for “image address.” This tells you something about the photograph’s origin. Is the URL very long? Is it from a legitimate website? (Norton check mark)
Reference Desk: Library staff can recommend materials that are accessible and yet trustworthy.
Twitter: is a check mark next to the name? Does it appear out of context in your Twitter feed?
TED talks: seek out experts in the field and read what they have written or listen to TED talks. You may object that expert opinion is difficult to understand, but good speakers explain clearly.
Bias is relative: read sources with which you disagree. Practice forming an argument, based upon evidence and data, to express your viewpoint. Avoid disparaging the other side; don’t judge people based upon their looks; avoid arguments based on emotion, abuse, or name-calling.
Fact checking sites: (descriptions from Batchelor, 2017)
• FactCheck – A nonprofit, nonpartisan website from the Annenberg Public Policy Center of the University of Pennsylvania primarily focused on misstatements made by public officials. Entries include documentation of sources and detailed explanations.
• Politifact – A Pulitzer Prize-winning website operated by the Tampa Bay Times with original statements from public figures, evaluations of the statements with supporting documentation and ratings with a “Truth-O-Meter” that range from “True” to “Pants on Fire”.
• Snopes – A nonpartisan website with a focus on debunking urban legends, online rumors and other questionable/unproven statements. Original documentation provided whenever available.
• Washington Post Factchecker – A fact-checking website created and managed by award-winning veteran journalist Glenn Kessler.
Adjusting the DISCERN service for age, education, and getting it on the road:
Provide “outreach to local organizations and at events, to partnering with businesses and schools” (Alvarez, 2016)
“To become critical consumers of media information, users should question the recency or date of the information (or lack thereof), carefully examine the site’s URL, consider the language being used (e.g., language that is dramatic, inflammatory, or absolute), consider the plausibility of the information, and consider the reputation and leanings of the website providing the information (e.g., The Onion is a known satire site, […] even if the headlines and content seem plausible). Another question to consider is whether the information is reported elsewhere online (i.e., triangulating information).” (Cooke, 2017a).
Ethos: These additional sources of professional ethics provide justification for the DISCERN proposal and further considerations as to the content of information literacy training.
ALA (2005) Resolution on Disinformation, Media Manipulation and the Destruction of Public Information (2004-2005 ALA CD #64):
“Whereas inaccurate information, distortions of truth, deliberate deceptions, excessive limitations on access and the removal or destruction of information in the public domain are anathema to the ethics of librarianship and to the functioning of a healthy democracy…” ALA, The American Library Association’s “Core Values of Librarianship.”
• The International Federation of Library Associations and Institutions “Beacons of the Infor-mation Society: The Alexandria Proclamation on Information Literacy and Lifelong Learning.”
• The Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education.
• United States President Barack Obama’s proclamation that October 2009 should be declared “National Information Literacy Month.”
(The last three sources are suggested in Lenker, 2016).
Appendix 1: Illusion of Choice (ideological Spectrum)
Appendix 2: Misleading Twitter Post (Stanford History Education Group)
Works Cited
Alvarez, B., (2016). Public libraries in the age of fake news. Public Libraries 55(6): 24–27.
American Library Association (ALA) (2005). Resolution on Disinformation, Media Manipulation & the Destruction of Public Information. Chicago: ALA. http://www.ala.org/aboutala/sites/ala.org.aboutala/files/content/governance/policymanual/updatedpolicymanual/ocrpdfofprm/52–8disinformation.pdf
American Library Association (2017) Resolution on Access to Accurate Information. http://www.ala.org/advocacy/intfreedom/statementspols/ifresolutions/accurateinformation
Banks, M. (2016). Fighting fake news: How libraries can lead the way on media literacy. American Libraries Magazine, Dec. 27. https://americanlibrariesmagazine.org/2016/12/27/fighting-fake-news/
Barclay, D.A. (2017a). The role of academic libraries in an era of fake news, alternative facts, and information overload. Presentation at the Spring 2017 Coalition for Networked Information meeting, 3–4 April Albuquerque, New Mexico, USA. Available at https://www.youtube.com/watch?v=ClFBqk3Dvek (accessed 3 Nov 2019).
--. (2017b). The Challenge facing libraries in an era of fake news, The Conversation. 01-04-17. https://theconversation.com/the-challenge-facing-libraries-in-an-era-of-fake-news-70828
Batchelor, O. (2017). Getting out the truth: the role of libraries in fighting fake news. Reference Services Review, Vol. 45 No. 2, 2017, pp. 143-148. © Emerald Publishing Ltd 0090-7324. DOI 10.1108/RSR-03-2017-0006
Bates, M.J., (1989). The Design of Browsing and Berrypicking: Techniques for the Online Search Interface. Medford, NJ: Information Today, Inc. www.gseis.ucla.edu./bates/berrypicking/html
Bawden, D., & Robinson, L. (2009). The dark side of information: overload, anxiety and other paradoxes and pathologies. Journal of Information Science, 35(2), 180–191. https://doi.org/10.1177/0165551508095781
Belkin, N. (1980). Anomalous states of knowledge as a basis for information retrieval. The Canadian Journal of Information Science, 5, 133-143.
Berry, J.N. (2016). The misinformation age. Library Journal 141(14): 10.
Bessi, A. & Ferrara, E. (2016). “Social bots distort the 2016 U.S. Presidential election online discussion.” First Monday 21(11). doi: https://doi.org/10.5210/fm.v21i11.7090
Bovet, A., & Makse, H.A. (2016). “Influence of fake news in Twitter during the 2016 US presidential election.” Nature communications. (Data was painstakingly gathered and analyzed). https://journals.sagepub.com/doi/full/10.1177/0961000618764258
Buschman, J. (2017). November 8, 2016: Core Values, Bad Faith, and Democracy. The Library Quarterly, 87(3), 277–286. https://doi.org/10.1086/692305
Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., … Dredze, M. (2018). Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate. American journal of public health, 108(10), 1378–1384. doi:10.2105/AJPH.2018.304567
Case, D. and Givens, L. (2007). Looking for information: a survey of research on information seeking, needs, and behavior (2nd ed.). Amsterdam: Elsevier/Academic Press.
Chatman, E.A. (1996). The impoverished life-world of outsiders. Journal of the American Society for Information Science, 47(3), 193-206.
*Cooke, N., (2017a). Post-truth, Truthiness, and Alternative Facts: Information Behavior and Critical Information Consumption for a New Age. Library Quarterly: Information, Community, Policy, vol. 87, no. 3, pp. 211–221. 0024-2519/2017/8703-0003$10.00
*Cooke, N. (2017b). Post Truth: Fake News and a New Era of Information Literacy. Webinar. https://youtu.be/7Wp4eZr0d7g
El Rayess, M., Chebl, C., Mhanna, J., & Hage, R. (2018). Fake news judgement. Reference Services Review, 46(1), 146–149. https://doi.org/10.1108/RSR-07-2017-0027
Enli, G. (2017). Twitter as arena for the authentic outsider: exploring the social media campaigns of Trump and Clinton in the 2016 US presidential election. European Journal of Communication, 32(1), 50–61. https://doi.org/10.1177/0267323116682802
Evangelicals are preprogrammed to follow a cult leader who demonstrates their affinity for strong, hierarchal social order. https://www.npr.org/2016/09/13/493615864/when-it-comes-to-our-politics-family-matters [maybe not needed]
*Fallis, D. (2015). What Is Disinformation? Library Trends, Volume 63, No. 3, Winter 2015, pp. 401-426. JHU Press. https://doi.org/10.1353/lib.2015.0014. Accessed 3 Dec 2019.
-- (2004) On Verifying the Accuracy of Information: Philosophical Perspectives. Library Trends, Vol.52, No. 3, Winter ’04, pp. 463–487. Accessed 3 Dec 2019.
Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford, CA: Stanford UP.
Frost-Arnold, K. (2014). Trustworthiness and truth: The epistemic pitfalls of internet accountability. Episteme, 11(1), 63-81
Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., Lazer, D., (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science (New York, N.Y.), 363(6425), 374–378. https://doi.org/10.1126/science.aau2706
Hara, N., & Sanfilippo, M. (2017). Analysis of roles in engaging contentious online discussions in science. (Report). Journal of the Association for Information Science and Technology, 68(8), 1953–1966. https://doi.org/10.1002/asi.23850
*International Federation of Library Associations and Institutions (FLA) (2017) How to Spot Fake News [digital image]. https://www.ifla.org/files/assets/hq/topics/info-society/images/how_to_spot_fake_news.pdf (accessed 1 October 2017).
*Jacobson, L. (2017). The smell test: In the era of fake news, librarians are our best hope. School Library Journal 63(1): 24–29.
Jamieson, K.H., Cyberwar: How Russian Hackers and Rolls Helped Elect a President. NY: Oxford U Press, 2018.
Jefferson, T[homas]. Thomas Jefferson to Richard Price, Jan. 8, 1789. https://www.loc.gov/item/mtjbib004021/
Kamal, A., & Burkell, J. (2011). Addressing uncertainty: When information is not enough. The Canadian Journal of Information and Library Science, 35(4), 384-396.
*Knox, E.J.M. (2017). Opposing censorship in difficult times. Library Quarterly 87(3): 268–276.
Lee, K., Eoff, B. D. & Caverlee, J. “Seven months with the devils: a long-term study of content polluters on Twitter.” In Proc. of the 5th International AAAI Conference on Weblogs and Social Media 185–192. (AAAI, 2006).
*Lenker, M. (2016) Motivated reasoning, political information, and information literacy education. portal: Libraries and the Academy 16(3): 511–528.
Mayer, J., (2018). How Russia Helped Swing the Election for Trump. New Yorker. https://www.newyorker.com/magazine/2018/10/01/how-russia-helped-to-swing-the-election-for-trump
Mueller, R. (2019). Report on the investigation into Russian interference in the 2016 presidential election: submitted pursuant to 28 C.F.R. ʹ600.8(c).(U.S. Government official edition). Washington, D.C: U.S. Department of Justice.
Pew Research Group https://www.pewresearch.org/fact-tank/2019/01/24/like-americans-overall-u-s-catholics-are-sharply-divided-by-party/
politifact https://www.politifact.com/truth-o-meter/article/2019/apr/26/context-trumps-very-fine-people-both-sides-remarks/
Savolainen, R., (2006). Information Use as Gap-Bridging: The Viewpoint of Sense-Making Methodology.
Soll, J. (2016), “The long and brutal history of fake news”, Politico, 18 December, available at: www.politico.com/magazine/story/2016/12/fake-news-history-long-violent-214535
Stanford Graduate School of Education report, Evaluating Information: The Cornerstone of Civic Online Literacy: recent study from researchers at Stanford University (http://ow.ly/DBIS3077Qg2)
*Sullivan, M. (2019). Why librarians can’t fight fake news. Journal of Librarianship and Information Science, 51(4), 1146–1156. https://doi.org/10.1177/0961000618764258
Tambuscio, M, Ruffo, G, Flammini, A, et al. (2015). Fact-checking effect on viral hoaxes: A model of misinformation spread in social networks. 24th International conference on WWW, Florence, 5/18–22, Geneva: International WWW Conference Steering Committee. 977–982.
The Trust Project, initiative of Markkula Center for Applied Ethics at Santa Clara (Calif.) University. https://thetrustproject.org/
*Valenza, J., (2016). http://blogs.slj.com/neverendingsearch/2016/11/26/truth-truthiness-triangulation-and-the-librarian-way-a-news-literacy-toolkit-for-a-post-truth-world/
Velshi, A., (2017). How Fake News Grows in a Post-Fact World | Ali Velshi | TEDxQueensU Mar 9, 2017 https://youtu.be/nkAUqQZCyrM
*Walsh, J. (2010) Librarians and controlling disinformation: Is multi-literacy instruction the answer? Library Review 59(7): 498–511.
Yang, J., Chu, H., & Kahlor, L. (2019). Fearful Conservatives, Angry Liberals: Information Processing Related to the 2016 Presidential Election and Climate Change. Journalism & Mass Communication Quarterly, 96(3), 742–766. https://doi.org/10.1177/1077699018811089
Zhang, Y., Wells, C., Wang, S., & Rohe, K. (2018). Attention and amplification in the hybrid media system: The composition and activity of Donald Trump’s Twitter following during the 2016 presidential election. New Media & Society, 20(9), 3161–3182. https://doi.org/10.1177/1461444817744390
Zipf, G. (1949). Human behavior and the principle of least effort. New York, Hafner Pub. Co.
Theoretical Works Cited and Works from Other Disciplines
Bakhtin, M. (1981). Discourse in the Novel. Bloomington: Indiana University Press.
De Certeau, M. (1984). The practice of everyday life. Berkeley: University of California Press.
Deleuze, G., & Guattari, F. (1983). Anti-Oedipus : capitalism and schizophrenia. Minneapolis: University of Minnesota Press.
Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford, CA: Stanford U Press.
Foucault, M., & Gordon, C. (1980). Power/knowledge: selected interviews and other writings, 1972-1977. Brighton, Sussex: Harvester Press.
Greenblatt, S. (1980). Renaissance self-fashioning: from More to Shakespeare. Chicago: University of Chicago Press.
Muir, E. (1979). Images of power: art and pageantry in Renaissance Venice. American Historical Review, 84.
18 | Pallitto 17:610:510 12.12.19