Providence Lawsuit Over Social Media

Download as pdf or txt
Download as pdf or txt
You are on page 1of 66

Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 1 of 66 PageID #: 1

UNITED STATES DISTRICT COURT


DISTRICT OF RHODE ISLAND

CITY OF PROVIDENCE, Civil Action No. 1:23-cv-170

Plaintiff,
DEMAND FOR JURY TRIAL
v.

META PLATFORMS, INC., FACEBOOK


HOLDINGS, LLC, FACEBOOK
OPERATIONS, LLC, META PAYMENTS
INC., META PLATFORMS
TECHNOLOGIES, LLC, INSTAGRAM,
LLC, SICULUS, INC., SNAP INC., TIKTOK
INC., TIKTOK LTD., TIKTOK LLC,
BYTEDANCE INC., BYTEDANCE LTD.,
ALPHABET INC., XXVI HOLDINGS INC.,
GOOGLE LLC, and YOUTUBE, LLC.

Defendants.
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 2 of 66 PageID #: 2

TABLE OF CONTENTS

I. INTRODUCTION .............................................................................................................. 1
II. JURISDICTION AND VENUE ......................................................................................... 5
III. PARTIES ......................................................................................................................... 5
Plaintiff ................................................................................................................... 5
Meta Defendants ..................................................................................................... 6
Snap Defendant ....................................................................................................... 8
TikTok Defendants ................................................................................................. 8
YouTube Defendants .............................................................................................. 9
IV. FACTUAL BACKGROUND ........................................................................................... 10
Meta’s Purposefully Manipulative and Addictive Social Media
Platform Design .................................................................................................... 11
1. Meta’s Facebook Platform ........................................................................ 13
2. Meta’s Instagram Platform ....................................................................... 20
YouTube’s Purposefully Manipulative and Addictive Design ......................... 26
Snapchat’s Purposefully Manipulative and Addictive Design .......................... 29
TikTok’s Purposefully Manipulative and Addictive Design ............................ 34
V. DEFENDANTS’ BUSINESS MODELS MAXIMIZE USER SCREEN
TIME, FUELING ADDICTION ...................................................................................... 38
VI. DEFENDANTS HAVE CHOSEN DESIGNS THAT ADDICT MINORS
TO THEIR PLATFORMS ................................................................................................ 41
VII. DEFENDANTS’ PLATFORMS CAUSE HARM TO MINORS .................................... 45
Defendants Encourage and Cause Increased Usage of Their Platforms
by Minors .............................................................................................................. 45
Minors’ Brains Are Particularly Susceptible to Manipulation
by Defendants ....................................................................................................... 48
Defendants Have Caused a Significant Mental Health Toll on Minors................ 51
Defendants’ Conduct Has Harmed the City of Providence and Its Schools ......... 53
VIII. THE COMMUNICATIONS DECENCY ACT ALLOWS COMPUTER SERVICE
COMPANIES TO LIMIT HARMFUL CONTENT ......................................................... 57
IX. CLAIMS FOR RELIEF .................................................................................................... 58
X. REQUEST FOR RELIEF ................................................................................................. 63
XI. JURY TRIAL DEMANDED ............................................................................................ 63

i
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 3 of 66 PageID #: 3

Plaintiff City of Providence, Rhode Island brings this action against the following Defendant

Social Media Companies and their affiliates and subsidiaries: Meta Platforms, Inc.; Facebook

Holdings, LLC; Facebook Operations, LLC; (collectively referred to as “Facebook”); Meta

Payments Inc.; Meta Platforms Technologies, LLC; Instagram, LLC (“Instagram”); Siculus, Inc.

(collectively, Facebook, Instagram, and Siculus Inc. are referred to as “Meta”); Snap Inc.

(“Snapchat”); TikTok Inc.; TikTok Ltd.; TikTok LLC; ByteDance Inc.; ByteDance Ltd.

(collectively, the TikTok and ByteDance entities are referred to as “TikTok”); Alphabet Inc.;

XXVI Holdings Inc.; Google LLC; and YouTube, LLC (collectively, the Alphabet, XXVI,

Google, and YouTube entities are referred to as “YouTube”) (collectively, “Defendants” or

“Defendant Social Media Companies”).

I. INTRODUCTION

1. Today’s youth are experiencing a mental health crisis fueled by the Defendant

Social Media Companies, who are ruthlessly seeking to maximize profits at any cost and with

callous disregard for the harm that their platforms cause to minors’ mental and behavioral health.

2. Over the last two decades, across the country, including within the City of

Providence, an astounding number of adolescents have begun to suffer from poor mental health

and behavioral disorders. Indeed, according to the CDC, “[b]etween 2007 and 2018, the national

suicide rate among persons aged 10-24 increased 57.4%.”1

3. In 2021, the U.S. Surgeon General issued an advisory alerting the public that

“[r]ecent national surveys of young people have shown alarming increases in the prevalence of

certain mental health challenges” and that “[i]n 2019 one in three high school students and half of

1
Sally C. Curtin, M.A., State Suicide Rates Among Adolescents and Young Adults Aged 10-24: Untied States, 2000-
2018, CDC (Sept 11, 2020), https://www.cdc.gov/nchs/data/nvsr/nvsr69/nvsr-69-11-508.pdf (last visited Apr. 11,
2023.
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 4 of 66 PageID #: 4

female students reported persistent feelings of sadness or hopelessness, an overall increase of 40%

from 2009.”2

4. The current state of youth mental health has led the American Academy of

Pediatrics, the American Academy of Child and Adolescent Psychiatry, and the Children’s

Hospital Association to declare a national emergency.3

5. These dramatic increases in youth mental health deterioration have directly

coincided with the growth of Defendants’ social media platforms, which have exploited the

vulnerable brains of youth, hooking tens of millions of students across the country into feedback

loops, which Defendants know will lead to excessive use (and abuse) of social media.

6. According to research, 90% of children ages 13-17 use social media.4 Even younger

children also regularly use social media, with studies showing 38% of children ages 8-12 use social

media,5 49% of children ages 10-12 use social media, and 32% of children ages 7-9 use social

media.6

7. As a means of increasing revenue as much as possible, Defendants deliberately

design and operate their platforms to maximize users’ screen time. Defendants accomplish this by

building features intended to exploit human psychology, using complex algorithms driven by

advanced artificial intelligence and machine-learning systems.

2
Protecting Youth Mental Health, U.S. Surgeon General Advisory (Apr. 11, 2023),
https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf.
3
AAP-AACAP-CHA Declaration of a National Emergency in Child and Adolescent Mental Health, Am. Acad.
Pediatrics (Oct. 19, 2021), https://www.aap.org/en/advocacy/child-and-adolescent-healthy-mental-development/aap-
aacap-cha-declaration-of-a-national-emergency-in-child-and-adolescent-mental-health/.
4
Social Media and Teens, Am. Acad. Child & Adolescent Psychiatry (Mar. 2018),
https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Families/FFF-Guide/Social-Media-and-Teens-
100.aspx.
5
Victoria Rideout et al., The Common Sense Census: Media Use by Tweens and Teens, 2021 at 5, Common Sense
Media (2022), https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-
final-web_0.pdf.
6
Sharing Too Soon? Children and Social Media Apps, C.S. Mott Children’s Hosp. Univ. Mich. Health (Oct. 18,
2021), https://mottpoll.org/sites/default/files/documents/101821_SocialMedia.pdf.

2
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 5 of 66 PageID #: 5

8. As part of this quest to maximize revenue, Defendants exploit adolescent users’

still-developing decision-making capacity, impulse control, emotional maturity, and poor

psychological resiliency.

9. Nobody except the Defendant Social Media Companies could have foreseen this

unprecedented youth mental health crisis. Despite knowing that minors are much more likely to

sustain serious psychological harm through social media use than adults, Defendants knowingly

seek to grow the use of their platforms by minors through designs, algorithms, and policies that

promote addiction, compulsive use, and other severe mental harm—and by thwarting the ability

of parents to keep their children safe and healthy by supervising and limiting social media use.

10. The results of Defendants’ actions have a known effect on minors’ mental health

as “[t]here is a substantial link to depression, and that link tends to be stronger among girls” and

“[t]he same is true for self-harm. . . . ‘The more hours a day she spends using social media, the

more likely she is to engage in self-harm behaviors – the link is there for boys as well.’”7 Moreover,

“[m]ost of the large studies show that heavy users of social media are about twice as likely to be

depressed as light users.”8

11. In fact, the Surgeon General has linked social media’s relentless focus on profits

over safety to the youth mental health crisis, stating that social media’s “[b]usiness models are

often built around maximizing user engagement as opposed to safeguarding users’ health and

ensuring that users engage with one another in safe and healthy ways.”9 President Biden recently

reiterated this conclusion, asserting in his recent 2023 State of the Union address that “social media

companies” must be held “accountable for the experiment they are running on our children for

7
Jennifer A. Kingson, Social media’s effects on teen mental health comes into focus, Axios (Jan. 11, 2023),
https://www.axios.com/2023/01/11/social-media-children-teenagers-mental-health-tiktok-meta-facebook-snapchat.
8
Id.
9
Protecting Youth Mental Health, supra note 2.

3
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 6 of 66 PageID #: 6

profit.”10 There is no question that children are facing a growing mental health crisis of epidemic

proportions. And there is similarly no question that Defendants are the cause of this crisis.

12. Plaintiff seeks to hold Defendants responsible for the harm caused by their conduct,

which preys on their youngest and most vulnerable users.

13. Local communities and public schools are on the front lines of the unfair fight

between large corporate Defendants preying on America’s youth and communities attempting to

address the ongoing youth mental health crisis.

14. Indeed, local communities and schools, including Plaintiff and Plaintiff’s schools,

are forced to spend increasing resources as the first responders in an attempt to alleviate the rising

rates of suicidal ideation, depression, anxiety, and other tragic indices of this public health crisis

in minors.

15. This youth mental health crisis is infecting all aspects of education – students

experience record rates of anxiety, depression, and other mental health issues because of

Defendants’ intentional conduct. And these students perform worse in school, are less likely to

attend school, and are more likely to engage in substance use and to act out, all of which directly

affects Plaintiff’s schools’ ability to fulfill their educational mission.

16. That is why Plaintiff’s schools, as part of their mission, and like many schools in

the country, provide mental health services to its students. Plaintiff needs more funding to meet

the ever-growing needs of youth facing this mental health crisis.

17. Plaintiff requires funding to develop a long-term plan to deal with the mental health

crisis and address the record rates of depression, anxiety, suicidal ideation, and the other tragic

10
Remarks of President Joe Biden – State of the Union Address as Prepared for Delivery, White House (Feb. 7, 2023),
https://www.whitehouse.gov/briefing-room/speeches-remarks/2023/02/07/remarks-of-president-joe-biden-state-of-
the-union-address-as-prepared-for-delivery/.

4
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 7 of 66 PageID #: 7

byproducts caused by Defendants. Plaintiff also needs Defendants to be held responsible for their

continued use of algorithms, social media features, and policies that target minors and that

Defendants know are a driving force behind the current mental health crisis for minors in Plaintiff’s

community and public school system, and for Defendants to cease targeting youths for profit.

II. JURISDICTION AND VENUE

18. This Court has jurisdiction pursuant to 28 U.S.C. § 1331(a) because the amount in

controversy exceeds $75,000.00, and Plaintiff and Defendants are residents and citizens of

different states.

19. The Court has personal jurisdiction over Defendants because they do business in

the District of Rhode Island and have sufficient minimum contacts with the District. Defendants

intentionally avail themselves of the markets in this State through the promotion, marketing, and

operation of their platforms at issue in this lawsuit in Rhode Island, and by retaining the profits

and proceeds from these activities, rendering the exercise of jurisdiction by this Court permissible

under Rhode Island law and the United States Constitution.

20. Venue in this Court is proper under 28 U.S.C. §§ 1391(b)(2) and (3) because a

substantial part of the events or omissions giving rise to the claims at issue in this Complaint arose

in this District and Defendants are subject to the Court’s personal jurisdiction with respect to this

action.

III. PARTIES

Plaintiff

21. Plaintiff City of Providence, Rhode Island (“Plaintiff”) is a municipal corporation

with a principal address of 25 Dorrance Street, Providence, Rhode Island.

22. Plaintiff’s public schools have an enrollment of over 21,000 students.

5
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 8 of 66 PageID #: 8

23. The City of Providence provides significant funding for its public schools. For

example, the City of Providence budgeted approximately $134,897,350 for its public schools

(which was an increase of approximately $4.8 million) in Fiscal Year 2022, $130,046,611 in Fiscal

Year 2021, and $130,046,611 in Fiscal Year 2020 (which was an increase of approximately $1.5

million).

Meta Defendants

24. Defendant Meta Platforms, Inc. (“Meta”), formerly known as Facebook, Inc., is a

Delaware corporation with its principal place of business in Menlo Park, California.

25. Defendant Meta develops and maintains social media platforms, communication

platforms, and electronic devices that are widely available to users throughout the United States.

The platforms developed and maintained by Meta include Facebook (including its self-titled app,

Marketplace, and Workplace), Messenger (including Messenger Kids), Instagram, and a line of

electronic virtual reality devices and services called Meta Quest (collectively, “Meta platforms”).

26. Meta transacts or has transacted business in this District and throughout the United

States. At all times material to this Complaint, acting alone or in concert with its subsidiaries

(identified below), Meta has advertised, marketed, and distributed the Meta platforms to

consumers throughout the United States. At all times material to this Complaint, Meta formulated,

directed, controlled, had the authority to control, or participated in the acts and practices set forth

in this Complaint.

27. Defendant Meta’s subsidiaries include: Facebook Holdings, LLC; Facebook

Operations, LLC; Meta Payments, Inc.; Meta Platforms Technologies, LLC; Instagram, LLC; and

Siculus, Inc.

28. Defendant Facebook Holdings, LLC (“Facebook Holdings”) was organized under

the laws of the state of Delaware on March 11, 2020, and is a wholly owned subsidiary of Meta

6
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 9 of 66 PageID #: 9

Platforms, Inc. Facebook Holdings is primarily a holding company for entities involved in Meta’s

supporting and international endeavors, and its principal place of business is in Menlo Park,

California. Defendant Meta is the sole member of Facebook Holdings.

29. Defendant Facebook Operations, LLC (“Facebook Operations”) was organized

under the laws of the state of Delaware on January 8, 2010, and is a wholly owned subsidiary of

Meta Platforms, Inc. The principal place of business of Facebook Operations is in Menlo Park,

California. Defendant Meta is the sole member of Facebook Operations.

30. Defendant Meta Payments, Inc. (“Meta Payments”) was incorporated in Florida on

December 10, 2010, as Facebook Payments Inc. In July 2022, the entity’s name was amended to

Meta Payments, Inc. Meta Payments is a wholly owned subsidiary of Meta Platforms, Inc. Meta

Payments manages, secures, and processes payments made through Meta, among other activities,

and its principal place of business is in Menlo Park, California.

31. Defendant Meta Platforms Technologies, LLC (“Meta Technologies”) was

organized under the laws of the state of Delaware as “Oculus VR, LLC” on March 21, 2014, and

acquired by Meta on March 25, 2014. In November 2018, the entity’s name was amended to

Facebook Technologies, LLC. In June 2022, the entity’s name was amended again, this time to

Meta Platforms Technologies, LLC. Meta Technologies develops Meta’s virtual and augmented

reality technology, such as the Meta Quest line of services, among other technologies related to

Meta’s platforms, and its principal place of business is in Menlo Park, California. Defendant Meta

is the sole member of Meta Technologies.

32. Defendant Instagram, LLC (“Instagram”) was founded by Kevin Systrom and Mike

Krieger in October 2010. In April 2012, Meta purchased the company for approximately $1 billion.

Meta reformed the limited liability company under the laws of the state of Delaware on April 7,

7
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 10 of 66 PageID #: 10

2012, and the company’s principal place of business is in Menlo Park, California. Defendant Meta

is the sole member of Instagram.

33. Defendant Siculus, Inc. (“Siculus”) was incorporated in Delaware on October 19,

2011. Siculus is a wholly owned subsidiary of Meta, which supports Meta platforms by

constructing data facilities and other projects. Siculus’s principal place of business is in Menlo

Park, California.

Snap Defendant

34. Defendant Snap Inc. (“Snap”) is a Delaware corporation with its principal place of

business in Santa Monica, California. Snap transacts or has transacted business in this District and

throughout the United States. At all times material to this Complaint, acting alone or in concert

with others, Snap has advertised, marketed, and distributed the Snapchat social media platform to

consumers throughout the United States. At all times material to this Complaint, Snap formulated,

directed, controlled, had the authority to control, or participated in the acts and practices set forth

in this Complaint.

TikTok Defendants

35. Defendant TikTok Ltd. wholly owns its subsidiary Defendant TikTok LLC

(“TikTok LLC”) which is, and at all relevant times was, a Delaware limited liability company.

36. TikTok LLC wholly owns its subsidiary Defendant TikTok Inc. f/k/a Musical.ly,

Inc. (“TikTok Inc.).

37. TikTok Inc. was incorporated in California on April 30, 2015, with its principal

place of business in Culver City, California. TikTok Inc. transacts or has transacted business in

this District and throughout the United States. At all times material to this Complaint, acting alone

or in concert with others, TikTok Inc. has advertised, marketed, and distributed the TikTok social

media platform to consumers throughout the United States. At all times material to this Complaint,

8
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 11 of 66 PageID #: 11

acting alone or in concert with ByteDance Inc., TikTok Inc. formulated, directed, controlled, had

the authority to control, or participated in the acts and practices set forth in this Complaint.

38. Defendant ByteDance Ltd. (“ByteDance Ltd.”) is a global company incorporated

in the Cayman Islands. Its principal place of business is in Beijing, China. ByteDance Ltd. also

maintains offices in the United States, Singapore, India, and the United Kingdom, among other

locations. ByteDance Ltd. wholly owns its subsidiary Defendant ByteDance Inc.

39. ByteDance Inc. (“ByteDance”) is a Delaware corporation with its principal place

of business in Mountain View, California. ByteDance transacts or has transacted business in this

District and throughout the United States. At all times material to this Complaint, acting alone or

in concert with others, ByteDance has advertised, marketed, and distributed the TikTok social

media platform to consumers throughout the United States. At all times material to this Complaint,

acting alone or in concert with TikTok Inc., ByteDance formulated, directed, controlled, had the

authority to control, or participated in the acts and practices set forth in this Complaint.

YouTube Defendants

40. Defendant Alphabet Inc. is a Delaware corporation with its principal place of

business in Mountain View, California. Alphabet Inc. is the sole stockholder of XXVI Holdings

Inc.

41. Defendant XXVI Holdings Inc. is a Delaware corporation with its principal place

of business in Mountain View, California. XXVI Holdings, Inc. is a wholly owned subsidiary of

Alphabet Inc. and the managing member of Google LLC (“Google”).

42. Defendant Google LLC is a limited liability company organized under the laws of

the state of Delaware, and its principal place of business is in Mountain View, California. Google

LLC is a wholly owned subsidiary of XXVI Holdings Inc., and the managing member of YouTube,

LLC. Google LLC transacts or has transacted business in this District and throughout the United

9
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 12 of 66 PageID #: 12

States. At all times material to this Complaint, acting alone or in concert with others, Google LLC

has advertised, marketed, and distributed its YouTube video sharing platform to consumers

throughout the United States. At all times material to this Complaint, acting alone or in concert

with YouTube, LLC, Google LLC formulated, directed, controlled, had the authority to control,

or participated in the acts and practices set forth in this Complaint.

43. Defendant YouTube, LLC is a limited liability company organized under the laws

of the state of Delaware, and its principal place of business is in San Bruno, California. YouTube,

LLC is a wholly owned subsidiary of Google LLC. YouTube, LLC transacts or has transacted

business in this District and throughout the United States. At all times material to this Complaint,

acting alone or in concert with Defendant Google LLC, YouTube, LLC has advertised, marketed,

and distributed its YouTube social media platform to consumers throughout the United States. At

all times material to this Complaint, acting alone or in concert with Google LLC, YouTube, LLC

formulated, directed, controlled, had the authority to control, or participated in the acts and

practices set forth in this Complaint.

IV. FACTUAL BACKGROUND

44. Social media platforms have grown exponentially over the past decade, from

millions to billions of users, and are reaping billions of dollars in profit through advertising.

45. America’s youth are particularly prevalent users of social media and are,

accordingly, seen as Defendants’ most valuable commodities. This is borne out by minors’ access

to and use of social media, as 95% of teenagers aged 13-17 have cellphones11 and 90% use social

media.12 Studies also show that even younger children have widespread social media use.13

11
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022),
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
12
Social Media and Teens, supra note 4.
13
Sharing Too Soon? Children and Social Media Apps, C.S. Mott Child.’s Hosp. Univ. Mich. Health (Oct. 18, 2021),
https://mottpoll.org/sites/default/files/documents/101821_SocialMedia.pdf.

10
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 13 of 66 PageID #: 13

46. To obtain such a virulent spread across America’s youth, each platform utilizes

intentional design choices and operational methods to keep their audiences captive once on the

platform. Researchers studying the effect social media has on the brain have shown that social

media exploits “the same neural circuitry” as “gambling and recreational drugs to keep consumers

using their products as much as possible.”14

Meta’s Purposefully Manipulative and Addictive Social Media Platform


Design

47. Defendant Meta operates multiple social media platforms, including Facebook and

Instagram, which substantially contribute to the ongoing youth mental health crisis.

48. Meta’s platforms are designed to create dangerous experiences for users where they

are systematically exposed to tailored content that is designed to promote repetitive and addictive

use of their social media platforms. This experience is designed to promote maximum user

engagement with the platform regardless of the consequences this engagement has on the user’s

mental health and overall well-being, rendering Defendants’ social media platforms inherently

dangerous, particularly when used by teens and children.

49. Both the Facebook and Instagram platforms push a “feed” to users. But,

importantly, the “feed” does not consist only of material posted by accounts that the user has

chosen to follow. Rather, it also includes substantial volumes of material specifically selected and

promoted by Meta, as well as advertising. Meta’s algorithms and designs drive the experience the

user has, not the user’s choices.

50. In order to maximize user engagement, and thereby maximize its profits obtained

through advertising revenue made from showing ads to users, Meta promotes harmful and/or

14
Jena Hilliard et al., Social Media Addiction, Addiction Ctr (Apr. 3, 2023),
https://www.addictioncenter.com/drugs/social-media-addiction/#:~:text=Due%20to%20the%20effect%20that,when
%20taking%20an%20addictive%20substance.

11
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 14 of 66 PageID #: 14

unhealthy user experiences. Meta is aware of these inherently dangerous features and has

repeatedly decided against changing them and/or implementing readily available and relatively

inexpensive safety measures, for the stated purpose of ensuring continued growth, engagement,

and revenue increase. Meta’s own analysis recognizes simple solutions could be effective but

Meta has chosen not to implement known tools that could alleviate the harm caused by its social

media platforms.15

51. Moreover, Meta fails to take basic precautions that could protect the hundreds of

thousands of minors using its social media platforms. For instance, Meta allows individuals to

have multiple accounts, does not verify user’s age or identify, and does not confirm the authenticity

of email addresses. These failures exacerbate the harm Meta’s social media platforms cause by

making it impossible to avoid unwanted interactions. Users can open accounts as fast as those

accounts can be blocked and when coupled with the excessive and addictive usage habits Meta’s

social media platforms promote among teens, these features create a perfect storm for depression,

anxiety, and suicide and self-harm.

52. Additionally, Meta knows that underage users are on its platforms and has

deliberately designed its platforms to evade parental authority and consent, including but not

limited to Meta’s failure to verify age and identity, provision of multiple accounts, marketing

aimed at informing minors that they can open multiple accounts, failure to provide a point of contact

for parents to notify Meta of lack of consent, marketing aimed at children that encourages children

to use Meta’s social media platforms without consent, and multiple other features and conduct by

15
Teen Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the U.S. at 40, WSJ (Sept.
29, 2021), https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-
instagram.pdf.

12
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 15 of 66 PageID #: 15

Meta aimed at ensuring young users have a means to access Meta’s social media platforms no

matter the circumstances.

53. These practices, in addition to the harmful design choices Meta has made for its

platforms as detailed below, have caused serious harm to the children and teens in Plaintiff’s

community and attending Plaintiff’s public schools and have caused Plaintiff to expend substantial

resources to address the growing mental health challenges faced by the minors it serves.

1. Meta’s Facebook Platform

54. Facebook is an online social network that is part of the Meta Platforms.

55. Facebook is currently the largest social network in the world with approximately 2

billion daily users of as 2022.16

56. When it was founded in 2004, only students at certain colleges and universities

could use the social media platform—and verification of college enrollment was required to access

the platform.

57. In 2005, Facebook expanded and became accessible to students at twenty-one

universities in the United Kingdom and others around the world. Meta then launched a high school

version of Facebook, which Meta CEO and majority shareholder, Mark Zuckerberg, referred to as

the next logical step. Even then, however, high school networks required an invitation to join.

58. Facebook later expanded eligibility to employees of several companies, including

Apple Inc. and Microsoft. On December 11, 2005, Facebook added universities in Australia and

New Zealand to its network and, in September 2006, Facebook opened itself up to everyone. While

Facebook claimed that it was open only to persons aged 13 and older and with a valid email

address, on information and belief, Facebook made the decision to not actually require verification

16
Stacy Jo Dixon, Number of Daily Active Facebook Users Worldwide as of 4th Quarter 2022 (in Millions), Statista
(Feb. 13, 2023), https://www.statista.com/statistics/346167/facebookglobal-dau.

13
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 16 of 66 PageID #: 16

of age and/or identity and did not verify user email addresses so that underage users could literally

enter nonsense email addresses and would be provided by Meta with access to a Facebook account.

59. Meta’s history of starting as a closed platform limited to only specific users over

the age of 18, and then over the age of 13, demonstrates that Meta knows how to implement design

features meant to restrict access to persons above a certain age. Nonetheless, Meta made a deliberate

decision in 2006 to distribute its platform to everyone in the world with internet access, including

minors, without considering the consequences of this affirmative choice.

60. At the same time that Facebook was expanding its platform, it also implemented a

series of changes to its design meant to increase user engagement with the platform and promote

platform growth including: (1) launching the “like” button; (2) creating a direct messaging

services; (3) changing its newsfeed algorithm to determine what content to show users; and (4)

creating content for users to post on its platform.

61. These design changes, while profit maximizing for Facebook, had damaging

consequences for Facebook’s young users.

62. For example, Facebooks’ implementation of a feed and the “like” button increase

social comparison pressure and publicly visible social metrics, such as “likes,” turn social

interaction into a competition, in ways that are particularly harmful to adolescents. Moreover,

adolescents’ under-developed ability to self-regulate means they are particularly vulnerable to the

dopamine spikes caused by these external stimuli that activate the brain’s reward system.17 Meta

is well-aware of the harm of its “like” feature as indicated by the employees involved in creating

17
Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022),
https://doi.org/10.1186/s40359-022-00990-7.

14
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 17 of 66 PageID #: 17

the features’ public statements later decrying it and discussing its harms after leaving the

company.18

63. Additionally, Facebook’s messaging system is harmful to underage users as it

creates damaging and dangerous interactions for minors. In the direct messaging application,

which is integrated into Facebook, minors can encounter unwanted and harmful interactions such

as bullying and sexual exploitation. Additionally, through the message application, children are

vulnerable to unsupervised interactions with adults who may be predators or other bad actors.

64. Facebook has also designed and implemented harmful algorithms. For instance,

Facebook uses algorithmic data mining that is able to pair users with whatever experience will

maximize their engagement with its platform. Facebook directs users to the experiences that will

maximize their engagement even if it exposes children and teens to harmful and destructive

content. For instance, the algorithmic data may tailor a Facebook feed for a young girl with body

image issues to show pro- anorexia content or may tailor a feed for a child struggling with

depression to show pro-suicide content. Facebook has also designed and implemented a group

recommendation algorithm that directs all users, including minors, to harmful groups that they

would not have been exposed to but for Facebook’s programming choices.

65. Facebook also creates content, including images and GIFs, and licenses thousands

of hours of music, for users to use in videos they share on Facebook. These features are designed

to encourage minor users posting on Facebook and is meant to increase user’s engagement on

Facebook.

66. In addition to these harmful in app features, Facebook uses push notifications and

emails which create addictive behavior. These features make individuals compelled to return to

18
See, e.g., Paul Lewis, ‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia, The Guardian
(Oct. 6, 2017), https://www.theguardian.com/technology/2017/oct/05/smartphone-addiction-silicon-valley-dystopia.

15
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 18 of 66 PageID #: 18

Facebook’s platform. Moreover, Facebook sends these notifications regardless of the time of day

which means minors receive these notifications even at night when they should be sleeping or

when they are in school.

67. Facebook’s privacy settings and default profile settings are also harmful to minors.

An individual’s Facebook profile may be public or private. On a public profile, any Facebook user

can view the photos, videos, and other content posted by the user with a public profile, whereas, on

a private profile, the user’s content may only be viewed by the user’s followers, which the user

must approve. Critically, Facebook chose to make user profiles public by default for many years.

This allowed all users to see and interact with underage users with public profiles by default. Even

now, while Facebook claims that it is defaulting certain categories of users into private platforms,

it does not limit who may change their setting to public. Once a user changes their settings to

public, Facebook again allows all users to message and interact with any user, even if the user is a

minor.

68. Permitting underage users to have public profiles exposes these young users to a

wide swath of other users on the Facebook platform. Disastrously for children and teens, many of

these potential connections are harmful.

69. Ultimately, the totality of the Facebook experience, created by the feature and

design choices made by Facebook, leads to addictive, compulsive, and excessive use by minors

and Facebook knows that youths are particularly vulnerable to these outcomes.

70. Despite knowing about these outcomes, Facebook has chosen to continue to

implement these addictive design features because they are good for Facebook’s business

advancement.

16
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 19 of 66 PageID #: 19

71. Indeed, these design choices were deliberately made to maximize user engagement.

As Sean Parker, Meta’s first President, explained in a 2017 interview:

The thought process that went into building these applications, Facebook
being the first of them, to really understand it was all about: “How do we
consume as much of your time and conscious attention as possible?” And
that means that we need to sort of give you a little dopamine hit every once
in a while, because someone liked or commented on a photo or a post or
whatever. And that's going to get you to contribute more content, and that’s
going to get you, you know, more likes and comments. It’s a social-
validation feedback loop that that it’s exactly the kind of thing that a hacker
like myself would come up with, because you’re exploiting a vulnerability
in human psychology. The inventors, creators — it’s me, it’s Mark
[Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people —
understood this consciously. And we did it anyway.

72. “God only knows what it’s doing to our children’s brains,” Mr. Parker also

remarked in the same interview.

73. Similarly, in testimony before Congress in September 2020, Tim Kendall,

Facebook’s first director of monetization, reaffirmed that Meta had chosen to design its platforms

in ways it knew to be dangerous in order to maximize engagement. Mr. Kendall explained:

We sought to mine as much human attention as possible and turn it into


historically unprecedented profits. To do this, we didn’t simply create
something useful and fun; we took a page from Big Tobacco’s playbook,
working to make our offering addictive at the outset….

The next page in Big Tobacco’s playbook was to add bronchodilators to


cigarettes. This allowed the smoke to get in contact with more surface area
of the lungs. Allowing for misinformation, conspiracy theories, and fake
news to flourish were Facebook’s bronchodilators.

But that incendiary content wasn’t enough. Tobacco companies then added
ammonia to cigarettes to increase the speed with which nicotine traveled to
the brain. Facebook’s ability to deliver this incendiary content to the right
person, at the right time, in the exact right way—through their algorithms—
that is their ammonia. And we now know it fosters tribalism and division.

Social media preys on the most primal parts of your brain, it provokes, it
shocks, and it enrages. . . .

Facebook and their cohorts worship at the altar of engagement and cast

17
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 20 of 66 PageID #: 20

other concerns aside, raising the voices of division, anger, hate, and
misinformation to drown out the voices of truth, justice, morality, and
peace.19

74. Mr. Parker and Mr. Kendall’s confessions stand in stark contrast with Meta’s public

statements about the safety of its platforms made at the time it was expanding the reach of its

platform and the design elements meant to prolong interaction. Throughout its changes, re-designs,

and launches, Facebook founder and CEO, Mark Zuckerberg, made public statements promising

the public that safety was Meta’s top priority. For example, in February of 2017, Mr. Zuckerberg

posted on his personal Facebook page a statement titled “Building Global Community,” in which

he talked at length about how Meta is focused on safety, how it intends to use its artificial

intelligence to the fullest to keep users safe, and how amazing Facebook is for bringing

communities together, promoting critically important social groups, and other statements that were

untrue and profoundly dangerous, given what was actually happening at Facebook and what Mr.

Zuckerberg knew about the harms his platforms were causing American youth including those in

Plaintiff’s community and schools.

75. In fact, despite these public facing reassurances of Facebook’s safety, in 2017, Meta

employees were internally reporting to management that Facebook was causing harmful

dependencies. Worse, despite these known harmful dependencies, Meta was also already

marketing to children under 13, despite clear legal mandates that it could not allow children under

13 on its platform. And Meta leadership, including Mr. Zuckerberg himself, actively rejected

proposed re-designs intended to minimize the harms to child and teen users, like the youth in

Plaintiff’s community and schools.

19
Testimony of Tim Kendall, House Committee on Energy & Commerce (Sept. 24, 2020),
https://nsarchive.gwu.edu/sites/default/files/documents/023.pdf

18
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 21 of 66 PageID #: 21

76. Faced with these undeniably harmful consequences of its design choices, internally,

Meta employees have offered countless suggestions and recommendations as to design changes

Meta could make to protect its users from the harms Meta causes. Yet, over and over again, Meta

leadership has declined, delayed, or outright ignored the vast majority of those in favor of its own

financial and growth-related interests.

77. The reason for Meta’s resistance to changing the harmful nature of Facebook’s

platform is simple: it directly profits from the time, attention, and data its young users provide it

and from the content it encourages young users to post to Facebook.

78. Tellingly, Meta has conducted studies relating to social comparison harms on its

platforms titled: “Social Comparison: Topics, Celebrities, Like Counts, Selfies” and “Appearance-

Based Social Comparison on Instagram.” These studies have shown that the toxic stew of design

features Meta purposely embedded in its platforms cause intense mental harms to young people.

Yet Meta has continued to systemically implement and continue to use these harmful design

features.

79. While Meta knows that it is harming its young users, leadership has decided to

prioritize ever increasing profits over the health and well-being of its minor users.

80. This fact was detailed by the testimony of Facebook employee France Haugen

before Congress. Ms. Haugen testified: “The company’s leadership knows ways to make Facebook

and Instagram safer and won’t make the necessary changes because they have put their immense

profits before people.”20 Haugen continued, “[Facebook’s] profit optimizing machine is generating

self-harm and self-hate—especially for vulnerable groups, like teenage girls. These problems have

20
Frances Haugen, Statement before United States Senate Committee on Commerce, Science and Transportation,
Sub-Committee on Consumer Protection, Product Safety, and Data Security (Oct. 4, 2021),
https://www.commerce.senate.gov/services/files/FC8A558E-824E-4914-BEDB-3A7B1190BD49.

19
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 22 of 66 PageID #: 22

been confirmed repeatedly by Facebook’s own internal research.”21 As emphasized by Haugen,

“Facebook became a $1 trillion company by paying for its profits with our safety, including the

safety of our children.”22 Haugen’s explosive testimony and documentary evidence have

unfortunately not changed this conduct.

81. Because Meta continues to prioritize its own bottom line over mental health,

thousands of American children and teens, including those in Plaintiff’s community and schools,

continue to use Facebook daily and suffer exposure to its harmful user experience.

2. Meta’s Instagram Platform

82. Instagram is a social media platform that was launched in October 2010 to feature

photos taken on mobile devices. Instagram was acquired by Facebook for $1 billion in April 2012.

83. After Facebook acquired Instagram, it underwent tremendous growth driven by

Facebook’s implementation of design and developmental changes meant to increase user engagement.

These changes to the design and development of Instagram were undertaken without regard to their

impact on children and teen users of the social media platform.

84. Many of the design features implemented on Instagram are the same ones made on

Facebook. These include: (1) a “like” button; (2) direct messaging; (3) an algorithm designed to

determine what content to show users; and (4) creating content for users to post on its platform.

As described above, these features are meant to encourage compulsive and addictive use of the

social media platform leading to negative outcomes for child and teen users.

85. The “like” button feature on Instagram, direct messaging feature, and content creation

features on Instagram all work like the same features on Facebook and have the same corresponding harms

for adolescent users as described above. Similarly, like Facebook, Instagram permits minors to have public

21
Id.
22
Id. (emphasis in original).

20
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 23 of 66 PageID #: 23

accounts which, for the reasons discussed above, poses known harms to the children and teen users of

Instagram.

86. The push notification in Instagram also causes the same harms as the Facebook push

notification features, while also imposing additional harms through its uniquely problematic design. In the

case of Instagram, Defendant Meta collects individualized data—not just about the user, but also

about the user’s friends and contacts—and then crafts notifications, decides notification frequency,

and notifies users via text and email using this data. Meta’s notifications to individual Instagram

users are specifically designed to, and do, prompt them to open Instagram and view what Instagram

selected, increasing sessions, and resulting in greater profits to Instagram irrespective of user’s

health or wellbeing.

87. As recent leaks of internal documents show, Instagram’s feed algorithm also poses a unique

harm to Instagram’s teen users, of which Instagram is well aware. Meta exerts control over a user’s

Instagram “feed,” including through certain ranking mechanisms, escalation loops, and/or

promotion of advertising and posts specifically selected and promoted by Meta based on, among

other things, its ongoing planning, assessment, and prioritization of the types of information most

likely to increase engagement. In the case of certain user groups, like teens, this control translates

to Meta’s deliberate and repeated promotion of harmful and unhealthy online experiences, which

Meta knows is causing harm to minor users.

88. Instagram also has a search feature called “Explore,” where a user is shown an

endless feed selected by an algorithm designed by Meta based upon the users’ demographics and

prior activity in the application. The feed is not based on the user’s searches or requests. Instead,

Meta crafts the feed via its algorithms (which Meta in turn programs to increase engagement and

in other ways Meta knows to be harmful to users, but more profitable to Meta), as well as paid

21
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 24 of 66 PageID #: 24

advertisements created with Meta’s assistance or approval, and the like. Indeed, Meta’s internal

analyses indicate that the “Explore” feature is amongst the most damaging features for teens:

See Teen Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the
U.S., supra note 12, at 31.

89. These leaked reports from Meta also show that it is aware that Instagram is causing

serious mental health problems for its teen users and has even categorized the various harms this

social media platform inflicts on its teen users.23

23
Teen Mental Health Deep Dive, Instagram, https://s.wsj.net/public/resources/documents/teen-mental-health-deep-
dive.pdf.

22
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 25 of 66 PageID #: 25

23
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 26 of 66 PageID #: 26

See supra, “Teen Mental Health Deep Dive,” pp. 18, 27, 32 (examples only).

90. The Instagram platform also has features known as “Reels” and “Stories,” which

promote the use of short videos and temporary posts, respectively. These features were developed

to appeal to teens and Meta knows that the features coalesce into a toxic environment that is

addictive and harmful:

24
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 27 of 66 PageID #: 27

See Teen Girls Body Image and Social Comparison on Instagram, supra note 12, at 33-34.

91. Yet, despite its knowledge of these harmful consequences of Instagram’s

algorithms, Meta refused to implement changes that could protect the mental health of its young

users, instead, once again, prioritizing profits over the well-being of its children and teen users.

Indeed, during Meta’s 2022 Third Quarter Earnings Conference call, Mark Zuckerberg touted that

increasing time spent on the Company’s platforms is one of the metrics it follows, noting that reels

25
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 28 of 66 PageID #: 28

are “‘incremental’ for users spending time on Instagram” and that “‘[t]he trends look good here,

and we believe that we’re gaining time spent share on competitors like TikTok.’”24

92. Meta consciously disregards any duty to protect minors in Plaintiff’s community

and schools.

YouTube’s Purposefully Manipulative and Addictive Design

93. YouTube is the second-most visited website on the internet. Astoundingly, surveys

by the Pew Research Center in 2022 found that 95% of American teenagers used YouTube and

that one in five American teenagers reported that they used YouTube almost constantly.25

94. YouTube earns the bulk of its revenue through advertisements. In order to

maximize the profits earned through these revenues, YouTube has a design that allows it to embed

targeted advertising directly into the video clips that its users watch, as well as promote featured

content.26

95. Individuals can create channels on YouTube where they post content they create.

When these channel owners cross a certain viewership threshold they can elect to monetize the

channel by delivering advertisements to viewers. The revenue earned from advertisements on these

channels is shared between the channel owner and YouTube. YouTube also offers systems,

policies, and features to encourage creators to post more and earn rewards that can be converted

into cash.

96. YouTube has two types of advertising that its channel owners can use. The first is

contextual advertising which is informed by the particular channel or video. The second is

24
Kate Duffy, Mark Zuckerberg says Instagram Reels are booming despite celebrities such as Kim Kardashian and
Kylie Jenner slamming the app for being like TikTok, Yahoo! Finance (Oct. 27, 2022),
https://finance.yahoo.com/news/mark-zuckerberg-says-instagram-reels-121319465.html.
25
Emily Vogels et al., supra note 11.
26
Andrew Beattie, How YouTube Makes Money Off Videos, Investopedia (Oct. 31, 2021),
https://www.investopedia.com/articles/personal-finance/053015/how-youtube-makes-money-videos.asp.

26
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 29 of 66 PageID #: 29

behavioral advertising which is informed by the behavior of the device owner as tracked across

different websites, apps, and devices. YouTube defaults to behavioral advertising and, while it

technically permits channel owners to turn off default behavioral advertising and serve instead

contextual advertising that does not track viewers, almost no channel owners make this choice.

This is likely because channel owners receive warnings that disabling behavioral advertising can

significantly reduce their channel’s revenue.

97. YouTube’s advertising strategy is effective, as demonstrated by the fact that it

generated total advertising revenues of $28.8 billion and $29.2 billion in fiscal years 2021 and

2022 respectively.

98. In order to maximize its advertising profits, YouTube has designed and

implemented algorithms meant to create compulsive, addictive use of it platform by minors and

pushes users into dangerous “rabbit hole” experiences.

99. While YouTube has refused to publicly disclose its algorithms, its VP of

Engineering has described the algorithm it uses in broad terms as follows:

To provide such custom curation, our recommendation system doesn’t


operate off of a ‘recipe book’ of what to do. It’s constantly evolving,
learning every day from over 80 billion pieces of information we call
signals. That’s why providing more transparency isn’t as simple as listing
a formula for recommendations, but involves understanding all the data that
feeds into our system. A number of signals build on each other to help
inform our system about what you find satisfying: clicks, watchtime, survey
responses, sharing, likes, and dislikes. 27

100. Whatever the particulars of YouTube’s algorithms, it is well aware of the fact that

the algorithms on its platform promote and amplify violent and harmful experiences.

27
Cristos Goodrow, On YouTube’s recommendation system, Inside YouTube (Sept. 15, 2021), https://blog.youtube/
inside-youtube/on-youtubes-recommendation-system/.

27
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 30 of 66 PageID #: 30

101. Internally, YouTube employees have notified leadership of these issues in the

YouTube algorithm and, each time such notice i s provided, they are told by YouTube

leadership, “Don’t rock the boat.”28 According to individuals within YouTube

The company spent years chasing one business goal above others:
‘Engagement,’ a measure of the views, time spent and interactions with
online videos. Conversations with over twenty people who work at, or
recently left, YouTube reveal a corporate leadership unable or unwilling to
act on these internal alarms for fear of throttling engagement.

Id.

102. Because YouTube concluded in 2012 that the more people watched the more ads it

could sell, it set a company-wide goal to reach one billion hours of viewing a day, and rewrote its

recommendation engine to maximize for that goal. Id. In order to achieve this goal, YouTube re-

designed itself to maximize addiction and stayed the course on programming its algorithm to

prioritize engagement over user safety, despite its knowledge that such programming was harming

a significant number of its users – including children and teens.

103. Moreover, YouTube’s algorithm-based experience is fundamental to its functionality.

Indeed, “YouTube has described its recommendation system as artificial intelligence that is

constantly learning which suggestions will keep users watching. These recommendations, it says,

drive 70 percent of views, but the company does not reveal details of how the system makes its

choices.”29

28
Mark Bergen, YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2, 2019,
5:00 am), https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored- warnings-letting-
toxic-videos-run-rampant.
29
Max Fisher & Amanda Taub, On YouTube’s Digital Playground, an Open Gate for Pedophiles, N.Y. Times (June
3, 2019), https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html.

28
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 31 of 66 PageID #: 31

104. YouTube’s algorithm drives users towards the content they should watch next, thus users

experience on the platform is driven by the algorithm and not user’s choices. Accordingly, YouTube drives

young users towards content they would not have been exposed to but for YouTube’s design choices.

105. YouTube knows that underage users are on its YouTube platform and has

deliberately designed its platform in a manner intended to evade parental authority and consent.

106. YouTube is used by many millions of minors every day, including students in

Plaintiff’s community and schools, who have become addicted to it and suffer other severe mental

harms as a result of how YouTube has designed, setup, and operates its platform design and

features.

107. YouTube consciously disregards any duty to protect minors in Plaintiff’s

community and schools.

Snapchat’s Purposefully Manipulative and Addictive Design

108. Snapchat was founded in 2011 and quickly became an incredibly popular social

media app among U.S. teens with 59% of children between the ages of 13-17 reporting using it.30

109. Snapchat started as a photo and short video sharing social media application that

allows users to form groups and share posts or “Snaps” that disappear after being viewed by the

recipients—and became well known for this self-destructing message feature. Specifically,

Snapchat allows users to form groups and share posts, or “Snaps,” that disappear after being

viewed by the recipients. However, Snapchat quickly evolved from there, as its leadership made

design changes and rapidly developed new features that were intended to, and that did increase,

Snapchat’s popularity among teen users.

30
Emily Vogels et al., supra note 11.

29
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 32 of 66 PageID #: 32

110. Among the features added by Snapchat were video capabilities, stories, and chat

features. These features evolved and also grew to include various propriety features including “Our

Story,” Geofilters and Community Geofilters, and Snapcash.

111. Snapchat monetized its user base by generating money through advertisements to

its users.

112. Snapchat estimates that it has tens of millions of teen users. Against this backdrop,

Snapchat has numerous algorithmic features that promote dangerous use of its platform by

teenagers.

113. One of these features is “Quick Add” which sends messages to users suggesting

they should “friend” another user on Snapchat. These Snap-initiated messages result in exposure

to harmful contacts, bullying, and dangerous predators and are designed to reinforce addiction and

increase the odds of maintaining more users for longer.

114. Snapchat also generates an “Explore” feed that pushes out a never ending stream

of video. These features are designed by Snap to grab and keep users’ attention for as long as

possible each day, and have led many people, from psychologists to government officials, to

describe Snapchat as “dangerously addictive.”

115. Snapchat also offers several unique messaging and data features. It is perhaps most

famous for its self-destructing message design feature, which appeals to minors and makes it more

difficult for parents to monitor their children’s social media activity. This is an inherently

dangerous feature because it both encourages and allows minor uses to exchange harmful, illegal,

and sexually explicit images with adults, and provides those same adults with an efficient vehicle

to recruit victims. Snapchat is a go-to application for sexual predators because of this feature.

30
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 33 of 66 PageID #: 33

116. For years Snapchat has received reports of child abuse and bullying occurring

through its platform and because of its features.31 Despite these alarming reports, Snapchat

continues to use and promote these features so that it does not decrease the popularity of platform.

117. The disappearing photo feature is also dangerous because it does not actually work

as advertised. Indeed, while teens rely on Snap’s representations when taking and sending photos

that they will disappear, recipients are actually able to save photos – and then often use the photos

a teen took under the assumption that it would be automatically deleted to bully, exploit, and/or

sexually abuse the teen who took the photo.

118. In 2014, Snapchat added “Stories” and “Chat” features that allowed users to post

longer stories that could be viewed by users outside the user’s friends. On information and belief,

during the relevant time period, Snapchat’s algorithmically ranking of Stories made sure that

Stories from accounts that a user interacted with the most appeared at the top of the user’s Stories.

119. Snapchat also allows users to enable the sharing of their location, through a tool

called Snap Map, which allows the users’ followers (and the public for Snaps submitted by the

users) to see the user’s location on a map. At all times relevant, this feature was available to all

users, including minors. This is an inherently dangerous feature, which serves no practical purpose

– but that does provide strangers and predators with access to the location of minor victims. This

feature has directly contributed to stalking and other, physical harms and assaults perpetrated on

minors, and these are harms known to Snapchat.

120. Snap also has a “My Eyes Only” functionality that encourages and enables minor

users to hide harmful material from parents by allowing them to hide material in a special tab that

31
Zak Doffman, Snapchat Has Become A ‘Haven for Child Abuse’ with its ‘Self-Destructing Messages’, Forbes (May
26, 2019, 5:13 am), https://www.forbes.com/sites/zakdoffman/2019/05/26/snapchats-self-destructing-messages-
have-created-a-haven-for-child-abuse/.

31
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 34 of 66 PageID #: 34

requires a passcode, and where material cannot be recovered – even by Snapchat itself – without

the correct passcode. The material self-destructs if a user attempts to access the hidden folder with

the wrong code. My Eyes Only allows Snapchats’ young users to hide potentially harmful material

from parents and/or legal owners of the devices used to access Snap.

121. On information and belief, Snapchat’s disappearing messages are harmful for this

reason as well. Snapchat has possession, custody, or control of that data, and knows that it will be

relevant and material in the event of litigation, but has designed its technologies – i.e., its advertised

“disappearing” functionality which suggests that Snap itself no longer has access to such data – in

a manner that frustrates and actively prevents parents from monitoring the activity of their

underage children on Snapchat. These are serious harmful features, which Snapchat should be

required to remedy immediately.

122. Like Meta and TikTok, Snapchat also sends push notifications and emails to

encourage addictive behavior and to increase use of Snapchat. Snapchat’s communications are

triggered and based upon information Snapchat collects from and about its users, and Snapchat

“pushes” these communications to teen users in excessive numbers and disruptive times of day.

These notifications are specifically designed to, and do, prompt them to open Snapchat, increasing

sessions, and resulting in greater profits to Snapchat. Even the format of these notifications has

been designed to pull users back on to the social media platform—irrespective of a user’s health

or wellbeing.

123. Snapchat also features a series of rewards including trophies, streaks, and other

signals of social recognition similar to the “likes” metrics available across other platforms. These

features are designed to encourage users to share their videos and posts with the public. Moreover,

they are designed to be addictive, and to encourage greater use of Snapchat without regard to any

32
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 35 of 66 PageID #: 35

other content or third-party communication. While the names of these various metrics have

changed over time, and certain metrics have been phased out and others phased in, their core

collective function—rewarding harmful over engagement with Snapchat—has never changed.

124. These features serve no purpose other than creating dependencies on Snapchat by

children and teens, which dependencies in turn cause sleep deprivation, anxiety, depression, anger,

shame, interpersonal conflicts, and other serious harms to mental and physical health.

125. Snapchat incorporates several other features that serve no functionality purpose, but

that do make Snapchat more appealing to children and teens (i.e., avatars, emojis, and games) while

simultaneously using known mechanisms to addict those same children and teens (i.e., streaks and

trophies offering unknown rewards). These features and the ones discussed above were particularly

addictive to youths and were targeted to underage users.

126. The Snap Streak feature is unique to Snapchat and is one of the most – if not the

most – addictive feature available especially to teenagers. Snap Streaks provide a measure of a

user’s interaction with another user, in the form of a symbol representing the user’s consistent

engagement with the other user’s posts after a certain amount of time. If the user fails to engage

with future messages from that user fast enough, Snap removes this symbol from both users’

profiles. Because of Snapchat’s successful efforts to integrate itself into the lives of American

children, this creates social pressure to engage with Snapchat—or risk the humiliation of losing

Snap Streaks. Snapchat has known for years that Snap Streak is addictive and generates

compulsive use of the app in minors, yet it continues to provide and promote that feature to teens and

children.

127. Snapchat has also developed images for users to decorate the pictures or videos

they post, and Snapchat has developed Lenses, which are augmented reality-based special effects

33
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 36 of 66 PageID #: 36

and sounds for users to apply to pictures and videos users post on Snapchat, and World Lenses to

augment the environment around posts. Snapchat also has acquired publication rights to music,

audio, and video content that its minor users can incorporate in the pictures and videos they post

on Snapchat.

128. These images, Lenses, and licensed audio and video content supplied and created

by Snapchat frequently make a material contribution to the creation or development of the minor

user’s Snapchat posts. Indeed, in many cases, the only content in a user’s Snapchat post are images,

Lenses, and licensed audio and video content.

129. Snap consciously disregards any duty to protect minors in Plaintiff’s community

and schools.

TikTok’s Purposefully Manipulative and Addictive Design

130. TikTok was launched in or around 2017 for iOS and Android in most markets

outside of mainland China and became available worldwide after merging with another Chinese

social media service, Musical.ly, on August 2, 2018.

131. TikTok is a video sharing social media application where users create, share, and

view short video clips. TikTok hosts a variety of short-form user videos, from genres like pranks,

stunts, tricks, jokes, dance, and entertainment with durations from fifteen seconds to ten minutes.

132. Users on TikTok who open the TikTok application are automatically shown an

endless stream of videos selected by an algorithm developed by TikTok to shape the user

experience on the “For You” page based upon the user’s demographics, likes, and prior activity

on the app.

133. TikTok, just like the other Defendants, has designed its algorithms to addict users

and cause them to spend as much time on the application as possible, including through advanced

analytics that create a variable reward system tailored to user’s viewing habits and interests.

34
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 37 of 66 PageID #: 37

134. A leaked internal TikTok document titled “TikTok Algo 101”32 was created by

TikTok’s engineering team in Beijing and offers details about both the app’s mathematical core

and insight into the company’s understanding of human nature. The document explains that in the

pursuit of the company’s “ultimate goal” of adding daily active users, it has chosen to optimize for

two closely related metrics in the stream of videos it serves: “retention”— that is, whether a user

comes back— and “time spent.” The document offers a rough equation for how videos are scored,

in which a prediction driven by machine learning and actual user behavior are summed up for each

of three bits of data: likes, comments and playtime, as well as an indication that the video has

been played.

135. Moreover an article by the New York Times, explained how TikTok markets itself

as an “artificial intelligence company.” “The most obvious clue is right there when you open the

app: the first thing you see isn’t a feed of your friends, but a page called ‘For You.’ It’s an

algorithmic feed based on videos you’ve interacted with, or even just watched. It never runs out of

material. It is not, unless you train it to be, full of people you know, or things you’ve explicitly told

it you want to see. It’s full of things that you seem to have demonstrated you want to watch, no matter

what you actually say you want to watch. Imagine a version of Facebook that was able to fill your

feed before you’d friended a single person. That’s TikTok.”33 Another article by the New York

Times confirms, “The FYP algorithm is TikTok’s secret sauce, and a big part of what makes it so

accurate is ByteDance’s global reach. Every swipe, tap and video viewed by TikTok users around

32
Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021), https://www.nytimes.com/2021/12/05/
business/media/tiktok-algorithm.html.
33
John Herrman, How TikTok is Rewriting the World, N.Y. Times (Mar. 10, 2019),
https://www.nytimes.com/2019/03/10/style/what-is-tik-tok.html.

35
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 38 of 66 PageID #: 38

the world—billions and billions of data points a day—is fed into giant databases, which are then

used to train artificial intelligence to predict which videos will keep users’ attention.”34

136. TikTok also features and promotes various “challenges” where users film

themselves engaging in behavior that mimics and “one ups” other users posting videos related to

a particular challenge. TikTok promotes users creating and posting videos of challenges identified

by a system of hashtags that are promoted within TikTok’s search feature.

137. TikTok’s app and algorithm have created an environment in which TikTok

“challenges” are widely promoted and result in maximum user engagement and participation, thus

financially benefiting TikTok. At the same time, TikTok “challenges” involve users filming

themselves engaging in behavior that is routinely dangerous or risky.

138. TikTok’s algorithm presents these often-dangerous “challenges” to users on their

FYP and encourages users to create, share, and participate in the “challenge.”

139. These are just some examples of how TikTok operates to generate profit, at the

expense of the health and well-being of its users, particularly its child and teen users.

140. Until mid-2021, TikTok also and by default made all users profiles “public,”

meaning that strangers, often adults, could view and message underage users of the TikTok app.

This also meant that those strangers could then contact children directly.

141. Like Meta and Snapchat, TikTok also sends push notifications and emails to

encourage addictive behavior in minors and to increase minor use of TikTok. TikTok’s

communications are triggered and based upon information TikTok collects from and about its

users, and TikTok “pushes” these communications to teen users in excessive numbers and at

disruptive times of day. These notifications are specifically designed to, and do, prompt them to

34
Kevin Roose, Is TikTok a Good Buy? It Depends on What’s Included, N.Y. Times (Aug. 5, 2020),
https://www.nytimes.com/2020/08/05/technology/tiktok-deal-algorithm.html.

36
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 39 of 66 PageID #: 39

open TikTok, increasing sessions, and resulting in greater profits to TikTok. Even the format of

these notifications has been designed to pull users back on to the social media platform—

irrespective of a user’s health or wellbeing.

142. Despite this, TikTok markets itself as a family-friendly social media application,

and markets to children and teens.

143. TikTok exclusively controls and operates the TikTok platform for profit, which like

Instagram and Snapchat, creates advertising revenue through maximizing the amount of time users

spend on their platforms. Accordingly, while TikTok purports to have a minimum age requirement

of 13-years-old, it does little to verify user age or enforce its age limitations despite knowledge

that underage use is widespread.

144. TikTok does not seek parental consent for underage users or provide warnings or

adequate controls that would allow parents to monitor and limit the use of TikTok by their children.

TikTok does not verify user age, enabling and encouraging teens and children to open TikTok

accounts, providing any age they want, without parental knowledge or consent.

145. Further, based on TikTok data leaked to the New York Times, internal TikTok

documents show that the number of daily U.S. users in July of 2020 estimated by TikTok to be 14

or younger—a whopping 18 million—was almost as large as the number of over-14 users, which is

around 20 million. The rest of TikTok’s U.S. users were classified as being “of unknown age.”35

146. On information and belief, a substantial percentage of TikTok’s U.S. users are age

13 or younger.

147. Like the other Defendants, TikTok has tried to boost engagement and keep young

users hooked to its social media platform by any means necessary.

35
Raymond Zhong, Sheera Frankel, A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety Questions,
N.Y. Times (Aug 14. 2020), https://www.nytimes.com/2020/08/14/technology/tiktok-underage-users-ftc.html.

37
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 40 of 66 PageID #: 40

148. TikTok has also developed memes and other images for users to apply to images and

videos they post on TikTok. TikTok also has acquired publication rights to music that its users can

incorporate in the pictures and videos they post on TikTok. TikTok knows that it is harming teens

yet consistently opts for prioritization of profit over health and well-being of its youth and teen

users— the millions of youth and teen users who continue to use its inherently dangerous and

harmful social media platform every single day.

149. Indeed, as evidenced by ByteDance’s Chinese version of TikTok (Douyin),

protections are available.36

150. TikTok consciously disregards any duty to protect minors in Plaintiff’s community

and schools.

V. DEFENDANTS’ BUSINESS MODELS MAXIMIZE USER SCREEN TIME,


FUELING ADDICTION

151. Defendants all have the same goal: to fuel addition by maximizing user screen time,

including usage by minors, in order to drive advertising revenues. Defendants receive revenue

from advertisers who pay a premium to target advertisements to exploit specific user behavior and

demographic groups, including youth—one of the most profitable target audiences.

152. While Defendants advertise their platforms as “free” because they do not charge their

users, they generate massive revenues by finding unique and increasingly dangerous ways to capture

user attention and target advertisements to their minor users. In fact, Defendants’ revenue is

directly linked to the level of user engagement and the amount of time users spend on their

platforms, which directly correlates with the number of advertisements that can be shown to each

36
China: Children given daily time limit on Douyin – its version of TikTok, BBC (Sept. 20, 2021),
https://www.bbc.com/news/technology-58625934.

38
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 41 of 66 PageID #: 41

minor user. For example, Defendants use changing rewards that are designed to prompt minor

users to engage with their social media platforms in excessive and dangerous ways.

153. Defendants know, or in the exercise of ordinary care should know, that their designs

have created extreme and addictive usage by their minor users, and Defendants knowingly or

purposefully designed their platforms to encourage such addictive behaviors. For example, all the

achievements and trophies in Snapchat are unknown to users until they are unlocked. The

Company has stated that “[y]ou don’t even know about the achievement until you unlock it.” This

design is akin to a slot machine but marketed toward minor users who are even more susceptible

than gambling addicts to the variable reward and reminder system designed by Snapchat. The

system is designed to reward increasingly extreme behavior and usage by minors.

154. Similarly, Facebook and Instagram, like Snapchat and TikTok, are designed around

a series of features that seek to exploit minor users’ susceptibility to persuasive design and

unlimited accumulation of unpredictable and uncertain rewards, including “likes” and “followers.”

This design is unreasonably dangerous to the mental well-being of minors’ developing minds, and

has resulted in mental health disorders in youths in local communities, like the City of Providence,

and schools.

155. Defendants have reportedly employed thousands of psychologists and engineers to

help make their platforms maximally addictive. For example, Instagram’s “pull-to-refresh” is based

on how slot machines operate. It creates an endless feed experience, designed to manipulate brain

chemistry and prevent natural end points that would otherwise encourage minor users to move on

to other activities.37

37
Daniel Kruger, Ph.D., M.S., Social Media Copies Gambling Method ‘to create psychological cravings’, University
of Michigan Institute for Healthcare Policy & Innovation (May 8, 2018), https://ihpi.umich.edu/news/social-media-
copies-gambling-methods-create-psychological-cravings.

39
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 42 of 66 PageID #: 42

156. Rather than warning users or parents of the addictive design of their social media

platforms, Defendants actively conceal the dangerous and addictive nature of their platforms,

lulling minor users and parents into a false sense of security. This includes consistently playing

down their platforms’ negative effects on children in public statements and advertising, making

false or materially misleading statements concerning safety, and refusing to make their research

public or available to academics and lawmakers who request it.

157. Defendants have repeatedly represented to the public and the government that their

platforms are safe and not addictive.

158. For example, YouTube represents that it enforces its “Community Guidelines using

a combination of human reviewers and machine learning,” and that its policies “aim to make

YouTube a safer community ….” TikTok represents in its community guidelines that its priority is

“safety, diversity, inclusion, and authenticity,” and Snap’s Terms of Service claim, “We try hard

to keep our Services a safe place for all users.”

159. Defendants know that their platforms are designed to be, and are, addictive, and

that millions of minor users are addicted and/or engaging in excessive and risky use, leading to

mental health issues. Defendants also know, or in the exercise of reasonable care should know,

that their social media platforms are unreasonably dangerous to the mental well-being of minor

users’ developing minds.

160. Yet Defendants continue to engineer their platforms to keep users, and particularly

minors, engaged longer and more frequently. This “engineered addiction” includes features like

bottomless scrolling, tagging, notifications, and live stories.

161. Defendants also exploit minor users’ susceptibility to persuasive design and

unlimited accumulation of unpredictable and uncertain rewards (like “likes,” “followers,” “views,”

40
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 43 of 66 PageID #: 43

“streaks,” and “trophies”). As recently detailed by the FTC, these design features are

manipulative, dark patterns that deceive minor users into staying on the application and steer

excessive usage of social media platforms.38

162. Defendants purposefully engineer addiction by minors to exploit this advertising

demographic and reap significant profits in advertising revenue.

VI. DEFENDANTS HAVE CHOSEN DESIGNS THAT ADDICT MINORS TO THEIR


PLATFORMS

163. Defendants have intentionally designed their social media platforms to maximize

minors’ screen time, using complex algorithms and other features designed to exploit human

psychology and driven by the most advanced computer algorithms and artificial intelligence

available.

164. In particular, Defendants willfully, recklessly, and intentionally designed their

social media platforms to exploit minors in order to extend and expand their users’ engagement

with their products. Defendants target youth to exploit the still-developing brains of children

through the use of artificial intelligence, machine learning, and complicated algorithms to promote

extreme utilization. Defendants calibrate and optimize these addiction methods on a continual

basis with one goal: to maximize revenues.

165. Defendants designed and have progressively modified their platforms to promote

problematic and excessive use that they know is indicative of addictive and self-destructive use by

minors. For example, Defendants use information “feeds” that deliver personalized content,

including photos, videos, and other promoted subject matter to promote maximum engagement by

their minor users on an endless cycle.

38
FTC Staff Report, Bringing Dark Patterns to Light, FTC Bureau of Consumer Protection (Sept. 2022),
https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-
%20FINAL.pdf.

41
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 44 of 66 PageID #: 44

166. YouTube, Facebook, Instagram, and TikTok all use complex algorithms to optimize

user interaction through these never-ending “feeds.” The endless cycle has been described by

psychologists as a “flow state” that distorts a user’s ability to perceive time.39 Former Google design

ethicist Tristan Harris has described this endless cycle as being intentionally designed to eliminate any

reason to pause or discontinue using the platform by replacing the traditional close-ended experience of

consuming media with an infinite one.40

167. Defendants know that algorithm-controlled feeds promote unlimited “scrolling”—

a type of use that studies have identified as detrimental to users’ mental health. Defendants promote

this use because it allows them to display more advertisements and obtain more revenue from each

minor user.

168. Meta, YouTube, and TikTok’s algorithm-controlled features are designed to ensure

experiences most likely to increase user engagement, which often means experiences that

Defendants know to be harmful to their minor users. This includes experiences that minors would

otherwise never have but for Defendant’s sorting, prioritizing, and/or affirmative pushing of such

experiences to minors’ accounts.

169. In the words of one, high-level departing Meta employee,

39
Gino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022),
https://doi.org/10.1186/s40359-022-00990-7.
40
Von Tristan Harris, The Slot Machine in Your Pocket, Spiegel International (July 27, 2016),
https://www.spiegel.de/international/zeitgeist/smartphone-addiction-is-part-of-the- design-a- 1104237.html.

42
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 45 of 66 PageID #: 45

“Why We Build Feeds” (Oct. 4, 2019), at 1.41

170. The addictive nature of Meta, YouTube, Snap, and TikTok’s platforms and the

complex and psychologically manipulative design of their algorithms are unknown to ordinary

consumers, particularly minors.

171. Instead of disclosing the addictive and harmful nature of their social media

platforms, Defendants go to significant lengths to prevent transparency by making public

statements about the safety of their platforms that are not true and posing as “free” platforms.

172. Meta, YouTube, and TikTok’s algorithms adapt to promote whatever user

experiences will trigger minor users’ engagement and maximize their screen time. Once a minor

user engages with abusive, harmful, or destructive experiences, Defendants’ algorithms will direct

the minor user to experiences that are progressively more abusive, harmful, and destructive to

maximize the user’s screen time.

173. For example, Defendants manipulate human psychology using the same techniques

deployed by casinos and slot machines, including by the use of intermittent variable rewards

(“IVR”), which work by tapping into the human reward pathway that regulates dopamine

41
https://www.documentcloud.org/documents/21600853-tier1_rank_exp_1019.

43
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 46 of 66 PageID #: 46

production. Through IVR, Defendants space out dopamine-triggering stimuli vis-à-vis staggering

user rewards on their platforms. IVR causes users to engage with the platform again and again, not

disengaging because there is an anticipatory “hit” of dopamine right around the corner. As a result,

minor users anticipate the next “hit” of dopamine, creating the same craving effect experienced by

gambling addicts.

174. Defendants such as Instagram intentionally delay the loading time of content with

each scroll or refresh to mimic the same anticipatory dopamine reward that casinos use. Defendants

also knowingly manipulate the reward pathways of users by giving a carefully-engineered moment

to build anticipation, just like the spinning of the reels or the shuffling of the cards in a casino.

Whenever user content receives a “like” or a “heart,” Defendants’ platforms provide a reward to

minors for their use, leading to additional use and addiction.

175. Defendants deliberately and intentionally employ these schemes to encourage use

by teens and children, who are especially vulnerable due to their developing brains. Young brains

are especially susceptible to Defendants’ strategies and manipulation. Between the ages of 10 and

12, children undergo fundamental changes in their neurological reward pathways that promote

extra dopamine and oxytocin rewards for socially advantageous behavior such as admiration,

attention, and approval from others.42 Unlike adults with a more developed prefrontal cortex to

regulate emotions and who have a more developed sense of self, developing adolescents have a

limited capacity to resist emotional and social pressures and regulate impulses. As a result, they

seek admiration, attention, and approval from others with much more persistence than adults.

176. Defendants have knowingly, deliberately, and intentionally designed and managed

their social media platforms in a way that endangers minors. Although Defendants know that their

42
Zara Abrams, Why young brains are especially vulnerable to social media, Am. Psych. Ass’n (Aug. 25, 2022),
https://www.apa.org/news/apa/2022/social-media-children-teens.

44
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 47 of 66 PageID #: 47

social media platforms harm minors, they continue to develop increasingly sophisticated

technologies and methods to ensure minors are engaged with their platforms more intensively and

for longer periods.

177. Defendants’ efforts have been successful, as minors increasingly spend time on

social media platforms. Not only are adolescents particularly vulnerable to Defendants’

psychological manipulations and resulting addiction, but they are also at much greater risk of

developing mental disorders as a result of their social media usage driven by Defendants.

VII. DEFENDANTS’ PLATFORMS CAUSE HARM TO MINORS

Defendants’ Encourage and Cause Increased Usage of Their Platforms by


Minors

178. The addiction created by Defendants and compulsion to use social media have

negative mental health consequences on minors. Defendants have driven this compulsion and have

contributed to the increased mental health disorders experienced by minors, including anxiety,

depression, eating disorders, and suicide.

179. Defendants were the cause of the current youth mental health crisis long before the

pandemic. But the crisis worsened during the pandemic, as school children spent more and more

time online and increased their exposure to Defendants’ platforms and manipulations. During

remote learning and as students returned to school, there was an increased incidence of students

paying attention to Defendants’ social media products while in class in lieu of being mentally

present in the classroom.

180. Prior to the pandemic, American adolescents increasingly used social media for

substantial portions of their day, increasing usage by 3% per year for tweens and 11% per year for

45
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 48 of 66 PageID #: 48

teens.43 Following 2019, social media usage increased at a 17% per year pace, with an average of

8 hours and 39 minutes of daily usage by teens in 2021.44 Reviews of global studies confirm drastic

increases in screen usage as social interactions shifted online.45

181. A comparison of Pew Research studies from 2014-2015 to studies in 2022 confirms

a stark increase in the proportion of time spent online by teens:46

43
The Common Sense Census: Media Use by Tweens and Teens, Common Sense (2021),
https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-final-
web_0.pdf.
44
Id.
45
A review of 46 research studies globally found that on average screen time for children and adolescents increased
by 52% (84 minutes per day) during the pandemic. See Sheri Madigan et al., Assessment of Changes in Child and
Adolescent Screen Time During the Covid-19 Pandemic, JAMA Pediatrics (Nov. 7, 2022),
https://jamanetwork.com/journals/jamapediatrics/fullarticle/2798256.
46
Emily Vogels et al., supra note 11.

46
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 49 of 66 PageID #: 49

182. As part of that extreme growth in overall social media usage, social media

dependence increased, with 54% of teens saying it would be hard to give up social media.47

183. Confirming this explosion of social media usage by youths, a significant proportion

of teens report using Defendants’ social media platforms almost constantly.

184. The specific mix of platform use is based on current trends, with TikTok not even

registering in 2018 surveys, but being the second-highest used platform in 2022.

47
Id.

47
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 50 of 66 PageID #: 50

48

185. Notably, increased usage has led teens to report a decreasing enjoyment of social

media.49

Minors’ Brains Are Particularly Susceptible to Manipulation by Defendants

186. In February 2023, during the Senate Judiciary Committee’s hearing on social

media’s impact on child, teen, and adolescents’ mental health, Senator Richard Blumenthal stated

that America is in the midst of “a public health emergency egregiously and knowingly exacerbated

48
See Monica Anderson & Jingjing Jiang, Teens, Social Media and Technology 2018, Pew Research Center (May 31,
2018), https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/.
49
See Victoria Rideout et al., 2021 The Common Sense Census: Media Use by Tween and Teens at Table D, Common
Sense (Mar. 9, 2022), https://www.commonsensemedia.org/research/the-common-sense-census-media-use-by-
tweens-and-teens-2021.

48
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 51 of 66 PageID #: 51

by Big Tech. Aggravated by toxic content on eating disorders, bullying, even suicide. Driven by

Big Tech’s black box algorithms, leading children down dark rabbit holes.”50

187. During adolescence, the brain is still developing and is associated with

psychosocial immaturity. Numerous studies have concluded that excessive use of social media can

have a detrimental effect on the mental well-being of youth and teens. Social media use, especially

compulsive or excessive use, can result in various psychological disorders, including behavioral,

developmental, emotional, and mental disorders, such as anxiety, depression, thoughts of suicide,

and eating disorders.

188. This is exacerbated because teens’ brains are not yet fully developed in regions

related to risk evaluation, emotional regulation, and impulse control. MRI studies have shown that

the prefrontal cortex is one of the last regions of the brain to mature. The frontal lobes—and, in

particular, the prefrontal cortex—of the brain play an essential part in higher-order cognitive

functions, impulse control, and executive decision- making. These regions of the brain are central

to the process of planning and decision-making, including the evaluation of future consequences

and the weighing of risk and reward. They are also essential to the ability to control emotions and

inhibit impulses. During childhood and adolescence, the brain undergoes myelination, the process

through which the neural pathways connecting different parts of the brain become insulated with

white fatty tissue called myelin. The brain also undergoes “pruning”—the paring off of unused

synapses, leading to more efficient neural connections. Through myelination and pruning, the

brain’s frontal lobes change to help the brain work faster and more efficiently, improving

50
Press Release, Blumenthal Calls on Congress to Pass Kids Online Safety Legislation During Senate Judiciary
Committee Hearing, Sen. Richard Blumenthal, (Feb. 14, 2023), https://www.blumenthal.senate.gov/newsroom/press/
release/blumenthal-calls-on-congress-to-pass-kids-online-safety-legislation-during-senate-judiciary-committee-
hearing.

49
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 52 of 66 PageID #: 52

the “executive” functions of the frontal lobes, including impulse control and risk evaluation. This

shift in the brain’s composition continues throughout adolescence and into young adulthood.

189. For minors, important aspects of brain maturation remain incomplete, including

those associated with executive functions and emotion and cognition. These parts of the brain that

are critical for control of impulses, emotions, and mature decision-making are still developing in

teens. Defendants’ social media platforms are designed to exploit minors’ diminished decision-

making capacity, impulse control, emotional immaturity, and lack of psychological resiliency.

Studies have found that immature brains may not possess sufficient self-control to deal with the

overwhelming aspects of social media and may lead minors to engage in addictive usage patterns,

which Defendants encourage.51

190. Defendants know, or in the exercise of reasonable care should know, that because

their minor users’ frontal lobes are not fully developed, those users experience enhanced dopamine

responses to stimuli on Defendants’ social media platforms and are much more likely to become

addicted to Defendants’ platforms. Minors also exercise poor judgment in their social media

activity and act impulsively in response to negative social media encounters.

191. Defendants also know, or in the exercise of reasonable care should know, that minor

users of their social media platforms are much more likely to sustain serious physical and

psychological harm through their social media use than adult users. Nevertheless, Defendants

knowingly designed their social media platforms to be addictive to minors and failed to include in

their platform designs any safeguards to account for and ameliorate the psychosocial immaturity of

their minor users.

51
Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (Nov. 28, 2022),
https://doi.org/10.1186/s40359-022-00990-7.

50
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 53 of 66 PageID #: 53

Defendants Have Caused a Significant Mental Health Toll on Minors

192. A 2018 study on the effect of screen time associated with the use of electronic

devices showed that staring at electronic screens is not healthy for teens.52 First, this study

concluded that: (i) “After 1 h[our]/day of use, more hours of daily screen time were associated

with lower psychological well-being, including less curiosity, lower self-control, more

distractibility, more difficulty making friends, less emotional stability, being more difficult to care

for, and inability to finish tasks[;]” (ii) Among teens aged 14 to 17, high users of screens (7+

hours/day) were more than twice as likely [as low users of screens (1 hour/day)] to ever have been

diagnosed with depression or anxiety, ever have been treated by a mental health professional, or

taken medication for a psychological or behavioral issue in the last 12 months[;] and (3) “[M]ore

hours of daily screen time were associated with lower psychological well-being, including less

curiosity, lower self-control, more distractibility, more difficulty making friends, less emotional

stability, being more difficult to care for, and inability to finish tasks.”53

193. A 2021 United States Surgeon General advisory report states that mental,

emotional, developmental, and/or behavioral disorders are common among American children,

with one in five children between the ages of 3-17 suffering from one or more of these disorders.54

Further, “[f]rom 2009 to 2019, the share of high school students who reported persistent feelings

of sadness or hopelessness increased by 40%[,]” the share “seriously considering attempting

suicide” increased by 36%, and the share creating a suicide plan increased by 44%.55

52
Jean M. Twenge & W. Keith Campbell, Associations between screen time and lower psychological well-being
among children and adolescents: Evidence from a population-based study, 12 Preventive Med. Rep. 271-83 (Oct. 18,
2018), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6214874/.
53
Id.
54
U.S. Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic
U.S. Dep’t of Health & Human Services (Dec. 7, 2021), https://public3.pagefreezer.com/browse/HHS.gov
/30-12-2021T15:27/https://www.hhs.gov/about/news/2021/12/07/us-surgeon-general-issues-advisory-on-youth-
mental-health-crisis-further-exposed-by-covid-19-pandemic.html
55
Id.

51
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 54 of 66 PageID #: 54

194. Cyberbullying and abusive behavior are also prevalent on Defendants’ social media

platforms. A Pew Research study from 2018 determined that one in six teenagers has experienced

at least one of the following forms of abusive behavior online: (1) name-calling (42%); (2)

spreading false rumors (32%); (3) receiving unsolicited explicit images (25%); (4) having their

activities and whereabouts tracked by someone other than a parent (21%); (5) someone making

physical threats (16%); and (6) having explicit images of them shared without their consent (7%).

The survey found that 90% of teens believe online harassment is a problem for people their age,

and 63% identify it as a “major problem.”56

195. Usage of Defendants’ social media platforms can also lead to disordered eating and

sleep disturbances. A 2019 NIH study found a correlation between a greater number of social

media accounts, as well as a greater amount of daily time spent on social media, specifically

Snapchat and Instagram, and a higher incidence of eating disorders among young girls.57 Another

study of young adults found “consistent, substantial, and progressive associations between SM

[social media] use and sleep disturbance,” which “has important clinical implications for the health

and well-being of young adults.”58

196. In fact, numerous studies show that adolescents are susceptible to mental health

problems and psychological disorders linked to social media usage, including depression and

56
Monica Anderson, A Majority of Teens Have Experience Some Form of Cyberbullying, Pew Research Center (Sept.
27, 2018), https://www.pewresearch.org/internet/2018/09/27/a-majority-of-teens-have-experienced-some-form-of-
cyberbullying/.
57
Simon M. Wilksch et al., The relationship between social media use and disordered eating in young adolescents,
53 Int’l J. Eating Disorders at 96-106 (Jan. 2020), https://pubmed.ncbi.nlm.nih.gov/31797420/.
58
Jessica C. Levenson et al., The Association between Social Media Use and Sleep Disturbance Among Young Adults,
85 Preventive Med. 36-41 (Apr. 2016), https://www.sciencedirect.com/science/article/abs/pii/S0091743516000025.

52
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 55 of 66 PageID #: 55

suicide.59 In addition, there is “evidence that technology-based social comparison and feedback-

seeking behaviors may be associated with depressive symptoms among adolescents.”60

197. This youth mental health crisis has been caused by Defendants’ willful, reckless,

intentional, and/or negligent conduct. Defendants designed and marketed their social media

platforms to addict youth so that they could profit. Defendants did this because they knew youth

are a particularly profitable target audience, and thus much of Defendants’ advertising efforts and

revenues are focused on and derived from minors.

Defendants’ Conduct Has Harmed the City of Providence and Its Schools

198. Defendants’ fueling of addiction and social media usage has led to a mental health

crisis that has placed severe burdens and financial strains on local communities, including the City

of Providence, and its schools.

199. Social media has drastically changed the high school and middle school experience

of students across the nation. As succinctly reported, “The youth mental health crisis is not getting

better, and schools are increasingly being pressed into service as first responders amid rising rates

of suicidal ideation, overdoses and gun violence.”61

200. Youth mental health services are provided through most local communities, like

the City of Providence, and public schools, like Plaintiff’s schools.

201. As one of the primary providers of these services, public schools have been

inundated by the increase in youth mental health issues, many associated with increased social

59
See, e.g., Jean M. Twenge et al., Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates
Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time, 6 Clinical Psych. Sci. 3-17
(Nov. 14, 2017), https://doi.org/10.1177/2167702617723376.
60
Jacqueline Nesi & Mitchell J. Prinstein, Using Social Media for Social Comparison and Feedback-Seeking: Gender
and Popularity Moderate Associations with Depressive Symptoms, 43 J. Abnormal Child Psych. 1427-38 (Nov. 2015),
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5985443/.
61
Sabrina Moreno, The Funding Cliff for Student Mental Health, Axios (Feb. 2, 2023),
https://www.axios.com/2023/02/02/funding-cliff-student-mental-health.

53
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 56 of 66 PageID #: 56

media use. In the 2021-22 school year, nearly all public schools reported providing mental health

services to their students. 97% of public schools now provide mental health services in some form,

with over two-thirds of schools reporting an increase of students seeking such services since

2020.62 Only 12% of schools strongly agreed they could effectively provide mental health services

to students in need, and just over half believed they could effectively provide these services.

Among the factors listed by those that did not strongly agree, 61% stated they have “insufficient

mental health professional staff coverage to manage caseload[,]” 57% stated they have “inadequate

access to licensed mental health professionals[,]” and 48% stated they have “inadequate

funding.”63

202. Schools are struggling not only to provide students with mental health services but

also to deliver an adequate education because of the youth mental health crisis driven by

Defendants. Students in grades 6-12 identify depression, stress, and anxiety as the most prevalent

obstacles to learning.64 Most middle school and high school students also fail to get enough sleep

on school nights, which contributes to poor academic performance.65 These reported mental health

outcomes are the most common symptoms of excessive social media use.

203. Schools also report substantial increases in tardiness, skipping class, bullying,

fighting, threats, use of electronic devices during class, and other classroom disruptions.

62
Press Release, Roughly Half of Public Schools Report That They Can Effectively Provide Mental Health Services
to All Students in Need, National Center for Education Statistics (May 31, 2022),
https://nces.ed.gov/whatsnew/press_releases/05_31_2022_2.asp.
63
Results from the April 2022 School Pulse Panel, U.S. Dep’t of Educ., Institute of Education Sciences (Apr. 12,
2022), https://ies.ed.gov/schoolsurvey/spp/SPP_April_Infographic_Mental_Health_and_Well_Being.
pdf. See also, https://ies.ed.gov/schoolsurvey/spp/ (2022 School Pulse Panel”).
64
Insights From the Student Experience, Part I: Emotional and Mental Health, at 2-3, YouthTruth (2022),
https://youthtruthsurvey.org/wp-content/uploads/2022/10/YouthTruth_EMH_102622.pdf.
65
Anne G. Wheaton et al., Short Sleep Duration Among Middle School and High School Students-United States, 2015,
67(3) Morbidity & Mortality Wkly. Rpt. 85-90 (Jan. 26, 2018),
https://www.cdc.gov/mmwr/volumes/67/wr/mm6703a1.htm.

54
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 57 of 66 PageID #: 57

204. Local communities and schools do not have sufficient funding or mental health staff

to address today’s youth mental health crisis.

205. Given this shortage and the extent of the youth mental health crisis, the number of

teens and adolescents waiting in emergency rooms for mental health treatment for suicide

nationwide has tripled from 2019 to 2021.66

66
Stephen Stock et al., Children languish in emergency rooms awaiting mental health care, CBS News (Feb. 27,
2023, 8:02 am), https://www.cbsnews.com/news/emergency-rooms-children-mental-health/.

55
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 58 of 66 PageID #: 58

206. Local communities and school districts have borne increased costs and expenses in

response to the youth mental health crisis fueled by Defendants, including costs associated with:

• hiring additional mental health staff (41% of public schools added staff to focus on student
mental health);67

• developing additional mental health resources (46% of public schools created or expanded
mental health programs for students, 27% added student classes on social, emotional, and
mental health and 25% offered guest speakers for students on mental health);68

• training teachers to help students with their mental health (56% of public schools offered
professional development to teachers on helping students with mental health);69

• increasing and hiring additional personnel for disciplinary services in response to increased
bullying and harassment over social media;

• addressing property damage caused by students acting out because of mental, social, and
emotional problems;

• diverting time and resources from educational instruction to notify parents and guardians
of students’ behavioral issues and attendance;

• investigating and responding to threats made against schools and students over social
media; and

• updating student handbooks and policies to address use of Defendants’ platforms.

207. The City of Providence has been directly harmed by the mental health crisis among

youth in its community and schools, which has been caused by Defendants.

208. Plaintiff’s community and schools have been directly impacted by the mental health

crisis among youth and have been tasked with addressing the surge in mental, emotional, and social

issues among this population. Plaintiff’s young residents and students confront mental health

issues now more than ever before due to their excessive social media usage.

67
Nirmita Panchal et al., The Lanscape of School-Based Mental Health Services, Kaiser Family Foundation (Sept. 6,
2022), https://www.kff.org/other/issue-brief/the-landscape-of-school-based-mental-health-services/.
68
Id.
69
Id.

56
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 59 of 66 PageID #: 59

209. Plaintiff has borne the cost of the increased need for youth mental health services.

210. To address the decline in minors’ mental, emotional, and social health and resulting

misconduct, Plaintiff has been forced to expend resources that would otherwise be used to deliver

education and other community services.

211. Plaintiff requires funding to address the mental health crisis and nuisance

Defendants have created and continue to exacerbate with their design of, and engineered addiction

to, their social media platforms.

VIII. THE COMMUNICATIONS DECENCY ACT ALLOWS COMPUTER SERVICE


COMPANIES TO LIMIT HARMFUL CONTENT

212. As indicated by its title, the express purpose of the Communications Decency Act,

47 U.S.C. §230(c), is “[p]rotection [of] ‘Good Samaritan’ blocking and screening of offensive

material.”

213. The Act states that “[n]o provider or user of an interactive computer service shall

be treated as the publisher or speaker of any information provided by another information content

provider.” 47 U.S.C. §230(c).

214. The Act also asserts that providers or users may not be held liable for actions taken

“to restrict access to or availability of material” or to provide others with the means to “restrict

access” to material “that the provider or user considers to be obscene, lewd, lascivious, filthy,

excessively violent, harassing, or otherwise objectionable, whether or not such material is

constitutionally protected.” 47 U.S.C. §230(c)(2)(A).

215. These provisions of the Act are intended to protect those attempting to restrict the

deluge of harmful content on internet platforms and cannot possibly serve to shield Defendants

from purposefully designing their Social Media platforms in ways that deliberately harm

adolescence.

57
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 60 of 66 PageID #: 60

216. Plaintiff expressly disavows any claim or allegation that attempts to hold

Defendants liable as the publisher or speaker of any information provided by third parties within

the plain meaning of the Communications Decency Act and as interpreted by applicable law.

217. Plaintiff’s claims stem from Defendants’ conduct as the designers and marketers of

social media platforms that have harmed children and created an undeniable mental health crisis

among America’s youth.

218. The nature of Defendants’ businesses centers around the use of design choices that

encourage users to spend excessive amounts of time on their platforms without regard to the

devastating consequences that this has on the mental health and well-being of America’s youth.

Plaintiff’s claims are predicated on this conduct.

IX. CLAIMS FOR RELIEF

COUNT ONE - PUBLIC NUISANCE

219. Plaintiff repeats, reasserts, and incorporates the allegations contained above as if

fully set forth herein.

220. A public nuisance is an unreasonable interference with a right common to the

general public.

221. In addition, a public nuisance is behavior that unreasonably interferes with the

health, safety, peace, comfort, or convenience of the general community.

222. Plaintiff and the minors in its community and students in its schools have a right to

be free from conduct that endangers their health, comfort, and safety. Yet Defendants have

engaged in conduct which endangers or injures the health, comfort, and safety of minors in

Plaintiff’s community and the students in Plaintiff’s schools by design and operational choices that

have maximized profit over the well-being of its minor users as well as by explicitly targeting

tween- and teenage children for monetary purposes.

58
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 61 of 66 PageID #: 61

223. Each Defendant has created or assisted in the creation of a condition that is injurious

to the health and safety of Plaintiff and minors in its community and schools and interferes with

the comfortable enjoyment of life and property of entire communities and/or neighborhoods.

224. Defendants’ conduct has caused increased mental health issues and a severe

disruption of the public peace, order, and safety, including disruption of schools, classes, the

learning environment, and the community at large. Defendants’ conduct is ongoing and continues

to produce permanent and long-lasting damage.

225. The health, comfort, and safety of the minors in Plaintiff’s community and students

in Plaintiff’s schools, including those who use, have used, or will use Defendants’ social media

platforms, as well as those affected by users of these social media platforms, are matters of

substantial public interest and of legitimate concern to Plaintiff.

226. Defendants’ conduct has impacted and continues to impact Plaintiff and is likely to

continue causing significant harm to Plaintiff and minors in Plaintiff’s community and schools.

227. But for Defendants’ actions, there is no doubt that mental health issues in minors

would not be as widespread as they are today, and the massive epidemic youth and teen depression,

anxiety, and self-harm that currently exists would have been averted.

228. Logic, common sense, justice, policy, and precedent indicate Defendants’ conduct

has caused the damage and harm complained of herein. Defendants knew or reasonably should

have known that their behavior regarding the risks and benefits of the use of Defendants’ social

media platforms were causing harm. Thus, the public nuisance caused by Defendants to Plaintiff

was reasonably foreseeable, including the financial and economic losses incurred by Plaintiff.

59
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 62 of 66 PageID #: 62

229. Defendants’ actions were, at the very least, a substantial factor in the youth mental

health epidemic and in the public health crisis in communities and schools, including in the City

of Providence.

230. Defendants’ conduct in creating and maintaining the public nuisance were neither

fully regulated nor required by any federal or Rhode Island law.

231. The public nuisance alleged herein can be abated and further recurrence of such

harm and inconvenience can be abated.

232. Plaintiff has been, and continues to be, directly and proximately injured by

Defendants’ actions in creating a public nuisance.

233. Plaintiff suffered special injuries distinguishable from those suffered by the general

public.

234. Defendants’ conduct was accompanied by wanton, willful, and reckless disregard

of persons who foreseeably might be harmed by their acts and omissions.

COUNT TWO - NEGLIGENCE

235. Plaintiff repeats, reasserts, and incorporates the allegations contained above as if

fully set forth herein.

236. Each Defendant failed to act as a reasonably prudent person would under

circumstances where they were offering Social Media platforms to children.

237. Each Defendant owed a duty of care to minors and to Plaintiff, including because

each Defendant knew or foreseeably should have known that its conduct in designing, setting up,

promoting, managing, and operating its social media platforms would inflict severe mental harms

on minors, including school children, throughout the country, which Plaintiff would have to

address.

60
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 63 of 66 PageID #: 63

238. Each Defendant also owed a duty to advertise their social media platforms in a

truthful and non-misleading way and to monitor and report suspicious behavior or harmful

activities.

239. Defendants further had a duty to provide accurate, true, and correct information

about the risks of minors using Defendants’ social media platforms, and appropriate, complete,

and accurate warnings about the potential adverse effects of extended social media use, in

particular, social media content Defendants directed via their algorithms to minor users.

240. Each Defendant has breached, and continues to breach, its duties of care owed to

minors and to Plaintiff through its affirmative malfeasance, actions, business decisions, and

policies in the development, setup, management, maintenance, operation, marketing, advertising,

promotion, supervision, and control of its respective platforms.

241. As alleged above, each Defendant knew or, in the exercise of reasonable care,

should have known of the hazards and dangers of Defendants’ platforms, specifically the addictive,

compulsive, and repetitive use of Defendants’ platforms, which foreseeably can lead to a cascade

of negative effects, including but not limited to dissociative behavior, withdrawal symptoms, social

isolation, damage to body image and self-worth, increased risk behavior, exposure to predators,

sexual exploitation, suicidal ideation, and profound mental health issues for minors, including but

not limited to depression, body dysmorphia, anxiety, suicidal ideation, self-harm, insomnia, eating

disorders, death, and other harmful effects.

242. Each Defendant also knew or, in the exercise of reasonable care, should have

known that parents and minor users of Defendants’ social media platforms were unaware of the

risks and the magnitude of the risks associated with the use of the platforms, including but not

limited to the risks of extended social media use and the likelihood that algorithm-based

61
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 64 of 66 PageID #: 64

recommendations would expose adolescent users to content that is violent, sexual, or encourages

self-harm, among other things.

243. Each Defendant further knew or, in the exercise of the reasonable care, should have

known that their conduct violated the duty of care to minors and to Plaintiff, including providing

true and correct information concerning the risks of using Defendants’ platforms and appropriate,

complete, and accurate warnings concerning the potential adverse effects of using the Social Media

platforms.

244. Each Defendant also knew or, in the exercise of the reasonable care, should have

known that their conduct could be remedied and abated.

245. Defendants, by action and inaction, representation, and omission, breached their

duties of reasonable care, failed to exercise ordinary care, and failed to act as reasonably careful

persons and/or companies would act under the circumstances in the design, research, development,

testing, marketing, supply, promotion, advertisement, operation, and distribution of their social

media platforms, in that Defendants designed, researched, developed, tested, marketed, supplied,

promoted, advertised, operated, and distributed social media platforms that Defendants knew or

had reason to know would negatively impact the mental health of minor users, and failed to prevent

or adequately warn of these risks and injuries.

246. As a direct and proximate cause of each Defendant’s unreasonable and negligent

conduct, Plaintiff has suffered and will continue to suffer harm, and is entitled to damages in an

amount determined at trial.

247. Defendants made conscious decisions not to warn or inform the public, including

Plaintiff and minors in Plaintiff’s community and schools, even as the evidence mounted of the

severe harms Defendants’ platforms were inflicting on the nation’s children.

62
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 65 of 66 PageID #: 65

X. REQUEST FOR RELIEF

WHEREFORE, Plaintiff demands judgment as follows:

A. That acts alleged above be adjudged and decreed to have created, or been a

substantial factor in creating, a public nuisance;

C. A judgment against Defendants for the compensatory and punitive damages

sustained by Plaintiff, and for any additional damages, penalties, and other monetary relief

provided by applicable law;

D. An order providing injunctive and other equitable relief as necessary to protect the

interests of Plaintiff and minors in its community and schools;

E. By awarding Plaintiff prejudgment and post-judgment interest as provided by law,

and that such interest be awarded at the highest legal rate from and after service of this Complaint;

F. The costs of this suit, including reasonable attorney fees; and

G. Such other and further relief as the Court deems just and proper.

XI. JURY TRIAL DEMANDED

Plaintiff requests a jury trial, under Federal Rule of Civil Procedure 38, on all claims so

triable.

DATED: April 28, 2023 Respectfully submitted,

/s/ Jeffrey B. Pine


Jeffrey B. Pine (R.I. Bar No. 2278)
LYNCH AND PINE
One Park Row, 5th Floor
Providence, RI 02903
(401) 274-3306
[email protected]

and

63
Case 1:23-cv-00170-MSM-PAS Document 1 Filed 04/28/23 Page 66 of 66 PageID #: 66

/s/ Patrick C. Lynch


Patrick C. Lynch (R. I. Bar. No. 4867)
LYNCH AND PINE
One Park Row, 5th Floor
Providence, RI 02903
(401) 274-3306
[email protected]

Joseph H. Meltzer
Melissa L. Troutner
Tyler S. Graden
Jordan E. Jacobson
KESSLER TOPAZ
MELTZER & CHECK, LLP
280 King of Prussia Road
Radnor, PA 19087
Telephone: (610) 667-7706
[email protected]
[email protected]
[email protected]
[email protected]

64

You might also like