Final Rough Draft - IRR - Social Media
Final Rough Draft - IRR - Social Media
Final Rough Draft - IRR - Social Media
social media companies for their platform’s harmful effects on the youth. "Our city is built on
innovation and technology, but many social media platforms end up endangering our children's
mental health, promoting addiction, and encouraging unsafe behavior.” (1) The lawsuit aims to
hold the companies of social media platforms accountable for algorithms, the impact on health,
As social platforms grow, so do mental and physical health problems within youth
generations. Throughout the last decade, apps like Instagram have become a primary source of
insecurity bred by unrealistic standards (Chacón, Veronica; Langlais, Michael R.). Adolescents,
specifically up-and-coming teens, are being exposed to heavily photoshopped pictures and
videos, which is taking a toll on their mental health and causing them to harm themselves to
achieve whatever social standards may be prevalent at the time. Both men and women cling to
these standards through social comparison, which causes a multitude of mental and physical
issues (Halliwell, Emma; Harvey, Martin). Young, easily influenced people are developing
insecurities that they attempt to eliminate by posting revealing photos, developing unhealthy
habits such as eating disorders or overworking, and commenting on or insulting their peers
online. Among the adverse effects of social media use, some of the most documented are related
to adolescents' risky behavior, cyberbullying, and eating disorders (Reid & Weigle, [65]). In
addition to developing self-destructive habits, young people are projecting their fears onto
strangers online, which allows the cycle of abuse to spread and continue. While cyberbullying
regarding body image is a problem presented in both genders, studies show that it is most
common in women: Simple engagement with an attractive peer on social media, such as viewing
image content and commenting on photos may subsequently lead to an increase in body
dissatisfaction among adult women (Chacón, Veronica; Langlais, Michael R.). Due to
aforementioned cyberbullying and increased insecurity, suicide rates and self-harm, specifically
in women, have steadily risen. For teenage girls, rates of hospitalization for self-harm have
climbed since 2010 in all 11 countries with available data, by an average of 143% (The
Economist 2023). Researchers predict that, without change, these rates will continue to grow.
With the editing in media, comes people who want to change themselves to look more
like filters on such apps. This is known as ‘Snapchat Dysmorphia’, where people are seeking to
have plastic surgery to change their features to look like filters (6). Facial adjustment algorithms
can offer a way to improve people’s satisfaction with their photos. However, it may also
normalize certain facial features and proportions (6). On the other hand, algorithms that show
subtle adjustments may be more socially acceptable than those that show dramatic changes
between photo and reality (6). This affects women aged 18-35 the most, as they represent one of
the largest, most engaged social media groups in the world (7). Facebook, a very popular social
media app, is shown to be used to promote health and dietary behaviors among women greater
than 18 (7). Back in 2020, some social media apps were known to be giving out misinformation
regarding COVID-19 vaccines and nutrition. This happened because users can use hashtags to
extend content and reach beyond existing followers by allowing their content to be categorized
and discoverable using in-app searches (7). The problem with users doing this is it causes a
never-ending stream of personalized content that the user has no desire to see. The consequence
of this is that it’s possible for any young person to experience overwhelming, upsetting content
that they could find hard to break away from (8). To prove this, a research center created a
TikTok account, pretending to be a 13-year-old, and found that pro-suicide content was
recommended within 2.6 minutes of opening the app and pro-eating disorder content in 8
minutes. This is very dangerous, as it can cause children to pick up these habits and harm
themselves.
During the 2020 Covid-19 pandemic, short-term video clip app TikTok gained popularity
among young teens and adults. The interconnection of communities across the globe was proven
to be therapeutic for its teenage users, positively impacting their mental health in a time of severe
isolation (9). Companies like TikTok use their algorithms to recommend videos based on a
user's interest, liked videos, searches, and history. Many teenage users even use TikTok as their
main source for news (9). But when do these algorithms cross the line? To support the research
provided by Reid and Weigle, it was concluded that, “Social media algorithms that push extreme
content to vulnerable youth are linked to an increase in mental health problems for adolescents,
including poor body image, eating disorders, and suicidality” (10). Children’s For You Pages are
being flooded with content that promotes underweight and extremely skinny models (10) leading
to mental health concerns. With overconsumption of media from smartphones, users social,
academic, and psychological well-being are negatively affected (11). Users on social media can
be blinded by the content posted, believing that others are living happier and healthier lives, in
return, leading to an increase of depression (11). A research study was conducted to correlate
Nomophobia, no mobile phone phobia (12), to depression levels. Out of the participants, 71.7%
were women and 28.3% men, with results showing that 31.5% of them reported moderate cases
of depression (12). But those with severe cases of Nomophobia, 91.7%, reported using their
devices to check social media companies (12). Users of social media, young teens and adults
mainly, are shaping their beliefs of what their lives should look like, be like, and feel like based
off what is posted on social media, even though most of the time, this media is edited.
For example, a girl by the name of Alexis Spence created a social media account through
one of Meta’s platforms called Instagram. She created this account without her parents’
knowledge at the age of eleven, and just three years later was hospitalized for an eating disorder
known as anorexia nervosa, as well as for depression and anxiety, according to an article written
by Nancy Costello and many others (13). Alexis and her parents claimed that her social media
accounts on Instagram and the content fed to her by the algorithm and CEOs of the company
were to blame. In a documented case study of Stance vs Meta, the document states on page three,
“In Meta’s own words, it created a ‘perfect storm’ of addiction, social comparison, and exposure
to incredibly harmful content and product features, then operated its algorithms to push and
promote harmful content via Alexis’ Feed, Explore, Stories and Reels feature.” (14) The effect
they have on youth is not something done by mistake. Despite this, however, accountability has
yet to be taken. Furthermore, it is incredibly difficult to dispute over these companies as they are
protected by a multitude of things that people would be violating where they to act. The first
amendment is a main contributor to this issue, the amendment that allows freedom of speech is
viable to be violated should accountability be disputed among these companies (14). Besides
these companies being protected under the first amendment, they have gained the power to
outright refuse to take accountability. “Just last month, new released internal documents revealed
that Meta executives refused to take action after learning that their algorithms connected children
with potential child predators were receiving sexually abusive content from adults on their
platforms – each day.” (15). As social media companies and CEO’s continue to shirk their
responsibilities, other people, such as Marsha Blackburn and Mayor Adams, are working to find
ways to stop this monstrosity from hurting tweens, teenagers, and even adults.
Conclusion:
In July of 2024, The Stop Addictive Feeds Exploitation (SAFE) for Kids Act was passed in the
Senate and in the House of Representatives. (13) As it stands, the bill awaits to be approved by
the President. Until laws can be passed to legally hold social media companies accountable for,
as a society more must be done to come to a better solution. “This article advocates for state
legislation requiring social media companies to conduct periodic algorithm risk audits that
measure the incidence of harm inflicted on young users. Such risk audits should be conducted by
independent third parties, and results should be publicly disclosed.” (13). One author writes,
presenting a possible solution to making social media a safer place for communities to gather and
for content to be shared through the Federal Childrens Online Privacy Protection Act. Only time
will tell what the ever-changing status of social media will look like in the future, for now social
media will continue to influence the lives of its user's health and their character.
(1) The Official Website of New York City. (2024, February 14). Mayor
Adams Announces Lawsuit Against Social Media Companies Fueling
Nationwide Youth Mental Health Crisis. https://www.nyc.gov/office-of-
the-mayor/news/125-24/mayor-adams-lawsuit-against-social-media-
companies-fueling-nationwide-youth-mental-health#/0
(6) Xin Wang, & Yin Guo. (2023). Motivations on TikTok addiction: The
moderating role of algorithm awareness on young people. El
Profesional de La Información, 32(4), 1–10.
https://doi.org/10.3145/epi.2023.jul.11.
(7) Costello, N., Sutton, R., Jones, M., Almassian, M., Raffoul, A., Ojumu,
O., Salvia, M., Santoso, M., Kavanaugh, J. R., & Austin, S. B. (2023).
ALGORITHMS, ADDICTION, AND ADOLESCENT MENTAL HEALTH: An
Interdisciplinary Study to Inform State-level Policy Action to Protect
Youth from the Dangers of Social Media. American Journal of Law &
Medicine, 49(2/3), 135–172. https://doi.org/10.1017/amj.2023.25.
(9) Bhattacharya, S., Bashar, M. A., Srivastava, A., & Singh, A. (2019).
NOMOPHOBIA: NO MObile PHone PhoBIA. Journal of family medicine
and primary care, 8(4), 1297–1300.
https://doi.org/10.4103/jfmpc.jfmpc_71_19.
(10) Fried, O., Jacobs, J., Finkelstein, A., & Agrawala, M. (2020). Editing
Self-Image. Communications of the ACM, 63(3), 70–79.
https://doi.org/10.1145/3326601
(13) The New York State Senate. (2024, June 6). First-in-Nation Legislation
Limiting Social Media Algorithmic Reach Passes Senate. The New York
State Senate.
https://www.nysenate.gov/newsroom/press-releases/2024/first-nation-
legislation-limiting-social-media-algorithmic-reach#:~:text=The
%20Stop%20Addictive%20Feeds%20Exploitation,unless%20they
%20receive%20parental%20consent.