SaferInternet Special Report
SaferInternet Special Report
SaferInternet Special Report
Supporting Safer
Digital Spaces
Suzie Dunn, Tracy Vaillancourt
and Heather Brittain
About CIGI
The Centre for International Governance Innovation (CIGI) is an independent, non-
partisan think tank whose peer-reviewed research and trusted analysis influence
policy makers to innovate. Our global network of multidisciplinary researchers and
strategic partnerships provide policy solutions for the digital era with one goal: to
improve people’s lives everywhere. Headquartered in Waterloo, Canada, CIGI
has received support from the Government of Canada, the Government of Ontario
and founder Jim Balsillie.
Credits
Managing Director and General Counsel Aaron Shull
Director, Program Management Dianna English
Manager, Government Affairs and Partnerships Liliana Araujo
Research Associate Kailee Hilt
Senior Publications Editor Jennifer Goyder
Publications Editor Susan Bubak
Graphic Designer Abhilasha Dewan
The opinions expressed in this publication are those of the authors and do not
necessarily reflect the views of the Centre for International Governance Innovation
or its Board of Directors.
This work was carried out with the aid of a grant from the International
Development Research Centre, Ottawa, Canada.
The views expressed herein do not necessarily represent those of IDRC or its Board
of Governors.
1 Introduction
Methodology
76 Works Cited
77 Appendix
About the Project
Supporting a Safer Internet is a multi-year research project,
in partnership with the International Development Research
Centre (IDRC). This project explores the prevalence and
impacts of technology-facilitated gender-based violence
(TFGBV) experienced by women, transgender, gender
non-conforming and gender-diverse people, as well as
technology-facilitated violence (TFV) against LGBTQ+
individuals.
As part of the project, an international survey was conducted by Ipsos on behalf of
the Centre for International Governance Innovation (CIGI). The survey results provide
valuable insight on people’s experiences with online harms in 18 different countries,
with a specific focus on the Global South. From cyberstalking, impersonation and the
non-consensual distribution of intimate images, to organized networked harassment,
TFGBV causes serious harm and silences the voices of women, gender-diverse people
and LGBTQ+ individuals in digital spaces. Fear of TFGBV leads to digital exclusion and
propagates systemic inequalities. To address these emerging challenges, the survey results,
papers and the Supporting Safer Digital Spaces report from this project aim to inform policy
recommendations and navigate shared governance issues that are integral to designing
responses to TFGBV — whether that be through the regulation of online social media
platforms, educational programming or legal recourse.
This project was assisted by an expert advisory committee made up of Chenai Chair
(Mozilla Foundation), Jan Moolman (Association for Progressive Communication), Anja
Kovacs (independent researcher and consultant, previously at Internet Democracy Project),
María Paz Canales (Global Partners Digital, previously at Derechos Digitales) and Ruhyia
Seward (IDRC).
In addition to this report and an annotated bibliography, the following papers have been
published as part of this project:
• Michelle Bordachar, Nonhlanhla Chanza, Kailee Hilt, J. Carlos Lara, Emma Monteiro
and Grace Mutung’u, Non-Consensual Intimate Image Distribution: The Legal Landscape in
Kenya, Chile and South Africa, Supporting a Safer Internet Paper No. 2 (2021)
Publications, multimedia, country data reports and opinions related to the project can be
found on CIGI’s website: www.cigionline.org/activities/supporting-safer-internet/.
iv
Foreword
Various forms of digital technology are being used to inflict significant
harms online. This is a pervasive issue in online interactions, in
particular with regard to technology-facilitated gender-based violence
(TFGBV) and technology-facilitated violence (TFV) against LGBTQ+
people. This modern form of violence perpetuates gender inequality
and discrimination against LGBTQ+ people and has significant impacts
on its targets, including silencing women’s and LGBTQ+ persons’
voices online.
v
On the occasion of International Women’s Day (March 8) in 2020, the Centre for
International Governance Innovation (CIGI) announced that it had received a grant from
the International Development Research Centre (IDRC) for a project titled Supporting a
Safer Internet. The project was officially launched with the Honourable Karina Gould, then
Canada’s minister of international development, who said: “This is an important project to
arm governments, NGOs and private sectors including social media entities with this data
to design effective responses for a safer online world.”
The project achieved three objectives: conducting an international survey on online harms,
convening experts and scholars to prioritize policy-related inputs, and producing analysis
of TFGBV in the Global South. The survey was conducted in 18 countries and key findings
include the international prevalence of online harms, as well as the impacts on the mental
health, safety and freedom of expression of women and LGBTQ+ people.
The project has published several research papers and opinion pieces on TFGBV and its
impact on women and LGBTQ+ individuals globally. The papers discuss the various forms
of online harm and their harmful effects on victims/survivors.
CIGI has also hosted virtual events and discussions to explore the issue and propose
solutions. This CIGI special report summarizes the quantitative data collected on people’s
experiences with and opinions of online harms, with a particular focus on the ways that
a person’s gender identity, gender expression and/or sexual orientation impact their
experiences with online harms, and provides policy recommendations for governments,
technology companies, academics, researchers and civil society organizations.
The data for each country is presented with highlights in an individual “data report,”
organized by survey question to make it easy to identify the key findings. The SPSS file of
data is also being made publicly available to inspire and inform current and future research.
The aim is to promote transparency and accountability in research and contribute to a safer
and more equitable online environment for vulnerable and marginalized populations.
I would like to express gratitude to everyone who contributed to the development of this
special report. First and foremost, the project owes a tremendous debt of gratitude to Suzie
Dunn, the primary author of this report. Her extensive knowledge and expertise on TFGBV
have been invaluable in helping guide this project and in authoring this comprehensive
report.
Liliana Araujo showed exceptional ability in managing the project and collaborating with
internal and external stakeholders. Her leadership and guidance have been critical in
bringing this report to fruition.
I extend CIGI’s appreciation to the members of the project steering committee, María
Paz Canales, Chenai Chair, Anja Kovacs and Jan Moolman, for their valuable insights and
contributions to this report, and to Tracy Vaillancourt and Heather Brittain for their double-
hatted role in authoring the report with Suzie and for their advice and counsel on research
and statistical methodology along the way. The steering committee’s expertise in the fields
of human rights, technology and gender-based violence online was instrumental in shaping
the report’s recommendations and conclusions.
vi
Additional external experts provided insightful commentary on sections of earlier drafts of
this report, including members of the Association for Progressive Communications: Nicola
Henry, Tigist Hussen, Rosel Kim and Molly Reynolds.
The Ipsos team, Sean Simpson and Sanyam Sethi, contributed to the design, execution,
analysis and delivery of the quantitative research exercises. Their expertise and dedication
have been invaluable.
I am also grateful to the rest of the CIGI team, Anne Blayney, Susan Bubak, Sara Daas,
Michael Den Tandt, Abhilasha Dewan, Dianna English, Niyosha Freydooni, Jennifer Goyder,
Andrea Harding, Trevor Hunsberger, Tim Lewis, Rebecca MacIntyre, Rohinton P. Medhora,
Emma Monteiro, Kate Pearce, Paul Samson, Lynn Schellenberg, Spencer Tripp, Som Tsoi,
Claire van Nierop, Yang Wang and John Xu, for their support and collaboration throughout
the project. A special thank you goes to Kailee Hilt, who played an essential role in this
project by writing the annotated bibliography and by being a key figure in the creation of
each of the country data reports.
Our colleagues at Global Affairs Canada have been instrumental in providing advice and
support throughout this process.
This report would not have been possible without the contributions of all these individuals.
I am deeply appreciative of their hard work and dedication.
TFGBV is a complex issue that requires a multifaceted approach that involves governments,
civil society organizations and technology companies. Social media companies need to
be more responsive to the needs of people experiencing violence and provide meaningful
support for those abused on their platforms. Front-line anti-violence organizations require
increased resources and support to provide adequate intervention strategies, while
governments should ensure that there are practical and accessible avenues for those
targeted by TFGBV to get the support they need and to hold perpetrators accountable for
their actions.
Finally, I would like to express my sincere gratitude to IDRC for its partnership and support
throughout this project. I am particularly grateful to Ruhiya Seward, whose idea sparked
the project and who has been a vital driving force behind its success. Her insights, guidance
and dedication have been critical to the project’s progress and impact. I am honoured to
have worked with such a committed and inspiring partner and look forward to continuing
our collaboration with IDRC on future initiatives.
I hope that this report will contribute to the ongoing efforts to address TFGBV and make the
internet a safer space for all.
Aaron Shull
Managing Director and General Counsel, CIGI
vii
Expert Advisory Committee
This project would not be possible without the Technology? Exploring the Connections between
decades of work done by feminist, LGBTQ+ Information Communication Technologies
advocates, digital rights groups and anti-violence (ICT) and Violence Against Women (VAW).”6 Jan
organizations that brought this issue to light. Civil Moolman, who worked for APC’s Women’s Rights
society organizations are often at the forefront Programme as a senior project coordinator on
of intersectional feminist policy making and online gender-based violence (OGBV) and currently
public education on gender-based violence and represents APC on the steering committee of the
violence against LGBTQ+ people. It can take years advisory group of the Global Partnership for Action
of dedicated work by these organizations before on Gender-Based Online Harassment and Abuse,
governments react to them, social norms begin to attended her first International Governance Forum
change, and laws and resources are developed. This in 2009, where she noted the lack of global and
remains true with TFGBV and TFV against LGBTQ+ feminist perspectives on internet governance
people, and the authors are indebted to that history issues.7 APC was committed to bringing these
and the work done before this time. perspectives to the field of information technology.
In 2011, APC published the paper “Voices from
Although this report will not be able to Digital Spaces: Technology Related Violence against
acknowledge the wide array of organizations and Women,” which outlined this issue and the role of
researchers that have contributed to the history various stakeholders in responding to it.8 Since that
of TFGBV research and advocacy, this section will time, APC has published numerous studies, reports
highlight the work of the project’s expert advisory and articles addressing TFGBV worldwide.9 There
committee and organizations they have been a part is now a robust collection of researchers, policy
of in addressing TFGBV and TFV against LGBTQ+ makers, advocates and educators working on this
people. Special thanks must be given to the topic around the world.
members of the expert advisory committee — their
expertise was foundational to the development
of the survey and is relied on extensively in this Take Back the Tech
report.
Take Back the Tech was established by APC in
2006.10 It shares information on how people can
Association for Progressive use technology to end gender-based violence,
viii
Streets? Ours! Witness Silencing. Occupy. Create,” Feminist Principles of the Internet
which focused on the history of women and
technology and denouncing TFGBV.13 Its work In 2014, APC organized a Global Meeting on Gender
brings much-needed education to the world about Sexuality and the Internet in Malaysia to discuss
TFGBV. the rights of women, gender-diverse people and
LGBTQ+ people on the internet. Fifty participants
ix
Gender-Based Violence Online in Bulgaria (BlueLink security from a feminist perspective.31 It has
Foundation); Anti-rights Discourse in Brazilian Social also concentrated efforts in supporting local
Media: Digital Networks, Violence and Sex Politics organizations in responding to gender gaps and
(Latin American Center on Sexuality and Human online gender violence in the region through its
Rights); and Alternate Realities, Alternate Internets: Rapid Response Fund for the Protection of Digital
African Feminist Research for a Feminist Internet Rights in Latin America.32
(Pollicy).
Derechos Digitales’s work has been focused on
demonstrating the ways that the data, algorithms
Derechos Digitales and protocols that the internet is built over are not
neutral in terms of gender. It has published about
Derechos Digitales is a Latin American digital
the gendered impacts of the deployment of identity
rights organization that was founded in 2005.29
systems in Latin America,33 the impacts on freedom
It promotes human rights in the digital sphere,
of expression from a gender perspective,34 crimes
specifically focusing on freedom of expression,
against intimacy,35 anonymity and encryption and
privacy and data collection, and copyright and
TFGBV,36 and it has proposed a feminist framework
access to information. It provides legal and
for artificial intelligence (AI).37 Finally, Derechos
technical research, education and policy advocacy
Digitales has dedicated efforts to making sure Latin
on digital rights. Derechos Digitales approaches
American women’s voices are considered in the
gender as a cross-cutting theme in its work. Its
development of internet protocols, working jointly
team is gender balanced and they have policies in
with other organizations to influence the Internet
place to secure diversity and gender approach in
Engineering Task Force for the development of
their research and staff work.
standards that consider human rights and feminist
Derechos Digitales is very concerned with the principles.38
reproduction of offline exclusions and inequalities
in the digital world and has been working to Internet Democracy Project
advance opportunities for historically marginalized
groups to enjoy their rights online. In 2017, Derechos The Internet Democracy Project, based in New
Digitales produced their landmark report Latin Delhi, India, was established in 2011.39 Its work aims
America in a Glimpse, which focused on mapping to use the Indian context as a starting point, then
and providing visibility to gender information develops what it has learned to apply to a wider
and communication technology initiatives in context. It is “working towards realising feminist
the region.30 This work helped the organization visions of the digital in society by exploring &
communicate the challenges that women who addressing power imbalances in the areas of norms,
actively participate in the internet ecosystem in governance & infrastructure.”40 Its work addresses
Latin America confront daily as a continuum of the a wide variety of topics related to women, gender,
violence they suffer in the physical space in their sexuality and the internet, including TFGBV. It has
activism. applied a feminist perspective to data collection,41
39 See https://internetdemocracy.in/.
x
sexuality online,42 surveillance,43 online harassment aimed at understanding the Southern African
and the law,44 verbal online abuse45 and other issues Development Community’s model laws framing of
connected to gender and the internet. In addition to gender and sex life and African feminist resistance
research, the Internet Democracy Project engages in to extractive data practices. Her work is available
policy development and has released policy briefs on mydatarights.africa.
on topics such as feminist principles on consent
in data governance46 and online violence against Chenai led the development of research and writing
women.47 It has also organized multiple meetings of the 2020 report Women’s Rights Online: Closing
and conferences on gender and the internet, along the digital gender gap for a more equal world for the
with Point of View, including Porn. Panic. Ban: A World Wide Web Foundation.54 This report provided
Conversation on Sexual Expression, Pornography, a global snapshot of the state of digital gender
Sexual Exploitation, Consent;48 My Troll, Our Troll? inequality, focusing on Colombia, Ghana, Uganda
Moving beyond Individual Action and towards and Indonesia. It found that even where women
Structural Change against Online Abuse;49 and are closing the gap on basic internet access, they
Imagine a Feminist Internet South Asia.50 These face a multitude of additional barriers to using the
conferences bring people together to discuss issues internet and fully participating online.
such as women’s sexuality, sexual expression,
Chenai also developed research projects that
privacy and TFGBV.
sought to provide evidence to bridge the digital
divide and to understand the experiences of
Chenai Chair (Various young people accessing the internet in Africa
while at Research ICT Africa.55 She has worked in
Organizations) collaboration with Pollicy on Afrofeminist Data
Futures, a project that “seeks to better understand
Chenai Chair has worked at the intersection of
how feminist movements in sub-Saharan Africa
digital technology and gender, assessing the impact
can be empowered through the production, sharing
of technology on society. Her work draws on
and use of gender data, and how this knowledge
principles of feminism to assess digital technology.
can be translated into actionable recommendations
Chenai was a Mozilla 2019/2020 Tech Policy fellow.51
for private technology companies in terms of
She developed a feminist project focused on
how they share non-commercial datasets.”56
privacy, data protection and AI known as “My Data
She has also worked on a project, Engine Room,
Rights (Africa).”52 Her research includes a project
that seeks to understand the lived experiences
that examined the ways that African feminists
of people using digital ID systems in mostly
in Malawi, South Africa, Zambia and Zimbabwe
marginalized communities in Bangladesh, Ethiopia,
engage on issues to do with gender, privacy and
Nigeria, Zimbabwe and Thailand.57 Chenai is
data, including resisting digital rights violations.53
currently a senior program officer at the Mozilla
Chenai has further developed a research project
Foundation leading the development of Mozilla’s
Africa Innovation Mradi and Common Voice
programmatic work.58
42 See Bhandari and Kovacs (2021); https://internetdemocracy.in/events/
imagine-a-feminist-internet-research-policy-and-practice-in-south-asia.
48 See https://internetdemocracy.in/events/porn-panic-ban.
54 See https://webfoundation.org/research/womens-rights-online-2020/.
49 See https://internetdemocracy.in/events/my-troll-our-troll.
55 See https://researchictafrica.net/author/chenai-chair/.
50 See https://internetdemocracy.in/events/imagine-a-feminist-internet-
56 See https://pollicy.org/projects/afro-feminist-data-futures/.
research-policy-and-practice-in-south-asia.
57 See https://digitalid.theengineroom.org/assets/pdfs/200123_FINAL_
51 See https://foundation.mozilla.org/en/blog/authors/chenai-chair-33/.
TER_Digital_ID_Report+Annexes_English_Interactive.pdf.
52 See https://mydatarights.africa/.
58 See https://foundation.mozilla.org/en/blog/going-far-together-mozillas-
53 See https://mydatarights.africa/projects/. africa-innovation-mradi-focus/.
xi
International Development is to enable development, private sector and
government stakeholders to use this research
Research Centre and data to improve their responses to TFV and
hate speech. Another objective is to ensure that
The International Development Research Centre
scholars, advocates and researchers in the Global
(IDRC) was established by an act of Canada’s
South — equipped with their findings and policy
Parliament in 1970, and functions as part of
ideas — have a voice at the table when laws and
Canada’s international aid envelope.59 IDRC
regulatory measures are discussed. The hope is that
champions and funds research and innovation
this rich body of research from Global South experts
within and alongside partners in the Global South
will have a substantive and long-term impact in
to drive global change — investing in high-quality
national and international policy spaces on the
research, sharing knowledge with researchers
equitable and fair governance of the digital public
and policy makers for greater uptake and use,
sphere.
and mobilizing global alliances to build a more
sustainable and inclusive world.
59 See www.idrc.ca/en/about-idrc.
60 See https://firn.genderit.org/.
61 See www.idrc.ca/en/project/understanding-digital-access-and-use-global-
south.
62 See www.idrc.ca/en/project/recognize-resist-remedy-research-project-
combat-gender-based-hate-speech-against-women.
63 See www.idrc.ca/en/project/making-feminist-internet-research-network.
64 See www.idrc.ca/en/project/advancing-research-feminist-artificial-
intelligence-advance-gender-equality-and-inclusion.
65 See www.idrc.ca/en/project/data-inclusive-democratic-and-feminist-
development-shaping-global-research-agenda.
66 See https://citizenlab.ca/2021/08/no-access-lgbtiq-website-censorship-in-
six-countries/.
67 See www.idrc.ca/en/project/supporting-safer-internet-2-global-survey-
tech-facilitated-gender-based-violence.
68 See https://freedomonlinecoalition.com.
xii
Introduction
Digital spaces, such as social media platforms and instant messaging
via text or apps, can be incredibly uplifting places where women and
LGBTQ+ people go to find information, build community and gather
support. These tools are used to create and maintain valuable allies,
friendships and other caring relationships.
1
They are essential for democratic discussion, the terms online harms and online gender-based
advocacy, creativity and education. They have violence (OGBV) when discussing the data collected
been used by these communities in creative ways as it focuses specifically on online experiences.
to build movements and create resistance against OGBV is a subset of the larger issue of TFGBV.
the discriminatory status quo. These digital spaces
must be protected and nurtured so that all people Although the analysis of this data is focused on the
can benefit from them. experiences of women and LGBTQ+ people, this
survey collected data from people of all genders and
Unfortunately, the current digital landscape is one sexual orientations. Data was collected from cis and
where the cruellest voices often dominate and trans women and men, gender non-conforming,
discriminatory hierarchies are reinforced through agender, non-binary people and people of other
negative engagement in digital spaces, preventing gender identities, as well as gay, lesbian, bisexual,
women and LGBTQ+69 people from participating heterosexual and other sexual orientations.
freely, safely and authentically in them. The data
discussed in this report affirms that this unhealthy This report includes data on the type of online
digital environment exists, and that women and harms participants experienced, how harmful they
LGBTQ+ people suffer disproportionately. The data thought these forms of online harm were, how they
shows that people are often specifically targeted were impacted by their experiences with being
because they are members of equity-seeking harmed online, how people responded to online
groups, that there is a lack of effective resources harms, and what resources and supports they used
and supports available for all people being harmed and thought might help people targeted by OGBV.
online, and that far too many people are suffering The data is used to examine the influence of gender
in silence. A digital world in which people are identity, gender expression and sexual orientation,
discriminatorily targeted because of their gender, in particular, on these types of online harm.
sexual orientation and other intersecting aspects
The report is divided into three main sections.
of their identity without meaningful redress can
The first section focuses on the experiences and
never fulfill the true potential of digital spaces that
impacts of online harms on victims/survivors.
is afforded with equitable inclusion.
The second section focuses on the resources and
This report will provide background information supports that victims/survivors have used and
on technology-facilitated gender-based violence would like to see. These two sections begin with
(TFGBV) and technology-facilitated violence (TFV) background information on TFGBV, followed by
against LGBTQ+ people by summarizing some detailed descriptions of the survey results, and
of the existing research on this topic. It will then conclude with summaries on how gender and
present quantitative data collected on people’s sexuality are reflected in the data to expose what
experiences with, and opinions of, 13 forms of represents OGBV and online violence against
online harm that have been recognized as common LGBTQ+ people. The final section includes a list of
forms of TFGBV and TFV against LGBTQ+ people. recommendations for governments, technology
The survey discussed in this report specifically companies, academics, researchers and civil society
examines people’s online experiences. Many forms organizations on how they can contribute to
of TFGBV involve modern digital technologies addressing and ending TFV.
that are not connected to the internet, such as
This report aims to centre on the experiences of
cellphones that are not internet connected, GPS
people from the Global South and includes data
location tracking devices or cameras that are not
from many countries in the Global South. The
connected to the internet to perpetrate voyeurism.
authors would like to recognize that the term
However, the data collected for this report focuses
“Global South” is a contested term, and that this
solely on online experiences and will therefore use
terminology is not the sole nor universally
2
The current digital landscape is one where
the cruellest voices often dominate and
discriminatory hierarchies are reinforced
through negative engagement in digital
spaces, preventing women and LGBTQ+
people from participating freely, safely and
authentically in them.
accepted term used to describe regions of the cultural needs of people in their regions. As such,
world that include countries that are systematically it is critical that more attention be brought to the
lesseconomically and politically advantaged. Some experience of those living in the Global South.
prefer to call these regions the “majority world,”
in the Global South, it was decided to use the In addition to a focus on countries of the Global
term Global South, while recognizing the term’s South, this report examines the experiences of
limitations. women and LGBTQ+ people with online harms
in particular. Gender-based violence and violence
Data was collected from 18,149 people of all genders against LGBTQ+ people are rampant in digital
in 18 countries (Algeria, Argentina, Brazil, Canada, spaces. TFV against these groups violates their
Chile, China, Colombia, Ecuador, France, Germany, human rights and negatively impacts their
India, Jordan, Kenya, Saudi Arabia, South Africa, experiences in their overlapping physical and
Tunisia, the United Arab Emirates [UAE] and the digital worlds. Cis women and girls, transgender
United States). Participants in Algeria, Jordan, people, gender non-conforming, agender and
Saudi Arabia, Tunisia and the UAE were not asked non-binary people, as well as bisexual, lesbian and
to report their sexual orientation or diverse gender gay people, are discriminatorily targeted with TFV
identity due to safety and legal limitations in those because of their gender, gender identity, gender
countries. expression and sexual orientation, leaving many
of them feeling unsafe in digital spaces and in the
The purpose of this report is to centralize the physical world, with many facing discriminatory
experiences of women and LGBTQ+ people with violence against them. Their digital and physical
TFGBV and TFV against LGBTQ+ people in the experiences are inseparable in the modern world.
Global South. As noted in the earlier section on Negative digital experiences will inevitably impact
the project’s steering committee, civil society the physical experiences of those targeted, causing
organizations and researchers in the Global South mental distress, impacting their general feelings
have long been the thought leaders on these of safety and, in some cases, leading to physical
issues, having engaged in research, advocacy and violence. Conversely, negative experiences in the
education on this issue for decades. The data in physical world will be shared and reflected in
this report hopes to supplement and build on that digital spaces and impact how people engage in
existing research, as well as on research and data these spaces.
from other regions. People in the Global South have
unique and culturally specific needs that must Previous research has shown that women and
be addressed by TFV research, laws and policies, LGBTQ+ people are targeted by abusers online
including technology companies’ policies. Countries because of their gender and sexual orientation, and
in the Global South often have fewer resources to that these groups are uniquely vulnerable to the
address TFV, may have challenges with the rule of impacts of certain forms of TFV, such as the non-
law and struggle to get technology companies to consensual distribution of intimate images (NCDII)
recognize and act on the contextual, linguistic and
3
and stalking.70 For example, some cyberstalking transphobic discriminatory norms or those who
apps are marketed as tools to spy on current and dare to advocate for gender equality or LGBTQ+
ex-intimate partners and can be used as tools to rights (Posetti 2017; Palumbo and Sienra 2017).
commit gender-based violence in intimate partner These are just a few ways that technology is used
relationships (Parsons et al. 2019), “revenge porn” to harm women and LGBTQ+ people in digital
websites predominantly host sexual images of spaces. It is critical to recognize that gender and
women shared without their consent (Henry sexuality are not the only identity factors that
and Flynn 2019), and women and girls more make women and LGBTQ+ people vulnerable
commonly have their devices monitored and online. Women and LGBTQ+ people who are
controlled by male family members (Udwadia Black or Indigenous, are people of colour, have
and Grewal 2019). In recent years, hate groups disabilities or are discriminated against because
have increasingly targeted LGBTQ+ people both of their ethnicity or religion face compounding
online and off. Misogynistic, transphobic and harms related to their intersecting social locations
homophobic groups, and their influential leaders, (United Nations Human Rights Council 2018).
provoke networked harassment against women The experiences of women and LGBTQ+ people,
and LGBTQ+ people (Yahaya and Iyer 2022; Curlew including those with these intersecting identities,
and Monaghan 2019), along with anyone who will be reflected in the findings from this survey,
does not fit within sexist, homophobic and wherever possible.
4
Key Findings in Brief
The results of the survey will not be surprising to impacts of online harms, but also rated many
anyone who has spent time online. Research shows categories of online harms as less harmful generally.
that TFV and online harms are widespread. The data The harms faced by transgender, gender-diverse
demonstrates the disproportionate negative impact and young people may be downplayed by society in
of online harms on women and LGBTQ+ people: ways that impact their overall conceptions of these
harms. This potential normalization of TFV among
• Almost 60 percent (59.7 percent) of all those groups that are most impacted is a disturbing
participants had experienced at least one trend.
of the 13 forms of online harm surveyed.
Survey participants were aware of the
• Transgender and gender-diverse people disproportionate challenges that women and
reported the highest proportion of incidents LGBTQ+ people face in digital spaces. A significantly
experienced, with cis women reporting slightly higher proportion of participants recognized that
higher proportions of incidents of online harm OGBV was a serious issue for women and LGBTQ+
compared to cis men. people compared to men. When participants were
asked who OGBV was a big problem for:
• Although men and women reported relatively
similar numbers of incidents of online harm • 46.5 percent reported that it was a very big
in several categories, women were much more problem for LGBTQ+ people;
likely to report a serious impact from online
harms compared to men. • 44.3 percent reported that it was a very big
problem for women; and
• LGBTQ+ people were much more likely to report
a serious impact from online harms compared to • 22.7 percent reported that it was a very big
heterosexual people. problem for men.
• Women were much more likely to rate the Gender differences were also apparent in who
various forms of online harm as harmful perpetrated the various forms of online harm.
compared to men. The data shows that men’s behaviour in digital
spaces contributes to much of the most harmful
Women reported similar or higher proportions forms, including OGBV and online violence against
of incidents of online harm in many categories LGBTQ+ people. A high proportion of participants
compared to men; however, when asked what reported that men were the perpetrators of the
their general opinions were on various forms of most serious incidents of TFV they experienced:
online harm, women consistently rated almost all
forms of online harm as more harmful than men, • Close to half of all participants (49.7 percent)
which reflects much of the research showing that reported that a man perpetrated the most
women are more negatively impacted by online serious digital attack they personally
harms than men. Surprisingly, transgender and experienced; a smaller percentage (18.9 percent)
gender-diverse people generally rated most forms reported that a woman was the perpetrator.
of online harm as less harmful than men and
women, even though as individuals they reported • More than half of women (57.7 percent) and
proportionately more incidents of harm and more transgender and gender-diverse people
serious impacts than most other groups. This may (51.6 percent) reported that it was a man who
be due to a normalizing effect, where some people targeted them, compared to 42.9 percent of men.
who experience TFV more regularly and do not
• Almost one-quarter of participants (24.8 percent)
find support from society about the harms that
could not identify the gender of the person (for
they experience may start to downplay its overall
example, when the person used an anonymous
effects because the experience is so common and
user profile that did not indicate their gender).
is regularly dismissed by the general public. The
data indicated similar results in young people,
who, like transgender and gender-diverse people,
experienced a higher prevalence and more negative
5
• A very small percentage (1.1 percent) of Additionally, the data showed that people are
participants reported a person of an “other” struggling to talk to others about experiencing
gender was the perpetrator.71 online harms and to find effective support and
resources. Very few spoke to anyone about their
The identity of an individual played an important experience. Of those that did reach out for help, few
role in why they were targeted. Of the most formal mechanisms were rated as “very effective,”
serious incidents of online harm experienced, showing that there is a long way to go in creating
most participants reported that they were targeted and improving support for victims/survivors of
because of their gender identity, gender expression, TFV. This issue is particularly relevant in the Global
sexual orientation, race, religion or disability: South, where there are often fewer laws related to
TFV in place, there may be challenges with the rule
• Transgender and gender-diverse people
of law and there are fewer resources available for
(31.8 percent) and women (29.8 percent) were
victims/survivors of TFV. Among the most serious
more likely to report they were targeted because
incidents of online harms:
of their gender identity than men (16.0 percent);
lesbian, gay, bisexual and other sexualities • Almost 40 percent (39.6 percent) of people
(LGB+) people (27.8 percent) were more likely to did not reach out to anyone for help, not even
be targeted because of their gender identity than friends or family.
heterosexual people (23.0 percent).
• Very few (10.1 percent or less) sought support
• Transgender and gender-diverse people from social media companies, government
(24.0 percent) were more likely to report they services, including the police, or civil society
were targeted because of their gender expression organizations.
than men (8.6 percent) and women (8.2 percent),
as were LGB+ people (17.8 percent) compared to This data demonstrates that online harms are
heterosexual people (7.8 percent). a rampant and serious issue that needs more
attention, and that particular attention needs to be
• LGB+ people were more likely to report they paid to the experiences of women and members of
were targeted (42.7 percent) because of their the LGBTQ+ community, who are more significantly
sexual orientation than heterosexual people impacted by TFGBV.
(6.6 percent).
Almost
%
of all participants had
experienced at least one of the 13 forms of
online harm surveyed.
6
Data was collected from
7
Methodology
This report selected a broad range of countries, this category, rather than the larger category of
focusing mainly on countries located in the Global cis women and men, as they face unique forms of
South, to provide diverse representation of people discrimination that are aligned with those faced by
who have experienced online harms internationally gender non-conforming, agender and non-binary
and to create data in regions where data collection people. As such, statistics that refer to “transgender
on online harms is sparse or non-existent. Data and gender-diverse people” should be considered as
was collected from 18,149 people of all genders in inclusive of transgender, gender non-conforming,
18 countries: Algeria, Argentina, Brazil, Canada, agender and non-binary people as well as anyone
Chile, China, Colombia, Ecuador, France, Germany, outside of the cisgender binary.
India, Jordan, Kenya, Saudi Arabia, South
Africa, Tunisia, the UAE and the United States. The authors recognize that a person’s gender
Approximately 1,000 people per country were identity can include identities other than
surveyed, primarily through online surveys. In those listed here, such as two-spirit, agender,
countries with lower internet penetration (Algeria, genderqueer, non-binary or other gender categories.
Brazil, Colombia, India, Kenya, South Africa and “Gender diverse” is being used as an umbrella term
Tunisia), in-person and telephone interviews were in this report to capture any gender identity outside
conducted as well as online surveys. of the cisgender binary; however, the authors
acknowledge that this term may be underinclusive
for some and overinclusive for others. No term can
accurately capture the complexity of all gender
Categorization of Gender identities.72
8
by transgender men and gender-diverse people, Fidell 2019). Inadequate expected cell frequencies
such as transphobic attacks. (i.e., 20 percent of cells under five) (ibid.) can
influence the results of MFA. To address this issue,
The authors chose to categorize gender this way the authors used gender diversity (transgender
to highlight the experiences of transgender and and gender diverse, women and men) rather than
gender-diverse people, rather than to systemically an interaction between biological sex and gender
exclude transgender people within the larger identity. In these analyses, form of online harm
categories of men and women. This was done (experienced or not) was conceptualized as the
because the small number of transgender women, dependent variable. Independent variables included
men and gender-diverse people did not allow for a gender diversity (transgender and gender diverse,
statistically significant analysis of those categories women and men) and sexual orientation (LGB+ and
individually. Categorizing gender identity in these heterosexual), yielding expected cell frequencies
ways means that the important data on transgender that exceeded five in all cases. Traditionally, MFA is
and gender-diverse people features definitively in used to create a model by testing the higher-order
the results, allowing for analysis on the specific associations (for example, gender diversity by sexual
harms transgender and gender-diverse people face orientation by online harm) followed by all two-
in digital spaces, which were essential to feature. way, then one-way associations. Non-statistically
significant associations were eliminated from the
Sexual Orientation Data model. Since the authors were not interested in
Categorization establishing a model, analyses were restricted to
the examination of variations in experiences of
When analyzing the data, sexual orientation was online violence as a function of gender diversity
categorized in the following way: any participant and/or sexual orientation (following procedures by
who selected non-heterosexual options (lesbian, Vaillancourt et al. 2021). A statistically significant
gay or another sexual orientation) or multiple three-way interaction was considered evidence of
sexual orientations (heterosexual and another intersectionality, and results reported accordingly.
option, or options beyond heterosexual) were Proportions were further examined using chi-square
categorized as LGB+ (lesbian, gay, bisexual or tests of association and differences were assessed
other sexual orientation); any participant who using the z-test for column proportions. MFA was
only selected heterosexual was categorized as also used to examine the effects of intersectionality
heterosexual. Of those who were asked and on the impacts of online harms as well as perceptions
reported their sexual orientation, 92 percent of harmfulness. The McNemar’s test was used to
identified as heterosexual, and 8.0 percent compare the paired proportions of participants
identified as LGB+. reporting who OGBV was a big problem for. Because
the analysis did not control for multiple comparisons
The authors would like to recognize that there is
(Hsu 1996) using a false discovery rate procedure
a wide diversity of ways that people define their
such as Benjamini-Hochberg (Benjamini and
sexual orientation, such as queer, pansexual,
Hochberg 1995), the probability of committing false
two-spirit, demisexual and many more. The term
statistical inferences was increased.
“LGB+” was chosen as an umbrella term for this
report, however, the authors recognize that this is The data set and SPSS syntax used to generate
a simplified term that does not fully capture the the statistics in this report are available upon
breadth of people’s diverse sexual orientations. reasonable request to the authors. Some data have
been suppressed to ensure that participants cannot
Analytic Strategy to Examine be identified if data with small sample sizes are
Intersectionality combined. The full results of inferential statistics can
be found in the Appendix.
Intersectionality of gender and sexuality was
examined with multi-way frequency analysis (MFA).
This nonparametric analysis is similar to an analysis
of variance for categorical variables, which compares
observed and expected frequencies (Tabachnick and
9
Limitations reported for heterosexual people and men and
women. Fourth, there may be error induced by the
The following limitations need to be considered coding of gender in this report as some transgender
when interpreting the findings in this report. people may have identified with gender-matching
categories. For these reasons, some of the raw
First, convenience samples were used so the percentage differences involving any of these
findings may not generalize to the population or groups may be larger than those not involving
certain subgroups. Second, the data was collected these groups, yet not statistically significantly
during the COVID-19 pandemic, a time when TFGBV different. Fifth, the data is cross-sectional, which
increased worldwide (Kraicer 2020). Thus, the issues precludes comments about causation. Sixth, it is
highlighted in this report may be more pronounced possible that responses of “Prefer not to answer”
than those documented before the pandemic. Third, and “Don’t know/not sure,” which were treated
missing data for some countries is high, which as missing in data analyses, were systematically
can impact statistical inferences. In particular, missing. In this report, missing data mechanisms
the sample sizes for LGB+ and transgender and were not examined. Failure to examine and manage
gender-diverse people were proportionately lower underlying patterns of missingness, in conjunction
than heterosexual and gender binary people, with a per analysis listwise deletion analytic
respectively. Therefore, reported percentages for strategy, may lead to bias in estimates.
LGB+ and transgender and gender-diverse groups,
as well as cross-tabulations, may have a larger
margin of error and be less reliable than those
10
Section I
TFV: Incidents
and Impacts
The following section discusses TFV and the broader influence of gender
and sexual orientation on the 13 types of online harm participants in
the global survey were asked about. It then reviews the number of
incidents participants reported, the impact of the online harm and how
harmful they found each behaviour to be.
11
Survey participants were asked whether they had experienced any of these 13 forms of online harm:
Blackmailed online (e.g., someone Lies posted online about them (defamation)
threatening to post private information
about them unless they did something in
return, including sextortion)
As the focus of this research was to look at the influence of gender and sexual orientation on these online
harms, these specific forms of harm were selected as they are commonly identified as forms of TFGBV and
TFV experienced by LGBTQ+ people in previous research (Iyer, Nyamwire and Nabulega 2020; Amnesty
International 2018; Goulds et al. 2020; GenderIT 2018b; Van Der Wilk 2018).73
12
Introduction: TFV The spread of
In The Emerald International Handbook of Technology-
Facilitated Violence and Abuse, Jane Bailey, Nicola
negative ideas
Henry and Asher Flynn define technology-
facilitated violence and abuse as “an umbrella term
about women and
used to describe the use of digital technologies to
perpetrate interpersonal harassment, abuse, and
LGBTQ+ people and
violence” (Bailey, Henry and Flynn 2021, 1). their communities
It includes technology-facilitated behaviour such
as hate speech, trolling, image-based sexual
legitimizes
abuse, threats, doxing and stalking. TFV can
happen to anyone, regardless of their gender,
technology-facilitated
sexual orientation or other social locations (Dunn
2020a). It can be used to cause generalized harm
and physical violence
to individuals but can also cause specific systemic against them.
discriminatory harms against equity-seeking groups
and individuals, such as women and LGBTQ+ International human rights organizations, such as
people. the United Nations, recognize that certain groups
of people experience systemic discrimination in
For example, organizations such as Pollicy,
societies at large that violates their human rights,
Musawah, the Internet Democracy Project and
and that these discriminatory practices have moved
APC describe how certain conservative political,
into digital spaces, including gender-based violence
community and religious leaders in the Global South
(UN Women 2023b; United Nations Human Rights
reinforce patriarchal and heteronormative notions
Council 2018; Coombs 2021). Various experts and
online by disparaging and threatening people they
bodies within the United Nations have recognized
do not approve of, such as feminists, members of
that people can be discriminated against based
the LGBTQ+ community or racial, religious and
on gender (UN Committee on the Elimination of
ethnic minorities (Yahaya and Iyer 2022; Kovacs,
Discrimination against Women 2017; United Nations
Padte and SV 2013; Palumbo and Sienra 2017). This
General Assembly 2018), sexual orientation (United
can lead to additional TFV by other community
Nations General Assembly 2018), race or ethnicity,74
members who are influenced and emboldened by
religion,75 age,76 disability77 and other equality-
their leaders’ actions to further harass the people or
based identity factors. TFV can be used as a tool
groups online, thus reinforcing the discrimination
to reinforce any of these existing discriminatory
on a grander scale. The spread of negative ideas
power structures, which legitimize sexism,
about women and LGBTQ+ people and their
homophobia, transphobia, racism, colonialism,
communities legitimizes technology-facilitated and
casteism, religious discrimination and others. This
physical violence against them. Similarly, in the
discrimination leads to inequality and violence
Global North, there has been an increase in alt-right
against these groups.
groups that endorse racist, misogynistic, anti-
feminist, transphobic, homophobic, Islamophobic What this means is that when TFV is used as a
and anti-Semitic views (McGinley 2022; Sugiura tool of oppression against equity-seeking groups,
2021; Conway, Scrivens and Macnair 2019). When
a specific woman or LGBTQ+ person is named by
an influential member or group of the alt-right, it 74 International Convention on the Elimination of All Forms of Racial
can lead to sustained harms against that person, Discrimination, 7 March 1966, 660 UNTS 195 (entered into force
including TFV that causes risks to their safety 4 January 1969).
(Curlew and Monaghan 2019; Brown, Sanderson, 75 United Nations Declaration on the Elimination of All Forms of Intolerance
Silva Ortega, 2022). These discriminatory beliefs are and of Discrimination Based on Religion or Belief, GA Res 36/55,
36th Sess, UN Doc A/RES/36/55 (1981).
fuelled in online spaces and have been linked to
mass murders motivated by racism, homophobia 76 Convention on the Rights of the Child, 20 November 1989, 1577
UNTS 3 (entered into force 2 September 1990).
and misogyny (McGinley 2022; Baele, Brace and
Coan 2019; Silva and Greene-Colozzi 2019). 77 Convention on the Rights of Persons with Disabilities, 13 December
2006, 2515 UNTS 3 (entered into force 3 May 2008).
13
it has a larger systemic impact compared to other and girls, throughout their life course, infringing on
forms of TFV. It is used to maintain discriminatory their rights and freedoms, in particular for those in
social hierarchies and cause real harms, including public life” (UN Women 2023b, para. 53).
individual and systemic violence, to these groups.
No form of TFV should be minimized, as all forms
of TFV can cause real harms to the people targeted;
however, this report seeks to highlight some of the
When TFV is used as
ways that TFV is used as an oppressive tool against
groups facing systemic discrimination, with a
a tool of oppression
particular focus on women and LGBTQ+ people. against equity-
Various bodies and rapporteurs at the United
Nations have acknowledged that women and
seeking groups,it
LGBTQ+ people face discrimination because of
their gender identity, gender expression and sexual
has a larger systemic
orientation and are at a heightened risk of violence
because of this. For example, in the case of women,
impact compared to
the UN Declaration on the Elimination of Violence
Against Women (article 1) defines violence against
other forms of TFV.
women as any act “that results in, or is likely to
result in, physical, sexual or psychological harm or It is important to recognize that gender-based
suffering to women, including threats of such acts, violence goes beyond violence directed at cis
coercion or arbitrary deprivation of liberty, whether women and girls (Dunn 2020a). As noted by the
occurring in public or private life.”78 Recently, Women’s Legal Education and Action Fund (LEAF),
this has been recognized to include gender-based TFGBV is also aimed at transgender, gender-
violence in digital spaces (UN Women 2023b). nonconforming, agender and gender-diverse people
because of their gender identity and expression
In 2018, the United Nations released the report
(Khoo 2021). TFGBV includes forms of violence
of Dubravka Šimonović, the Special Rapporteur
involving the use of digital technology that are
on violence against women, its causes and
aimed at people because of their gender, gender
consequences, on online violence against women
identity or expression. It also includes types of TFV
and girls from a human rights perspective (United
that are disproportionately targeted at gender-
Nations Human Rights Council 2018). It noted
marginalized people, such as sexual violence, or
that “groups of women, such as women human
cause them disproportionate harm, such as NCDII
rights defenders, women in politics, including
(Powell and Henry 2017).
parliamentarians, journalists, bloggers, young
women, women belonging to ethnic minorities
and [I]ndigenous women, lesbian, bisexual, and Additionally, members of the LGBTQ+ community
transgender women, women with disabilities and are especially at risk of TFV due to discrimination
women from marginalized groups are particularly against them. In 2018, the United Nations adopted
targeted by [TFGBV]” (ibid., para. 28). Soon after the report of the Independent Expert on protection
this report was released, the world was faced with against violence and discrimination based on
the COVID-19 pandemic, which moved much of the sexual orientation and gender identity, Victor
world online and TFGBV became more widespread Madrigal-Borloz, which recognized violence and
(Kraicer 2020). In 2023, the Commission on the discrimination “on the basis of sexual orientation
Status of Women expressed its deep concern and gender identity and, in particular, their
about “the magnitude of various forms of violence, intensity and scope. Gender identity refers to
including gender-based violence that occurs each person’s deeply felt internal and individual
through or is amplified by technology and the experience of gender, which may or may not
significant physical, sexual, psychological, social, correspond with the sex assigned at birth, including
political and economic harm it causes to women the personal sense of the body (which may involve,
if freely chosen, modification of bodily appearance
or function by medical, surgical or other means)
78 Declaration on the Elimination of Violence against Women, GA Res
and other gender expressions, including dress,
48/104, UNGAOR, 48th Sess, UN Doc A/RES/48/104 (1993) at 2. speech and mannerisms” (United Nations General
14
Assembly 2018). Transphobic and homophobic terms online harms and OGBV or online violence
ideas that purport that there are strictly limited against LGBTQ+ people when analyzing the results
gender and sexual roles, including heteronormative from the survey as the data specifically examines
roles, are used to condone violence against those online experiences.
who do not fit within these discriminatory norms
(Aghtaie et al. 2018; Ashley 2018a; Ontario Human
Rights Commission 2014). These views can be
used to normalize and legitimize violence against
Transphobic and
LGBTQ+ people (Namaste 1996). As such, LGBTQ+
people face high rates of TFV and violence in the
homophobic ideas
physical world (James et al. 2016; Brandwatch
2019). These discriminatory views have found their
that purport that
way onto digital spaces, where LGBTQ+ people are
regularly targeted by online attackers.
there are strictly
It is critical to acknowledge that gender and sexual
limited gender and
orientation are only two aspects of why someone
may be targeted by TFV. The reasons why women
sexual roles...are used
and LGBTQ+ people experience violence and
discrimination often intersect with additional
to condone violence
identity factors. Intersectionality scholarship by
Kimberlé Crenshaw (1991) and Patricia Hill Collins
against those who
(1990) note that a person’s gender cannot be do not fit within
separated from other aspects of their identity such
as their race, ability, religion, Indigeneity and sexual these discriminatory
orientation. For example, women from racial or
ethnic minorities experience discrimination (Anwer norms.
2022) in ways that are different from women from
the dominant ethnic or racial group (Amnesty
International 2018). These intersecting social
locations play an important role in how and why
people are targeted by TFV.
15
Background: Gendered rights. Additionally, women are excluded from the
economic and social benefits afforded from digital
Digital Divide technologies and experience increasing rates of
TFGBV that are inadequately addressed by states
Before discussing the various forms of TFGBV and technology companies, which may result in
addressed in this report, it is important to recognize women self-censoring online due to safety concerns
how the gendered digital divide contributes to (Arimatsu 2019).
gender inequality in digital spaces, including TFV.
The gendered digital divide is most pronounced Substantive equal access to the internet and digital
in the Global South. According to the Office devices is essential to achieve gender equality
of the United Nations High Commissioner for and to give more women and girls power in the
Human Rights (2021), women and girls make up technological world. As stated by the World Wide
the majority of the 3.7 billion people who remain Web Foundation, “Women’s equal access to new
unconnected to the internet worldwide, which technologies and their meaningful participation
reflects the state of gender discrimination globally. on and through the web is a critical component
The International Telecommunication Union (ITU) of women’s rights and equality in a digital world.
reported that in 2020, only 19 percent of women in Access to the internet can support women to
the least developed countries had used the internet have a voice in spaces where this was previously
compared to 86 percent in the Global North (in denied, challenge gender norms, use information,
2019), and, in 2022, 57 percent of women used the participate in political and associational networks,
internet globally compared to 62 percent of men.79 and increase their economic independence”
The ITU identifies four main categories of the global (Sambuli, Brandusescu and Brudvig 2018).
digital gender divide:
• a gap in access and use of the internet; Mobile Ownership and Internet
• a gap in digital skills and use of digital tools;
Access Gap
A 2022 report by GSMA showed that there was a
• a gap in participation in science, technology,
16 percent gender gap in the use of mobile phones
engineering and math (STEM) fields; and
in the Global South in 2021, with the widest gap
• a gap in tech sector leadership and in South Asia and Sub-Saharan Africa (Shanahan
entrepreneurship.80 2022). A 2018 report by Giorgia Barboni et al. found
that in India, 67 percent of men own mobile phones
compared to 33 percent of Indian women. The ITU
The digital divide can (2022) notes that the gender divide for internet use
is also wider in several African and Arab countries.
of women and can women in Africa are less likely to have access
to the internet and have lower social media
79 See www.itu.int/en/mediacentre/backgrounders/Pages/bridging-the-
gender-divide.aspx.
80 Ibid.
16
According to the EQUALS Research Group, the approve of, as well as the disproportionate criticism
digital gender gap is in part due to the high cost for women who have public-facing social media
of access to technology, women’s limited access profiles (Tyers-Chowdhury and Binder 2021).
to economic resources, a lack of digital skills,
safety risks and socio-cultural barriers that hinder
women’s participation (Sey and Hafkin 2019). This
divide is further amplified for rural women in
Surveillance of
the Global South (African Declaration 2015; Sanya
2013) — this rural/urban digital divide remains true
women by male
for rural and Indigenous communities in the Global
North, including in Canada (McMahon, Lahache
family members and
and Whiteduck 2015; Bailey and Shayan 2016).
Interestingly, Alison Gillwald (2018) noted that in
the community limits
some Global South countries, such as India and
Bangladesh, the digital gender divide for mobile
their ability to express
phone access was more pronounced compared to themselves freely and
countries such as Ghana and Kenya, which have
similar gross national income per capita, suggesting communicate with
that affordability and the wealth of a country is
not the primary factor in explaining the gendered whom they want.
digital divide. Other gendered aspects may be
at play and these cultural factors can negatively
influence women’s and girls’ access to digital tools Exclusion from STEM Education
and spaces. and Employment in the
Technology Sector
Patriarchal Control and Access to
Devices and the Internet A gendered divide is also seen in education and the
technology industry. A policy brief by Women 20,
Women in the Global South often have less access a Group of Twenty (G20) engagement group, noted
than men to devices such as mobile phones and that even among G20 countries, where women and
to the internet, since the men in the family are girls are more likely to have access to education,
likely to be given priority in accessing these things there are fewer women and girls being educated
(Villamil 2022). However, physical access to a device and employed in the information technology sector
is not the only gendered limitation. Several authors (Kuroda et al. 2019). Women lag behind in access to
have noted that when women do have access to employment in that industry (Hupfer et al. 2021). As
devices or the internet, it is common for their use noted by the World Wide Web Foundation, women,
to be monitored by male family members, limiting girls and gendered bodies are significantly under-
their freedom of use (Badran 2019; Jamil 2021; Philip represented in the development of technology,
2018). In addition, public spaces for accessing the governance and policy making (Sambuli,
internet, such as internet cafés, are male dominated Brandusescu and Brudvig 2018). Further, gender
in certain countries and can be less welcoming to discrimination and sexual harassment of women
women, further limiting women’s access to these in the tech industry can limit their participation
technologies (ibid.). Digital surveillance of women by making them unwelcome and unsafe in some
can include the direct surveillance of their devices, of these spaces (Sey and Hafkin 2019). This lack of
as well as indirect monitoring of their public access, skills, safety and leadership positions for
posts and activity on the internet (Odeh 2018). women in the digital sector contributes to broader
Surveillance of women by male family members gender inequality in the tech sector but is also
and the community limits their ability to express reflected in inequality in digital spaces specifically
themselves freely and communicate with whom (Chair, Brudvig and Cameron 2020). The gendered
they want. This is further amplified by the gendered digital divide can lead to a lack of trust in the
critiques of women for having too many friends technology industry’s ability and willingness to
on social media, interacting with men online, and address women’s safety. For example, a study
for posting photos male family members do not by the World Wide Web Foundation on women’s
17
experiences using the internet in Colombia, TikTok and Twitter have made public commitments
Ghana, Uganda and Indonesia found that women to address TFGBV; however, addressing TFGBV does
were more concerned about their privacy online not seem to be a primary business priority for these
than men and that they have less trust in online companies, and there is a lack of representation
companies to protect their privacy (ibid.).81 of voices from the Global South in their decision-
making processes. For example, APC has noted the
Lack of Attention to TFGBV and lack of commitment from social media companies
to adequately address TFGBV in Africa (Iyer,
TFV against LGBTQ+ People in Nyamwire and Nabulega 2020). More culturally
18
Background: Forms of TFV Physical Threats
Although the primary purpose of conducting this When threatened in digital spaces, women and
survey was to use the data to examine TFGBV LGBTQ+ people are more likely to receive threats
and TFV directed at LGBTQ+ people, it should be of sexual violence, such as rape threats, than
noted that the 13 types of online harms listed in heterosexual men (Powell and Henry 2017). This
the survey are not always forms of TFGBV or TFV is due, in part, to the gendered and sexual power
directed at LGBTQ+ people. dynamics between these groups, with heterosexual
men situated in more socially powerful positions
The data from this report includes the experiences than women and LGBTQ+ people. However, women
of all genders of people, including cis and trans and LGBTQ+ people also face non-sexual physical
men and women, and gender non-conforming, threats, such as death threats, in digital spaces
agender and non-binary people. Not all of their due to their gender and sexual orientation (Younes
experiences will be forms of these types of gender- 2021).87 Threats against them are often related to
and sexual orientation-based harms. For example, their gender and sexual orientation through the
if a heterosexual cis presenting man threatens inclusion of derogatory slurs shared alongside the
another straight cis presenting man on a social threats of physical and sexual violence. Sexual
media platform, this is not an example of TFGBV or and non-sexual online threats like these can
TFV against LGBTQ+ people. The incident numbers be especially frightening and consequential for
in this report include all experiences with online them because of the high rates of discrimination
harms and should be read with this in mind. For and sexual and physical violence women (World
example, when asked about the most serious Health Organization 2021) and LGBTQ+ people88 are
incident of online harm they experienced, male subjected to. Digital threats cause real fear among
victims were most likely to report being targeted these groups and have been linked to physical
by another man and only a small percentage of violence (Manjoo 2012).
participants identified as LGBTQ+. As such, most
incidents of online harm reported by men are
not TFGBV/OGBV or TFV/online violence against
LGBTQ+ people. When examining the number of
Many LGBTQ+ people
incidents reported, this should be considered as the
numbers represent more generalized experiences
have their physical
with online harms rather than TFGBV and TFV
against LGBTQ+ people specifically.
safety and lives
Research on gender-based violence and violence
threatened online on
based on sexual orientation looks at groups that
are systemically marginalized because of their
a regular basis.
gender expression, gender identity or sexual
orientation, such as women and LGBTQ+ people. In Research, including that done by GLAAD, has
the following section, the influence of gender and shown that many LGBTQ+ people have their
sexual orientation on the 13 forms of online harm physical safety and lives threatened online on a
are discussed by examining previous research on regular basis (National Coalition of Anti-Violence
these subjects to show the ways it contributes to Programs 2016; Human Rights Watch 2019; GLAAD
the discrimination of women and LGBTQ+ people 2022). Digital threats against LGBTQ+ people and
and to violence against them. Although the survey activists have led to physical attacks and deaths,
collected data on online harms experienced and including those committed by state actors (Gritten
perpetrated by all genders of people, this report 2022). In Latin American countries such as El
is primarily interested in looking at individual Salvador, Guatemala and Honduras, LGBTQ+ people
and systemic harms impacting equity-seeking
communities, particularly those discriminated
against because of gender and sexual orientation, 87 See also http://webfoundation.org/docs/2020/03/WF_WAGGGS-
and will analyze the data with this focus. Survey-1-pager-1.pdf.
88 See www.hrc.org/resources/sexual-assault-and-the-lgbt-community;
www150.statcan.gc.ca/n1/daily-quotidien/200909/dq200909a
-eng.htm.
19
are at ongoing risk of physical violence and death control over women, and after a relationship has
(Ghoshal 2020). A report out of Latin America found ended to punish and harm women for leaving the
that at least 1,300 LGBTQ+ people were murdered relationship (Dragiewicz et al. 2018; Woodlock et
in Latin America and the Caribbean in a five- al. 2020). Unfortunately, the use of technology to
year period, primarily in Colombia, Mexico and threaten women in intimate partner relationships
Honduras (Moloney 2019). In recent years in the is on the rise. An Australian study about the
United States, there have been multiple attacks on connection between technology and domestic
drag performers and mass shootings in gay clubs. violence found that anti-violence practitioners
In 2022, following a shooting in a gay nightclub in observed a 74.4 percent increase in the use of
Colorado, GLAAD (2022) reported that a poll showed technology by abusive intimate partners to threaten
48 percent of LGBTQ+ respondents fear for their women between 2015 and 2020. This is especially
personal safety because of the current transphobic concerning as they noted that the likelihood that
and homophobic political climate, and 43 percent a woman will be killed by her male partner is
felt unsafe speaking about LGBTQ+ equality online 11.36 times more likely if she has been previously
using their real name. In Afghanistan, LGBTQ+ threatened by them (Woodlock et al. 2020). Further,
people have been criminalized and association outside of an intimate partner relationship, men’s
with the LGBTQ+ community online or in person feelings of sexual entitlement to cis and trans
can lead to serious social, physical and legal harms, women can result in men threatening women who
including violence by the Taliban (Akbary 2022). reject their sexual advances in digital spaces, as will
be discussed in greater detail in the section below
on unwanted communication. For example, a study
Digital threats and in India and Pakistan found women received online
threats when they did not respond to or rejected
other controlling romantic advances from men (Vashistha et al. 2019).
behaviour can occur Additionally, those who advocate online for gender
(Vasudevan 2018; Kovacs, Padte and SV 2013)
relationship.
are exposed to violence and threats to their safety
in retaliation for their work, attacks on women
are gender-based and highly sexualized online
People in abusive intimate partner relationships and offline” (Khan 2021). TFV against women and
may face digital threats or coercive control from LGBTQ+ journalists, human rights defenders and
their partners in digital spaces. Intimate partner politicians will be discussed in more detail in a later
violence (IPV) in heterosexual relationships is section of this report.
highly gendered, with many studies showing
that women are most vulnerable to the negative
impacts of IPV (Citron 2014; Aikenhead 2021).
Unsolicited Sexual Images
Digital threats and other controlling behaviour can People of all genders and sexual orientations send
occur during a relationship to maintain power and unsolicited sexual images for a variety of reasons
20
and people have a variety of responses to receiving Men’s and women’s reactions to unsolicited
them (Oswald et al. 2020; Dietzel 2022). Research sexual images can be quite different. A study
by Canadian scholar Christopher Dietzel (2022) by Flora Oswald et al. (2020) found that women
and Australian scholars Anastasia Powell and experienced more negative reactions to receiving
Nicola Henry (2017) found that some unsolicited these unsolicited sexual images than men. The
sexual images are wanted, particularly in the intention behind why men and women send
context of a sexual relationship, whereas others sexual images differs as well. Both women and
are interpreted as a form of harassment or abuse. men may send images to solicit sexual attention;
Unsolicited images that are unwanted can be a however, men more commonly do it as a form of
form of image-based abuse (McGlynn and Rackley harassment. Oswald et al. (2020) found that men
2017). Unsolicited sexual images can be sent in a who engage in sending unsolicited sexual images
variety of contexts ranging from intimate partner can be motivated by misogyny and a desire to have
relationships to complete strangers. For example, power and control over women. In their research
some unsolicited sexual images come in the form on unsolicited “dick pics,” Rebecca Hayes and
of spam or advertisements for sexual content and Molly Dragiewicz (2018) noted that there can be an
sexual services (Powell and Henry 2017). element of aggrieved entitlement when men sent
these unsolicited sexual images to women.
Unsolicited images can be a form of TFGBV when
they cause harm to the person receiving them. In In a Canadian study by Dietzel (2022) about
an Egyptian study on TFV against women by Fatma unsolicited sexual images, gay and bisexual men
Mohammed Hassan et al. (2020), unsolicited sexual considered sending unsolicited sexual images (such
images were one of the most common forms of as “dick pics”) more socially acceptable and less
TFGBV women experienced. The images in these harmful among men who sleep with men than
cases were typically sent to women by an unknown when heterosexual women received the same
person. In an international study on girls by Plan images. This was in part because the gendered
International, girls reported being sent unsolicited power dynamics are very different when same-sex
pornographic images in order to harass them people share photos with each other compared
(Goulds et al. 2020). An additional study on young to when men send images to women. However,
adults in Sub-Saharan Africa found that receiving some men do still find the images harassing. In
unwanted sexually explicit images was the most another study, Dietzel (2021) found that the highly
common form of TFV experienced by those sexualized nature of men’s dating and hook-up
surveyed (Makinde et al. 2021). practices, along with gendered assumptions about
men’s desire for sexual activity, including the
Clare McGlynn and Kelly Johnson’s book pressure to send and positively respond to sexual
Cyberflashing: Recognising Harms, Reforming Laws images and sexual advances, contributed to rape
discusses the phenomenon of men who send culture on dating and hook-up apps.
unsolicited sexual images to women. Their study
focused on women and girls in the United Kingdom,
Ireland, the United States, Canada, Australia and NCDII
Singapore (McGlynn and Johnson 2021). Their
Of all forms of TFV, one of the most researched
research focused specifically on men sending an
topics is NCDII. Early research on this subject was
image of a penis to women, which occurs in a
focused on young people and the risks of “sexting”
variety of contexts. For example, a single picture
(Karaian 2012), but research has expanded to
may be sent to a single woman on a dating app,
adults, including women, men and members of
or multiple women may be sent the image using
the LGBTQ+ community. NCDII is now considered
functions such as Apple’s AirDrop to place the
a form of what McGlynn and Erika Rackley (2017)
image on multiple women’s phones (for example,
have called “image-based sexual abuse,” which is
groups of women collectively located in a public
a subset of TFGBV. NCDII can range from sharing
place such as public transit). McGlynn and Johnson
pictures through texts to livestreaming sexual
found that “cyberflashing,” the term they use for
images, including sexual assault, onto public social
unsolicited sexual images, is often a form of gender-
media or pornography sites without consent. In
based harassment and that some women felt afraid,
South Korea, there is a disturbing trend of hidden
humiliated and violated by the act (ibid.).
cameras being used to non-consensually capture
nude and sexual images of women in public
21
bathrooms, change rooms and hotel rooms and Research has shown that women (Klein and Zaleski
distribute them on pornography sites and other 2019) and members of the LGBTQ+ community
places online (Aziz 2020; Ngyuen and Barr 2020). (Waldman 2019) report higher pressure to share
In some of the most severe cases of NCDII, images intimate images in digital spaces. In his research
of women being sexually assaulted and raped have on gay online communities in the United States,
been livestreamed or posted online (Akhter 2018; Ari Ezra Waldman (2019) found that NCDII is more
Klein and Zaleski 2019; Oliver 2015). common in gay and bisexual communities, where
there are heightened norms for disclosing intimate
images, which can increase the risk of having the
22
harassment due to NCDII. Men were more likely to
take, share and threaten to share intimate images LGBTQ+ people are
than women. LGB+ people were more likely to take,
share and threaten to share intimate images than at significant risk
heterosexual people, which may be linked to the
normalization of sexual image disclosure in those
of harms if they are
communities mentioned above. The researchers
found that their data pointed “to a troubling trend
in a country where
where digital technologies are being used not
only as a form of control, abuse and harassment,
it is not safe to be
but as a further expression and consolidation of
masculine entitlement and privilege, and as a
LGBTQ+ publicly,
tactic of sexuality-shaming women, women of
colour or those identifying as lesbian, gay, bisexual,
such as in countries
transgender, intersex or a non-binary gender”
(ibid., 27).
where same-sex
In a study of Canadian NCDII criminal cases,
relationships are
Moira Aikenhead (2021) found that there was a
“gendered double-standard” regarding the sharing
criminalized and
of intimate images, with women being slut-shamed
and blamed by some members of the public for
gendered dress codes
taking the images in the first place. Offenders in are enforced.
these cases were cited as trying to humiliate the
victim, and many images were posted on public Additionally, the gendered nature of NCDII can
websites such as dating or pornography websites. be seen on websites and online groups that are
The vast majority of NCDII criminal cases in Canada dedicated to publishing “revenge porn” or the
involved female victims and male offenders. collections of non-consensually shared intimate
images (La Prensa 2020). Studies have shown that
Reports of NCDII increased during the COVID-19
these public websites dedicated to publishing nude
pandemic. In Brazil, SaferNet reported that there
and sexual images without consent primarily focus
was a 154.9 percent increase in cases of NCDII in
on women (Slane and Langlois 2016). A study by
April 2020 compared to April 2019 and that most
Carolyn A. Uhl et al. (2018) found that 92 percent of
of the victims reporting to them (70 percent) were
the profiles on these sites were of women. A study
women (Ramos 2020). The United Kingdom’s
of these sites by Henry and Flynn (2019) found
Revenge Porn Helpline saw a spike in cases
that 85 percent of images on one website featuring
reported after the COVID-19 pandemic began (Ward
12,450 profiles were of women and women’s
2021). Its report also showed a gendered difference
images were viewed more often than men’s images,
in who was seeking help. Sixty-two percent of
sometimes upwards of 100,000 times. Matthew
the 3,146 cases in 2020 were women and most
Hall and Jeff Hearn (2019) examined the language
perpetrators were men (84.5 percent). The quantity
used on these sites and found “power, control and
of images shared was also much higher for women
(hetro)sexuality were the main underlying themes,”
than for men (ibid.). Victim blaming was a common
and men were attempting to hurt or control
theme across many countries’ studies on NCDII
the women in the images. According to Walter
(Sequera 2021; Ayres and Quevedo 2020; Giorgetti et
S. DeKeseredy and Martin D. Schwartz’s (2016)
al. 2016).
male peer support theory, some men who non-
consensually share sexual images in groups rely
on patriarchal masculinity to justify their sexually
abusive behaviour.
23
Blackmail 2018). Extortion against LGBTQ+ people can also
occur on a larger scale. In 2021, hackers obtained
Gender and sexuality can play a significant role in access to an Israeli gay dating website and posted
blackmail online. GenderIT listed extortion as one the data online after the company refused to pay a
of the 13 manifestations of TFGBV89 and extortion ransom (France 24 2021).
was found to be a form of TFGBV in multiple
studies, including those out of Palestine (Odeah
2018), Bangladesh (Akter 2018), Australia (Powell
and Henry 2017), India and Pakistan (Vashistha et al.
Blackmail is a
2019), the United States (Lenhart, Ybarra and Price-
Feeney 2016) and Europe (Council of Europe 2018).
particularly serious
Blackmail is a particularly serious risk for many
LGBTQ+ people, who may not share their sexual
risk for many LGBTQ+
orientation and sex assigned at birth publicly due
to privacy, safety and legal concerns.
people, who may
LGBTQ+ people are at significant risk of harms
not share their
if they are in a country where it is not safe to
be LGBTQ+ publicly, such as in countries where
sexual orientation
same-sex relationships are criminalized and
gendered dress codes are enforced.90 Homophobia,
and sex assigned at
transphobia and violence toward LGBTQ+ people
exist in all countries, but in countries where
birth publicly due to
same-sex relationships are criminalized, the risk privacy, safety and
is especially high (Akbary 2022). The International
Gay and Lesbian Human Rights Commission report legal concerns.
Nowhere to Turn: Blackmail and Extortion of LGBT
People in Sub-Saharan Africa detailed how LGBTQ+ Sexual extortion is another common form of
people in some African countries face physical and digital blackmail (Aziz 2020). Sometimes called
legal risks if their sexual orientation is exposed “sextortion,” it occurs when someone uses sexual
online or in their communities. Authors Ryan images of another person to demand something
Thoreson and Sam Cook (2011) noted that, “in places from them, often additional sexual images or
where it is illegal, stigmatizing, or dangerous to sexual contact, or to force someone to stay in an
identify as LGBT or to engage in same-sex activity, intimate relationship (Wittes et al. 2016; Wolak and
keeping one’s sexuality a secret may be, quite Finkelhor 2016). Women and young people are at
literally, a matter of life or death.” The report cited particular risk of sextortion; however, an American
studies from Botswana, Cameroon, Ghana, Malawi, study on sextortion of adults during the COVID-19
Namibia, Nigeria and South Africa where LGBTQ+ pandemic found that men were increasingly targets
people reported incidents of blackmail related of sextortion, along with Black and Indigenous
to their sexual orientation and gender identity. women, and LGBTQ+ people (Eaton, Ramjee
Details of their sexual orientation were sometimes and Saunders 2022). Intimate images that were
gathered from their communications on the originally shared consensually, images that were
internet and used to blackmail them by threatening taken without consent or images that were hacked
to expose their sexual orientation to their families or stolen can be used in sextortion. Women in
and communities. This risk is felt by LGBTQ+ people Mexico reported an increase in the use of sexual
in many other countries. For example, in Brazil, a images to extort and harm them during the early
record number of LGBTQ+ people have been killed, stage of the pandemic (El Heraldo 2020). Gendered
so their privacy is particularly important to them extortion can occur in other contexts. Women in
to avoid violence and death (TGEU 2021; Trevisan India and Pakistan reported being blackmailed by
men who had their phone numbers (not sexual
images) and threatened to publish their contact
89 See https://genderit.org/resources/13-manifestations-gender-based-
violence-using-technology.
information and false information about them
if they did not continue speaking with them
90 See www.ohchr.org/en/sexual-orientation-and-gender-identity/about-
lgbti-people-and-human-rights.
(Vashistha et al. 2019).
24
In a US study of 152 sextortion offenders, Roberta Repeated Unwanted Contact
O’Malley and Karen Holt (2022) defined four main
types of sextortion: sextortion that targeted minors; Repeated unwanted contact, sexual or otherwise,
sextortion involving a cybercrime (where images can cause distress in some circumstances, can
were hacked, stolen or obtained through deceit); be intimidating in others or can be as serious as
sextortion conducted by a current or ex-intimate stalking, which can cause ongoing distress and fear
partner; and transnational sextortion. In cases and pose a legitimate risk to the target’s physical
where minors were targeted, which were the safety.
most common (52.6 percent) and often involved
grooming the victims, 100 percent of the offenders Women and LGBTQ+ people face high levels of
were men, 71.3 percent of the victims were unwanted contact in the form of sexual harassment
female, 88.8 percent of the victims were minors and unwanted requests for romantic and sexual
and 100 percent of the demands were sexual. encounters. A study on online sexual harassment in
In cybercrime cases (21.1 percent), 96.9 percent Bangladesh found that women commonly received
of the offenders were men, 93.8 percent of the unwanted sexual propositions and inquiries about
victims were female, 28.1 percent of the victims dates (Nova et al. 2019). In other studies, in India,
were minors and 84.4 percent of the demands Pakistan (Digital Rights Foundation 2017a) and Sub-
were sexual. In cases involving intimate partners Saharan Africa (Makinde et al. 2021), researchers
(12.5 percent), 100 percent of the offenders were found similar results, with women reporting
men, 94.7 percent of the victims were female, that they were repeatedly contacted by people
26.3 percent of the victims were minors and proposing or demanding a sexual relationship with
36.8 percent of the demands were sexual. In them (Ramaseshan et al. 2019). Sexual harassment
cases of transnational sextortion (11.2 percent), is reported as a form of TFGBV in the majority of
where victims are more commonly extorted for the studies reviewed for this report, with women
money, 58.5 percent of the perpetrators were experiencing more sexual harassment than men
men, 5.9 percent of the victims were female, (Powell and Henry 2015).
5.9 percent of the victims were minors and none
A 2016 American study on online harassment
of the demands were sexual (they were more often
by the Pew Research Center found that men are
financial).
more likely to receive threats of physical violence,
but women are more likely to experience sexual
harassment. In the study, people’s gender, religious
Women and LGBTQ+ identity and sexual orientation were common
reasons for the sexual harassment (Duggan 2017). In
people face high 2021, the Pew Research Center found women were
levels of unwanted
more likely to be harassed and to say the reason
they were harassed was because of their gender
25
2021; Udwadia and Grewal 2019). Transgender
people, especially women, reported being sexually AirTags have been
objectified and fetishized on dating apps and
that they have been made to feel unsafe by that used to track women
contact in those digital spaces (Albury et al. 2021).
A qualitative study in India by Point of View found when their male ex-
that LGBTQ+ people receive unwanted messages
asking inappropriate questions about their bodies partner places them
and demands for sex (Udwadia and Grewal 2019).
in items belonging to
Further, repeated unwanted contact can amount
to stalking. Stalking is one of the most serious their children.
forms of TFGBV — it is the repeated contact or
surveillance by another person that causes a Intimate partners are often the perpetrators of
person to feel fearful. It can be related to IPV and stalking. Research by Diana Freed et al. (2018)
men’s feelings of entitlement toward women documented some of the ways current and
online (European Institute for Gender Equality ex-intimate partners use technology to stalk their
2017); however, women and LGBTQ+ individuals targets.91 When a person is still in the relationship,
are also stalked online due to their gender identity, the abusive partner may have physical access to
sexual orientation and leadership positions by the other person’s device and can use that access
strangers and people they know online (Curlew for surveillance purposes. The abuser may also own
and Monaghan 2019). Stalking has been linked to the device, share an account with their partner
in-person sexual and physical violence (FRA — and/or control access to their partner’s device and
European Union Agency for Fundamental Rights its contents. This limits the person’s freedoms and
2014; Sambasivan et al. 2019). their ability to seek out help. Further, the abusive
partner may give their shared child(ren) a device
The Pew Research Center examined Americans’ to stalk their current or ex-partner. For example,
experiences with online harassment in 2016 and Apple AirTags have been used to track women
2021 and found that women are more likely to when their male ex-partner places them in items
be cyberstalked compared to men (Duggan 2017; belonging to their children (Cole 2022). Further,
Vogels 2021). Statistics Canada reported that women former and current abusive partners may have
are more likely to be cyberstalked than men, a access or knowledge of their target’s accounts,
number that increases for young women (Burlock private information and photographs that can be
and Hudon 2018). In Malawi, women reported used to facilitate the stalking. Delanie Woodlock
being stalked online as the most common form of (2017) reported that intimate partner stalking
TFGBV they faced, which made them feel unsafe, against women can lead to feelings of isolation,
fearful, distressed or alarmed (Malanga 2021). omnipresence and constant surveillance that can
Cyberstalking was also the most common form be extremely disruptive to their lives and cause
of TFGBV experienced by women in a study from ongoing fear.
Bangladesh, India and Pakistan, with 66 percent of
women participants reporting being stalked online More complex forms of technology have also been
(Sambasivan et al. 2019). However, the gendered used to stalk intimate partners. Abusive partners
aspect of stalking was not consistent in all studies. may install spyware on their victim’s phone
A study on Sub-Saharan Africa did not find a to track them (Thomasen and Dunn 2021). This
gendered difference in stalking among genders technology allows the abusive partner to monitor
(Makinde et al. 2021). the activity of the other person, including their
texts and online interactions, and, in some cases,
can be used to turn on the person’s microphone or
camera to observe their activity. These apps have
been marketed to facilitate gender-based stalking.
As noted by CitizenLab, stalkerware spyware has
been marketed as an intimate partner tracking app,
26
and many comments on apps sold to track intimate can be used to track a person’s communication and
partners are about tracking women in particular whereabouts. In an Australian study by Heather
(Khoo, Robertson and Deibert 2019). Smart home Douglas, Bridget A. Harris and Molly Dragiewicz
technology, such as alarm systems and listening (2019) about women’s experiences with technology
devices (Lo 2021), and drones (Thomasen 2018), have and domestic violence, some participants
also been used to stalk and harass women. reported that their abusive partners maintained
unauthorized access to their accounts and at times
Organized groups can also be engaged in stalking changed their passwords, so they no longer had
women and LGBTQ+ people. For example, research access to their accounts. In other cases, keyloggers
by Abigail Curlew and Jeffrey Monaghan (2019) were installed on women’s devices, allowing their
described a website dedicated to stalking and abusive partners to access their passwords and
sharing private information about transgender communication. Additionally, hacking has also
people, in particular women and neurodivergent been used as a technique to obtain sexual photos
people. According to Curlew and Monaghan, this of people to extort them (O’Malley and Holt 2022).
site uses crowdsourcing to collect information to Many women are forced to provide access to their
create “dossiers” on the website that purposely accounts and devices by male partners or family
misgenders transgender people and posts their members, as discussed in the next section.
pre-transition photos and deadnames, along with
discriminatory commentary about them. In 2022, Outside of intimate partnerships, human rights
actors from this website targeted a transgender defenders and women’s and LGBTQ+ organizations
Canadian woman, Clara Sorrenti. They engaged are also at risk of unauthorized access by abusive
in an organized, hate-filled online harassment individuals who oppose their work (Acoso 2020).
campaign against her, including doxing her.
Someone made a false report to the police — a
practice known as swatting (Khoo 2021) — that she
Monitored, Tracked or Spied On
had killed her mother and was going to go to city Many women and LGBTQ+ people have their
hall and kill cisgendered people (Farokhmanesh devices and accounts monitored by family
2022). Armed police showed up at her house to members, current or former intimate partners
arrest her. After users from this website identified and malicious actors. In particular in countries
her location, Sorrenti fled the country for her safety. where same-sex marriage is illegal, as mentioned
above, social and state surveillance of women and
Unauthorized Access LGBTQ+ people puts them at risk of violence and
persecution (Akbary 2022).
Unauthorized access to a person’s personal
devices or online accounts is linked to stalking, As noted in the section on unauthorized access,
harassment and NCDII. A 2020 study by the BC intimate partner monitoring is a common problem.
Society of Transition Houses demonstrated how A study in Brazil noted that among young people,
common experiencing this type of behaviour was adolescent boys and young men monitoring,
for victims of gender-based violence. It found that tracking and stalking their girlfriends on mobile
85.29 percent of victim service workers worked phones was a normalized practice (Lopes Gomes
with women who had their social media platforms Pinto Ferreira 2021). In some countries, there is also
hacked and monitored by an abuser, 79.41 percent a significant amount of monitoring done by family
had worked with women whose mobile phone members. In Pakistan, it is common for women’s
was hacked and monitored by an abuser, and phones to be controlled and monitored by male
75.47 percent had worked with women whose email family members (Jamil 2021). Gender inequalities,
had been hacked and monitored by an abuser. in particular in conservative and religious
families, contribute to this practice. In a study
Karen Levy and Bruce Schneier (2020) have noted in Sub-Saharan Africa, 18 percent of the study’s
that intimate partners have unique access to the participants reported being spied on with a camera
personal information of their partners, including or listening device, or tracked using a location
knowing their passwords or the information tracker, such as GPS; however, information on who
needed to gain access to their passwords, which was spying on them was not collected (Makinde
allows them to access a person’s account without et al. 2021, 95). That research noted that in Uganda,
consent. Once access is gained, this information two women were killed by their partners after they
27
allegedly found romantic messages from someone Research by the Internet Democracy Project
else on their phones (ibid., 87). In Pakistan, four found that many of the concerns expressed by
women were killed when a video appeared online conservative leaders were focused on controlling
showing them clapping and singing at a wedding women’s sexual and romantic choices (ibid.).
(Aziz 2020, 34). However, these bans limit women’s autonomy
in many ways beyond their romantic and
sexual choices. They can affect their freedom of
28
Doxing is often used to increase harassment against with networked harassment for criticizing their
a person by providing additional ways for harassers governments or discussing feminist issues.
to contact them and can cause increased fear for
a person’s physical safety when their physical In Marwick and Caplan’s (2018) article on
location is exposed online (Dunn and Petricone- networked harassment, they focus on gender-
Westwood 2018). In a study of Australian and UK based harassment against women originating from
adults, Henry et al. (2020, 27) found that in cases of what is known as the “manosphere.” Proponents
NCDII, a person’s identifying information, such as of the manosphere blame feminism for what they
their name, contact information and social media perceive as a negative shift in society that no longer
accounts, was often posted along with the intimate embraces patriarchal and heteronormative ideals.
image, encouraging additional harassment against One of the first large-scale online harassment
them. As noted above, LGBTQ+ individuals are campaigns, #Gamergate, was conducted against
often doxed when people are trying to expose their several female gamers by male gamers who felt
sexual orientation and gender identity in harmful that the gaming industry was threatened by these
ways. women’s engagement in the gaming industry. Their
campaign against these women resulted in a nearly
Doxing has a gendered element to it. A study decade-long harassment campaign.
involving women and men in Canada, Finland,
Germany, Switzerland and the United States
found that women were doxed in relation to being
outspoken in male-dominated digital spaces (Eckert
Lies and
and Metzger-Riftkin 2020). In a study about Muslim
women human rights defenders, several had their
disinformation
personal information doxed (Yahaya and Iyer
2022). Some women choose to communicate and
campaigns about
do advocacy work online anonymously due to the
risks associated with it, so keeping their personal
individual women
information private is an important safety practice. and women as a
Networked Harassment group are used to
Alice E. Marwick and Robyn Caplan (2018) describe reinforce sexist
networked harassment as a form of collective
online harassment that originates from a network gender norms.
of people with a shared agenda or world view.
As will be discussed in another section of this The manosphere is often connected with the alt-
report, women and transgender and gender- right in the West. Members of the alt-right espouse
diverse journalists (Posetti et al. 2021), human white nationalism, homophobia, transphobia and
rights defenders (Van Der Wilk 2018), politicians misogyny and have a particular dislike of feminists
(Dhrodia 2018), and public figures (Gurumurthy (Massanari 2018). Those with a large following can
and Dasarathy 2022; Marwick 2017) are subject to drive significant gender-based violence toward
significant networked harassment. Investigative particular women when they criticize them on their
journalists such as Rana Ayyub from India (United platforms (Brown, Sanderson and Ortega 2022).
Nations 2018) and Maria Ressa from the Philippines These disinformation campaigns will be discussed
(Posetti 2017), have faced large-scale gender-based in further detail below.
attacks, including threats of death and sexual
violence. Hashtags such as #ArrestMariaRessa and a
sexual deepfake of Ayyub spread across WhatsApp
False Information
and Twitter were used to drive harassment toward Lies and disinformation campaigns about
them. Hashtags, derogatory sexist comments and individual women and women as a group are
sexual threats have been used against journalists used to reinforce sexist gender norms. In 2023, the
in Latin America as well (Cuellar and Chaher 2020). United Nations’ 67th Commission on the Status of
Many of these women journalists were targeted Women noted that “the way many digital platforms
are designed, maintained and governed has given
29
rise to disinformation, misinformation and hate agendas.92 A major problem for women in Asia was
speech, which can undermine the fulfilment of the dissemination of false information, including
women’s and girls’ rights, including the right reports of men falsely claiming women were sex
to freedom of opinion and expression and to workers (Aziz 2020). In some cases, abusers have
participate in all spheres of public life” (UN Women made fake websites that spread lies about their
2023a, para. 40). ex-partner in order to ruin their reputations online
(Dunn 2020b). More details on the ways that fake
A Wilson Center report showed that gendered information harms women and LGBTQ+ people
and sexualized disinformation online is a unique will be discussed in the following sections on
form of gendered abuse that involves sexist, racist, impersonation and identity-based harms.
transphobic and sexual narratives, with sexual
narratives being the most common (Jankowicz et
al. 2021). In its research, the Wilson Center found Impersonation
that racialized women faced intersectional attacks
Impersonation, such as the use of fake profiles,
that targeted their race and gender. In another
is used as a form of gender-based violence and
study, Sarah Sobieraj (2020) found that women who
violence against LGBTQ+ people (Aziz 2020). Fake
were in male-dominated fields such as politics
profiles can be used by abusive intimate partners to
and/or spoke about feminist issues, were
gain information about their ex-partner by posing
particularly targeted — again, racialized women
as them and communicating with family members,
experienced some of the most severe attacks.
friends or co-workers to get them to disclose
Demos reported that gendered disinformation is
information (Cox 2014). In an international study on
used to silence influential women in digital spaces
TFGBV against women, The Economist Intelligence
(Judson et al. 2020). It applies sexist norms to these
Unit reported that 63 percent of participants said
women and spreads lies about them, including
they had been impersonated online.93 In a study
doctored sexual images of them. Research by
from Southeast Asia, 15 percent of cis and trans
Samantha Bradshaw and Amélie Henle (2021) shows
women participants had been impersonated; it
how gendered disinformation campaigns against
was more common among those who were lower
feminism and women’s rights were orchestrated
income, younger or sexual minorities, and fake
by state-sponsored accounts from Iran, Russia and
sexual images were made of women and posted on
Venezuela, with high-profile feminists commonly
the fake profile of them (Sambasivan et al. 2019).
targeted.
Fake profiles have been used to humiliate women
Similar disinformation campaigns are made
and LGBTQ+ people by posting inappropriate
against LGBTQ+ people (Strand and Svensson 2021).
content from the fake profile featuring them
These campaigns often falsely claim that LGBTQ+
(Waldman 2019; Dunn 2020b). Some people have
people are a threat to children because they are
hacked into other people’s profiles and manipulated
sexual predators and that their “gender ideology”
their existing profiles, while others have created
is a threat to the social fabric. In one study, this
new fake profiles. These fake profiles have included
behaviour was found to be particularly common in
sexual content that suggested the people were
the Philippines and Poland. In Brazil, anti-feminist,
engaged in sexual activities that they had not
anti-LGBTQ+ and anti-human rights campaigns
engaged in or that they were interested in such
have led to violence against these groups (Sívori
activity. This type of impersonation can lead to
and Zilli 2022).
physical harm. A study on criminal NCDII cases
Defamatory and false information about women in Canada by Aikenhead (2021) found fake online
and LGBTQ+ people’s sexual practices are profiles suggesting that the person is available
typically used to discredit them (Bartow 2009). for unwanted sexual encounters, including “rape
In 2020, 165 Pakistani women journalists released fantasies” and escort services, have led to ongoing
a statement that said they were discredited by unwanted messages requesting sexual contact, as
political parties and opponents, some of whom well as physical and sexual assaults. In the United
suggested the women journalists had personal
relations with politicians of other parties and
92 See https://docs.google.com/document/d/1DD8BQ53noKO6zHy-
accused them of taking bribes to promote political gysGnFjeKT4ride4uYtQsNNRYoc/edit.
93 See https://onlineviolencewomen.eiu.com/.
30
States, a gay man created multiple fake profiles racist misogyny that Black women experience,
of his ex on Grindr, leading to hundreds of men particularly in US visual and digital culture” (ibid., 1).
unexpectedly coming to his ex-partner’s home
and workplace demanding sex with him (Goldberg
2019).
Like all forms
Fake profiles can also be used to make it seem that
a person is saying something that they would not of gender-based
have said online to smear their reputations (Dunn
2020a). For example, in the case of investigative
violence, gender
journalist Rana Ayyub mentioned earlier, a fake
Twitter profile of her was used to say she supported
inequality, misogyny
child rape and hated Indians, which contributed
to the networked harassment against her (Citron
and patriarchy are the
2019). root of much TFGBV
Identity-Based Harassment and against women.
Discrimination Other identity factors can intersect with women’s
Digital identity-based harassment and discrimina- identities to impact the TFV they experience. Shia
tion occurs when a person is targeted because they Muslim women in Pakistan are targeted because
are a member of an equity-seeking group. They face of their gender and their religion (Anwer 2022). As
attacks directly because they are a woman, LGBTQ+ noted by Pollicy and Musawah, Muslim women
person, religious, disabled or a member of an ethnic human rights defenders who push back against
group, or Black, Indigenous or a person of colour. patriarchal norms face unique risks related to
their religion when they engage online (Yahaya
Gender identity is one reason why a person can and Iyer 2022). Outspoken women in Egypt have
be discriminatorily targeted by TFV. Women are been targeted online because of their religion,
targeted because of their gender more than men gender, culture and race (Sallam 2018). Culturally
and patriarchal norms are reinforced online in and linguistically diverse women in Australia
these attacks (Vasudevan 2018). Like all forms of reported specific threats related to their social
gender-based violence, gender inequality, misogyny location, including threats of deportation for those
and patriarchy are the root of much TFGBV against without citizenship and honour killings (Louie 2021;
women (Aziz 2020). Gendered attacks also focus eSafety Commissioner 2019). Indigenous women in
on women’s other identity factors such as sexual Canada, who have some of the highest reports of
orientation, gender identity, gender expression, race gender-based violence in the country, experience
and class (Iyer, Nyamwire and Nabulega 2020). In forms of TFV, such as human trafficking and online
India, the colour of a woman’s skin and her caste hate, that are connected to sexist and colonial
can alter the type of TFV she faces. As noted by oppression against them (Bailey and Shayan 2016).
Kiruba Munusamy (2018), “unlike online violence These examples demonstrate the intersectionality
that privileged women face which are most often of identity-based harassment and discrimination in
only sexual, the violence that underprivileged digital spaces.
outcaste, dark-skinned, minority women
experience are intersectional, extreme, unique and Within the LGBTQ+ community, people are targeted
invariably high as they are hateful and identity- because of their sexual orientation, gender identity
based aiming to defame, humiliate, delegitimise or and gender expression. According to Sobieraj (2017),
undermine an individual.” In her book Misogynoir women of colour and LGBTQ+ people are exposed
Transformed: Black Women’s Digital Resistance, Moya to racist, transphobic and homophobic slurs online
Bailey (2021) discusses the way stereotypes such related to their intersecting identities. Attacks
as the Jezebel, mammy or Sapphire are evoked are focused on their physical appearance, sexual
online to dehumanize Black women. She calls the orientation and sexual activity. These attacks also
form of discrimination Black women face online challenge their capacity to be in leadership roles
“misogynoir,” which she defines as “the anti-Black and threaten physical and sexual violence. A study
by Brandwatch (2019) analyzed 10 million posts
31
from the United Kingdom and the United States Some have been prosecuted by the state and
about transgender people, finding a significant even killed (Human Rights Watch 2020; Sallam
number of transphobic posts. The study found 2018; Gritten 2022). Privacy online is therefore
that transgender people face daily attacks and extremely important to the LGBTQ+ community.
comments on their timeline, including comments Rebecca Ryakitimbo (2018) has written about the
that are linked to their race. In the most severe importance of data and privacy in Tanzania, where
cases, there were calls for transgender genocide. the government has created a task force to identify
Brandwatch found that abusive content often digital content about gay people to prosecute them.
spiked when laws and policies about trans rights
were proposed by governments. In Pakistan, for
example, despite the introduction of a transgender
rights bill in 2018, transgender people faced death
There are growing
threats and at least 20 transgender people were
murdered in 2021 (Zaman 2022).
online movements
Online platforms themselves can contribute to
against feminists,
the marginalization of LGBTQ+ people. On many
digital platforms, there are limited gender options
LGBTQ+ people
that do not allow transgender or gender-diverse
people to properly express their gender identity,
and human rights
which can cause them harm (Lui, Singh and Giuga,
forthcoming 2023). Research by Kath Albury et
defenders more
al. (2021) found problems with the structure of
dating apps. Most dating apps are not designed
generally where their
to be inclusive or safe for transgender people,
and often do not even provide a space for them to
identities are attacked
authentically define their gender identity. Florence as a group.
Ashley’s (2018b) research discusses the abuse that
transgender people face, including accusations of As mentioned in the networked harassment
gender fraud. section, there are growing online movements
against feminists, LGBTQ+ people and human
As noted earlier, LGBTQ+ people are targeted online
rights defenders more generally where their
because of their sexual orientation. A study by
identities are attacked as a group. This has been
the Australian eSafety Commissioner found that
seen in Brazil, where discriminatory views have
LGBTQ+ people experienced double the amount of
been advanced by the public and political leaders,
hate speech (30 percent) compared to the national
including the country’s previous president, Jair
population (14 percent). A report by ADL (the
Bolsonaro, and adopted by large portions of the
Anti-Defamation League) Center for Technology
broader population (Sívori and Zilli 2022). These
and Society (2021) found that 64 percent of
online movements have been linked to the deaths
LGBTQ+ respondents reported experiencing online
of human rights defenders, including a Black
harassment compared to 41 percent of the general
bisexual councilwoman, Marielle Franco (Kaul 2021;
population. A study of people in Australia, New
Judson 2021).
Zealand and the United Kingdom showed that
LGBTQ+ people experience more negative impacts Within the Western manosphere, there are groups
from online harassment than heterosexual people of men, called incels (a term meaning “involuntary
(Powell et al. 2020). Their intersecting identities celibate”), who feel entitled to sex with women,
can influence the discrimination they face. and who organize online against women’s rights
Research by Andrew Farrell (2021) has shown that to sexual autonomy. According to Stephane J.
Indigenous LGBTQ+ people can face discrimination Baele, Lewys Brace and Travis G. Coan (2019),
on dating apps based on their Indigenous identity. incels are largely groups of men, linked to men’s
In countries where same-sex relationships are rights activism (Boyd and Sheehy 2016) and the
criminalized or considered socially unacceptable, manosphere (McCulloch et al. 2019; Guy 2021),
LGBTQ+ people can be monitored, blackmailed who hold an extremist world view, believing they
and harassed online (Thoreson and Cook 2011). are entitled to sex with women and supporting
32
patriarchal monogamy. Many hold particular because of content that was posted about them
hatred for racialized women and feminists. Ann online by abusers, it can be expensive for victims/
McGinley (2022) reported that the most extreme survivors to replace devices or accounts that have
incels advocate for the torture, rape and murder been compromised, additional security tools may
of women. This group’s ideology and influence has be needed, and ongoing mental stressors can
been linked to violent incidents such as the 2018 negatively impact a person’s professional capacity
van attack in Toronto, Canada, that killed 10 people, (Jane 2018; Citron 2014). These economic impacts
eight of whom were women, and injured others. add to the already unequal economic position many
women and LGBTQ+ people face.
33
in the United States by the Pew Research Center Journalists (2017), Reporters Without Borders
showed that young people were at higher risks of (2018a; 2018b), International Women’s Media
TFV, including threats for young men, and stalking Foundation (Barton and Storm 2014), Media Matters
and sexual harassment for young women (ibid.). for Democracy (Lodhi 2018) and TrollBusters (Ferrier
A study from India showed that young people are 2018) are all examples of organizations that have
at an increased risk of TFV, with significant impact reported on this problematic trend.
on their mental health and well-being (Maurya et
al. 2022). The US research institute Data & Society A 2020 survey of more than 900 journalists in
also found that young people were more likely to 125 countries by UNESCO and the International
experience harassment and abuse online (Lenhart Center for Journalists showed that 73 percent of
et al. 2016). Similar results were found in Australia women journalists had experienced some form
(Powell and Henry 2015). A Statistics Canada study of TFV related to their work (Posetti et al. 2020).
showed that young women were more likely to Twenty-five percent had been threatened physically
be stalked online in Canada than older women and 18 percent had been threatened with sexual
(Burlock and Hudon 2018). violence, including 13 percent receiving threats
against people who are close to them. Eight
Additional reports that focus on young people’s percent were doxed. Certain topics appeared
experiences show high rates of TFV among that age to generate higher levels of attacks: gender
group. An international survey by the World Wide (47 percent), politics and elections (44 percent),
Web Foundation and the World Association of Girl and human rights and social policy (31 percent).
Guides and Girl Scouts reported that 52 percent of Seventeen percent reported feeling physically
the young women and girls they surveyed faced unsafe due to the TFV. Many reacted by self-
online abuse.95 Plan International also found censoring what they discussed on social media
significant rates of girls reported receiving threats (30 percent). Four percent quit their jobs due to the
and harassment online, with more than half of the TFV they experienced. Another report by the same
girls surveyed reporting being harassed and abused organizations stated that some women journalists
online (Goulds et al. 2020). It is clear that special were told to “toughen up” and learn to deal with
attention needs to be given to young people when the attacks against them as part of their job (Posetti
considering how to address TFV. and Shabbir 2021).
34
being harassed online. Online abuse included The #ToxicTwitter report found that women’s
doxing, sexist name calling, physical threats, rights activists were also targeted with abuse on
networked harassment and censorship. Eleven of Twitter. Women who spoke up about anti-Black
the reporters addressed in their study had been racism, reproductive rights and gender issues were
killed in relation to their work, including journalists threatened online (ibid.). Attacks often zeroed in on
in Afghanistan, India, Iraq and Mexico. This abuse the woman’s other identifying factors such as her
causes significant challenges for women journalists race, sexual orientation, gender identity, disability
who may struggle with whether to stay in the or religion. Activists across the globe face this type
profession or not due to safety concerns. Silvio of TFV. APC,96 the World Wide Web Foundation,97
Waisbord (2020, 1033) has reported that journalists the Middle East Institute,98 GenderIT99 and IT for
who have physical markers that identify them Change100 have all reported on the experiences of
as being part of an equity-seeking group such as TFV activists online. Those advocating for feminist
“gender, race, ethnicity, sexuality, and religion” may values,101 gender equality,102 reproductive rights,103
be at increased risk. sexual expression and LGBTQ+ rights,104 and against
sexual violence,105 are exposed to TFV related to
Women politicians are also exposed to higher rates their work.
of TFV and abuse in general (Inter-Parliamentary
Union 2016). The Amnesty International (2018) LGBTQ+ people who are advancing their rights
report #ToxicTwitter studied the abusive tweets have had their events attacked online and offline
women politicians, activists and writers faced on (GLAAD 2022) and have faced political persecution
Twitter. The report found that platforms such as (Human Rights Watch 2020). There has been
Twitter are important spaces for women’s voices increasing hostility from alt-right groups toward
to be heard, but that Twitter could be a toxic place LGBTQ+ people. In Europe, the Office of the Council
for women. For example, among women members of Europe Commissioner for Human Rights (2021)
of Parliament (MPs) in the United Kingdom, one found that there was a “sharp increase” against
Black female MP, Diane Abbott, received almost LGBTQ+ human rights defenders and LGBTQ+
half of the abusive tweets targeted against women people more generally in Europe. As noted earlier
MPs during the period the report reviewed. Women in this report, LGBTQ+ activists who defend their
wanted to be on Twitter but found that speaking rights in certain countries have the violence they
about gender, race and politics could trigger abuse face legitimized by society and the state.
on that platform and led many to self-censor.
Those advocating
for feminist values,
gender equality,
reproductive rights,
sexual expression
and LGBTQ+ rights 96 See Palumbo and Sienra (2017).
work.
102 See Kovacs, Padte and SV (2013).
35
%
of those who reported
experiencing at least one of the forms of
online harm identified social media as the
platform where it occurred.
106 It should be noted that gender identity and sexual orientation were not
asked in certain countries for legal and safety reasons (Algeria, Jordan,
Saudi Arabia, Tunisia and the UAE).
36
Survey Results: harmfulness of the types of online harm must also
be considered. The following section details the
Commonality and prevalence of each form of online harm, followed
Response to Incidents of by the actual impact of the TFV and the general
perceptions of harmfulness reported. A later section
Online Harm considers participants’ reports on aspects of their
most serious incidents of online harm, where
The following sections detail the rates of incidents additional information about the harmfulness of
of online harm people experienced, and their these types of behaviour is discussed, as well as the
various responses. influence of gender identity, gender expression and
sexual orientation.
When interpreting the data on incident reporting, it
is important to recognize that the severity and level
of harm experienced can vary widely under each
of these categories. For example, if someone was Survey Results:
repeatedly contacted by someone they did not want
to be contacted by, it could be distressing, such as Experiences with Any
a person not taking the hint that a person does not Form of Online Harm
want to be contacted by them any longer, but not
cause the recipient significant harm. However, in Among the survey participants, 59.7 percent had
contrast, it could be a very serious form of violence, experienced at least one form of online harm. The
such as an ex-intimate partner relentlessly stalking most common form of online harm experienced
their ex-partner, sending threats, and causing by participants was being repeatedly contacted
significant fear with the communication. As such, by someone they do not want to be contacted
the incident reporting should be interpreted with by (37.7 percent), followed by having unwanted
this nuance in mind. Further, people may have sexual images sent to them (28.1 percent); having
different reactions to various types of behaviour someone access their devices or social media
and different perceptions of their degree of harm. accounts belonging to them without permission
(24.5 percent); being called discriminatory names
Because people have such a wide range of reactions or derogatory cultural terms (19.8 percent); having
to these various types of behaviour, incident lies posted about them online (17.8 percent);
reporting alone does not necessarily get to the being impersonated online (16.5 percent);
heart of the actual harms experienced — the experiencing harassment because of their gender,
Almost
%
of participants
experienced being repeatedly contacted by
someone they do not want to be contacted by.
37
race, sexual orientation, gender expression or them did not statistically differ; however, a higher
other marginalizing factor (16.3 percent); being proportion of women and transgender and gender-
monitored, tracked or spied on online (14.7 percent); diverse people reported this type of abuse than men
being doxed (14.7 percent); being blackmailed (22.8 percent). A higher proportion of LGB+ people
online (12.1 percent); experiencing networked reported having unwanted sexual images sent
harassment (11.8 percent); being physically to them (40.1 percent) than heterosexual people
threatened (11.7 percent); and having their nude or (24.8 percent).
sexual images shared or shown to someone else or
posted online without permission (7.6 percent).
Unauthorized Access
A higher proportion of transgender and gender- Someone accessing their devices or social media
diverse people reported experiencing any form accounts without permission was reported
of online harm (67.8 percent) than women by 24.5 percent of all participants. A higher
(59.9 percent) and men (57.0 percent). A higher proportion of LGB+ people reported unauthorized
proportion of LGB+ people reported experiencing access (32.8 percent) than heterosexual people
any form of online harm (75.8 percent) than (24.1 percent). There was no statistical difference
heterosexual people (57.2 percent). A higher between genders.
proportion of LGB+ transgender and gender-diverse
individuals reported experiencing any form of Discrimination
online harm (87.7 percent) than LGB+ women
(76.7 percent) and LGB+ men (72.6 percent), who Among all participants, 19.8 percent reported
reported similar proportions. A higher proportion being called discriminatory names or having
of heterosexual women reported experiencing derogatory cultural terms stated about them. A
online harm (58.6 percent) than heterosexual men higher proportion of transgender and gender-
(55.7 percent). diverse people reported being called discriminatory
names or having derogatory cultural terms stated
The following 13 forms of online harms are listed in about them (30.6 percent) than men (18.8 percent)
order from the most commonly experienced overall and women (17.8 percent), who did not statistically
to least commonly experienced overall. Where there differ in their proportions. A higher proportion of
are statistically significant differences between LGB+ people reported being called discriminatory
gender and sexual orientation, they are noted. names or having derogatory cultural terms stated
about them (36.6 percent) than heterosexual people
(17.0 percent).
Repeated Unwanted Contact
In total, 37.7 percent of participants reported
being repeatedly contacted by someone they do A higher proportion
not want to be contacted by. A higher proportion
of transgender and gender-diverse people of LGB+ people
reported having lies
(40.3 percent) and women (39.4 percent) reported
being repeatedly contacted by someone they do not
want to be contacted by than men (31.3 percent). A
higher proportion of LGB+ people reported being spread about them
repeatedly contacted by someone they do not want
to be contacted by (46.3 percent) than heterosexual (29.3 percent) than
people (34.6 percent).
heterosexual people
Unsolicited Sexual Images (17.4 percent).
Being sent unwanted sexual images was reported
by 28.1 percent of participants. The proportion False Information
of transgender and gender-diverse people
(31.1 percent) and women (28.9 percent) who Having lies posted about them online was
reported having unwanted sexual images sent to reported by 17.8 percent of all participants. A
38
higher proportion of transgender and gender- on online (24.0 percent) than men (14.6 percent) and
diverse people reported having lies spread about women (12.5 percent). A higher proportion of LGB+
them (30.1 percent) than men (19.8 percent) and people reported being monitored, tracked or spied
women (16.5 percent). A higher proportion of on online (18.6 percent) than heterosexual people
LGB+ people reported having lies spread about (13.3 percent).
them (29.3 percent) than heterosexual people
(17.4 percent). A higher proportion of LGB+ A higher proportion of heterosexual transgender
transgender and gender-diverse people reported and gender-diverse people (21.3 percent) reported
having lies spread about them (41.8 percent) than being monitored, tracked or spied on online than
LGB+ men (25.5 percent) and women (30.9 percent), heterosexual men (14.5 percent) and heterosexual
who reported similar proportions. A higher women (11.9 percent; heterosexual men > women).
proportion of heterosexual transgender and gender- A higher proportion of LGB+ transgender and
diverse people (24.0 percent) and heterosexual men gender-diverse people (29.1 percent) reported being
(19.3 percent), who reported similar proportions, monitored, tracked or spied on online than LGB+
reported having lies spread about them than men (15.5 percent) but were not different from LGB+
heterosexual women (15.3 percent). women (19.7 percent). A higher proportion of LGB+
women reported being monitored, tracked or spied
on online (19.7 percent) than heterosexual women
Impersonation (11.9 percent).
39
being blackmailed online (23.1 percent) than men and LGB+ men (21.6 percent) who reported similar
(12.7 percent) and women (10.1 percent). LGB+ proportions. A statistically higher proportion of
people were more likely to report being blackmailed heterosexual transgender and gender-diverse
online (18.6 percent) than those identifying as people (18.3 percent) and men (13.1 percent), who
heterosexual people (11.0 percent). reported similar proportions, reported being
threatened than women (10.0 percent).
Networked Harassment
NCDII
Experiencing networked harassment was
reported by 11.8 percent of all participants. A Having personal nude or sexual images of them
higher proportion of transgender and gender- shared or shown to someone else or posted online
diverse people reported experiencing networked without permission was reported by 7.6 percent of
harassment (27.8 percent) than men (11.5 percent) all participants. A higher proportion of transgender
and women (9.3 percent). A higher proportion of and gender-diverse people reported having personal
LGB+ people reported experiencing networked nude or sexual images of them shared or shown to
harassment (19.6 percent) than heterosexual someone else or posted online without permission
individuals (9.9 percent). There was an interaction (19.2 percent) than men (8.4 percent) and women
between gender and sexual orientation: the effect (6.7 percent). A higher proportion of LGB+ people
of sexual orientation held for women (LGB+ = reported having personal nude or sexual images of
19.4 percent; heterosexual = 8.5 percent) and men them shared or shown to someone else or posted
(LGB+ = 18.1 percent; heterosexual = 11.0 percent), online without permission (16.6 percent) than
but the proportion of transgender and gender- heterosexual people (7.0 percent).
diverse people reporting networked harassment
did not vary by sexual orientation (LGB+ =
29.9 percent; heterosexual = 26.7 percent). A higher
proportion of LGB+ transgender and gender- Survey Results: Reported
diverse people reported experiencing networked Impacts of Online Harms
harassment (29.9 percent) than LGB+ women (19.4
percent) and LGB+ men (18.1 percent), who were Participants who had experienced at least one form
equally likely to report this type of abuse. A higher of online harm were asked to rate what impact
proportion of heterosexual transgender and gender- online harms had on them personally. Because
diverse people reported experiencing networked participants reported experiencing multiple forms
harassment (26.7 percent) than men (11.0 percent) of online harm, their responses are not separated
and women (8.5 percent). into individual types of online harm but reflect
their general experience.
40
Table 1: Impacts of Online Harms
Personal Reputation
Freedom to Express Political or
Nearly one-quarter of all participants who
experienced one of the forms of online harm Personal Views
identified (24.7 percent), reported a very negative
Of all the participants who reported experiencing
impact on their personal reputation; however, there
one of the forms of online harm identified,
was no significant difference in the negative impact
21.7 percent said that online harms very negatively
41
impacted their freedom to express their political Physical Safety
and personal views. A higher proportion of LGB+
people reported online harms very negatively Among participants who had experienced one of
impacted their freedom to express their political the forms of online harm identified, 19.3 percent
and personal views (25.5 percent) than heterosexual reported that online harms very negatively
people (19.5 percent). There was no significant impacted their personal safety. A higher proportion
difference between genders. of transgender and gender-diverse people
(24.4 percent) and women (20.7 percent) reported
that online harms very negatively impacted their
A higher proportion personal safety compared to men (16.3 percent).
A higher proportion of LGB+ people reported
of LGB+ people that online harms very negatively impacted their
personal safety (24.2 percent) than heterosexual
reported that online people (17.9 percent). A higher proportion of LGB+
women reported that online harms very negatively
harms very negatively impacted their personal safety (27.8 percent) than
heterosexual women (19.8 percent). A higher
impacted their mental proportion of LGB+ men reported that online harms
very negatively impacted their personal safety
health (35.8 percent) (22.0 percent) than heterosexual men (15.5 percent).
A higher proportion of heterosexual transgender
than heterosexual and gender-diverse people reported that online
harms very negatively impacted their personal
people (24.7 percent). safety (28.9 percent) than heterosexual women
(19.8 percent) and men (15.5 percent). LGB+ people
42
impacted their desire to live (22.9 percent) than In response to an incident of online harm,
heterosexual people (14.1 percent). higher proportions of people blocked or muted
someone (51.7 percent), changed their privacy
Sexual Autonomy and Freedom settings (37.6 percent), took a break from social
media (26.7 percent), or deleted or deactivated their
Among participants who reported experiencing social media account (25.2 percent).
one of the forms of online harm identified,
The next most common responses included people
16.2 percent stated that online harms very
who reported that they changed their contact
negatively impacted their sexual autonomy and
information (24.2 percent), stopped or reduced
freedom. A higher proportion of transgender and
posting on a certain platform (23.2 percent),
gender-diverse people reported it very negatively
stopped posting about a certain issue (19.8 percent),
impacted their sexual autonomy and freedom
changed their profile information (18.6 percent),
(28.4 percent) than women (16.8 percent) and men
searched for content about themselves online
(14.6 percent). A higher proportion of LGB+ people
(15.5 percent), or changed their behaviour in a
reported online harms very negatively impacted
relationship (14.5 percent).
their sexual autonomy and freedom (25.1 percent)
than heterosexual people (14.9 percent). A higher Lower proportions of people acted differently
proportion of LGB+ women reported online harms to protect their safety: respondents reported
very negatively impacted their sexual autonomy they changed the route they normally walk
and freedom (21.5 percent) than heterosexual (14.4 percent); avoided social occasions or events
women (16.4 percent). A higher proportion of LGB+ (13.9 percent); replaced their devices with a new
men (26.9 percent) reported online harms very one (11.3 percent); stopped participating online
negatively impacted their sexual autonomy and altogether (10.8 percent); changed part of their
freedom than heterosexual men (13.0 percent). A identity, such as how they look or their legal
statistically similar proportion of LGB+ transgender name (8.0 percent); bought something to add to
and gender-diverse people reported online harms their security (7.1 percent); took time off school or
very negatively impacted their sexual autonomy work (6.8 percent); or moved to another address
and freedom (35.4 percent) than heterosexual (5.9 percent).
transgender and gender-diverse people
(23.8 percent). A statistically higher proportion “None of the above” was the response given by
of heterosexual transgender and gender-diverse 10.5 percent of respondents.
people (23.8 percent) and heterosexual women
(16.4 percent), who reported similar proportions,
reported that online harms very negatively
impacted their sexual autonomy compared to Survey Results:
heterosexual men (13.0 percent). Perceptions of
Harmfulness of Online
Survey Results: Actions Harms
Taken The survey also asked people about their general
perceptions of how big a problem OGBV was for
Respondents who had experienced some form of various groups of people in their country. They were
online harm were asked what actions they took in also asked about each of the 13 different forms of
response. There were relatively little statistically online harms and asked to rate how harmful they
significant differences between the gender and thought they were. All participants — those who
sexual orientation of individuals and the actions had experienced online harms and those who had
they took in response to online harms. As such, not — were asked this question.
the statistics below are inclusive of all participants
of the survey, regardless of gender or sexual
orientation.107
107 It should be noted that data on gender identity and sexual orientation
was not collected in all countries.
43
Close to
%
of participants
reported that OGBV was a very big problem
for LGBTQ+ people.
108 The percentages of gender in the brackets are lower than the total number because some people did not select their gender identity in the survey as there
was an option of “Prefer not to answer” and others were not asked based on survey limitations in specific countries.
44
Table 2: Perceptions of the Harmfulness of Online Harms
45
people (59.2 percent). A similar proportion of LGB+ and gender-diverse people). A higher proportion of
women (73.0 percent) and LGB+ men (70.4 percent) heterosexual women reported being impersonated
reported threats as extremely harmful, which was online as extremely harmful (73.6 percent) than
a higher proportion than LGB+ transgender and LGB+ women (61.2 percent). A similar proportion of
gender-diverse people (52.5 percent). A statistically LGB+ men reported being impersonated online as
similar proportion of heterosexual transgender and extremely harmful (65.6 percent) as heterosexual
gender-diverse people reported threats as extremely men (66.1 percent). A similar proportion of LGB+
harmful (59.2 percent) than LGB+ transgender and transgender and gender-diverse people reported
gender-diverse people (52.5 percent). Finally, a being impersonated online as extremely harmful
higher proportion of heterosexual people reported (57.0 percent) as heterosexual transgender and
threats as extremely harmful (74.7 percent) than gender-diverse people (55.8 percent). A higher
LGB+ people (70.2 percent). proportion of heterosexual people reported
being impersonated online as extremely harmful
Blackmail (69.7 percent) than LGB+ people (62.8 percent). A
higher proportion of heterosexual women reported
Being blackmailed online was perceived being impersonated online as extremely harmful
as extremely harmful by 73.5 percent of all (73.6 percent) than heterosexual men (66.1 percent)
participants. A higher proportion of women and heterosexual transgender and gender-diverse
reported being blackmailed as extremely people (55.8 percent).
harmful (77.8 percent) than men (67.4 percent)
and transgender and gender-diverse people Networked Harassment
(58.3 percent; women > men > transgender and
gender-diverse people). A higher proportion of Networked harassment was rated as extremely
heterosexual women reported being blackmailed as harmful by 68.1 percent of all participants. Women
extremely harmful (78.5 percent) than LGB+ women were more likely to report networked harassment
(71.4 percent). A similar proportion of LGB+ men as extremely harmful (74.0 percent) than men
reported being blackmailed as extremely harmful (62.0 percent) and transgender and gender-diverse
(70.0 percent) as heterosexual men (67.4 percent). people (58.3 percent), who reported statistically
A similar proportion of LGB+ transgender similar proportions. There was no significant
and gender-diverse people (61.0 percent) and difference between LGB+ and heterosexual people.
heterosexual transgender and gender-diverse
people (58.3 percent) reported being blackmailed Unauthorized Access
as extremely harmful. A similar proportion of
heterosexual people (72.9 percent) and LGB+ Sixty-eight percent of all participants perceived
people (69.8 percent) reported being blackmailed unauthorized access to their devices or social
as extremely harmful. A higher proportion of media accounts as extremely harmful. A higher
heterosexual women reported being blackmailed proportion of women reported that unauthorized
online as extremely harmful (78.5 percent) than access to their devices or social media accounts
heterosexual men (67.4 percent) and heterosexual was extremely harmful (70.5 percent), than men
transgender and gender-diverse people (62.5 percent) and transgender and gender-diverse
(58.3 percent). A similar proportion of LGB+ women people (52.3 percent; women > men > transgender
reported being blackmailed online as extremely and gender-diverse people). A higher proportion of
harmful (71.4 percent) as LGB+ men (70.4 percent) heterosexual women reported that unauthorized
and LGB+ transgender and gender-diverse people access to their devices or social media accounts was
(61.0 percent). extremely harmful (71.3 percent) than LGB+ women
(60.0 percent). A similar proportion of LGB+ men
Impersonation reported that unauthorized access to their devices
or social media accounts was extremely harmful
Among all participants, 69.5 percent reported being (66.5 percent) as heterosexual men (62.4 percent). A
impersonated online as extremely harmful. A higher statistically similar proportion of LGB+ transgender
proportion of women reported being impersonated and gender-diverse people reported that
online as extremely harmful (72.5 percent) than men unauthorized access to their devices or social media
(65.8 percent) and transgender and gender-diverse accounts was extremely harmful (59.5 percent)
people (55.6 percent; women > men > transgender as heterosexual transgender and gender-diverse
46
people (50.7 percent). A higher proportion of Doxing
heterosexual women (71.3 percent) reported that
unauthorized access to their devices or social media Having their personal contact information or their
was extremely harmful than heterosexual men address posted online without permission (doxing)
(62.4 percent) and heterosexual transgender and was perceived as extremely harmful by 65.4 percent
gender-diverse heterosexual people (50.7 percent). of all participants. A higher proportion of women
A statistically similar proportion of LGB+ women reported doxing as extremely harmful (70.2 percent)
(60.0 percent) perceived unauthorized access to than men (59.9 percent) and transgender and
their devices or social media as extremely harmful gender-diverse people (53.6 percent), whose
as LGB+ men (66.5 percent) and LGB+ transgender proportions were statistically similar. A similar
and gender-diverse people (59.5 percent). A similar proportion of heterosexual women and LGB+
proportion of heterosexual people (66.8 percent) women reported doxing as extremely harmful. A
and LGB+ people (63.1 percent) reported that higher proportion of LGB+ men reported doxing as
unauthorized access to their devices or social media extremely harmful (65.3 percent) than heterosexual
accounts was extremely harmful. men (59.8 percent). A higher proportion of
heterosexual women (70.4 percent) reported doxing
Monitored, Tracked or Spied On as extremely harmful than heterosexual men
(59.8 percent) and heterosexual transgender and
Among all participants, 66.9 percent considered gender-diverse people (51.7 percent), who reported
being monitored, tracked or spied on online similar proportions. No difference was found in
extremely harmful. A higher proportion of women the proportions of LGB+ transgender and gender-
reported being monitored, tracked or spied on diverse people and heterosexual transgender and
online as extremely harmful (71.5 percent) than men gender-diverse people who reported doxing as
(60.6 percent) and transgender and gender-diverse extremely harmful. Similar proportions of LGB+
people (53.6 percent; women > men > transgender women, men and transgender and gender-diverse
and gender-diverse people). A higher proportion people reported doxing as extremely harmful.
of heterosexual women reported being monitored, Similar proportions of heterosexual people and
tracked or spied on online as extremely harmful LGB+ people reported that doxing was extremely
(72.4 percent) than LGB+ women (65.6 percent). harmful.
A similar proportion of LGB+ men reported being
monitored, tracked or spied on online as extremely Untrue Information
harmful (63.2 percent) as heterosexual men
(60.6 percent). A statistically similar proportion Sixty-five percent of all participants reported
of LGB+ transgender and gender-diverse people having lies posted about them online as extremely
reported being monitored, tracked or spied on harmful. Women were more likely to report
online as extremely harmful (58.5 percent) as having lies posted about them online as extremely
heterosexual transgender and gender-diverse harmful (67.9 percent) than men (59.3 percent)
people (52.1 percent). A similar proportion of and transgender and gender-diverse people
heterosexual people (66.5 percent) as LGB+ people (53.0 percent). A higher proportion of heterosexual
(63.9 percent) reported being monitored, tracked women reported having lies posted about them
or spied on online as extremely harmful. A higher online was extremely harmful (68.5 percent) than
proportion of heterosexual women reported being LGB+ women (58.3 percent). A similar proportion of
monitored, tracked or spied on online as extremely LGB+ men reported that having lies posted about
harmful (72.4 percent) than heterosexual men them online was extremely harmful (60.2 percent)
(60.6 percent) and heterosexual transgender and as heterosexual men (59.2 percent). A statistically
gender-diverse people (52.1 percent). Statistically similar proportion of heterosexual transgender and
similar proportions of LGB+ women reported being gender-diverse people reported that having lies
monitored, tracked or spied on online as extremely posted about them online was extremely harmful
harmful (65.6 percent) as LGB+ men (63.2 percent), (57.2 percent) as LGB+ transgender and gender-
and LGB+ transgender and gender-diverse people diverse people (48.7 percent). A higher proportion
(58.5 percent) reported being monitored, tracked or of heterosexual people reported that having lies
spied on online as extremely harmful. posted about them online was extremely harmful
(64.0 percent) than LGB+ people (58.3 percent). A
higher proportion of heterosexual women reported
47
having lies posted about them online was harmful (55.6 percent) than transgender and gender-diverse
(68.5 percent) than heterosexual men (59.2 percent) people (43.8 percent), and men (43.4 percent), who
and heterosexual transgender and gender-diverse reported similar proportions. More heterosexual
people (57.2 percent), who reported a similar women reported repeated unwanted contact
proportion. as extremely harmful (56.5 percent) than LGB+
women (46.2 percent). Similar numbers of LGB+
Unsolicited Sexual Images men reported repeated unwanted contact as
extremely harmful (42.2 percent) as heterosexual
The same proportion (65.0 percent) of all men (43.6 percent). A statistically similar
participants reported unwanted sexual images sent proportion of heterosexual transgender and
to them as extremely harmful. A higher proportion gender-diverse people reported repeated unwanted
of women reported that receiving unsolicited contact as extremely harmful (49.0 percent) as
sexual images was extremely harmful (70.4 percent) LGB+ transgender and gender-diverse people
than men (54.9 percent) and transgender and (39.7 percent). Heterosexual people were more
gender-diverse people (53.2 percent; women > likely to report repeated unwanted contact as
men > transgender and gender-diverse people). A extremely harmful (50.2 percent) than LGB+ people
higher proportion of heterosexual people reported (43.8 percent).
that receiving unsolicited sexual images was
extremely harmful (63.5 percent) than LGB+ people
(54.6 percent).
Survey Results: Young
Identity-Based Harassment People (Aged 25 and
Of all participants, 64.6 percent reported Under)
experiencing harassment online because of
their gender, race, sexual orientation, disability, Close to one-quarter of participants (23.7 percent)
gender expression or other marginalizing factors were young people (aged 25 years and under;
as extremely harmful. Women were more likely 16–25 years) and 76.3 percent were older adults
to report that identity-based harassment was (over the age of 25; 26–74 years). A higher proportion
extremely harmful (68.4 percent) than men of young people aged 25 and under reported having
(55.4 percent) and transgender and gender- personally experienced at least one type of harm
diverse people (48.9 percent; women > men > listed (68.5 percent) than people over the age of
transgender and gender-diverse people). There 25 (56.9 percent), and reported the attack had a
was no significant difference between LGB+ and very negative impact on their personal life (in all
heterosexual people. categories other than employment, freedom to
express political or personal views, and personal
Discrimination reputation, where there was no difference in the
two age categories). A higher proportion of young
Of all participants, 60.5 percent reported being people reported they had been targeted because
called discriminatory or derogatory names online as of identity factors, including gender identity
extremely harmful. A higher proportion of women (27.5 percent versus 23.4 percent), gender expression
reported being called discriminatory or derogatory (10.2 percent versus 7.5 percent), age (17.9 percent
names online as extremely harmful (64.9 percent) versus 11.8 percent) and sexual orientation
than men (51.2 percent) and transgender and (8.8 percent versus 6.4 percent), than older people.
gender-diverse people (48.1 percent), whose Similar proportions of younger and older people
proportions were similar. There was no significant reported being targeted due to race/ethnicity,
difference between LGB+ people and heterosexual religion and disability. A lower proportion of young
people. people rated each individual behaviour as harmful
than older adults (in all categories other than non-
Repeated Unwanted Contact consensual image sharing, receiving unsolicited
sexual images and identity-based harassment,
Half of all participants (49.9 percent) reported
where there was no difference in the two age
repeated unwanted contact as extremely
categories).
harmful. Women were more likely to report
repeated unwanted contact as extremely harmful
48
Survey Results: High- In their most serious incidents, a higher proportion
of transgender and gender-diverse people
Profile People experienced chronic attacks (25.5 percent) than men
(14.3 percent) and women (13.7 percent). A higher
Among all respondents, 12.4 percent can be proportion of LGB+ people experienced chronic
considered high-profile people (identified as attacks (19.3 percent) than heterosexual people
advocate/activist, journalist, social media influencer (13.5 percent). Chronic attacks included all events
or politician). A higher proportion of high-profile that happened monthly, weekly or daily.
people (77.2 percent) had personally experienced
at least one form of online harm than non-high-
profile people (57.2 percent). They were more likely Reason for Being Targeted
to experience reputation and identity-based harms
(60.3 percent versus 34.3 percent),109 coercion and Gender Identity
harassment (64.4 percent versus 42.2 percent),110
Of those who reported on their gender identity,
privacy and security-based harms (54.6 percent
50.4 percent identified as women, 47.5 percent
versus 31.5 percent),111 and sexual harms
identified as men and 2.0 percent identified as
(45.3 percent versus 27.1 percent)112 than those who
transgender and/or gender diverse.
would not be considered high-profile people.
When considering the most serious incident of
online harm they had experienced, 24.5 percent of
Survey Results: Most all participants identified that their gender identity
was the reason they were targeted. A higher
Serious Incident proportion of transgender and gender-diverse
people (31.8 percent) and women (29.8 percent),
Of those participants who had experienced at least who reported similar proportions, reported their
one form of online harm, participants were asked gender identity as the reason they were targeted
to consider the most serious incident that they than men (16.0 percent). A higher proportion of
experienced. As many harms intersect in online LGB+ people (28.7 percent) reported their gender
attacks (for example, nude photos of someone identity as the reason they were targeted than
posted along with derogatory comments, threats heterosexual people (23.0 percent).
and their address, combining several forms of
online harm) participants were only asked a Gender Expression
generalized question about the most serious
incident they experienced. When considering the most serious incident of
online harm they had experienced, 8.3 percent
49
reported similar proportions, reported their race/ orientation identified as LGB+). A higher proportion
ethnicity as the reason they were targeted than of transgender and gender-diverse people
women (9.7 percent). (25.7 percent) reported their sexual orientation
as the reason they were targeted than men
The original survey included a category for (12.2 percent) and women (7.9 percent). A higher
participants to identify their race and ethnicity. proportion of LGB+ people (42.7 percent) reported
However, racial and ethnic data was not collected their sexual orientation as the reason they were
in most countries by those collecting the data for targeted than heterosexual people (6.6 percent).
this report.113 The authors of this report were not This was true across gender: 53.3 percent of LGB+
made aware of this until after the data collection men compared to 7.8 percent of heterosexual
was complete and, as such, were unable to provide men; 42.3 percent of LGB+ transgender and
international statistics on racial or ethnic minorities gender-diverse people compared to 11.3 percent
impacted by online harms. of heterosexual transgender and gender-diverse
people; and 32.6 percent of LGB+ women compared
Age to 5.4 percent of heterosexual women. A higher
proportion of heterosexual transgender and gender-
When considering the most serious incident of
diverse people (11.2 percent) and heterosexual men
online harm they had experienced, 13.5 percent
(7.8 percent) reported their sexual orientation as
of all participants identified that their age was
the reason they were targeted than heterosexual
the reason they were targeted. There were no
women (5.4 percent).
significant differences by sexual orientation or
gender. Almost one-quarter of participants were
under the age of 25 (23.7 percent); 69.4 percent were
between the ages of 24 and 64; and 6.9 percent were A higher proportion
over the age of 65.
of LGB+ people
Sexual Orientation
(42.7 percent)
Of those participants who were asked about their
sexual orientation and reported it, 92.0 percent reported their
identified as heterosexual, and 8.0 percent
identified as LGB+.114 sexual orientation
When considering the most serious incident of as the reason they
online harm they had experienced, 7.0 percent of all
participants identified that their sexual orientation were targeted than
was the reason they were targeted (8.0 percent
of participants who reported their sexual heterosexual people
(6.6 percent).
113 Data on race and ethnicity was only collected in Canada, Jordan, Saudi
Arabia, the UAE and the United States. Ipsos, which conducted the data
collection, provided the following statement: “For the race/ethnicity, as
Religion
mentioned before, for the purposes of development of the demographic
questions for each country, standard Ipsos demographic questions When considering the most serious incident of
used in global studies were referenced as a starting point, and further online harm they had experienced, 12.1 percent
adaptations were made based on the needs of this survey. At the time of participants identified that their religion
of survey development (2020), it was not common to ask race/ethnicity
questions in many countries. Therefore, based on the advice of the
was the reason they were targeted. Higher
in-country experts, race/ethnicity questions were only asked in countries proportions of transgender and gender-diverse
where it was not considered sensitive and/or offensive. Over the last people (14.1 percent) and men (13.9 percent), who
year, growing awareness and focus on Diversity, Equity and Inclusion
initiatives have meant that the collection of race and ethnicity information
reported similar proportions, reported their religion
has become more common and acceptable than it was a few years ago. as the reason they were targeted than women
As a result, this information can be collected in many more countries (7.7 percent). There were no significant differences
than it was acceptable when the survey was developed and fielded
(2020–2021).”
by sexual orientation.
114 For safety reasons, participants in Algeria, Jordan, Saudi Arabia, Tunisia
and the UAE were not asked about their sexual orientation.
50
Table 3: Religious Affiliation No Identity Factor
For 37.4 percent of participants, none of the identity
Religion Proportion of factors listed were the reason they thought they
Participants were targeted.
Sunni Muslim 24.5%
Person Causing the Harm
Catholic 24.4%
Considering the most serious incident of online
Protestant or Evangelical 10.3% harm they had experienced, most people
(64.1 percent) reported that the person who targeted
Atheist 9.4% them was unknown to them or a distant but
identifiable person, where the person was someone
Hindu 5.5% they had never met (32.1 percent), an anonymous
person (27.2 percent), the person’s identity couldn’t
Another form of Christian 5.4% be determined (11.1 percent), a random group
of people (9.6 percent), was a member of an
Spiritual but not religious 5.3%
identifiable online group (7.0 percent), or was a
Agnostic 4.3% politician or public authority figure (2.8 percent).
Another form of Muslim 0.9% The next most common group was people who are
known to the person (21.3 percent), but not a close
Jehovah’s Witness 0.7% relationship, such as a co-worker (9.0 percent),
another student (8.9 percent), a client/customer
Shi’a Muslim 0.7% (4.7 percent) or a teacher (3.1 percent).
Jewish 0.5% Among participants, 4.5 percent said it was another
person not listed.
Confucianism 0.4%
Sikh 0.2% Gender of Person Causing the
Mormon 0.2% Harm
Prefer not to answer 3.9% Do Not Know the Gender
Of the most serious incidents of online harm
Disability reported, 24.8 percent reported that they did not
know the gender of the person who targeted them.
When considering the most serious incident of
online harm they had experienced, 3.5 percent
of all participants identified that their disability
Men
was the reason they were targeted (11.0 percent Men were, by far, the most common gender of
of individuals identified as having a disability). person instigating the most serious incidents of
Higher proportions of transgender and gender- online harm. Of the most serious incidents of online
diverse people (7.0 percent) and men (5.4 percent), harm reported, 49.7 percent reported that it was a
who reported similar proportions, reported their man who targeted them, the highest percentage
disability as the reason they were targeted than of all gender categories. Specifically, 57.7 percent of
women (2.7 percent). There were no significant women, 51.6 percent of transgender and gender-
differences by sexual orientation.
51
diverse people and 42.9 percent of men reported Transgender and gender-diverse people were
that it was a man who targeted them. particularly vulnerable to networked harassment
(27.8 percent); having their intimate images shared
Women without consent (19.2 percent); being threatened
(28.1 percent); being called discriminatory names or
Of the most serious incidents of online harm having derogatory cultural terms stated about them
reported, 18.9 percent reported that it was a woman (30.6 percent); as well as being targeted because
who targeted them. Specifically, 23.1 percent of of their gender, race, sexual orientation, disability,
men, 25.8 percent of transgender and gender- gender expression or other marginalizing factor
diverse people and 18.1 percent of women reported (33.9 percent). They reported close to double the
that a woman attacked them. amounts of these types of harms compared to men
and women in several categories.
Other Gender
The increased visibility and hostility toward
Of the most serious incidents of online harm transgender and gender-diverse people has been
reported, 1.1 percent reported that it was a person reported in previous research, and the survey
of a gender other than man or woman who targeted data reflects the heightened experiences of
them.115 discrimination that these groups face in the digital
and physical world (GLAAD 2022). They were also
much more likely to be monitored, tracked or spied
Experiences with Online matches up with previous research that shows that
some individuals and groups online are actively
Harms in Relation to seeking to bring negative attention to members of
Gender and Sexual the LGBTQ+ population that can put them at risk
of online and offline harms (Curlew and Monaghan
Orientation 2019). This data demonstrates the need for
supports and education to be specifically aimed at
TFV is a widespread problem internationally. These preventing online harms against LGBTQ+ people as
data show that most people surveyed reported that they are proportionately the most targeted group.
they had experienced at least one form of online
harms (59.7 percent).
52
In terms of the actual impact of online harms groups to report being targeted due to their race
on transgender and gender-diverse people, they or ethnicity (17.0 percent) or sexual orientation
faced some of the most negative impacts to their (25.7 percent). In nearly all identity factors, it was
mental health (29.8 percent), ability to focus shown that transgender and gender-diverse people
(26.4 percent), physical safety (24.4 percent), desire are discriminatorily targeted against, negatively
to live (29.6 percent), employment and business affecting their human rights.
(28.8 percent) and sexual autonomy (28.4 percent),
compared to the other gender categories. The Transgender and gender-diverse people were the
impact on their desire to live is particularly most likely to experience chronic attacks that
concerning because rates of suicide among occurred monthly, weekly or daily (25.5 percent).
transgender and gender-diverse people are much
higher than the average population (Virupaksha,
Muralidhar and Ramakrishna 2016; Bauer et al.
2013), and this data shows that online harms can
These relentless
affect their desire to live. The increased risks to
physical safety are equally concerning. Transgender
forms of online
and gender-diverse people face disproportionately attacks disrupt the
high rates of physical attacks (Ghoshal 2020),
and online harms contribute to their already lives of transgender
precarious sense of safety in the world. The data
on employment and business is relevant as well: and gender-diverse
people, who deserve
many transgender people face barriers in securing
employment because of discrimination against
them (Trans PULSE 2011; Hébert et al. 2022). Online
harms may include doxing or shaming transgender to be able to exist
people online in ways that impact their ability to
maintain employment and live freely and safely. authentically and
Discrimination against transgender people is also
associated with their ability to find sexual and
safely in digital
romantic partners and live with sexual autonomy
(Scheim and Bauer 2019; Ashley 2018b). These
spaces.
challenges were reflected in the data, which
showed a higher proportion of transgender people’s LGB+ People
sexual autonomy being impacted by online harms.
LGB+ people also reported a higher proportion of
When considering the most serious incident incidents of online harm (75.8 percent) compared
of online harm experienced, transgender and to heterosexual people (57.2 percent). LGB+ people
gender-diverse people were the most likely to also reported higher rates of online harm in
experience chronic attacks that occurred monthly, many categories, including threats (25.5 percent);
weekly or daily (25.5 percent). This trend has unwanted contact (46.3 percent); blackmail
been shown in previous research describing (18.6 percent); unauthorized access to their
organized disinformation campaigns and organized devices and accounts (32.8 percent); being called
harassment of transgender and gender-diverse discriminatory names or having derogatory cultural
individuals (Curlew and Monaghan 2019). These terms stated about them (36.6 percent); untrue
relentless forms of online attacks disrupt the lives information being posted about them (29.3 percent);
of transgender and gender-diverse people, who harassed online because of their gender, race,
deserve to be able to exist authentically and safely sexual orientation, disability, gender expression
in digital spaces. or other marginalizing factors (36.3 percent);
and monitored, tracked or spied on online
Transgender and gender-diverse people were most
(18.6 percent). Like transgender and gender-diverse
likely to report being targeted because of their
people, LGB+ people continue to be discriminated
gender identity (31.8 percent), gender expression
against globally (eSafety Commissioner 2021),
(24.0 percent), religion (14.1 percent) or disability
including through disinformation campaigns
(7.0 percent), and were among the most likely
online (Strand and Svensson 2021).
53
More than
This data shows that online harm is a contributing LGB+ people reported high rates of being targeted
factor to the inequality LGB+ people face on a for their gender identity (28.7 percent), gender
regular basis. expression (17.8 percent) and sexual orientation
(42.7 percent) compared to heterosexual people.
LGB+ people reported some of the most negative This reflects the research mentioned above where
effects on their mental health (35.8 percent), their a person’s marginalized identity factors are often
freedom to express political or personal views directly linked to the form and substance of online
(25.5 percent), their ability to focus (22.9 percent), harms, where abusers purposely use discriminatory
their close relationships (22 percent), their language related to a person’s inherent identity.
physical safety (24.2 percent), their desire to
live (22.9 percent) and their sexual autonomy
(25.1 percent). The high rates of impacts in these Women
categories are extremely concerning. Online harms
Overall, women reported slightly higher prevalence
contribute to LGB+ people’s ability to live freely.
of any form of online harm (59.9 percent) than men
Their close relationships and sexual autonomy
(57.0 percent). In the two most common categories
can already be limited because of homophobic
of online harm, repeated unwanted contact and
views that limit their freedoms, which, when
unsolicited sexual images, women reported
amplified online, compound the negative impacts
a significantly higher proportion of incidents
on LGB+ people. Their physical safety can be at risk
(39.4 percent and 28.9 percent, respectively) than
because of homophobic laws in some countries and
men (31.3 percent and 22.8 percent, respectively).
discriminatory social norms held by certain groups
Men reported similar or slightly higher proportions
(Human Rights Watch 2020; Sallam 2018; Gritten
of incidents in the other forms of online harm.
2022), and this data shows that their physical safety
Among all genders, men were by far the most
and their desire to live are worsened because of
common perpetrators of online harms, in particular
online harms.
when it was a woman who was targeted.
LGB+ people (19.3 percent) were more likely than
Despite having similar numbers of incidents of
heterosexual people (13.5 percent) to face chronic
online harms in many categories, women reported
attacks that happened a few times, monthly, weekly
higher rates of negative impacts in almost all
or daily. As organized online attacks against LGB+
categories compared to men. They reported higher
people become more common, the online harms
levels of impact to mental health (29.4 percent
against LGB+ people become more relentless and
versus 21.8 percent for men), ability to engage freely
difficult to escape. Efforts to end online harms must
online (22.9 percent versus 18.6 percent), ability to
focus on the needs of this particular community.
focus (19.8 percent versus 16.3 percent), physical
54
safety (20.7 percent versus 16.3 percent), desire to were targeted. Gender identity, gender expression,
live (15.8 percent versus 13.6 percent) and sexual race, ethnicity, age, sexual orientation, religion
autonomy (16.8 percent versus 14.6 percent). Men and disability were listed as reasons that people
only ranked higher than women in the negative were attacked in the majority of the most serious
impact on their employment and business incidents of online harm. Factors that have been
(17.5 percent for men versus 15.9 percent for women, recognized to relate to human rights abuses, such
although not statistically higher). This reflects as attacks on people because of their marginalized
previous studies on online harassment and intimate identity factors, including gender, sexual
image sharing that show that while men may orientation, race, religion and other factors, are
report similar or higher levels of TFV, the impact directly linked to the majority of online harms.
on women is worse (Vogels 2021; Henry et al. 2020).
This shows the gendered inequality women face, as
they experience increased harms when targeted by
Public Perception
TFV. TFGBV is a serious concern for women, who Public perception of women and LGBTQ+ people
more and more are feeling unwelcome in digital having more negative experiences with online
spaces due to increased sexism and violent threats harms was apparent in the data. When asked about
against them. This compounds the discrimination their perceptions of OGBV specifically, respondents
they experience in the physical world, amplifying were much more likely to report that it was a
discriminatory norms and making them feel problem for LGBTQ+ people (46.5 percent) and
increasingly unsafe. women (44.3 percent) than for men (22.7 percent).
Public perception about OGBV showed that around
twice as many people thought that the current
Women reported online atmosphere was more negative for women
and LGBTQ+ people. As witnesses to OGBV, the
higher rates of public reports there is a higher negative impact for
groups marginalized by their gender and sexual
negative impacts orientation.
from online harms in When asked about their general perceptions about
online harms, there was a difference among men,
almost all categories women and LGBTQ+ people. Despite reporting
similar or higher proportions of incidents of online
compared to men. harm in many categories, men consistently rated
almost all forms of online harm as less harmful
Women were much more likely to report being than women, which reflects much of the research
targeted by their gender identity (29.8 percent) than discussed above that shows that women are more
men (16.0 percent). They reported lower numbers in negatively impacted by TFV than men. Surprisingly,
other categories such as race, religion and disability transgender and gender-diverse people rated
than men, which could suggest that many women most forms of online harm as less harmful than
believe they are primarily targeted because of their men and women even though as individuals
gender identity, even though aspects of a woman’s they reported more incidents of harms and more
identity such as her race and sexual orientation serious impacts than most other groups. This
have been shown to be intersecting factors in why a unusual contrast may be due to a normalizing
woman might be attacked online. effect, where people who experience online harms
more regularly may start to downplay its overall
effects because the experience is so common and
Identity-Based Discrimination and because they receive so little public support for the
Intersectionality harms they experience. The data from this survey
shows a similar pattern with young people, who
Regardless of gender, most participants reported experience higher proportions of online harms
at least one identity factor as the reason that they and are personally more negatively impacted by
were targeted with the most serious form of online most categories of online harms, yet also rate
harm they experienced. Only 37.4 percent said it as less harmful than older populations. Some
that no identity factor was the reason that they consistency of this pattern was also found with
55
LGB+ people, where a similar or smaller proportion people (51.6 percent) reporting being targeted by
of LGB+ respondents rated some forms of online men at higher numbers than men (42.9 percent),
harm as less harmful compared to heterosexual who still reported that men were the primary
respondents, despite experiencing a higher perpetrators of the most serious incident of online
prevalence of online harm in several categories. harm they experienced.
The harms faced by transgender, gender-diverse,
LGB+ and young people may be downplayed Women were much less likely to be the person
by the larger society in ways that impact their causing the most serious harm. Among people
general conceptions of these harms. This potential who had experienced some form of online harm,
normalization of TFV among those groups that are 18.9 percent reported that a woman had been the
most impacted is a disturbing trend. perpetrator. Men (23.1 percent) and transgender and
gender-diverse people (25.8 percent) were more
likely to report that a woman had targeted them
Perpetration compared to women (18.1 percent).
Of the most serious incidents of online harm Few transgender and gender-diverse people were
reported, 24.8 percent reported that they did not reported as perpetrators. Only 1.1 percent of all
know the gender of the person who targeted them. participants who had experienced online harms
reported that a person of an “other” gender targeted
Despite many people not being able to identify
them.
the gender of their perpetrator, gender appeared
prominently in who was the perpetrator inflicting Responding to TFGBV requires not only providing
the online harm among those who could identify supports to those who are victims/survivors of
the gender of their perpetrator. Men stood out as TFGBV, but also changing the behaviour of those
the gender of the person causing most harmful perpetrating the harms. The data from this survey
incidents of online harm a person experienced. demonstrates that men and boys are responsible
Almost half of people (49.7 percent) reported that for a significantly higher percentage of harms
men caused the most serious incident of online compared to trans and cis women and gender-
harm they experienced, the highest percentage diverse people. As such, supports to respond to
in all gender categories. The gender of the target TFGBV must include efforts to change the behaviour
also showed a gendered pattern, with women of men and boys online.
(57.7 percent) and transgender and gender-diverse
Almost
%
of people reported that
men caused the most serious incident of
online harm they experienced, the highest
percentage in all gender categories.
56
Section II
Supports and
Resources
The following section discusses various existing supports and
resources available to victims/survivors of TFV, as well as where
there are gaps and barriers in finding support. It then summarizes
survey participants’ perceptions of and experiences with accessing
these supports and resources.
57
Introduction: Supports and to and prevent TFV, but there are relatively few
governments and organizations directly providing
Resources this type of information in an accessible format.
Finally, research can help identify trends and
As the prevalence of TFV increases, victims/ practices related to TFV and determine what
survivors need support and resources to help actions are best suited to prevent and address TFV.
address and prevent the abuse they are facing. In the past few years, there has been a great deal
There is also a need for resources to address this of research conducted on this subject, but more is
issue systemically to eradicate it. These supports needed, particularly in the Global South.
and resources can come in the form of content
moderation on and by social media platforms,
educational resources on TFV, technical solutions,
governmental and non-governmental victim/ Background: Supports and
survivor support programs, research, and robust
and evidence-based laws and policies. Support
Resources
and resources should help a victim/survivor when
they have been a target of TFV but should also be Government Support
preventive in nature. Research and education can
help shape the social norms of what is and is not Governments must support efforts to end TFV.
appropriate behaviour in digital spaces and address In some countries, governments are taking steps
the underlying discriminatory beliefs that fuel TFV. toward addressing this issue; however, in others,
governments are working actively against the rights
To date, many victims/survivors of TFV report of women and LGBTQ+ people and are not taking
struggling to find adequate support when they TFV seriously. As noted above, some governments
are harmed by TFV. From a legal perspective, and leaders are even engaging in TFV themselves.
depending on the country a person lives in, there In many countries, laws have been used to suppress
will be varying levels of criminal or civil laws women’s and LGBTQ+ people’s legitimate digital
that are applicable to TFV. However, even when interactions, including their advocacy for human
relevant laws are in place, victims/survivors may rights and sexual expression. For example, at the
face barriers in accessing those legal remedies, time this report was written, as Iranians protested
due to systemic bias and failures within the legal for women’s rights, the government implemented
system, as well as challenges with affordability and strict internet controls limiting protesters’ abilities
other access to justice issues. Content moderation to communicate with each other and spread
can be a helpful and time-sensitive tool for getting their message with the world (Green 2022). In
harmful content taken off websites and managing several countries, obscenity and decency laws
TFV, but how each company’s terms of service are have been used to penalize women’s online sexual
applied to complaints can be confusing, unclear expression (Global Information Society Watch 2017)
and inconsistently applied. The types of behaviour or advocacy for sexual and reproductive rights
and content that are forbidden on a platform can (Palumbo and Sienra 2017). Governments must
vary widely. Additionally, the rules might not be not limit women and LGBTQ+ people’s legitimate
available in all languages and the policies might not sexual expression and advocacy. Instead, they
be culturally relevant to people in the Global South. should be developing — and funding — human
rights-based research and supports to end TFV.
To date, no social media company has come up
with a sufficient system to address TFV and most Several countries have developed government
do not provide adequate resources to tackle this supports for victims/survivors of TFV. Some
issue, leaving many victims/survivors without countries have even created statutorily empowered
redress. Governmental and non-governmental bodies whose function is to address TFV. Others
digital rights or victim service organizations have provided government support for programs
have proven useful to victims/survivors in some such as helplines for victims/survivors of TFV.
cases, but they are few and far between in most Governments need to continue developing human
countries. Education and support tools can also be rights-based supports and providing resources to
a practical resource for people to learn about topics those organizations and researchers that can best
such as privacy, online safety, digital etiquette and support victims/survivors of TFV.
what actions are available for people to respond
58
Research by Pam Hrick (2021, 595) has shown that Porn Helpline.120 Adults who have had their intimate
statutorily empowered bodies “have the potential images shared online without consent can call
to meaningfully further a survivor-centered this helpline for assistance in getting the images
approach to combating technology-facilitated removed.121 In South Korea, the Ministry for Gender
violence against women — one that places their Equality funds the Centre for Online Sexual Abuse
experiences, rights, wishes, and needs at its core.” (McGlynn, n.d.). In India, a women’s helpline is
Hrick reviewed the work of Australian eSafety available for women to make complaints, including
Commissioner, New Zealand’s Netsafe, and two those related to TFGBV (Kovacs 2017).
Canadian bodies, the CyberScan unit in Nova Scotia
and the Canadian Centre for Child Protection in These types of supports are vital to victims/
Manitoba. She found that these bodies, while not survivors of TFV, who deserve to have immediate
perfect, provide victims/survivors with a variety of and accessible government-backed help with
legal and non-legal options to address TFV. Hrick legal and non-legal options to respond to their
noted that these bodies demonstrate a commitment experiences. Governments play a key role in
from governments that they are trying to take TFV funding and supporting independent and civil
seriously. society organizations and initiatives that provide
human rights-based services and conduct research
In Australia, the Office of the eSafety Commissioner on TFV, in particular for equity-seeking groups.
provides direct supports to survivors/victims of
TFV in getting content removed from the internet.116 International initiatives can also help curb TFV
It also conducts research; develops educational globally. In recent years, several international
materials; and engages with social media, partnerships have been created to work toward a
messaging, gaming and app services, and websites, better understanding of TFGBV and to strategize
to ensure those companies are working to keep how best to tackle the issue. The Global Partnership
Australians safe online. In New Zealand, Netsafe for Action on Gender-Based Online Harassment
investigates complaints, provides mediation, and Abuse was launched in 2022 (Crockett and
liaises with social media companies to request Vogelstein 2022). The partnership includes Australia,
the removal of harmful content and develops Canada, Chile, Denmark, Iceland, Kenya, Mexico,
educational tools to inform New Zealanders New Zealand, South Korea, Sweden, the United
about online safety.117 In the Canadian province of Kingdom and the United States (Global Partnership
Nova Scotia, CyberScan is mandated to provide for Action on Gender-Based Online Harassment
dispute-resolution services for victims/survivors, and Abuse, forthcoming 2023). It “will bring
information on legal rights and education on TFV.118 together countries, international organizations, civil
Research by Alexa Dodge (2021) found that most society, and the private sector to better prioritize,
people who use this service are interested in the understand, prevent, and address the growing
non-legal technical and emotional supports, and are scourge of technology-facilitated gender-based
often able to resolve their issue without engaging violence” (US Department of State 2022). A Global
in the legal system; however, some do seek legal Online Safety Regulators Network was established
information supports from CyberScan. In Manitoba, “with the aim of making sure the approach to
the Canadian Centre for Child Protection provides online safety between countries is as consistent
supports to people who have had their intimate and coherent as possible.”122 This network includes
images shared without consent.119 representatives from Australia, Fiji, Ireland and the
United Kingdom. Global partnerships like these
Other governments have provided supports to have the potential to be beneficial as they can share
organizations assisting victims of TFV. In the United existing knowledge and help develop and advocate
Kingdom, the government helps fund the Revenge for laws and policies to address TFV.
59
Legal Responses had bills and public policies in place.125 In 2009,
the Philippines was one of the first countries to
State recognition of the harms caused by TFV plays criminalize NCDII. Neris, Ruiz and Valente (2018)
both an expressive (Citron 2009) and practical role noted that of the countries that introduced NCDII
(Franks 2015) in addressing TFV. When governments laws, most countries had introduced criminal laws,
develop laws that prohibit TFV, it signals to the but some countries did have civil laws to address
public the state’s condemnation of these types of NCDII. Research by Aikenhead (2018) on Canadian
behaviour, and it also provides a legal avenue for criminal cases involving NCDII and by Bailey and
victims/survivors to seek a remedy from the state. Mathen (2019) on those involving other forms of
TFGBV show that there is a significant gendered
Certain forms of TFV, such as NCDII, may require trend in these cases, with most victims being
the creation of new laws, but in many jurisdictions, women and most offenders being men
existing laws can already be applied to many forms
of TFV (European Institute for Gender Equality
2022). TFV can be a new manifestation of harms
that are already recognized by the state, and those When governments
laws should apply regardless of whether the harm
occurred in a digital or physical space. As noted develop laws that
by Jane Bailey and Carissima Mathen (2019), in
the Canadian context, existing criminal laws, such
prohibit TFV, it signals
as harassment and extortion, can apply to forms
of TFV, as well as specific laws such as criminal
to the public the
voyeurism or NCDII. Suzie Dunn and Alessia
Petricone-Westwood (2018) found a similar trend
state’s condemnation
in civil responses in Canada, where many existing
civil laws could be applied to TFV, but additional
of these types of
civil statutes and torts that directly address TFV
were also beneficial. However, TFV-specific laws
behaviour, and it
are lacking in many countries (Machirori 2017) and
there is significant under-reporting of these harms
also provides a legal
to legal authorities even when there are laws in avenue for victims/
place (Malanga 2021; Nwaodike and Naidoo 2020).
Catherine Muya’s (2021) research on TFGBV in survivors to seek a
Kenya found that the laws in that country needed
to be revised to properly address TFGBV and many remedy from the state.
women were left with no legal remedy due to the
lack of legislation on the issue. A study by Neris, Ruiz and Valente (2018) found that
most NCDII laws did not address whether internet
Several countries have created laws to address
intermediaries could be held liable for the role
NCDII (Kamran and Ahmad 2021). In 2018, Natália
they played in the dissemination of the images or
Neris, Juliana Pacetta Ruiz and Mariana Giorgetti
require them to take action to get content removed.
Valente conducted a comparative analysis of
Only a few jurisdictions, such as Australia, included
countries that have introduced such laws. At that
potential fines for companies that refused to
time, they found that 11 countries had specific
remove content.
NCDII laws123 and 21 had general laws, such as laws
against harassment, gender-based violence and Even in countries that have developed laws to
domestic violence, that could apply,124 and several address TFV, many people report that there are
barriers in getting an adequate legal remedy and
believe that the legal system is failing women and
123 Australia, Canada, France, Israel, Japan, New Zealand, the Philippines,
LGBTQ+ victims/survivors of TFV. Law enforcement
Scotland, Spain, the United Kingdom and the United States.
60
officers have minimized gender-based violence to regulate sexuality and control women’s bodies
(Mahmutović, vale and Laçí 2021) or examined the and sexual expression. Some victims/survivors of
case from a “patriarchal-protectionist way” (Devika NCDII have been charged under these laws when
2019, 12). Further, harms experienced in digital their images were shared without consent. In
spaces may not be taken as seriously as physical Canada, the United Kingdom and the United States,
violence by some police officers (Gurumurthy and young people have been warned, and some have
Vasudevan 2018; Mahmutović, vale and Laçí 2021). been criminally charged, with taking or sharing
Under-reporting of these harms was also common. intimate images of themselves as a form of child
For example, women in Ethiopia, Kenya, Senegal, sexual abuse material production, regardless of
South Africa and Uganda often did not report whether it was consensually made and shared or
TFGBV to law enforcement and when they did, not, for example, when an older adolescent takes
some of their complaints were trivialized by law a nude photo of themselves and the image is never
enforcement (Nwaodike and Naidoo 2020). Gender used in an exploitative manner (Hasinoff 2014;
discrimination that minimizes violence against Karaian 2013; Miles 2020; Dodge 2021). These cases
women and blames the victim was a common show that some existing laws have the potential to
trend for people reporting TFGBV to police globally be used against people who have been victimized
(Nguyen and Barr 2020; Dodge et al. 2019; Devika by TFV or can criminalize legitimate sexual
2019; Sequera 2021). A lack of training (Segal 2015) expression.
and discriminatory responses by police (Powell and
Henry 2016) were some of the reasons that some Another important legal issue to address is
research found that law enforcement was failing anonymity and privacy (Hernández 2017). At
victims of TFGBV (Machirori 2017). Some actors in times, it can seem that there is a tension between
the legal system may be lacking the skills needed to protecting victims/survivors of TFV and protecting
understand and properly address TFV and require anonymity and privacy. However, anonymity and
additional training (Dunn and Aikenhead 2022). In privacy are important factors in keeping people
the civil context, seeking a civil remedy may be safe from TFV. Many women, LGBTQ+ people and
unaffordable for some and the response may be too human rights defenders communicate online using
slow to provide an adequate remedy (Young and anonymity and encryption to protect themselves
Laidlaw 2020; Nwaodike and Naidoo 2020). Further, from harassment and abuse (Yahaya and Iyer 2022;
some victims/survivors may be reluctant to report Hernández 2017). As such, these are important
because of potential negative social consequences aspects of digital communication to protect.
related to patriarchal and sexist norms in their Additionally, in legal cases involving private sexual
community (Malanga 2020). content, victims/survivors need anonymity to
bring their cases forward (McGlynn 2016). If there
Additionally, in some countries, existing laws that is a possibility that their name and the associated
regulate sexual expression, identity or orientation content could be shared publicly, victims/survivors
can actually hinder some victims’/survivors’ may be reluctant to report out of fear that the
ability to access justice, express their sexuality or content and details will be viewed and spread
engage in activism. In many countries, LGBTQ+ further.
people are at risk because same-sex relationships
are criminalized and gendered dress codes are Anonymity can create challenges for victims/
enforced.126 In countries such as Japan, Malawi survivors of TFV. Some victims/survivors may
and Uganda, obscenity and anti-pornography laws not know the identity of the person who has
criminalize some forms of sexual imagery, which, targeted them and may need support from law
according to Neris, Ruiz and Valente (2018, 41) enforcement to find out (Dunn and Aikenhead
“raise questions about the risk of increasing the 2022). However, when governments create laws
vulnerability of victims that may end-up being that assist them in unveiling a person’s identity in
punished instead of protected.” Sarai Chisala- digital spaces, they must take into account the fact
Tempelhoff and Monica Twesiime Kirya (2016) that anonymity and privacy are important factors
report that anti-pornography laws and anti- related to online safety and freedom (Treuthart
obscenity laws are in place in Uganda and Malawi 2019). Any government powers that impact privacy
and anonymity should be legally justified, limited
and narrow. Human rights such as privacy must
126 See www.ohchr.org/en/sexual-orientation-and-gender-identity/about- be taken into consideration. TFGBV should not
lgbti-people-and-human-rights.
61
be co-opted as a reason to create overly broad cause additional harms when they prioritize and
government powers that can unjustly infringe on serve up content on tech platforms that promotes
privacy and freedom of expression (Access Now extremist sexist, racist and homophobic content
2021). As noted by Citizen Lab and the Canadian (Ribeiro et al. 2020). It is essential that technology
Internet Policy and Public Interest Clinic, any companies commit to ending TFV on their
laws created that impact these issues must utilize platforms and devices by committing resources and
a human rights-based approach to fairly balance developing best practices.
people’s right to privacy and expression with
government interests such as public safety and Most social media companies do have content
national security (Gill, Israel and Parsons 2018). moderation rules and terms of services that
prohibit various forms of TFV and have tools for
preventing abuse (Khoo 2021). In Danielle Keats
Technology Companies Citron’s book, The Fight for Privacy, she discusses
some of the positive advancements social media
Technology companies play an essential role in
companies have made, in part due to pressure
preventing and responding to TFV. Their products
and engagement with academics, law makers
and services are the very platforms and devices that
and anti-violence advocates, while recognizing
host and facilitate TFV. The level of commitment
that there is a long way to go (Citron 2022). Citron
these companies have for addressing TFV
describes the early work done by the Cyber Civil
determines the safety and well-being of billions
Rights Initiative and the US Cyber Exploitation
of people worldwide. At this time, many people
Task Force, which advised law makers and large
question technology companies’ commitment to
technology companies such as Google, Twitter,
properly address TFGBV and TFV against LGBTQ+
Meta (Facebook at the time) and Tumblr to improve
people and other equity-seeking groups. Many of
their policies and laws. The work of these types of
these companies’ track records are questionable
groups has played an important role in encouraging
at best. For example, at the time this report was
social media platforms to improve their policies
written, Elon Musk, the current owner of Twitter,
and practices. For example, the Revenge Porn
had recently allowed several banned misogynist,
Helpline has worked with companies, including
racist, violent and transphobic users back onto the
Meta, to promote tools that help prevent the
platform in the name of free speech (Milmo 2022)
spread of intimate images,127 while the dating app
and dissolved Twitter’s Trust & Safety Council
Bumble has developed a blurred image feature
(Mehta 2022). Although this is a starker example of
that uses AI to detect sexual images in response to
problematic choices by a social media company,
complaints about unsolicited nude images on their
most technology companies’ corporate motives
apps, allowing users to decide whether to view the
are geared toward encouraging user engagement
images or not.128 Many victims/survivors of TFV use
to increase profits and there is less incentive
technical tools on these sites, including blocking
to provide robust content moderation and safe
and reporting harmful content, to prevent future
products, which impact their profit margins (Zuboff
abuse (Iyer, Nyamwire and Nabulega 2020; Kovacs,
2019; Goldberg 2019). Additionally, Pollicy has noted
Padte and SV 2013).
that social media companies’ content moderation
practices prioritize Western values and can be Although content moderation rules exist on
biased against racialized people and in the African most popular social media sites, many victims/
context (Iyer et al. 2021). survivors of TFV remain dissatisfied with these
companies’ overall responses to TFV (Dhrodia
As noted by Bailey et al. (2021), social media
2018; Ruiz, Valente and Neris 2019). Internet Sans
companies also engage in structural violence in
Frontières (2019) found a low level of reporting
which AI content sorting and algorithmic profiling
(15 percent) of TFGBV to social media companies
reinforce existing stereotypes about equity-
among women in West and Central Africa, which
seeking groups, provide biased and discriminatory
suggests that individuals may not believe that
outcomes, and create disparate access to
these organizations will adequately respond to
information on their platforms. Algorithms can
complaints, or that companies have not made
62
their users aware of their content moderation Project, the Digital Rights Foundation, Amnesty
systems. APC conducted a report on improving International, the Cyber Civil Rights Initiative,
corporate policies to end TFGBV (Athar 2015). It GLAAD, the National Network to End Domestic
found that while companies did have some policies Violence and the BC Society of Transition Houses
in place to address TFGBV, there was “little to no have been conducting research, developing
public information...available about how internal information and education on TFV, and providing
review processes work” (ibid., 20) and a great reports and resources to victims/survivors for years.
deal of harmful content remained online due to These organizations are essential in the ecosystem
inconsistent policies and application of those of stakeholders who are committed to ending
policies. There were also barriers to some people TFV. However, there are relatively few civil society
who could not find help-seeking information in organizations doing this work and many of them
their language. are underfunded.
The LEAF report Deplatforming Misogyny (Khoo Civil society organizations are often the first place
2021) outlined many of the challenges and barriers that victims/survivors find information about their
women and LGBTQ+ people faced when reporting rights and have their experiences validated as a
to social media companies. Khoo found that violation of their rights. For example, the Digital
content moderation policies could be opaque and Rights Foundation provides a helpline in Pakistan
inconsistently applied. Social media companies that people can call if they are harassed online.129
were more likely to address harmful content These organizations assist with formal and informal
when there were negative media reports about responses to TFV, advise governments and social
the content that drew attention to it, rather media companies on how to improve their laws and
than address it consistently. LEAF noted that policies and, most importantly, feature the voices
the business models of these companies focus of victims/survivors of TFV.130 The value of their
on user engagement, regardless of whether work cannot be underemphasized. Institutions
that engagement is positive or negative, thus like these that centre gender equality, the rights
disincentivizing those companies to remove of LGBTQ+ people and human rights have filled
content that engages users, even if it may be the gaps in countries where legal or governmental
harmful. LEAF proposed an equality-focused and supports are lacking (Muya 2021). However, they
human rights-based approach to regulating social often work with limited funding and supports.
media companies’ content moderation practices Additionally, traditional anti-violence organizations
that would better protect victims/survivors have had to quickly catch up to the novel issues
of TFGBV. Many other organizations, such as that their clients experience when their abusers
the Internet Democracy Project (Bhandari and use technology to harm them (National Network to
Kovacs 2021), APC (Athar 2015) and IT for Change End Domestic Violence 2014; AWARE 2020). These
(Gurumurthy and Dasarathy 2022), have called for organizations should be supported and adequately
improvements in platform governance to address funded by governments so that they can continue
TFGBV. As noted by Suzor et al. (2019), social media doing their essential work and provide non-legal
companies have a responsibility to prevent TFGBV avenues for victims/survivors.
on their platforms.
Research
Civil Society Organizations
As noted by the International Center for Research
As mentioned in the section on the expert advisory on Women, the concept of TFGBV is still being
committee, civil society organizations have developed and understood (Hinson et al. 2018).
been at the forefront of this issue for nearly two There is a growing number of researchers working
decades. Much of their research and advocacy in this area in academic and civil society circles.
was what brought this issue to the attention of There is a need for additional data collection
governments and the public. Organizations such as and analysis on this subject for this issue to be
APC, Derechos Digitales, the Internet Democracy better understood, in particular information
63
Almost
%
of people did not reach
out to anyone about the most serious incident
of online harm they experienced.
gathered from people who have experienced rights to safety and privacy online (Malanga 2020).
TFV (Global Partnership for Action on Gender- In a multi-country study in Asia, UN Women found
Based Harassment and Abuse, forthcoming 2023). that there was a lack of digital literacy among
APC stated, “more systematic documentation women that impacted their safety online (Aziz
of [TFGBV], including in-depth case studies, is 2020). Education campaigns can also be used to
necessary to identify effective remedies and educate legal actors on best practices in addressing
new policies” (Fascendini and Fialová 2011, 54), TFV, as many law enforcement officers lack training
including consultations with organizations that do and understanding on TFV (Shariff and Eltis
work on TFGBV. It further noted that, “particular 2017, 110; Dunn and Aikenhead 2022). Research by
attention should be given to women marginalised the Centre for Development Studies in India found
due to race, sexual orientation, intellectual and that research on TFGBV is limited and more reliable
physical abilities, age and socio-economic factors country-specific data is needed to better inform
such as geographical location, level of education, individuals and law enforcement (Devika 2019).
employment situations, and marital status” (ibid.). Additionally, education can play a preventive role
Additionally, policies and practices aimed at by working to change people’s behaviour online,
ending TFV should be evidence based. As noted including that of perpetrators. This can be done
by Hrick (2021, 599), any actions to address TFV by providing information on what healthy digital
should be “informed by research, evidence and the interactions should look like and challenging the
perspectives of survivors.” root causes of TFV, such as sexism, homophobia,
transphobia, racism, ableism and colonialism.
64
Survey Results: Evaluation Survey Results:
of Supports and Resources Effectiveness of Supports
The following section examines participants’ and Resources
opinions on supports and resources.
Participants were asked to rate which resources
Participants were asked on a five-point scale with 5 and supports they generally considered effective
being “very important” and 1 being “not important in responding to OGBV on a five-point scale, with 5
at all” how important certain resources and being “very effective” and 1 being “very ineffective.”
supports were in addressing OGBV.
The following information includes those who rated
The following information includes those that these categories as “very effective” of participants
rated these categories as “very important.” As such, who provided a rating.
those that listed these resources and supports
When participants were asked about what
as moderately important or not important at all
resources were most helpful to address OGBV,
are not included in these numbers. Nearly half or
38.1 percent rated information on how to protect
more reported each category as a very important
themselves online as very effective, 35.0 percent
resource or support.
rated education campaigns in schools as very
When participants were asked about what effective, and 31.2 percent rated public education
resources would be most helpful to address OGBV, campaigns as very effective.
they identified tools for awareness: 57.5 percent
When participants were asked about what
reported education campaigns in schools as very
resources were most helpful to address OGBV,
important, 57.4 percent reported information
35.1 percent rated internet security as very effective,
on how to protect themselves online as very
and 31.8 percent rated helplines as very effective.
important, and 52.9 percent reported public
education campaigns as very important. When participants were asked about what
resources were most helpful to address OGBV,
When asked about legal and policy resources,
35.5 percent reported laws as very effective,
60.2 percent reported laws as very important,
32.9 percent reported police as very effective, and
54.4 percent reported police as very important,
30.4 percent reported government support as very
and 53.6 percent reported government support
effective.
as very important.
When participants were asked about what
When asked about tools for support, 53.1 percent
resources were most helpful to address OGBV,
reported technical supports for internet security as
29.6 percent reported content moderation by social
very important, and 52.1 percent reported helplines
media companies as very effective, 31.5 percent
as very important.
reported OGBV organizations as very effective, and
When asked about non-governmental resources: 26.5 percent reported civil society organizations as
50.1 percent reported content moderation by social very effective.
media companies as very important, 51.2 percent
reported OGBV organizations as very important,
and 45.2 percent reported civil society organizations
as very important.
65
Survey Results: Who Has Survey Results:
Responsibility to Act? Effectiveness of Resources
When asked to rate which organizations have the When asked how effective the person or
most responsibility to address OGBV, respondents organization was in helping them with their most
were most likely to rate police (23 percent) as serious incident, participants were asked to rate
having the highest responsibility, followed by the effectiveness on a four-point scale, with 4 being
governments (19.4 percent), law and policy “very effective” and 1 being “completely ineffective.”
makers (17.8 percent), social media companies
(15.2 percent), schools and universities (9.6 percent), Close relationships were most likely to be
other internet users and community members considered very effective: nearly half of the
(9.8 percent), and civil society organizations respondents rated spouses (49.7 percent), family
(5.2 percent). members (48.3 percent) and friends (41.0 percent) as
very effective.
66
Summary of Survey and managed the incident alone. Victims/survivors
of online harms should not have to suffer alone and
Results: Supports and should be able to seek help from members of their
67
that require an immediate response, as they risk dedicated to understanding and preventing this
being amplified and spread online over time prior new form of violence. More support is needed
to a formal remedy. across the world, but in particular for women and
LGBTQ+ people in the Global South.
When rating the effectiveness of these supports and
resources, participants rated their informal support The research and data from this report
systems as most effective. Spouses/partners predominantly represent the experiences of those
were most rated as very effective (49.7 percent), in the Global South, and they demonstrate that
followed by family members (48.3 percent) and LGBTQ+ people and women are at significant
friends (41.0 percent). Although community- risk of experiencing online harms and being
based supports were some of the least commonly negatively affected by them. The data highlights
accessed supports or resources, they were more the particularly high levels of perpetration of
commonly rated as more effective than formal online harms against transgender and gender non-
mechanisms such as police (28.8 percent) or social conforming, agender, non-binary and other gender-
media companies (22.4 percent). Community- marginalized people who, in many countries,
based resources such as doctors and mental risk their safety and well-being when expressing
health workers (39.6 percent) and victim support themselves authentically in digital spaces. It shows
organizations (36.7 percent) were more commonly that the negative impacts of online harms are most
rated as very effective. This data shows that in strongly felt by LGBTQ+ people and women, and
practice, community-based organizations are that the wider community sees OGBV as a much
providing proportionately more effective resources more serious issue for these groups compared to
and supports. More resources should be provided to men. As such, particular attention and resources
these types of organizations, as there are relatively need to be directed at these groups. Finally, it lays
few organizations that provide direct services for bare the high proportion of men who engage in
victims/survivors of online harms. Additionally, this harmful behaviour, highlighting the fact that
it suggests that there is a need for improved law men have an essential role to play in changing their
enforcement practices when addressing online behaviour to make digital spaces safer.
harms.
The authors hope that the research, data and
recommendations provided will assist in the
development of a human rights-based, equity-
Conclusion focused, trauma-informed, survivor-centric and
intersectional feminist approach to social, policy,
It is time to take action on TFV, particularly forms educational and technical changes. However, this
that negatively impact equity-seeking groups. data does not reveal a new story. It tells the story
that civil society organizations, researchers and
The individual and systemic harms caused by
advocates internationally — but in particular in
TFV perpetuate unacceptable discrimination and
the Global South — have been alerting the world to
cause significant harms that must be addressed.
for over 20 years. It is a reminder that not enough
The results of this study indicate that online harms
has been done and things must change to ensure
are widespread, there is a lack of existing effective
that all people have access to safe digital spaces.
supports and there is a need for a multi-stakeholder
Without action on this issue, women and LGBTQ+
effort to end TFV. Over the past two decades, TFV
people will not be able to participate equally, safely
has only increased and, although some efforts
and authentically in our increasingly digital world.
have been made to curtail these types of harmful
behaviour, currently there are inadequate resources
Recommendations
The following recommendations are applicable to addressing TFV
in general. However, significant attention and resources must be
directed at ending TFV against equity-seeking groups who face
disproportionate harms from TFV, such as members of the LGBTQ+
community, women, and racialized, disabled and young people. This
includes journalists, human rights defenders and politicians from those
groups and those rights defenders and politicians from those groups
and those who advocate for equality and human rights. These groups
face additional barriers due to the systemic oppression they face in
society and the fact that their interests are often neglected or under-
represented by those in power.
69
Additionally, specific recommendations are made people with disabilities, Indigenous people and
for governments, technology companies, civil members of racial, ethnic and religious groups
society organizations and researchers, think tanks who face discrimination.
and academics. Some of these stakeholders may
already be engaging in the actions recommended, 4. Ensure concepts of freedom of expression,
while some may have a way to go. Although each of sexual autonomy and privacy rights use a
these groups has a specific role to play and each has human rights-based approach. Take into
drastically different levels of influence, responses consideration the silencing effect of TFV and
to TFV must apply an ecosystem approach, where the rights of equity-seeking groups to express
all actors are working together toward the common themselves safely and authentically in digital
goal of eradicating TFV. Those with more resources spaces.
and power, such as governments and technology
companies, must commit to investing in change 5. Address how the silencing effect of TFV
and meaningfully engage with other stakeholders, undermines freedom of expression.
such as civil society organizations, researchers and
academics, when working to end TFV. For example, Collaboration and Consultation
governments should fund the work of civil society
organizations and researchers whose work 6. Meaningfully and regularly consult and
addresses TFV. Governments should meaningfully collaborate with civil society organizations,
consult and collaborate with civil society researchers, academics and legal practitioners
organizations, researchers, and victims/survivors with expertise in TFV, as well as victims/
when developing and implementing policies, survivors, when developing laws, policies and
regulations and laws. Social media companies programs related to TFV.
should do the same when developing policies and
content moderation practices. 7. Include the perspectives and voices of diverse
members of equity-seeking groups in all
With each of these recommendations, it is essential consultations and collaborations.
that any efforts made take a human rights-based,
equity-focused, trauma-informed, survivor-centric Legal and Policy Responses
and intersectional feminist approach.
8. Review existing laws that could apply to TFV to
ensure that the existing structure of those laws
is able to capture TFV.
Governments
9. Avoid an overreliance on criminal law solutions
Human Rights-Based Approach and ensure that there are non-criminal legal,
governmental and non-governmental options
1. Take a human rights-based, equity-focused, available to victims/survivors, such as civil
trauma-informed, survivor-centric and laws, privacy/data protection laws, human
intersectional feminist approach when rights laws, administrative solutions and/or
addressing TFV through laws, policies and community-based solutions, that address TFV.
resource distribution.
10. Ensure that laws do not unjustly restrict
2. Engage with specialists in TFV, including sexual expression, human rights advocacy and
civil society organizations, victims/survivors criticism of governments and institutions.
and academics/researchers, to ensure the
11. Review existing laws, such as morality, anti-
approaches and remedies governments
pornography and anti-obscenity laws to
propose fully address the real needs of those
ensure that people are not unjustly at risk of
who have been harmed by TFV, especially
surveillance and/or criminalization or legal
those from equity-seeking groups.
penalties when they create consensual and
3. Take a clear public stance against TFV, non-harmful sexually expressive material.
in particular against forms that are
12. Introduce laws to address forms of TFV that are
disproportionately harmful to equity-seeking
not addressed by existing laws.
groups, such as women, girls, LGBTQ+ people,
70
13. Ensure that legal responses to TFV include 20. Implement privacy/data and consumer
timely legal orders to have harmful content protection laws that require privacy by design
removed, deleted or de-indexed from the and safety by design for technology companies
internet when appropriate. and government actors creating or making use
of digital technology.
14. Ensure that all laws related to TFV, including
those addressing anonymity and encryption, 21. Work collaboratively with the governments of
respect human rights, including equality, other countries that are taking a human rights-
privacy and freedom of expression. Any legal based approach to addressing TFV. This could
frameworks that impact those rights must be include cross-jurisdictional or international
narrow, proportionate and justified. Broad, agreements that collectively address TFV and
sweeping and generalized laws on these topics develop uniform human rights-based research,
should be avoided. policies and legislation related to TFV.
15. Do not co-opt the vulnerability of equity- 22. Develop an international normative framework
seeking groups to create overly broad that outlines a human rights-based, equity-
government powers and protectionist laws that focused, trauma-informed, survivor-centric and
can unjustly infringe on human rights. intersectional feminist approach to responding
to TFV.
16. Provide adequate and appropriate training
to all actors in the justice system — from
police to judges — to ensure they have the Funding and Resources
skills and knowledge to properly address
23. Provide adequate funding and resources to
TFV using a human rights-based approach,
ensure that victims/survivors of TFV have
including having the requisite knowledge on
a variety of options when seeking support,
various technologies, digital evidence, human
including legal and non-legal responses. These
rights, racial bias, gender-based violence and
responses should allow victims/survivors time
discrimination against LGBTQ+ people.
to consider their options. Responses should
17. Ensure that there are policies and legislation include accessible remedies that provide timely
in place that adequately protect employees responses and do not require engagement with
from discrimination and sexual harassment in the legal system in all instances. Any support
the workplace. Ensure that there is particular systems that are developed should take a
attention paid to discrimination faced by human rights-based, equity-focused, trauma-
employees in the technology sector. informed, survivor-centric and intersectional
feminist approach to addressing TFV.
18. Implement human rights-based content
moderation regulation for internet 24. Ensure that there are independent and
intermediaries, including a requirement that civil society organizations that are properly
companies publish transparency reports with resourced to provide direct supports to victims/
anonymized disaggregated data on the types survivors of TFV. These organizations should
of violations and number of incidents faced by be resourced to provide educational, social,
women, men, LGBTQ+ people and other equity- technical, restorative and legal information
seeking groups, as well as the company’s about and supports around TFV that use a
responses to them. These transparency reports human rights-based approach. Support is
should be published in ways that respect and particularly important for organizations that
protect the human rights, including privacy have expertise in supporting gender equality,
rights, of users. LGBTQ+ rights, racial equality and other
human rights. This could include developing
19. Apply pressure to platforms to ensure user resources such as independent helplines and
rights are respected and that those targeted civil society organizations where victims/
with TFV on their platforms have accessible survivors can get immediate psychosocial and
and understandable options regarding content technical support.
removal, user suspension and other safety
issues. 25. Support the development of independent
government-funded bodies that employ a
71
human rights-based, equity-focused, trauma- educational campaigns on TFV. Public
informed, survivor-centric and intersectional education campaigns on TFV should assist
feminist approach to provide public education victims/survivors of TFV, as well as address the
and legal and non-legal supports for victims/ harmful behaviour of perpetrators. Campaigns
survivors of TFV, including engaging with large should avoid employing messages suggesting
tech companies to appropriately moderate people not engage in digital spaces to avoid
content on their platforms. These bodies must being harmed and not use victim-blaming
be adequately resourced. language.
72
Collaboration and Consultation 13. Ensure timely responses to complaints of
TFV that violate content moderation rules,
4. Meaningfully engage with civil society including appeals to decisions about the
organizations, researchers and academics with removal or non-removal of content. Responses
expertise on TFV, as well as victims/survivors, must include clear explanations for why the
to improve policies and responses to TFV. decision was made.
5. Work collaboratively with civil society 14. Create dedicated flagging programs that fast-
organizations that support victims/survivors track cases. Publish a list of organizations that
of TFV to help facilitate fast-track channels are a part of their trusted flagging programs.
related to serious incidents reported to those Provide information about how technology
organizations. companies can join that program.
10. Using a human rights-based approach, ensure 18. Provide accessible educational material to their
that content moderation policies are culturally users on digital safety and safe online practices,
specific and that content is evaluated within as well as information on how to navigate their
those contexts. content moderation practices.
11. Content that breaches a company’s content 19. Collect and publish transparency reports
moderation policies should be removed with disaggregated data on the types of
reasonably swiftly to prevent the spread and violations and number of incidents faced by
repeated viewing of the content. Content that women, men, LGBTQ+ people and equity-
is particularly sensitive and harmful, such seeking groups, as well as the responses to
as child sexual abuse material and intimate those violations, and routinely review the
images that have been shared without consent, data to assess the effectiveness of policies
should be prioritized to be removed as soon as and practices. Work collaboratively with civil
possible. society organizations and academics to review
the data and determine best practices.
12. Removal policies should not be discriminatory
and should take a human rights-based, equity-
focused, trauma-informed, survivor-centric and
intersectional feminist approach.
73
Equality and Safety in the the agenda to end TFV and eliminate business
models that benefit from or fail to address TFV.
Workplace
5. Develop networks with other civil society
20. Employ human rights experts in TFV who organizations and academics to share research
can help develop technology and content and support a global effort to end TFV.
moderation practices that best comply with
human rights and safety standards.
Supports and Resources
21. Ensure that content moderation employees
are adequately trained, earn a living wage and 6. Civil society organizations must be supported
are provided relevant supports to manage the by governmental policies, legislation,
potential trauma related to viewing disturbing regulation and resources to play a meaningful
content as a part of their job. and central role in preventing TFV.
22. Ensure that their workplaces are safe and 7. Engage with community members and victims/
welcoming to equity-seeking groups, including survivors to create culturally relevant resources
women, people of colour and LGBTQ+ people. and supports.
23. Diversify their workforce to ensure members 8. Provide social, technical, restorative and legal
of equity-seeking groups are represented and supports related to TFV that use a human
valued. rights-based, equity-focused, trauma-informed,
survivor-centric and intersectional feminist
24. Ensure that their employees and developers are approach. This could include resources such
well trained about human rights, privacy and as independent helplines, online services and
safety, including gender and sexual orientation in-person services where victims/survivors
discrimination. can get immediate psychosocial and technical
supports.
25. Include privacy by design and safety by
design practices in the development and 9. Ensure staff have expertise in supporting
implementation of their products and services. women, LGBTQ+ people and other equity-
seeking groups.
74
racism, ableism, religious discrimination and Research and Education
colonialism).
8. Further the research landscape on legal,
15. Provide information on best practices for regulatory, technical and social inputs to
staying safe in digital spaces, and where to counter TFV and to monitor the status of TFV.
report and how best to manage incidents of
TFV. This should include legal and non-legal 9. Conduct research on equity-seeking groups
options, including how to collect digital that are vulnerable to TFV, such as women,
evidence needed in legal matters, how to report LGBTQ+ people, disabled people and members
harmful content to technology companies as of marginalized racial, ethnic and religious
well as community-based responses to TFV. groups.
75
Works Cited
7amleh — Arab Center for Social Media Advancement. 2018. Akbary, Artemis. 2022. “Afghan Queer Community’s Access to
“Online GBV in Palestine Means Losing Out on Women’s the Internet Is a Double Edged Sword Under Taliban Rule.”
Participation.” GenderIt, June 11. www.genderit.org/ GenderIT, December 12. https://genderit.org/feminist-talk/
feminist-talk/online-gbv-palestine-means-losing-out-womens- afghan-queer-communitys-access-internet-double-edged-
participation. sword-under-taliban-rule.
Abdullah, Mneera and Eliza Campbell. 2021. “Gendered violence Akter, Farhana. 2018. “Cyber Violence Against Women: The Case of
online: cybersecurity for whom?” Middle East Institute, Bangladesh.” GenderIT, June 17. www.genderit.org/articles/
March 16. www.mei.edu/publications/gendered-violence- cyber-violence-against-women-case-bangladesh.
online-cybersecurity-whom.
Albury, Kath, Christopher Dietzel, Tinonee Pym, Son Vivienne and
Access Now. 2021. “10 facts to counter encryption myths.” Access Teddy Cook. 2021. “Not your unicorn: trans dating app users’
Now Policy Brief. August. www.accessnow.org/wp-content/ negotiations of personal safety and sexual health.” Health
uploads/2021/08/Encryption-Myths-Facts-Report.pdf. Sociology Review 30 (1): 72–86. https://doi.org/10.1080/
14461242.2020.1851610.
Acoso. 2020. “Domestic violence and its online manifestations in the
context of the pandemic.” Submission to the United Nations Alcaraz, María Florencia. 2017. “#NiUnaMenos: Politicising the
Special Rapporteur on violence against women, its causes and Use of Technologies.” GenderIT, September 4.
consequences. June. https://genderit.org/feminist-talk/special-edition-
niunamenos-politicising-use-technologies.
Acosta, Rodrigo Vargas. 2020. “Regulatory issues in matters of
freedom of expression and the internet in Latin America Amnesty International. 2018. “A Toxic Place for Women.” In
2010–2020.” Derechos Digitales. www.unesco.org/en/world- #ToxicTwitter: Violence and Abuse against Women Online.
media-trends/regulatory-issues-matters-freedom-expression- March 21. www.amnesty.org/en/latest/research/2018/03/
and-internet-latin-america-2010-2020. online-violence-against-women-chapter-1-1/.
ADL Center for Technology and Society. 2021. Online Hate and ———. 2022. “South Korea: Online sexual abuse content proliferates
Harassment: The American Experience 2021. March. as survivors blame Google failings.” News, December 8.
www.adl.org/resources/report/online-hate-and-harassment- www.amnesty.org/en/latest/news/2022/12/south-korea-
american-experience-2021. online-sexual-abuse-content-proliferates-as-survivors-blame-
google-failings/.
African Declaration. 2015. “African Declaration on Internet Rights and
Freedoms.” https://africaninternetrights.org/en. Anasuya, Shreya Ila. 2018. “For Women in the Press Like Rana
Ayyub, It’s Scarily Easy for Online Threats to Turn Physical.”
African Development Bank Group. 2016. “Minding the Gaps: GenderIT, June 7. www.genderit.org/feminist-talk/women-
Identifying Strategies to Address Gender-Based Cyber Violence press-rana-ayyub-it%E2%80%99s-scarily-easy-online-threats-
in Kenya.” Policy Brief. March. www.afdb.org/fileadmin/ turn-physical.
uploads/afdb/Documents/Generic-Documents/Policy_Brief_
on_Gender_Based_Cyber_Violence_in_Kenya.pdf. Anderson, Briony and Mark A. Wood. 2021. “Doxxing: A Scoping
Review and Typology.” In The Emerald International Handbook
Aghtaie, Nadia, Cath Larkins, Christine Barte, Nicky Stanley, Marsha of Technology-Facilitated Violence and Abuse, edited by Jane
Wood and Carolina Øverlien. 2018. “Interpersonal violence Bailey, Asher Flynn and Nicola Henry, 205–26. Bingley,
and abuse in young people’s relationships in five European UK: Emerald. https://doi.org/10.1108/978-1-83982-848-
countries: online and offline normalisation of heteronormativity.” 520211015.
Journal of Gender-Based Violence 2 (2): 293–310.
https://doi.org/10.1332/239868018X15263879270302. Anwer, Zoya. 2022. “Navigating Identities: Experiencing Online
Violence as a Shia Woman in Pakistan.” GenderIT, October 29.
Aikenhead, Moira. 2021. “Revenge Pornography and Rape Culture https://genderit.org/feminist-talk/navigating-identities-
in Canada’s Nonconsensual Distribution Case Law.” In The experiencing-online-violence-shia-woman-pakistan.
Emerald International Handbook of Technology-Facilitated
Violence and Abuse, edited by Jane Bailey, Asher Flynn and
Nicola Henry, 533–53. Bingley, UK: Emerald.
https://doi.org/10.1108/978-1-83982-848-520211039.
76
APC & Humanist Institute for Cooperation with Developing Countries. Bailey, Jane and Carissima Mathen. 2019. “Technology-Facilitated
2013. Global Information Society Watch 2013: Women’s rights, Violence against Women & Girls: Assessing the Canadian
gender and ICTs. https://giswatch.org/sites/default/files/ Criminal Law Response.” The Canadian Bar Review 97 (3):
gisw13_chapters.pdf. 664–96. https://cbr.cba.org/index.php/cbr/article/
view/4562.
Arimatsu, Louise. 2019. “Silencing women in the digital age.”
Cambridge International Law Journal 8 (2): 187–217. Bailey, Jane and Suzie Dunn. Forthcoming 2023. “The More Things
https://doi.org/10.4337/cilj.2019.02.02. Change, The More They Stay the Same: Recurring Themes in
Tech-facilitated Sexual Violence Over Time.” In Criminalising
Ashley, Florence. 2018a. “Don’t be so hateful: The insufficiency of Intimate Image Abuse, edited by Kolis Summerer and Gian
anti-discrimination and hate crime laws in improving trans Marco Calletti. Oxford, UK: Oxford University Press.
well-being.” University of Toronto Law Journal 68 (1): 1–36.
https://doi.org/10.3138/utlj.2017-0057. Bailey, Jane and Sara Shayan. 2016. “Missing and Murdered
Indigenous Women Crisis: Technological Dimensions.”
———. 2018b. “Genderfucking Non-Disclosure: Sexual Fraud, Canadian Journal of Women and the Law 28 (2): 321–41.
Transgender Bodies, and Messy Identities.” Dalhousie Law www.utpjournals.press/doi/abs/10.3138/cjwl.28.2.321.
Journal 41 (2): 339–77.
https://digitalcommons.schulichlaw.dal.ca/dlj/vol41/ Bailey, Jane, Nicola Henry and Asher Flynn. 2021. “Technology-
iss2/3/. Facilitated Violence and Abuse: International Perspectives
and Experiences.” In The Emerald International Handbook of
Athar, Rima. 2015. “From Impunity to Justice: Improving Corporate Technology-Facilitated Violence and Abuse, edited by Jane
Policies to End Technology-Related Violence Against Women.” Bailey, Nicola Henry and Asher Flynn, 1–17. Bingley, UK:
APC, February 25. www.apc.org/en/pubs/impunity-justice- Emerald. https://doi.org/10.1108/978-1-83982-848-
improving-corporate-policies-end-0. 520211001.
AWARE. 2020. “AWARE’s Sexual Assault Care Centre saw 140 cases Bailey, Jane, Jacquelyn Burkell, Suzie Dunn, Chandell Gosse and
of technology-facilitated sexual violence in 2019, the most ever Valerie Steeves. 2021. “AI and Technology-Facilitated Violence
in one year.” December 2. www.aware.org.sg/ and Abuse.” In Artificial Intelligence and the Law in Canada,
2020/12/awares-sexual-assault-care-centre-saw-140-cases- edited by Florian Martin-Bariteau and Teresa Scassa. Toronto,
of-technology-facilitated-sexual-violence-in-2019-the-most- ON: LexisNexis.
ever-in-one-year/.
Bailey, Moya. 2021. Misogynoir Transformed: Black Women’s Digital
Ayres França, Leandro and Jéssica Veleda Quevedo. 2020. “Project Resistance. New York, NY: New York University Press.
Leaked: Research on Non-Consensual sharing of Intimate
Images in Brazil.” International Journal of Cyber Criminology Barboni, Giorgia, Erica Field, Rohini Pande, Natalia Rigol, Simone
14 (1): 1–28. Schaner and Charity Troyer Moore. 2018. A Tough Call:
Understanding barriers to and impacts of women’s mobile
Aziz, Zarizana Abdul. 2020. Online Violence Against Women in phone adoption in India. Harvard Kennedy School, October.
Asia: A Multicountry Study. UN Women, November. www.hks.harvard.edu/publications/tough-call-understanding-
www.aidsdatahub.org/sites/default/files/resource/unwomen- barriers-and-impacts-womens-mobile-phone-adoption-india.
online-violence-against-women-asia-2020.pdf.
Barton, Alana and Hannah Storm. 2014. Violence and Harassment
Badran, Mona Farid. 2019. “Bridging the gender digital divide against Women in the News Media: A Global Picture.
in the Arab Region.” IDRC. www.researchgate.net/ International Women’s Media Foundation and International
publication/330041688_Bridging_the_gender_digital_divide_ News Safety Institute. www.iwmf.org/wp-content/
in_the_Arab_Region. uploads/2018/06/Violence-and-Harassment-against-
Women-in-the-News-Media.pdf.
Baele, Stephane J., Lewys Brace and Travis G. Coan. 2019. “From
‘Incel’ to ‘Saint’: Analyzing the violent worldview behind the Bartow, Ann. 2009. “Internet Defamation as Profit Center: The
2018 Toronto attack.” Terrorism and Political Violence 33 (8): Monetization of Online Harassment.” Harvard Journal of Law
1667–91. https://doi.org/10.1080/ and Gender 32: 101–47.
09546553.2019.1638256.
Bauer, Greta R., Jake Pyne, Matt Caron Francino and Rebecca
Hammond. 2013. “Suicidality among trans people in Ontario:
Implications for social work and social justice.” Service Social
59 (1): 35–62. https://doi.org/10.7202/1017478ar.
77
BC Society of Transition Houses. 2020. Technology-Facilitated Chair, Chenai, Ingrid Brudivg and Calum Cameron. 2020. “Women’s
Violence: BC Anti-Violence Workers Survey Results Rights Online: Closing the digital gender gap for a more equal
Report. February. https://bcsth.ca/wp-content/ world.” World Wide Web Foundation, October.
uploads/2021/03/16.-BCSTH-Tech_Facilitated-Violence-BC- https://webfoundation.org/research/womens-rights-
Anti_Violence-Workers-Survey-Report-2020_Final.pdf. online-2020/.
Benjamini, Yoav and Yosef Hochberg. 1995. “Controlling the False Chandrasekar, Ramya. 2017. Policing online abuse or policing
Discovery Rate: A Practical and Powerful Approach to Multiple women? Our submission to the United Nations on online
Testing.” Journal of the Royal Statistical Society, Series B violence against women. Internet Democracy Project,
(Methodological) 57 (1): 289–300. November 7. https://internetdemocracy.in/policy/un-srvaw-
report.
Bhandari, Vrinda and Anja Kovacs. 2021. “What’s Sex Got to
Do with It? Mapping the Impact of Questions of Gender Chisala-Tempelhoff, Sarai and Monica Twesiime Kirya. 2016.
and Sexuality on the Evolution of the Digital Rights Landscape in “Gender, law and revenge porn in Sub-Saharan Africa: a
India.” Internet Democracy Project, January 20. review of Malawi and Uganda.” Palgrave Communications 2
https://news.cyrilla.org/2021/01/new-report-whats-sex- (1): 1–9. www.nature.com/articles/palcomms201669.
got-to-do-with-it-mapping-the-impact-of-questions-of-gender-
and-sexuality-on-the-evolution-of-the-digital-rights-landscape-i- Citron, Danielle Keats. 2009. “Law’s Expressive Value in Combating
n-india/. Cyber Gender Harassment.” Michigan Law Review 108 (3):
373–476. https://repository.law.umich.edu/cgi/viewcontent.
Boyd, Susan B. and Elizabeth Sheehy. 2016. “Men’s Groups: cgi?params=/context/mlr/article/1300/&path_info=.
Challenging Feminism.” Canadian Journal of Women and the
Law 28 (1): 5–10. www.utpjournals.press/doi/abs/10.3138/ ———. 2014. Hate Crimes in Cyberspace. Cambridge, MA: Harvard
cjwl.28.1.5?journalCode=cjwl. University Press.
Bradshaw, Samantha and Amélie Henle. 2021. “The Gender ———. 2019. “Sexual Privacy.” The Yale Law Journal 128 (7):
Dimensions of Foreign Influence Operations.” International 1870–1960.
Journal of Communication (15): 4596–618.
———. 2022. The Fight for Privacy: Protecting Dignity, Identity, and
Brandwatch. 2019. “Exposed: The Scale of Transphobia Online.” Love in the Digital Age. New York, NY: W. W. Norton.
www.brandwatch.com/reports/transphobia/.
Cole, Samantha. 2022. “The Legal System Is Completely Unprepared
Brown, Megan, Zeve Sanderson and Maria Alejandra Silva Ortega. for Apple AirTag Stalking.” Vice Motherboard, December 9.
2022. “Gender-based online violence spikes after prominent www.vice.com/en/article/5d3edx/apple-airtag-stalking-
media attacks.” TechStream, Brookings Institute, January 26. police-family-court.
www.brookings.edu/techstream/gender-based-online-
Collins, Patricia Hill. 1990. Black Feminist Thought: Knowledge,
violence-spikes-after-prominent-media-attacks/.
Consciousness, and the Politics of Empowerment. London, UK:
Burlock, Amanda and Tamara Hudon. 2018. “Women and men Routledge.
who experienced cyberstalking in Canada.” Statistics
Conway, Maura, Ryan Scrivens and Logan Macnair. 2019.
Canada, June 5. www150.statcan.gc.ca/n1/pub/75-
“Right-Wing Extremists’ Persistent Online Presence: History
006-x/2018001/article/54973-eng.htm.
and Contemporary Trends.” International Centre for Counter-
Canales, María Paz. 2017. “25N: Building an internet without Terrorism Policy Brief. November 25. https://icct.nl/app/
violence.” Derechos Digitales, November 24. uploads/2019/11/Right-Wing-Extremists-Persistent-Online-
www.derechosdigitales.org/11744/25n-construir-de-una- Presence.pdf.
internet-sin-violencia/.
Coombs, Elizabeth. 2021. “Human Rights, Privacy Rights, and
Center for Countering Digital Hate. 2022. “Hidden Hate: How Technology-Facilitated Violence.” In The Emerald International
Instagram fails to act on 9 in 10 reports of misogyny in DMs.” Handbook of Technology-Facilitated Violence and Abuse,
https://counterhate.com/wp-content/uploads/2022/05/ edited by Jane Bailey, Asher Flynn and Nicola Henry,
Final-Hidden-Hate.pdf. 475–91. Bingley, UK: Emerald. https://doi.org/10.1108/978-
1-83982-848-520211036.
78
Council of Europe. 2018. “Mapping study on cyberviolence.” Díaz, Marianne and Jamila Venturini. 2020. “Identity systems
T-CY(2017)10. July 9. Strasbourg, France: Council of and social protection in Venezuela and Bolivia: gender
Europe. https://rm.coe.int/t-cy-2017-10-cbgstudy- impacts and other inequalities.” Derechos Digitales.
provisional/16808c4914. www.derechosdigitales.org/wp-content/uploads/identity-
systems_ENG.pdf.
Cox, Cassie. 2014. “Protecting Victims of Cyberstalking,
Cyberharassment, and Online Impersonation through Dietzel, Christopher. 2021. “‘That’s Straight-Up Rape Culture’:
Prosecutions and Effective Laws.” Jurimetrics 54 (3): 277–302. Manifestations of Rape Culture on Grindr.” In The Emerald
www.jstor.org/stable/24395601. International Handbook of Technology-Facilitated Violence and
Abuse, edited by Jane Bailey, Asher Flynn and Nicola Henry,
Crenshaw, Kimberlé. 1991. “Mapping the Margins: Intersectionality, 351–68. Bingley, UK: Emerald. https://doi.org/10.1108/978-
Identity Politics, and Violence against Women of Color.” 1-83982-848-520211026.
Stanford Law Review 43 (6): 1241–99. https://doi.org/
10.2307/1229039. ———. 2022. “The three dimensions of unsolicited dick pics:
Sexual minority men’s experiences of sSending and receiving
Crockett, Cailin and Rachel Vogelstein. 2022. “Launching the Global unsolicited dick pics on dating apps.” Sexuality & Culture (26):
Partnership for Action on Gender-Based Online Harassment 834–52.
and Abuse.” Briefing Room (White House Gender Policy
Council blog), March 18. www.whitehouse.gov/gpc/briefing- Digital Rights Foundation. 2017a. Measuring Pakistani Women’s
room/2022/03/18/launching-the-global-partnership-for- Experiences of Online Violence: A Quantitative Research Study
action-on-gender-based-online-harassment-and-abuse/. on Online-Gender Based Harassment in Pakistan.
https://digitalrightsfoundation.pk/wp-content/
Cuellar, Lina and Sandra Chaher. 2020. “Being a journalist on uploads/2017/05/Hamara-Internet-Online-Harassment-
Twitter: Digital gender violence in Latin America.” UNESCO. Report.pdf.
Curlew, Abigail and Jeffery Monaghan. 2019. “Stalking ‘Lolcows’ ———. 2017b. “Surveillance of Female Journalists in Pakistan.”
and ‘Ratkings’: DIY Gender Policing, Far-Right Digilantes, and https://digitalrightsfoundation.pk/wp-content/
Anti-Transgender Violence.” In Disinformation and Digital uploads/2017/02/Surveillance-of-Female-Journalists-in-
Democracies in the 21st Century, edited by Joseph McQuade, Pakistan-1.pdf.
24–28. Toronto, ON: NATO Association of Canada.
https://natoassociation.ca/wp-content/uploads/2019/10/ ———. 2019. “Female Journalists in New Media: Experiences,
NATO-publication-.pdf. challenges and a gendered approach.”
https://digitalrightsfoundation.pk/wp-content/
Derechos Digitales. 2021. Rapid Response Fund for the Protection of uploads/2019/03/Female-journalists-in-new-media-
Digital Rights in Latin America: 2021 Milestones. experiences-challenges-and-a-gendered-approach.pdf.
www.derechosdigitales.org/wp-content/uploads/Rapid-
Response-Founf-2021-milestones.pdf. Dodge, Alexa. 2021. Deleting Digital Harm: A Review of
Nova Scotia’s CyberScan Unit. August. Halifax, NS: Dalhousie
DeKeseredy, Walter S. and Martin D. Schwartz. 2016. “Thinking University. www.vawlearningnetwork.ca/docs/CyberScan-
Sociologically About Image-Based Sexual Abuse: Report.pdf.
The Contribution of Male Peer Support Theory.”
Sexualization, Media, & Society 2 (4): 1–8. Dodge, Alexa, Dale Spencer, Rose Ricciardelli and Dale Ballucci.
https://doi.org/10.1177/2374623816684692. 2019. “‘This isn’t your father’s police force’: Digital evidence in
sexual assault investigations.” Journal of Criminology 52 (4):
Devika, J. 2019. “Gender-Based Cyber Violence against 499–515. https://doi.org/10.1177/0004865819851544.
Women in Kerala: Insights from Recent Research.” Centre
for Development Studies. https://cds.edu/wp-content/ Douglas, Heather, Bridget A. Harris and Molly Dragiewicz. 2019.
uploads/2021/02/7Commentary-min.pdf. “Technology-facilitated Domestic and Family Violence:
Women’s Experiences.” The British Journal of Criminology
Dhrodia, Azmina. 2018. “Unsocial media: A toxic place for women.” 59 (3): 551–70.
IPPR Progressive Review 24 (4): 380–87. https://doi.org/
10.1111/newe.12078.
79
Dragiewicz, Molly, Jean Burgess, Ariadna Matamoros-Fernández, European Institute for Gender Equality. 2017. “Cyber violence against
Michael Salter, Nicolas P. Suzor, Delanie Woodlock and women and girls.” https://eige.europa.eu/publications/cyber-
Bridget Harris. 2018. “Technology facilitated coercive control: violence-against-women-and-girls.
domestic violence and the competing roles of digital media
platforms.” Feminist Media Studies 18 (4): 1–17. https://doi.or ———. 2019. Gender equality and youth: opportunities and risks of
g/10.1080/14680777.2018.1447341. digitalisation. https://eige.europa.eu/publications/gender-
equalityand-youth-opportunities-andrisks-digitalisation.
Duggan, Maeve. 2017. Online Harassment 2017. Pew Research
Center. July 11. www.pewresearch.org/internet/2017/07/11/ ———. 2022. Combating Cyber Violence against Women and Girls.
online-harassment-2017/. https://eige.europa.eu/publications-resources/publications/
combating-cyber-violence-against-women-and-girls.
Dunn, Suzie. 2020a. Technology-Facilitated Gender-Based Violence:
An Overview. Supporting a Safer Internet Paper No. 1. Fairbairn, Jordan. 2015. “Rape threats and revenge porn: Defining
Waterloo, ON: CIGI. www.cigionline.org/publications/ sexual violence in the digital age.” In eGirls, eCitizens: Putting
technology-facilitated-gender-based-violence-overview/. Technology, Theory and Policy into Dialogue with Girls’ and
Young Women’s Voices, edited by Jane Bailey and Valerie
———. 2020b. “Identity Manipulation: Responding to Advances Steeves. Ottawa, ON: University of Ottawa Press.
in Artificial Intelligence and Robotics.” We Robot 2020
Conference Paper. Farokhmanesh, Megan. 2022. “The End of Kiwi Farms, the Web’s
Most Notorious Stalker Site.” Wired, September 8.
Dunn, Suzie and Moira Aikenhead. 2022. “On the Internet, Nobody www.wired.com/story/keffals-kiwifarms-cloudflare-blocked-
Knows You are a Dog: Contested Authorship of Digital Evidence clara-sorrenti/.
in Cases of Gender-based Violence.” Canadian Journal of Law
and Technology 19 (2): 371–409. Farrell, Andrew. 2021. “It’s Just a Preference: Indigenous LGBTIQ+
Peoples and Technologically Facilitated Violence.” In The
Dunn, Suzie and Alessia Petricone-Westwood. 2018. “More Palgrave Handbook of Gendered Violence and Technology,
than ‘Revenge Porn’ Civil Remedies for the Nonconsensual edited by Anastasia Powell, Asher Flynn and Lisa Sugiura,
Distribution of Intimate Images.” 38th Annual Civil Litigation 335–53. Cham, Switzerland: Palgrave Macmillan.
Conference.
Fascendini, Flavia. 2014. “Highlights on Tech-Related Violence
Eaton, Asia A., Divya Ramjee and Jessica F. Saunders. 2022. “The Against Women in Bosnia and Herzegovina, Mexico and the
Relationship between Sextortion during COVID-19 and Philippines.” GenderIT, December 12. https://genderit.org/
Pre-pandemic Intimate Partner Violence: A Large Study of feminist-talk/highlights-tech-related-violence-against-women-
Victimization among Diverse U.S. Men and Women.” Victims bosnia-and-herzegovina-mexico-and.
& Offenders 18 (2): 338–55. www.tandfonline.com/doi/
bs/10.1080/15564886.2021.2022057?journalCode=uvao20. Fascendini, Flavia and Kateřina Fialová. 2011. Voices from digital
spaces: Technology related violence against women. APC.
Eckert, Stine and Jade Metzger-Riftkin. 2020. “Doxxing, privacy December. www.apc.org/sites/default/files/APCWNSP_
and gendered harassment: The Shock and Normalization MDG3advocacypaper_full_2011_EN_0.pdfwww.apc.org/
of Veillance Cultures.” APC, August 10. www.apc.org/en/ sites/default/files/APCWNSP_MDG3advocacypaper_
press/doxxing-privacy-and-gendered-harassment-shock-and- full_2011_EN_0.pdf.
normalization-veillance-cultures.
Ferrier, Michelle. 2018. Attacks and Harassment: The Impact of
El Heraldo. 2020. “Feministas denuncian que filtración de fotos Female Journalists and Their Reporting. TrollBusters and
íntimas aumentó en cuarentena.” El Heraldo, April 25. International Women’s Media Foundation.
www.heraldo.mx/feministas-denuncian-que-filtracion-de-fotos-
intimas-aumento-en-cuarentena/. FRA — European Union Agency for Fundamental Rights. 2014.
Violence against women: an EU-wide survey.
eSafety Commissioner. 2019. eSafety for Women from Culturally and http://fra.europa.eu/en/publication/2014/violence-against-
Linguistically Diverse Backgrounds. February. www.esafety.gov. women-eu-widesurvey-main-results-report.
au/sites/default/files/2019-07/summary-report-for-women-
from-cald-backgrounds.pdf. France 24. 2021. “Hackers release Israeli LGBTQ dating site details.”
November 2. www.france24.com/en/live-news/20211102-
———. 2021. Protecting LGBTIQ+ voices online: resource hackers-release-israeli-lgbtq-dating-site-details.
development research. August. www.esafety.gov.au/research/
protecting-lgbtiq-voices-online. Franks, Mary Anne. 2015. “Drafting an Effective ‘Revenge Porn’ Law:
A Guide for Legislators.” https://ssrn.com/abstract=2468823.
80
Freed, Diana, Jackeline Palmer, Diana Minchala, Karen Levy, Thomas Goldberg, Carrie. 2019. Nobody’s Victim: Fighting Psychos, Stalkers,
Ristenpart and Nicola Dell. 2018. “‘A Stalker’s Paradise’: Pervs, and Trolls. New York, NY: Plume.
How Intimate Partner Abusers Exploit Technology.” CHI ‘18:
Proceedings of the 2018 CHI Conference on Human Factors in Goulds, Sharon, Miriam Gauer, Aisling Corr and Jacqui Gallinetti.
Computing Systems. https://doi.org/ 2020. Free to Be Online? Girls’ and young women’s
10.1145/3173574.3174241. experiences of online harassment. Plan International.
https://plan-international.org/publications/free-to-be-
GenderIT. 2015. “End Violence: Case Studies from Mexico.” online/.
GenderIT, February 15. https://genderit.org/resources/end-
violence-case-studies-mexico. Green, Yasmin. 2022. “Iran’s Internet Blackouts Are Part of a Global
Menace.” Wired, October 19. www.wired.com/story/iran-
———. 2018a. “We Can Be Heroes: Towards Public and Legal mahsa-amini-internet-shutdown/.
Recognition of Online Gender-Based Violence.” GenderIT,
June 17. www.genderit.org/edition/we-can-be-heroes- Gritten, David. 2022. “Iran sentences two LGBT activists to death.”
towards-public-and-legal-recognition-online-gender-based- BBC News, September 6. www.bbc.com/news/world-middle-
violence. east-62793573.
———. 2018b. “13 Manifestations of Gender-Based Violence Using GSMA and Cherie Blair Foundation for Women. 2013. Women &
Technology.” GenderIT, November 12. https://genderit.org/ Mobile: A Global Opportunity. A study on the mobile phone
resources/13-manifestations-gender-based-violence-using- gender gap in low and middle-income countries.
technology. www.gsma.com/mobilefordevelopment/wp-content/
uploads/2013/01/GSMA_Women_and_Mobile-A_Global_
Ghoshal, Neela. 2020. “Every Day I Live in Fear”: Violence and Opportunity.pdf.
Discrimination Against LGBT People in El Salvador, Guatemala,
and Honduras, and Obstacles to Asylum in the United Guerra, Juliana and Mallory Knodel. 2019. “Feminism and
States. Human Rights Watch, October 7. www.hrw.org/ protocols.” Internet Engineering Task Force, July 8.
report/2020/10/07/every-day-i-live-fear/violence-and-
Gurumurthy, Anita and Amrita Vasudevan. 2018. “Hidden Figures:
discrimination-against-lgbt-people-el-salvador.
A Look at Technology-Mediated Violence against Women in
Gill, Lex, Tamir Israel and Christopher Parsons. 2018. Shining a Light India.” Gender IT, June 11. www.genderit.org/articles/hidden-
on the Encryption Debate: A Canadian Field Guide. The Citizen figures-look-technology-mediated-violence-against-women-
Lab and the Canadian Internet Policy & Public Interest Clinic. india.
May. https://citizenlab.ca/wp-content/uploads/2018/05/
Gurumurthy, Anita and Amshuman Dasarathy. 2022. Profitable
Shining-A-Light-Encryption-CitLab-CIPPIC.pdf.
Provocations A Study of Abuse and Misogynistic Trolling
Gillwald, Alison. 2018. Understanding the Gender Gap in the Global on Twitter Directed at Indian Women in Public-political Life.
South. After Access. https://afteraccess.net/wp-content/ IT for Change, July. https://itforchange.net/sites/default/
uploads/2018-After-Access-Understanding-the-gender-gap- files/2132/ITfC-Twitter-Report-Profitable-Provocations.pdf.
in-the-Global-South.pdf.
Guy, Rachel. 2021. “Nation of Men: Diagnosing Manospheric
GLAAD. 2022. “Unsafe in America: New GLAAD Data Shows Misogyny as Virulent Online Nationalism.” Georgetown Journal
Unprecedented Threats and Attacks Against LGBTQ Americans of Gender and the Law 22 (3): 601–40. https-//ssrn.com/
Leading Up to the Deadly Mass Shooting in Colorado abstract=3939252.
Springs.” News release, November 22. www.glaad.org/
Hall, Matthew and Jeff Hearn. 2019. “Revenge pornography and
releases/unsafe-america%C2%A0-new-glaad-data-shows-
manhood acts: a discourse analysis of perpetrators’ accounts.”
unprecedented-threats-and-attacks-against-lgbtq.
Journal of Gender Studies 28 (2): 158–70.
Global Information Society Watch. 2017. Unshackling Expression: A
Hassan, Bushra, Tim Unwin and Akber Gardezi. 2018.
Study on Laws Criminalising Expression Online in Asia.
“Understanding the Darker Side of ICTs: Gender, Sexual
https://giswatch.org/2017-special-report-unshackling-
Harassment, and Mobile Devices in Pakistan.” Information
expression-study-law-criminalising-expression-online-asia.
Technologies & International Development 14: 1–17.
Global Partnership for Action on Gender-Based Online Harassment
and Abuse. Forthcoming 2023. Technology-Facilitated Gender-
Based Violence Evidence Situation Report.
81
Hassan, Fatma Mohammed, Fatma Nada Khalifa, Eman D. El Hrick, Pam. 2021. “The Potential of Centralized and Statutorily
Desouky, Marwa Rashad Salem and Mona Mohamed Ali. Empowered Bodies to Advance a Survivor-Centered Approach
2020. “Cyber violence pattern and related factors: online to Technology-Facilitated Violence Against Women.” In The
survey of females in Egypt.” Egyptian Journal of Forensic Emerald International Handbook of Technology-Facilitated
Sciences 10 (6): 1–7. https://ejfs.springeropen.com/ Violence and Abuse, edited by Jane Bailey, Asher Flynn and
articles/10.1186/s41935-020-0180-0. Nicola Henry, 595–615. Bingley, UK: Emerald.
https://doi.org/10.1108/978-1-83982-848-520211043.
Hasinoff, Amy Adele. 2014. “Blaming Sexualizaton for Sexting.”
Girlhood Studies 7 (1): 102–20. Hsu, Jason C. 1996. Multiple Comparisons: Theory and methods.
London, UK: Chapman & Hall/CRC Press.
Havron, Sam, Diana Freed, Rahul Chatterjee, Damon McCoy, Nicola
Dell and Thomas Ristenpart. 2019. “Clinical Computer Security Hupfer, Susanne, Sayantani Mazumder, Ariane Bucaille and
for Victims of Intimate Partner Violence.” 28th USENIX Security Gillian Crossan. 2021. “Women in the tech industry: Gaining
Symposium. www.usenix.org/conference/usenixsecurity19/ ground, but facing new headwinds.” Deloitte, December 1.
presentation/havron. www2.deloitte.com/us/en/insights/industry/technology/
technology-media-and-telecom-predictions/2022/statistics-
Hayes, Rebecca M. and Molly Dragiewicz. 2018. “Unsolicited dick show-women-in-technology-are-facing-new-headwinds.html.
pics: Erotica, exhibitionism or entitlement?” Women’s Studies
International Forum 71: 114–20. Human Rights Watch. 2019. “‘Don’t Punish Me for Who I Am’:
Systemic Discrimination Against Transgender Women in
Hébert, William, Nora Butler Burke, Tara Santini, Frank Suerich-Gulick Lebanon.” September 3. www.hrw.org/report/2019/09/03/
and Daphne Barile. 2022. A Qualitative Look at Serious Legal dont-punish-me-who-i-am/systemic-discrimination-against-
Problems: Trans, Two-Spirit, and Non-Binary People in Canada. transgender-women-lebanon.
Department of Justice Canada.
———. 2020. “Egypt: Security Forces Abuse, Torture LGBT People.”
Henry, Nicola and Asher Flynn. 2019. “Image-Based Sexual October 1. www.hrw.org/news/2020/10/01/egypt-
Abuse: Online Distribution Channels and Illicit Communities of security-forces-abuse-torture-lgbt-people.
Support.” Violence Against Women 25 (16): 193–95.
https://doi.org/10.1177/1077801219863881. Inter-Parliamentary Union. 2016. “Sexism, harassment and violence
against women parliamentarians.” Issues Brief. October.
Henry, Nicola, Clare McGlynn, Asher Flynn, Kelly Johnson, Anastasia www.iknowpolitics.org/en/learn/knowledge-resources/
Powell and Adrian J. Scott. 2020. Image-based Sexual Abuse: report-white-paper/ipu-issues-brief-sexism-harassment-and-
A Study on the Causes and Consequences of Non-consensual violence-against.
Nude or Sexual Imagery. New York, NY: Routledge.
International Federation of Journalists. 2017. “IFJ survey: One in two
Hernández Bauzá, Valentina. 2017. Technology for Privacy women journalists suffer gender-based violence at work.”
and Freedom of Expression: Regulation of Anonymity and November 24.
Encryption. Derechos Digitales. www.derechosdigitales.org/
wp-content/uploads/anonimity-and-encryption.pdf. Internet Sans Frontières. 2019. “#IWD2019: Online Gender-Based
Violence Affects 45% of Women on Social Media in West
———. 2020. “Trends related to regulatory developments in privacy and Central Africa.” https://internetwithoutborders.org/
and the internet in Latin America.” Derechos Digitales, January 1. iwd2019-online-gender-based-violence-affects-45-of-women-
on-social-media-in-west-and-central-africa/
Hinson, L., L. O’Brien-Milne, J. Mueller, V. Bansal, N. Wandera
and S. Bankar. 2019. “Defining and Measuring Technology- ITU. 2022. Measuring digital development: Facts and Figures 2022.
Facilitated Gender-Based Violence.” International Center www.itu.int/en/ITU-D/Statistics/Pages/facts/default.aspx.
for Research on Women. www.icrw.org/wp-content/
uploads/2019/03/ICRW_TFGBVMarketing_Brief_v4_ Iyer, Neema, Bonnita Nyamwire and Sandra Nabulega. 2020.
WebReady.pdf. Alternate Realities, Alternate Internets: African Feminist Research
for a Feminist Internet. APC. August. www.apc.org/en/pubs/
Horwitz, Jeff. 2021. “Facebook Says Its Rules Apply to All. Company alternate-realities-alternate-internets-african-feminist-research-
Documents Reveal a Secret Elite That’s Exempt.” The Wall Street feminist-internet.
Journal, September 13. www.wsj.com/articles/facebook-files-
xcheck-zuckerberg-elite-rules-11631541353.
82
Iyer, Neema, Garnett Achieng, Favour Borokini and Uri Ludger. 2021. Kapiyo, Victor. 2022. “Bridging the Gender Digital Divide is Critical
Automated Imperialism, Expansionist Dreams: Exploring Digital for Achieving Digital Rights in Africa.” Collaboration on
Extractivism in Africa. Pollicy. June. https://archive.pollicy.org/ International ICT Policy for East and Southern Africa, June 30.
wp-content/uploads/2021/06/Automated-Imperialism- https://cipesa.org/2022/06/bridging-the-gender-digital-
Expansionist-Dreams-Exploring-Digital-Extractivism-in- divide-is-critical-for-achieving-digital-rights-in-africa/.
Africa.pdf.
Karaian, Lara. 2012. “Lolita speaks: ‘Sexting,’ teenage girls and the
Jain, Tripti. 2021. “Tech Tools to Facilitate and Manage Consent: law.” Crime Media Culture 8 (1): 57–73.
Panacea or Predicament? A Feminist Perspective.” Data
Governance Network Working Paper 21. Internet Democracy ———. 2013. “Policing ‘sexting’: Responsibilization, respectability
Project, January 10. https://internetdemocracy.in/reports/ and sexual subjectivity in child protection/crime prevention
tech-tools-to-facilitate-and-manage-consent. responses to teenagers’ digital sexual expression.” Theoretical
Criminology 18 (3): 282–99. https://doi.org/10.1177/
James, Sandy, Jody Herman, Susan Rankin, Mara Keisling, Lisa 1362480613504331.
Mottet and Ma’ayan Anafi. 2016. The Report of the 2015 U.S.
Transgender Survey. Washington, DC: National Center for Kaul, Nitasha. 2021. “The Misogyny of Authoritarians in
Transgender Equality. December. https://transequality.org/ Contemporary Democracies.” International Studies Review
sites/default/files/docs/usts/USTS-Full-Report-Dec17.pdf. 23 (4): 1619–45. https://doi.org/10.1093/isr/viab028.
Jamil, Sadia. 2021. “From digital divide to digital inclusion: Kayastha, Shubha and Rita Baramu. 2021. Mapping Laws
Challenges for wide-ranging digitalization in Pakistan.” Relevant to Online Violence in Nepal. Body & Data.
Telecommunications Policy 45 (8): 102206. https://bodyanddata.org/wp-content/uploads/2021/12/
https://doi.org/10.1016/j.telpol.2021.102206. OnlineGBVLawsMapping-min.pdf.
Jane, Emma. 2018. “Gendered cyberhate as workplace harassment Kee, Jac sm. 2005. “Cultivating Violence Through Technology?
and economic vandalism.” Feminist Media Studies 18 (4): Exploring the Connections between Information Communication
575–91. https://doi.org/10.1080/14680777.2018.1447344. Technologies (ICT) and Violence Against Women (VAW).” APC,
April 16. https://genderit.org/es/node/1813.
Jankowicz, Nina, Jillian Hunchak, Alexandra Pavliuc, Celia Davies,
Shannon Pierson and Zoë Kaufmann. 2021. Malign Creativity: Kee, Jac sm and Sonia Randhawa. 2010. “Malaysia: Violence
How Gender, Sex, and Lies Are Weaponized against against Women and Information Communication Technologies.”
Women Online. Washington, DC: Wilson Center. GenderIT, June 1. https://genderit.org/resources/malaysia-
www.wilsoncenter.org/publication/malign-creativity-how- violence-against-women-and-information-communication-
gender-sex-and-lies-are-weaponized-against-women-online. technologies.
Jansen Reventlow, Nani. 2017. “Online harassment of women Khan, Irene. 2021. #JournalistsToo: Women Journalists Speak
journalists and iInternational law: not ‘just’ a gender issue, Out. UNESCO and United Nations Human Rights Special
but a threat to democracy.” Berkman Klein Center Collection, Procedures. https://srfreedex.org/journalists-too/.
November 16.
Khoo, Cynthia. 2021. Deplatforming Misogyny: Report on
Judson, Ellen. 2021. “Gendered disinformation: 6 reasons why liberal Platform Liability for Technology-Facilitated Gender-Based
democracies need to respond to this threat.” Demos, July. Violence. Toronto, ON: LEAF. www.leaf.ca/wp-content/
https://demos.co.uk/blog/gendered-disinformation/. uploads/2021/04/Full-Report-Deplatforming-Misogyny.pdf.
Judson, Ellen, Asli Atay, Alex Krasodomski-Jones, Rose Lasko-Skinner Khoo, Cynthia, Kate Robertson and Ronald Deibert. 2019. Installing
and Josh Smith. 2020. Engendering Hate: The contours of state- Fear: A Canadian Legal and Policy Analysis of Using,
aligned gendered disinformation online. Demos, October 26. Developing, and Selling Smartphone Spyware and Stalkerware
https://demos.co.uk/project/engendering-hate-the-contours- Applications. The Citizen Lab, June 12. https://citizenlab.ca/
of-state-aligned-gendered-disinformation-online/. 2019/06/installing-fear-a-canadian-legal-and-policy-
analysis-of-using-developing-and-selling-smartphone-spyware-
Kamran, Hija and Meher Ahmad. 2021. “Pakistan’s revenge porn law and-stalkerware-applications/.
is stronger than most. For one woman, that made no difference.”
Rest of World, April 30. https://restofworld.org/2021/ Kirchgaessner, Stephanie. 2022. “Two female activists in Bahrain and
pakistans-revenge-porn-law-is-stronger-than-most-for-one- Jordan hacked with NSO spyware.” The Guardian, January 17.
woman-that-made-no-difference/. www.theguardian.com/news/2022/jan/17/two-female-
activists-in-bahrain-and-jordan-hacked-with-nso-spyware.
83
Klein, Jessica and Kristen Zaleski. 2019. “Non-Consensual Image Lenhart, Amanda, Michele Ybarra and Myeshia Price-Feeney. 2016.
Sharing-Revenge Pornography and Acts of Sexual Assault “Nonconsensual Image Sharing: One in 25 Americans Has
Online.” In Women’s Journey to Empowerment in the Been a Victim of ‘Revenge Porn.’” Data & Society Research
21st Century: A Transnational Feminist Analysis of Women’s Institute. December 13. https://datasociety.net/pubs/oh/
Lives in Modern Times, edited by Kristen Zaleski, Annalisa Nonconsensual_Image_Sharing_2016.pdf.
Enrile, Eugenia L. Weiss and Xiying Wang. New York, NY:
Oxford University Press. Lenhart, Amanda, Michele Ybarra, Kathryn Zickuhr and Myeshia
Price-Feeney. 2016. “Online Harassment, Digital Abuse, and
Kovacs, Anja. 2017. “‘Chupke, Chupke’: Going Behind the Mobile Cyberstalking in America.” Data & Society Research Institute.
Phone Bans in North India.” Gendering Surveillance. November 21. www.datasociety.net/pubs/oh/Online_
Internet Democracy Project, February. Harassment_2016.pdf.
https://genderingsurveillance.internetdemocracy.in/
phone_ban/. Levy, Karen and Bruce Schneier. 2020. “Privacy threats in intimate
relationships.” Journal of Cybersecurity 6 (1): 1–13.
Kovacs, Anja and Tripti Jain. 2021. “Informed consent — Said who? https://doi.org/10.1093/cybsec/tyaa006.
A feminist perspective on principles of consent in the age of
embodied data — A policy brief.” Internet Democracy Project, Lirri, Evelyn. 2015. “The Challenge of Tackling Online Violence
March 22. https://internetdemocracy.in/policy/informed- Against Women in Africa.” Collaboration on International ICT
consent-said-who-a-feminist-perspective-on-principles-of- Policy for East and Southern Africa Blog, October 2.
consent-in-the-age-of-embodied-data-a-policy-brief. https://cipesa.org/2015/10/the-challenge-of-tackling-
online-violence-against-women-in-africa/.
Kovacs, Anja, Richa Kaul Padte and Shobha SV. 2013. “Don’t Let It
Stand”: An Exploratory Study of Women and Verbal Online Lo, Madison. 2021. “A Domestic Violence Dystopia: Abuse via the
Abuse in India. Internet Democracy Project. April. Internet of Things and Remedies Under Current Law.” California
https://internetdemocracy.in/reports/women-and-verbal- Law Review (109): 277–315. https://doi.org/10.15779/
online-abuse-in-india. Z38XW47X1J.
Kovacs, Anja, Avri Doria, Bruno Zilli, Margarita Salas and Women’s Lodhi, Annam. 2018. “Women Journalists & the Double Bind:
Legal and Human Rights Bureau. 2012. Critically Absent: Choosing silence over being silenced.” Media Matters for
Women in Internet Governance. APC. April 3. www.apc.org/ Democracy, July 18. www.apc.org/en/pubs/women-
en/pubs/critically-absent-women-internet-governance-policy. journalists-and-double-bind-choosing-silence-over-being-
silenced.
Kraicer, Eve. 2020. “Online and ICT facilitated violence against
women and girls during COVID-19.” UN Women. Lokot, Tetyana. 2018. “#IAmNotAfraidToSayIt: stories of sexual
www.unwomen.org/en/digital-library/publications/ violence as everyday political speech on Facebook.”
2020/04/brief-online-and-ict-facilitated-violence-against- Information, Communication & Society 21 (6): 802–17.
women-and-girls-during-covid-19. https://doi.org/10.1080/1369118X.2018.1430161.
Kraicer, Eve and Azmina Dhrodia. 2021. “‘Women shouldn’t be Lopes Gomes Pinto Ferreira, Gisella. 2021. “Technology as Both a
expected to pay this cost just to participate.’ Online Gender- Facilitator of and Response to Youth Intimate Partner Violence:
Based Violence and Abuse: Consultation Briefing.” World Wide Perspectives from Advocates in the Global-South.” In The
Web Foundation, October 6. https://webfoundation.org/ Emerald International Handbook of Technology-Facilitated
research/women-shouldnt-be-expected-to-pay-this-cost-just- Violence and Abuse, edited by Jane Bailey, Asher Flynn and
to-participate-online-gender-based-violence-and-abuse- Nicola Henry, 427–46. Bingley, UK: Emerald.
consultation-briefing/. https://doi.org/10.1108/978-1-83982-848-520211032/
full/html.
Kuroda, Reiko, Mariana Lopez, Janelle Sasaki and Michelle
Settecase. 2019. “The Digital Gender Gap.” www.gsma.com/ Louie, Yee Man. 2021. “Technology-Facilitated Domestic Abuse
mobilefordevelopment/wp-content/uploads/2019/02/ and Culturally and Linguistically Diverse Women in Victoria,
Digital-Equity-Policy-Brief-W20-Japan.pdf. Australia.” In The Emerald International Handbook of
Technology-Facilitated Violence and Abuse, edited by Jane
La Prensa. 2020. “Redes sociales, el espacio que han encontrado Bailey, Asher Flynn and Nicola Henry, 447–67. Bingley, UK:
las mujeres nicaragüenses para denunciar a sus agresores.” Emerald. https://doi.org/10.1108/978-1-83982-848-
April 14. www.laprensa.com.ni/2020/04/14/nacionales/ 520211033.
2662961-redes-sociales-el-espacio-que-han-encontrado-las-
mujeres-nicaraguenses-para-denunciar-a-sus-agresores.
84
Lui, Michelle, Brittany Singh and Giovanni Giuga. Forthcoming Massanari, Adrienne L. 2018. “Rethinking Research Ethics, Power, and
2023. “Pro-Pronouns: Gender Identities on Social Media, the Risk of Visibility in the Era of the ‘Alt-Right’ Gaze.” Social
Video Conferencing, and Learning Platforms.” In Can’t Media & Society 2 (4): 1–9. https://journals.sagepub.com/
Compute. Ottawa, ON: University of Ottawa Research Chair in doi/10.1177/2056305118768302.
Technology and Society. www.cantcompute.ca/.
Maurya, Chanda, T. Muhammad, Preeti Dhillon and Priya Maurya.
Machirori, Fungai. 2017. “Towards a gender-just internet: Combating 2022. “The effects of cyberbullying victimization on depression
online violence against women.” Africa Portal, September 29. and suicidal ideation among adolescents and young adults: a
www.africaportal.org/features/gender-just-internet-how-do- three year cohort study from India.” BMC Psychiatry 22 (599).
we-combat-online-violence/. https://doi.org/10.1186/s12888-022-04238-x.
Mahmutović, Aida, hvale vale and Vasilika Laçí. 2021. Cyber McCulloch, Jude, Sandra Walklate, JaneMaree Maher, Kate Fitz-
Violence against Women and Girls in the Western Balkans: Gibbon and Jasmine McGowan. 2019. “Lone Wolf Terrorism
Selected Case Studies and a Cybersecurity Governance Through a Gendered Lens: Men Turning Violent or Violent Men
Approach. Geneva, Switzerland: Geneva Centre for Security Behaving Violently?” Critical Criminology 27: 437–50.
Sector Governance. https://doi.org/10.1007/s10612-019-09457-5.
Makinde, Olusesan Ayodeji, Emmanuel Olamijuwon, Nchelem McGinley, Ann C. 2022. “Misogyny and Murder.” Harvard Journal
Kokomma Ichegbo, Cheluchi Onyemelukwe and Michael of Law & Gender 45: 177– 247. http://harvardjlg.com/
Gboyega Ilesanmi. 2021. “The Nature of Technology- wp-content/uploads/sites/19/2022/09/
Facilitated Violence and Abuse among Young Adults in Misogyny-and-Murder.pdf.
Sub-Saharan Africa.” In The Emerald International Handbook
of Technology-Facilitated Violence and Abuse, edited by Jane McGlynn, Clare. n.d. “Korean Lessons on Supporting Victims
Bailey, Asher Flynn and Nicola Henry, 83–101. Bingley, UK: of Image-Based Abuse.” https://claremcglynn.com/
Emerald. https://doi.org/10.1108/978-1-83982-848- imagebasedsexualabuse/korean-lessons-on-supporting-
520211005. victims-of-image-based-sexual-abuse/.
Malanga, Donald Flywell. 2020. “Tackling gender-based cyber ———. 2016. “Anonymity for Complainants of Image-Based
violence against women and girls in Malawi amidst the Sexual Abuse: focus on harms to victims, not motives
COVID-19 pandemic.” African Internet Rights. of perpetrators.” Centre for Gender Equal Media.
https://africaninternetrights.org/sites/default/files/Donald_ https://claremcglynn.files.wordpress.com/2016/07/mcglynn-
Flywell.pdf. anonymity-revenge-porn-11-july-2016.pdf.
———. 2021. “Survey of Cyber Violence against Women in Malawi.” McGlynn, Clare and Erika Rackley. 2017. “Image-Based Sexual
Proceedings of the 1st Virtual Conference on Implications of Abuse.” Oxford Journal of Legal Studies 37 (3): 534–61.
Information and Digital Technologies for Development. https://doi.org/10.1093/ojls/gqw033.
https://arxiv.org/ftp/arxiv/papers/2108/2108.09806.pdf.
McGlynn, Clare and Kelly Johnson. 2021. Cyberflashing: Recognising
Manjoo, Rashida. 2012. “The Continuum of Violence against Women Harms, Reforming Laws. Bristol, UK: Bristol University Press.
and the Challenges of Effective Redress.” International Human
McMahon, Rob, Tim LaHache and Tim Whiteduck. 2015. “Digital
Rights Law Review 1 (1): 1–29. https://doi.org/
Data Management as Indigenous Resurgence in Kahnawà:ke.”
10.1163/22131035-00101008.
International Indigenous Policy Journal 6 (3): 1–19.
Marwick, Alice E. 2017. “Scandal or sex crime? Gendered privacy https://doi.org/10.18584/iipj.2015.6.3.6.
and the celebrity nude photo leaks.” Ethics and Information
Media Foundation for West Africa. 2017. Women’s Rights
Technology, 19 (3): 177–91. https://doi.org/10.1007/
Online: Issues in Ghana. www.mfwa.org/wp-content/
s10676-017-9431-7.
uploads/2018/02/Baseline-Report-WRO-Issues-in-
Marwick, Alice E. and Robyn Caplan. 2018. “Drinking male tears: Ghana.pdf.
language, the manosphere, and networked harassment.”
Mehta, Ivan. 2022. “Twitter disperses the Trust & Safety Council after
Feminist Media Studies 18 (4): 543–59. https://doi.org/
key members resigned.” Tech Crunch, December 12.
10.1080/14680777.2018.1450568.
https://techcrunch.com/2022/12/12/twitter-disperses-the-
trust-safety-council-after-key-members-resigned/.
85
Miles, Katrina. 2020. “The law on underage sexting needs to Nguyen, Erica and Heather Barr. 2020. “Thinking beyond punishment
change — here’s how.” The Conversation, March 4. to combat digital sex crimes.” The Korea Times, May 20.
https://theconversation.com/the-law-on-underage-sexting- www.koreatimes.co.kr/www/
needs-to-change-heres-how-132473. opinion/2020/05/197_289806.html.
Milmo, Dan. “Elon Musk reinstates Donald Trump’s Twitter account Nova, Fayika Farhat, Md. Rashidujjaman Rifat, Pratyasha Saha,
after taking a poll.” The Guardian, November 20. Syed Ishtiaque Ahmed and Shion Guha. 2019. “Online sexual
www.theguardian.com/us-news/2022/nov/20/twitter-lifts- harassment over anonymous social media in Bangladesh.”
donald-trump-ban-after-elon-musks-poll. ICTD ‘19: Proceedings of the Tenth International Conference
on Information and Communication Technologies and
Moloney, Anastasia. 2019. “LGBT+ murders at ‘alarming’ levels in Development. https://doi.org/10.1145/3287098.3287107.
Latin America — study.” Reuters, August 8. www.reuters.com/
article/us-latam-lgbt-killings-idUSKCN1UY2GM. Nwaodike, Chioma and Nerissa Naidoo. 2020. Fighting Violence
Against Women Online: A Comparative Analysis of Legal
Moolman, Jan. 2018. “Recognition of Online GBV in International Frameworks in Ethiopia, Kenya, Senegal, South Africa, and
Law: The Highs and Lows.” APC, June 17. www.genderit.org/ Uganda. Pollicy. August. http://ogbv.pollicy.org/legal_
editorial/editorial-recognition-online-gbv-international-law- analysis.pdf.
highs-and-lows.
Odeh, Shahrazad. 2018. A Violent Network: Gender-Based Violence
Munusamy, Kiruba. 2018. “Intersection of Identities: Online Gender Against Palestinian: Women in Virtual Space. 7amleh — The
and Caste Based Violence.” GenderIT, June 7. Arab Center for the Advancement of Social Media. November.
www.genderit.org/articles/intersection-identities-online- https://7amleh.org/wp-content/uploads/2018/11/Report_
gender-and-caste-based-violence. GBV_-_KtK.pdf.
Muya, Catherine. 2021. “The Law Should Work for Us: Assessing Office of the Council of Europe Commissioner for Human Rights.
Gaps in Kenya’s Regulatory Framework to Build a Safer Internet 2021. “Human Rights of LGBTI People in Europe: Current
for Women and Girls.” Open Internet for Democracy Initiative. Threats to Equal Rights, Challenges Faced by Defenders, and
https://openinternet.global/sites/default/files/2021-09/ the Way Forward.” Online event, February 9. https://rm.coe.
The%20Law%20Should%20Work%20for%20Us.pdf. int/human-rights-of-lgbti-people-in-europe-current-threats-to-
equal-rights/1680a4be0e.
Namaste, Ki. 1996. “Genderbashing: Sexuality, Gender, and the
Regulation of Public Space.” Environment and Planning D: Office of the United Nations High Commissioner for Human Rights.
Society and Space 14 (2): 221–40. https://doi.org/10.1068/ 2021. “The Gender Digital Divide Is a Reflection of the Overall
d140221. Discrimination Faced by Women and Girls, High Commissioner
for Human Rights Tells Human Rights Council.” Press release,
National Coalition of Anti-Violence Programs. 2016. Lesbian, Gay,
September 27. www.ohchr.org/en/press-releases/2021/09/
Bisexual, Transgender, Queer, and HIV-Affected Intimate
gender-digital-divide-reflection-overall-discrimination-faced-
Partner Violence in 2015. New York, NY: National Coalition
women-and.
of Anti-Violence Programs. https://avp.org/wp-content/
uploads/2017/04/2015_ncavp_lgbtqipvreport.pdf. Oliver, Kelly. 2015. “Rape as Spectator Sport and Creepshot
Entertainment: Social Media and the Valorization of Lack of
National Network to End Domestic Violence. 2014. “A
Consent.” American Studies Journal (10): 1–16.
Glimpse from the Field: How Abusers Are Misusing
https://philpapers.org/rec/OLIRAS.
Technology.” https://static1.squarespace.com/
static/51dc541ce4b03ebab8c5c88c/t/54e3d1b6 O’Malley, Roberta Liggett and Karen M. Holt. 2022. “Cyber
e4b08500fcb455a0/1424216502058/NNEDV_ Sextortion: An Exploratory Analysis of Different Perpetrators
Glimpse+From+the+Field+-+2014.pdf. Engaging in a Similar Crime.” Journal of Interpersonal
Violence 37 (1–2): 258–83. https://doi.org/
Neris, Natália, Juliana Pacetta Ruiz and Mariana Giorgetti Valente.
10.1177/0886260520909186.
2018. “Fighting the Dissemination of Non-Consensual Intimate
Images: a comparative analysis.” Internet Lab. Ontario Human Rights Commission. 2014. “Policy on preventing
www.internetlab.org.br/wp-content/uploads/2018/11/ discrimination because of gender identity and gender
Fighting_the_Dissemination_of_Non.pdf. expression.” www.ohrc.on.ca/en/policy-preventing-
discrimination-because-gender-identity-and-gender-
expression.
86
Oswald, Flora, Alex Lopes, Kaylee Skoda, Cassandra L. Hesse and Powell, Anastasia and Nicola Henry. 2015. “Digital Harassment
Cory L. Pedersen. 2020. “I’ll Show You Mine so You’ll Show Me and Abuse of Adult Australians: A Summary Report.” RMIT
Yours: Motivations and Personality Variables in Photographic University. www.parliament.nsw.gov.au/lcdocs/other/7351/
Exhibitionism.” The Journal of Sex Research 57 (5): 597–609. Tabled%20Document%20-Digital%20Harassment%20and%20
https://doi.org/10.1080/00224499.2019.1639036. Abuse%20of%20A.pdf.
Padte, Richa Kaul and Anja Kovacs. 2013. “Keeping women ———. 2016. “Policing technology-facilitated sexual violence against
safe? Gender, online harassment and Indian law.” Internet adult victims: police and service sector perspectives.” Policing
Democracy Project, June 29. https://internetdemocracy.in/ and Society 28 (3): 291–307. https://doi.org/10.1080/
reports/keeping-women-safe-gender-online-harassment-and- 10439463.2016.1154964.
indian-law.
———. 2017. Sexual Violence in a Digital Age. London, UK: Palgrave
Paola, Steffania, Constanta Figueroa, Juliana Guerra, Rocío Consales, MacMillan. https://link.springer.com/book/10.1057/978-1-
Violeta Cereceda and Vladimir Garay. 2017. Latin America in 137-58047-4.
a Glimpse: Gender, Feminism and the Internet in Latin America.
Derecos Digitales. November. www.derechosdigitales.org/ Powell, Anastasia, Adrian J. Scott, Asher Flynn and Nicola Henry.
wp-content/uploads/GlImpse2017_eng.pdf. 2020. “Image-based sexual abuse: An international study of
victims and perpetrators.” Summary Report. RMIT University.
Palumbo, Mariana and Delfina Schenone Sienra. 2017. “EROTICS https://research.monash.edu/en/publications/image-based-
Global Survey 2017: Sexuality, rights and internet regulations.” sexual-abuse-an-international-study-of-victims-and-pe.
APC. December 19. www.apc.org/en/pubs/erotics-global-
survey-2017-sexuality-rights-and-internet-regulations. Radhakrishnan, Radhika. 2020. “‘I took Allah’s name and stepped
out’: Bodies, Data and Embodied Experiences of Surveillance
Parsons, Christopher, Adam Molnar, Jakub Dalek, Jeffrey Knockel, and Control During COVID-19 in India.” Data Governance
Miles Kenyon, Bennett Haselton, Cynthia Khoo and Ronald Network Working Paper 12. Internet Democracy Project,
Deibert. 2019. The Predator in Your Pocket: A Multidisciplinary November 12. https://internetdemocracy.in/reports/i-took-
Assessment of the Stalkerware Application Industry. Research allahs-name-and-stepped-out-bodies-data-and-embodied-
Report No. 119. June. The Citizen Lab. https://citizenlab.ca/ experiences-of-surveillance-and-control-during-covid-19-in-
docs/stalkerware-holistic.pdf. india.
Philip, Shannon. 2018. “Youth and ICTs in a ‘new’ India: exploring Ramaseshan, Geeta, Sudaroli Ramasamy, M. R. Sangeetha,
changing gendered online relationships among young urban S. Prabha, Nandita Krishna and, Shreeja Kumar. 2019.
men and women.” Gender & Development 26 (2): 313–24. “Towards a safer cyberzone: A study on gender and online
https://doi.org/10.1080/13552074.2018.1473231. violence in Tamil Nadu.” IT For Change. https://itforchange.
net/sites/default/files/add/TamilNadu-Report_Righting-
Posetti, Julie. 2017. “Fighting back against prolific online harassment: Gender-Wrongs.pdf.
Maria Ressa.” In An Attack on One is an Attack on All, edited
by Larry Kilman, 37–40. Paris, France: UNESCO. https:// Ramos, Raphaela . 2020. “Violência contra a mulher na internet
en.unesco.org/an-attack-on-one. cresce na quarentena. Saiba como identificar e se defender.”
O Globo Celina, May 22. https://oglobo.globo.com/celina/
Posetti, Julie and Nabeelah Shabbir. 2022. The Chilling: What violencia-contra-mulher-na-internet-cresce-na-quarentena-
More Can News Organisations Do to Combat Gendered saiba-como-identificar-se-defender-1-24438989.
Online Violence? UNESCO. https://unesdoc.unesco.org/
ark:/48223/pf0000383043.locale=en. Reporters Without Borders. 2018a. Online Harassment of Journalists:
Attack of the Trolls. https://rsf.org/sites/default/files/rsf_
Posetti, Julie, Nermine Aboulez, Kalina Bontcheva, Jackie Harrison report_on_online_harassment.pdf.
and Silvio Waisbord. 2020. “Online Violence against Women
Journalists: A Global Snapshot of Incidence and Impacts.” ———. 2018b. Women’s Rights: Forbidden Subject. Paris, France:
International Center for Journalists and UNESCO. Reporters Without Borders. https://rsf.org/sites/default/files/
https://unesdoc.unesco.org/ark:/48223/pf0000375136. womens_rights-forbidden_subject.pdf.
Posetti, Julie, Nabeelah Shabbir, Diana Maynard, Kaline Bontcheva Ribeiro, Manoel Horta, Raphael Ottoni, Robert West, Virgílio A. F.
and Nermine Aboulez. 2021. “The Chilling: Global Trends Almeida and Wagner Meira Jr. 2020. “Auditing radicalization
in Online Violence against Women Journalists.” Research pathways on YouTube.” FAT ‘20: Proceedings of the 2020
Discussion Paper. UNESCO. https://unesdoc.unesco.org/ Conference on Fairness, Accountability and Transparency,
ark:/48223/pf0000377223/PDF/377223eng.pdf.multi. 131–41.
87
Ruiz, Juliana Pacetta, Mariana Giorgetti Valente and Natália Neris. Shanahan, Matthew. 2022. The Mobile Gender Gap Report.
2019. “Between the perpetrator and the victim: the role of Internet GSMA Intelligence. June. www.gsma.com/r/wp-content/
ontermediaries on violations against women.” Sociology & uploads/2022/06/The-Mobile-Gender-Gap-Report-
Technoscience 9 (1): 9–27. 2022.pdf.
Ryakitimbo, Rebecca. 2018. “Meltdown of Protections for Data Shariff, Shaheen and Karen Eltis. 2017. “Addressing Online Sexual
and Privacy in Tanzinia for LGBTQIA and Others.” GenderIT, Violence: An Opportunity for Partnerships between Law and
November 15. www.apc.org/en/blog/meltdown-protections- Education.” Education & Law Journal 27 (1): 99–117.
data-and-privacy-tanzania-lgbtqia-and-others.
Silva, Ilena. 2022. “Towards a feminist framework for AI development:
Sallam, Yara. 2018. “Online Violence Faced by Outspoken Activists: The from principles to practice.” Derechos Digitales, August 16.
Case from Egypt.” GenderIt, June 20. www.genderit.org/feminist- www.derechosdigitales.org/publicaciones/towards-a-feminist-
talk/online-violence-faced-outspoken-activists-case-egypt. framework-for-ai-development-from-principles-to-practice/.
Sambasivan, Nithya, Amna Batool, Nova Ahmed, Tara Matthews, Kurt Silva, Jason and Emily Ann Greene-Colozzi. 2019. “Fame-seeking
Thomas, Laura Sanely Gaytan-Lugo, David Nemer, Elie Bursztein, mass shooters in America: severity, characteristics, and media
Elizabeth Churchill and Sunny Consolvo. 2019. “‘They Don’t coverage.” Aggression and Violent Behavior (48): 24–35.
Leave Us Alone Anywhere We Go’: Gender and Digital Abuse in https://doi.org/10.1016/j.avb.2019.07.005.
South Asia.” CHI ’19: Proceedings of the 2019 CHI Conference
on Human Factors in Computing Systems, 1–14. https://doi.org/ Sívori, Horacio and Bruno Zilli. 2022. “Anti-rights Discourse in Brazilian
10.1145/3290605.3300232. Social Media: Digital Networks, Violence and Sex Polictics.” Latin
American Center on Sexuality and Human Rights, March 10.
Sambuli, Nanjira, Ana Brandusescu and Ingrid Brudvig. 2018. www.apc.org/en/pubs/anti-rights-discourse-brazilian-social-
“Advancing Women’s Rights Online: Gaps and Opportunities media-digital-networks-violence-and-sex-politics.
in Policy and Research.” World Wide Web Foundation. August.
http://webfoundation.org/docs/2018/08/Advancing- Slane, Andrea and Ganaele Langlois. 2016. “Regulating Business
Womens-Rights-Online_Gaps-and-Opportunities-in-Policy-and- Models that Capitalize on User Posted Personal Information of
Research.pdf. Others: How Can Canada’s Privacy Regime Protect Victims of
Online Shaming Businesses?” August. www.priv.gc.ca/en/about-
Sanya, Brenda Nyandiko. 2013. “Disrupting patriarchy: An the-opc/what-we-do/consultations/completed-consultations/
examination of the role of e-technologies in rural Kenya.” Feminist consultation-on-online-reputation/submissions-received-for-the-
Africa 18: 12–24. https://feministafrica.net/wp-content/ consultation-on-online-reputation/or/sub_or_01/.
uploads/2019/10/features_disrupting_
patriarchy.pdf. Sobieraj, Sarah. 2017. “Bitch, slut, skank, cunt: patterned resistance to
women’s visibility in digital publics.” Information, Communication
Scheim, Ayden I. and Greta R. Bauer. 2019. “Sexual Inactivity Among & Society 21 (11): 1700–14. https://doi.org/
Transfeminine Persons: A Canadian Respondent-Driven Sampling 10.1080/1369118X.2017.1348535.
Survey.” The Journal of Sex Research 56 (2): 264–71.
https://doi.org/10.1080/00224499.2017.1399334. ———. 2020. Credible Threat: Attacks Against Women Online and the
Future of Democracy. New York, NY: Oxford University Press.
Segal, Murray D. 2015. Independent Review of the Police and
Prosecution Response to the Rehtaeh Parsons Case. Strand, Cecilia and Jakob Svensson. 2021. “Disinformation campaigns
https://novascotia.ca/segalreport/Parsons-Independent- about LGBTI+ people in the EU and foreign influence.” European
Review.pdf. Parliament Briefing. July. www.europarl.europa.eu/RegData/
etudes/BRIE/2021/653644/EXPO_BRI(2021)653644_
Sequera, Maricarmen. 2021. “Non-Consensual Image Dissemination in EN.pdf.
Paraguay: An Exploratory Research.” TEDIC. June 7.
www.tedic.org/wp-content/uploads/2021/06/Imagen-no- Sugiura, Lisa. 2021. “Legitimising Misogyny.” In The Incel Rebellion:
consentida-Tedic-web-EN.pdf. The Rise of the Manosphere and the Virtual War Against Women,
95–115. Bingley, UK: Emerald.
Sey, Araba and Nancy Hafkin, eds. 2019. Taking Stock: Data and
Evidence on Gender Equality in Digital Access, Skills, and Suzor, Nicolas, Molly Dragiewicz, Bridget Harris, Rosalie Gillett, Jean
Leadership. Report of EQUALS Research Group, led by the United Burgess and Tess Van Geelen. 2019. “Human Rights by Design:
Nations University. March. https://i.unu.edu/media/cs.unu. The Responsibilities of Social Media Platforms to Address Gender-
edu/attachment/4040/EQUALS-Research-Report-2019.pdf. Based Violence Online.” Policy & Internet 11 (1): 84–103.
https://doi.org/10.1002/poi3.185.
88
Tabachnick, Barbara G. and Linda S. Fidell. 2019. Using Multivariate Udwadia, Zarah and Baldeep Grewal. 2019. “Free to Be Mobile:
Statistics. 7th ed. New York, NY: Pearson Education. Ensuring That Women, Girls, Queer and Trans Persons Can
Inhabit Digital Spaces Freely and Fearlessly.” Point of View,
TGEU. 2021. “TMM Update TDoR 2021.” November 11. March 11. www.apc.org/en/pubs/free-be-mobile-ensuring-
https://transrespect.org/en/tmm-update-tdor-2021/. women-girls-queer-and-trans-persons-can-inhabit-digital-
spaces-freely.
Thomasen, Kristen. 2018. “Beyond Airspace Safety: A Feminist
Perspective on Drone Privacy Regulation.” Canadian Journal Uhl, Carolyn A., Katlin J. Rhyner, Cheryl A. Terrance and Noel R.
of Law and Technology 16 (2): 307–28. Lugo. 2018. “An examination of nonconsensual pornography
https://digitalcommons.schulichlaw.dal.ca/cjlt/vol16/ websites.” Feminism & Psychology 28 (1): 50–68.
iss2/4/. https://journals.sagepub.com/doi/10.1177/
0959353517720225.
Thomasen, Kristen and Suzie Dunn. 2021. “Reasonable Expectations
of Privacy in an Era of Drones and Deepfakes: Expanding UN Committee on the Elimination of Discrimination against Women.
the Supreme Court of Canada’s Decision in R v Jarvis.” In The 2017. “General recommendation No. 35 on gender-based
Emerald International Handbook of Technology-Facilitated violence against women, updating general recommendation
Violence and Abuse, edited by Asher Flynn, Nicola Henry No. 19.” CEDAW/C/GC/35. July 14.
and Jane Bailey, 555–76. Bingley, UK: Emerald. https://digitallibrary.un.org/record/1305057?ln=en.
https://doi.org/10.1108/978-1-83982-848-520211040.
UN Women. 2022. “Toolkit: Youth Guide to End Online Gender-
Thoreson, Ryan and Sam Cook, eds. 2011. Nowhere to Turn: Based Violence.” December 14.
Blackmail and Extortion of LGBT People in Sub-Saharan Africa. https://asiapacific.unwomen.org/sites/default/
International Gay and Lesbian Human Rights Commission. files/2022-12/Youth-Toolkit_14-Dec_compressed-final.pdf.
www.world-psi.org/en/nowhere-turn-blackmail-and-extortion-
lgbt-people-sub-saharan-africa. ———. 2023a. Innovation and technological change, and education
in the digital age for achieving gender equality and the
Trans PULSE. 2011. “We’ve Got Work to Do: Workplace empowerment of all women and girls. Expert guidance and
Discrimination and Employment Challenges for Trans People in substantive inputs to preparations for the 67th Session of the
Ontario.” Trans PULSE E-Bulletin, May 30. Commission on the Status of Women. March 18.
www.unwomen.org/sites/default/files/2023-02/CSW67-
Treuthart, Mary Pat. 2019. “Connectivity: The Global Gender Digital
Expert-Group-Meeting-report-en.pdf.
Divide and Its Implications for Women’s Human Rights and
Equality.” Gender and International Law 23 (1): 1–54. ———. 2023b. Innovation and technological change, and education
https://gjil.scholasticahq.com/article/12338-connectivity-the- in the digital age for achieving gender equality and the
global-gender-digital-divide-and-its-implications-for-women-s- empowerment of all women and girls. Agreed Conclusions.
human-rights-and-equality. 67th Session of the Commission on the Status of Women.
March 20. www.unwomen.org/en/csw/csw67-2023/
Trevisan, João Silvério. 2018. Devassos no Paraíso: A
session-outcomes.
homossexualidade no Brasil, da colônia à atualidade. 4th ed.
Rio de Janeiro, Brazil: Objetiva. United Nations. 2018. “UN experts call on India to protect
journalist Rana Ayyub from online hate campaign.” Press
Tucker, Nia. 2020. “COVID-19 is Leaving Women and LGBTQIA+
release, May 24. www.ohchr.org/EN/NewsEvents/Pages/
People in the USA Vulnerable to Online Surveillance.”
DisplayNews.aspx?NewsID=23126&LangID=E.
GenderIT, August 5. https://genderit.org/feminist-talk/covid-
19-leaving-women-and-lgbtqia-people-usa-vulnerable-online- ———. 2022. “Defenders of the human rights of LGBT persons
surveillance. constantly at risk, warn UN experts.” Statement, March 24.
www.ohchr.org/en/statements/2022/03/defenders-human-
Tyers-Chowdhury, Alexandra and Gerda Binder. 2021. “What we
rights-lgbt-persons-constantly-risk-warn-un-experts.
know about the gender digital divide for girls: A literature
review.” UNICEF Gender and Innovation Evidence Briefs. United Nations General Assembly. 2018. “Report of the Independent
www.unicef.org/eap/reports/innovation-and-technology- Expert on protection against violence and discrimination based
gender-equality-0. on sexual orientation and gender identity.” A/73/152.
https://digitallibrary.un.org/record/1639754?ln=en.
89
United Nations Human Rights Council. 2018. “Report of the Special Virupaksha, H. G., Daliboyina Muralidhar and Jayashree
Rapporteur on violence against women, its causes and Ramakrishna. 2016. “Suicide and Suicidal Behavior among
consequences on online violence against women and girls from Transgender Persons.” Indian Journal of Psychological
a human rights perspective.” A/HRC/38/47. June 18. Medicine 38 (6): 505–09. https://doi.org/10.4103/0253-
https://digitallibrary.un.org/record/1641160. 7176.194908.
United Nations Population Fund. 2021. Technology-facilitated Vogels, Emily A. 2021. The State of Online Harassment. Pew Research
Gender-based Violence: Making all spaces safe. December. Center. January 31. www.pewresearch.org/internet/wp-
www.unfpa.org/sites/default/files/pub-pdf/UNFPA-TFGBV- content/uploads/sites/9/2021/01/PI_2021.01.13_Online-
Making%20All%20Spaces%20Safe.pdf. Harassment_FINAL-1.pdf.
US Department of State. 2022. “2022 Roadmap for the Global Waisbord, Silvio. 2020. “Mob Censorship: Online Harassment of
Partnership for Action on Gender-Based Online Harassment US Journalists in Times of Digital Hate and Populism.” Digital
and Abuse.” March 16. www.state.gov/2022-roadmap-for- Journalism 8 (8): 1030–1046. https://doi.org/10.1080/
the-global-partnership-for-action-on-gender-based-online- 21670811.2020.1818111.
harassment-and-abuse/.
Waldman, Ari Ezra. 2019. “Law, Privacy, and Online Dating:
Vaillancourt, Tracy, Heather Brittain, Amanda Krygsman, Ann H. ‘Revenge Porn’ in Gay Online Communities.” Law & Social
Farrell, Sally Landon and Debra Pepler. 2021. “School bullying Inquiry 44 (4): 987–1018. www.cambridge.org/core/
before and during COVID‐19: Results from a population‐based journals/law-and-social-inquiry/article/law-privacy-and-
randomized design.” Aggressive Behavior 47 (5): 557–69. online-dating-revenge-porn-in-gay-online-communities/
https://doi.org/10.1002/ab.21986. BCCE05CF25AA4C2E05CCF8D64980E839.
Valente, Mariana. 2018. “Do We Need New Laws to Address Non- Ward, Zara. 2021. “Intimate image abuse, an evolving landscape.”
Consensual Circulation of Intimate Images: The Case of Brazil.” Revenge Porn Helpline. https://revengepornhelpline.org.uk/
GenderIT, June 17. www.genderit.org/articles/do-we-need- assets/documents/intimate-image-abuse-an-evolving-
new-laws-address-non-consensual-circulation-intimate-images- landscape.pdf?_=1639471939.
case-brazil.
Wells, Georgia, Jeff Horwitz and Deepa Seetharaman. 2021.
Valente, Mariana Giorgetti, Natália Neris, Juliana Pacetta Ruiz and “Facebook Knows Instagram Is Toxic for Teen Girls, Company
Lucas Bulgarelli. 2016. The Body is the Code: Legal Strategies to Documents Show.” The Wall Street Journal, September 14.
Combat Revenge Porn in Brazil. São Paulo, Brazil: Internet Lab. www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-
teen-girls-company-documents-show-11631620739.
Van Der Wilk, Adriane. 2018. Cyber violence and hate speech online
against women. Policy Department for Citizen’s Rights and Wilton Park. 2022. Building a shared agenda on the evidence base
Constitutional Affairs. PE 604.979. September. for Gender-Based Online Harassment and Abuse WP3057.
www.europarl.europa.eu/RegData/etudes/
STUD/2018/604979/IPOL_STU(2018)604979_EN.pdf. Wittes, Benjamin, Cody Poplin, Quinta Jurecic and Clara Spera.
2016. “Sextortion: Cybersecurity, teenagers, and remote
Vashistha, Aditya, Abhinav Garg, Richard Anderson and Agha Ali sexual assault.” Center for Technology Innovation at
Raza. 2019. “Threats, Abuses, Flirting, and Blackmail: Gender Brookings, May 11. www.brookings.edu/research/sextortion-
Inequity in Social Media Voices Forums.” CHI ‘19: Proceedings cybersecurity-teenagers-and-remote-sexual-assault/.
of the 2019 CHI Conference on Human Factors in Computing
Systems. https://doi.org/10.1145/3290605.3300302. Wolak, Janis and David Finkelhor. 2016. “Sextortion: Findings from
a survey of 1,631 victims.” Crimes Against Children Research
Vasudevan, Amrita. 2018. Report of the National Dialogue on Centre. https://calio.dspacedirect.org/handle/11212/3037.
Gender-Based Cyber Violence. IT for Change.
https://projects.itforchange.net/e-vaw/wp-content/ Woodlock, Delanie. 2017. “The Abuse of Technology in Domestic
uploads/2018/03/Event-Report-of-National-Dialogue-on- Violence and Stalking.” Violence Against Women 23 (5):
Gender-Based-Cyber-Violence.pdf. 584–602. https://doi.org/10.1177/1077801216646277.
Villamil, Daniela García. 2022. “Advancing gender equality and Woodlock, Delanie, Karen Bentley, Darcee Schulze, Natasha
women’s digital empowerment in the Global South.” Policy Brief Mahoney, Donna Chung and Amy Pracilio. 2020. Second
No. 14. Southern Voice, September 30. http://southernvoice. National Survey on Technology Abuse and Domestic Violence
org/advancing-gender-equality-and-womens-digital- in Australia. WESNET. November. https://wesnet.org.au/
empowerment-in-the-global-south. about/research/2ndnatsurvey/.
90
World Health Organization. 2021. “Violence against women.” Fact
sheet. March 9. www.who.int/news-room/fact-sheets/detail/
violence-against-women.
Yahaya, Mardiya Siba and Neema Iyer. 2022. (In)Visible: The Digital
Threats Muslim Women Human Rights Defenders Face in the
Greater Horn of Africa. Pollicy and Musawah. May.
www.musawah.org/wp-content/uploads/2022/05/
InVisible-The-Digital-Threats-Muslim-Women-Human-Rights-
Defenders-Face-in-the-Greater-Horn-of-Africa.pdf.
91
Appendix
92
Table A1: Experiences with Forms of Online Harm
Sexual
Genderd Interactiond
Orientation
Full Form x Gender
Sample Form x Sexual
Incidents x Sexual Form x Genderc Transgender
(%)a Orientationc Transgender
Orientationb and Gender- Hetero-
Women Men LGB+ Women Men and Gender-
Diverse sexual
Diverse People
People
Any form of online χ2(2, 12001)=6.193, χ2(2, 12001)=13.307, χ2(1, 12001)=126.611, H. = 58.6 H. = 55.7 H. = 57.4
59.7 59.9 57.0 67.8 57.2 75.8
harm p=.045 p=.001 p<.001
LGB+ = 76.7 LGB+ = 72.6 LGB+ = 87.7
2 2
Repeated unwanted χ (2, 11846)=84.873, χ (1, 11846)=49.535,
37.7 NS 39.4 31.3 40.3 34.6 46.3 Not examined
contact p<.001 p<.001
Unsolicited sexual χ2 (2, 11865)=57.055, χ2(1, 11865)=95.585,
28.1 NS 28.9 22.8 31.1 24.8 40.1 Not examined
images p<.001 p<.001
Unauthorized χ2(1, 11881)=30.530,
24.5 NS NS Not examined 24.1 32.8 Not examined
access p<.001
Monitored, tracked χ2(2, 11797)=7.466, χ2(2, 11797)=24.297, χ2(1, 11797)=13.860, H. = 11.9 H. = 14.5 H = 21.3
14.7 12.5 14.6 24.0 13.3 18.6
or spied on p=.024 p<.001 p<.001 LGB+ = 19.7 LGB+ = 15.5 LGB+ = 29.1
2
χ (2, 11866)=50.582,
Doxing 14.7 NS NS 12.8 17.1 23.6 Not examined Not examined
p<.001
χ2(2, 11885)=35.368, χ2(1, 11885)=34.385,
Blackmail 12.1 NS 10.1 12.7 23.1 11.0 18.6 Not examined
p<.001 p<.001
Networked χ2(2, 11888)=7.396, χ2(2, 11888)=50.859, χ2(1, 11888)=56.919, H. = 8.5 H. = 11.0 H. = 26.7
11.8 9.3 11.5 27.8 9.9 19.6
harassment p=.025 p<.001 p<.001
LGB+ = 19.4 LGB+ = 18.1 LGB+ = 29.9
93
94
Table A2: Reported Impacts of Online Harm
Sexual
Genderd Interaction d
Orientation
Full Impact x Impact
Sample Gender Impact x
Impacts x Sexual Transgender Transgender
(% Very x Sexual Genderc Orientationc and Gender- Hetero- and Gender-
Negative)a Orientationb Women Men LGB+ Women Men
Diverse sexual Diverse
People People
Sexual autonomy/ χ2(2, 6728)=7.717 χ2(2, 6728)=16.159, χ2(2, 6728)=36.241, H. = 16.4 H. = 13.0 H. = 23.8
16.2 16.8 14.6 28.4 14.9 25.1
freedom p=.021 p<.001 p<.001 LGB+ = 21.5 LGB+ = 26.9 LGB+ = 35.4
Table A3: Perceptions of Who OGBV Is a Big Problem For
Gender (%)c, d
95
96
Table A4: Perceptions of the Harmfulness of Online Harms
Sexual
Genderd Interaction d
Orientation
% of Full Harmfulness x Harmfulness
Sample Harmfulness x
Harmfulness “Extremely Gender x Sexual x Sexual Transgender Transgender
Genderc
Orientationb Orientationc and Gender- Hetero- and Gender-
Harmful”a Women Men LGB+ Women Men
Diverse sexual Diverse
People People
Non-consensual
χ2(2, 11637)=235.435,
distribution of 76.6 NS NS 82.8 71.2 60.0 Not examined Not examined
p<.001
intimate images
Repeated unwanted χ2(2, 11550)=6.386, χ2(2, 11550)=171.896, χ2(1, 11550)=12.535, H. = 56.5 H. = 43.6 H. = 49.0
49.9 55.6 43.4 43.8 50.2 43.8
contact p=.041 p<.001 p=.035 LGB+ = 46.2 LGB+ = 42.2 LGB+ = 39.7
Table A5: Young People — Perceptions of the Harmfulness of Online Harms
Any form of online harm 59.7 χ2(1, 17817)=325.054, p<.001 57.2 77.2
Reputation and identity-based harms 37.6 χ2(1, 17785)=560.933, p<.001 34.3 60.3
Any coercion and harassment 45.0 χ2(1, 17789)=388.193, p<.001 42.2 64.4
Any privacy and security-based harms 34.4 χ2(1, 17767)=462.260, p<.001 31.5 54.6
97
98
Table A7: Most Serious Incident — Frequency
Sexual
Genderd Interaction
Orientation
Full Frequency Frequency
Frequency Sample x Gender Frequency x x Sexual Transgender Transgender
(% x Sexual Genderc
of Incident(s) Orientationc and Gender- Hetero- and Gender-
Chronic)a Orientationb Women Men LGB+ Women Men
Diverse sexual Diverse
People People
Chronic (monthly,
χ2(2, 6501)=8.078, χ2(1, 6501)=11.552,
weekly and daily) 12.6 NS 13.7 14.3 25.5 13.5 19.3 Not examined
p<.018 p<.001
exposure
Sexual
Genderd Interactiond
Orientation
Reason
Full x Gender Reason x Sexual
Reason Sample (% Reason x Genderc Transgender Transgender
x Sexual Orientationc
Yes)a and Gender- Hetero- and Gender-
Orientationb Women Men LGB+ Women Men
Diverse sexual Diverse
People People
Your sexual χ2(2, 7039)=7.723, χ2(2, 7039)=38.886, χ2(1, 7039)=585.192, H. = 5.4 H. = 7.8 H. = 11.2
7.0 7.9 12.2 25.7 6.6 42.7
orientation p=.021 p<.001 p<.001 LGB+ = 32.6 LGB+ = 53.3 LGB+ = 42.3
χ2(2, 7039)=70.390,
Your religion 12.1 NS NS 7.7 13.9 14.1 Not examined Not examined
p<.001
χ2(2, 7039)=36.480,
Your disability 3.5 NS NS 2.7 5.4 7.0 Not examined Not examined
p<.001
About the Authors
Suzie Dunn is a senior fellow at CIGI. She is an At uOttawa, Tracy is a member of the Brain and
assistant professor at Dalhousie University’s Mind Institute, Faculty of Medicine, and the Centre
Schulich School of Law in Halifax, Nova Scotia. Her for Health Law, Policy and Ethics with the Faculty
research centres on the intersections of gender, of Law. She is the president of the International
equality, technology and the law, with a specific Society for Research on Aggression, a fellow and
focus on technology-facilitated gender-based chair of the Royal Society of Canada Task Force on
violence (TFGBV), the non-consensual distribution COVID-19, and the chief editor of the Child Mental
of intimate images and impersonation in digital Health and Interventions sections of Frontiers in
spaces. As a subject matter expert on TFGBV, Suzie Child and Adolescent Psychiatry.
has been a key contributor to CIGI’s Supporting a
Safer Internet project. Heather Brittain is a Vanier Scholar who is
completing her doctoral degree in the Faculty of
Tracy Vaillancourt is a senior fellow at CIGI. Education at the University of Ottawa in Tracy
She is a Tier 1 Canada Research Chair in School- Vaillancourt’s Brain and Behaviour Laboratory.
Based Mental Health and Violence Prevention She obtained master’s degrees in education and
at the University of Ottawa (uOttawa). She is statistics. Heather’s research is focused on how the
cross-appointed as a full professor in counselling experience of childhood adversity, such as being the
psychology, the Faculty of Education and the School victim of peer abuse, impacts academic functioning
of Psychology, Faculty of Social Sciences. and how these experiences relate to functional
outcomes during adulthood, such as post-
secondary educational success and job stability for
women and men.
LEAF Women’s Legal Education and Action UAE United Arab Emirates
Fund
UNESCO United Nations Educational,
LGB+ lesbian, gay, bisexual or other non- Scientific and Cultural Organization
heterosexual sexual orientations
99
67 Erb Street West
Waterloo, ON, Canada N2L 6C2
www.cigionline.org
@cigionline