Policy:Wikimedia Foundation Combating Online Child Exploitation Policy
This policy or procedure is maintained by the Wikimedia Foundation. Please note that in the event of any differences in meaning or interpretation between the original English version of this content and a translation, the original English version takes precedence. |
The Combating Online Child Exploitation Policy is a set of guidelines and procedures regarding changes to or removals of content on the Wikimedia projects, or actions against specific individuals, performed by Foundation staff members upon receipt of one or multiple valid complaints from the volunteer community or the public, and/or as required by law.
This policy supports existing policies around user conduct, such as the Universal Code of Conduct, the Wikimedia Foundation Global Ban Policy, and the Office Actions Policy. It serves to formalize existing "unwritten" policies and similar gray areas between these policies. It also supports the Foundation's commitment to protect childrens' rights set out in our Human Rights Policy as well as recommendations made in the Foundation's 2023 Child Rights Impact Assessment. All office actions are performed pursuant to the Terms of Use.
Purpose and scope
The Wikimedia projects attract contributors of all backgrounds, spanning a large demographic range. Given the range, some volunteers, including minors, are inherently more at risk and potentially vulnerable to exploitation and disproportionate levels of abuse. The Wikimedia Foundation has a zero tolerance policy regarding content or activity that sexually exploits or puts minors at risk.
This policy was developed following extensive discussion with and feedback from community, industry and subject matter experts, and Foundation staff stakeholders. Any updates to this policy will be recorded as soon as possible, but may go into effect before the public document is changed. Questions about changes or current practice may be addressed at the talk page or emailed to cawikimedia.org.
Child sexual abuse material ("CSAM")
The definition of child sexual abuse material ("CSAM") across industry best practices is broad and covers a wide range of subjects and acts. Generally, CSAM refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, live streaming, and digital or computer generated images, including AI-generated content, indistinguishable from an actual minor.
CSAM is known legally as "child pornography", though this term is generally deprecated. Additional information including definitions is available from subject matter-specific organizations including RAINN, the National Center for Missing and Exploited Children, and INHOPE.
Material on the Wikimedia websites is intended for general informational purpose only, which informs our definition of CSAM. It includes all forms of text or media — depictions of real people, computer generated, or illustrated. Under this policy, you must not upload material that depicts the sexual exploitation of minors — that is, people under the age of 18. This includes images of yourself.
Prohibited content
Prohibited content includes (but is not limited to) photographs, drawings, renderings, or videos of minors depicting:
- Nude body with a focus on genital areas, where that image has no obvious educational value aligned with the projects' purpose (see below for more details)
- Implicit or explicit sexual intercourse with or in proximity of a minor
- Simulated sexual activity (including if the minor is fully clothed)
- Masturbation
- A minor depicted with or engaging in sexual activity with sex toys
- Other "sexually explicit conduct" as defined in United States law (18 U.S. Code § 2256)
Sharing links to (or purporting to be to) such material is also in violation of this policy. Other material may be removed by the Foundation at the discretion of the Legal department as a primary office action pursuant to the Terms of Use.
Situations and content outside the scope of this policy
In rare cases, images that may be removed on other platforms might be kept on the Wikimedia projects as they provide educational, medical, or scientific value in support of Wikimedia's free knowledge mission.
Examples of content that may be allowed on our websites includes (but is not limited to) photographs, drawings, renderings, or videos of minors depicting:
- Media with demonstrable cultural or historic significance (for example, Phan Thi Kim Phuc / "napalm girl")
- Artistic representations of nude children with demonstrable educational value (such as, but not limited to lolicon, lolita, historical cartoons, drawings, paintings, or photographs)
- Media from movies, television shows, or paintings/drawings from renowned artists
- Minor with minimal clothing (e.g., swimsuits, athletic clothing) in staged or competitive environments such as notable pageants or sporting events
- Media depicting naturist or nudist colonies in which context is demonstrably not sexual
- Media in which scientific, medical, or religious context is evident (for example, images used on Wikipedia to illustrate a medical condition in minors)
Reporting violations
If you come across material that you believe to be in violation of this policy, please report it to the Wikimedia Foundation immediately. You can send links to the material (or other links that may help us locate the material) to legal-reportswikimedia.org.
If you are a volunteer administrator, you should also feel empowered to delete or suppress the content per the editorial policies of your language version of the project. The legal-reports address is monitored by the Trust and Safety team in collaboration with the Foundation's Legal team. Material sent there will be evaluated and, if in violation of applicable policy, removed and reported as documented below.
If you believe yourself to be in immediate danger of threats of physical harm, contact your local law enforcement or contact Wikimedia's emergency email address.
Mitigating actions
The Wikimedia Foundation, as hosts of the Wikimedia websites and in compliance with applicable law, will remove apparent CSAM upon its discovery and report it to the National Center for Missing and Exploited Children.
Accounts found to have uploaded such material will be immediately locked in accordance with the Office Actions Policy. Actions related to the upload of CSAM are not appealable to the Case Review Committee due to the nature of the violation.
Content that promotes, supports or encourages child exploitation
Section 4 of the Wikimedia Terms of Use, "Refraining from certain activities", prohibits "encouraging, grooming, or advocating for others to create or share [CSAM]". You may not post material, through text or other means, which promotes, encourages, or advocates for child sexual exploitation, or which seeks or requests abusive material. This includes but is not limited to:
- Content that defends, advocates for, supports, or encourages participation in sexual relations involving minors
- Attempts to ascertain the age and/or personal identity of other users, or attempts to coerce them into revealing this information
- Attempts to move discussions with minors to off-platform venues, private or public
Please note that it is not a violation of this policy to merely edit content in the topic area of child abuse or pedophilia in good faith and consistent with the editorial policies of a specific language version of the project.
Reporting violations
If you believe a user is in violation of this policy, please report this to the Trust and Safety team by email to cawikimedia.org. Please include as much information about the situation as possible that you feel comfortable sharing, ideally including links to edits or uploads that are problematic. You may use the process documented in the Global Ban Policy.
Mitigating actions
Upon receipt of a report, the Trust and Safety team will open an investigation into the incident(s) in accordance with the existing Global Ban Policy. These investigations will follow the process and timeline as documented in the latter policy on the Foundation Governance wiki.
Criminal convictions for child abuse
It is not a violation of the Wikimedia Foundation Terms of Use to edit with a current or served criminal conviction or record by itself. However, if the Foundation learns of such a conviction, the Foundation may still investigate whether such a user's engagement presents a risk to other volunteers or its services, either on the platform directly or at Foundation-sponsored events, as well as other related risk factors. Such investigations will follow the typical process, may consider the manner in which the Foundation obtained information about the conviction when determining recommended actions, and possible actions may be performed pursuant to the Terms of Use.
Volunteer communities on individual Wikimedia websites may choose to set and enforce stronger policies in this regard.
Minor safety at events
The Wikimedia Foundation is committed to ensuring a safe and positive experience for minors and young attendees at events (both online and offline). Always refer to and enforce the Friendly Space Policy for all events that are supported, hosted or funded by the Wikimedia Foundation.
Frequently asked questions
How were the details of this policy informed?
Between January and June 2023, the Wikimedia Foundation's Trust and Safety team undertook extensive research of existing community practices and policies around child protection. These were analyzed in combination with the approaches taken by other organizations of a similar size and with websites using a similar system of community-led governance and moderation, general industry best practices, the Foundation's 2023 Child Rights Impact Assessment, and input from subject-matter experts. To this end, we undertook interviews with trusted volunteer community groups with access to nonpublic personal data, many of whom have encountered these issues in the past.
How does this policy interact with my community's policies?
There are a number of communities with explicit policies on child protection. This policy is intended to complement local policies on child protection where these exist and are enforced. Communities may create and enforce local project policies going beyond the shared minimal standards this topic-specific policy provides. Situations may also continue to be escalated to the Trust and Safety team by any community member.
How long does it take for an action to be issued under this policy?
Cases involving CSAM uploads are handled immediately after processing a complaint. The Foundation follows statutory expectations in the United States in processing these types of issues. Other matters may vary depending on the nature of the report. Most cases take between 3–4 weeks to investigate and take action on.
Can actions taken under this policy be appealed?
In general, actions related to content takedowns (discussed under the CSAM section above) are not appealable to the Case Review Committee. However, if you have reported a user for potentially violating this policy and the Trust and Safety team declines to act, or takes action you believe inadequate, this can typically be appealed.
How will this policy be modified?
The Trust and Safety team at the Wikimedia Foundation will review the impact of this policy after one year. After this time, we will undergo an audit of any actions taken under the policy and evaluate potential changes to its text.
Glossary
- Child Sexual Abuse Material (CSAM): Any content that depicts sexually explicit activity involving a minor. In the United States this is legally known as "child pornography".
- Minor: A person under the age of majority in their home country. For Wikimedia's purposes, we use the age of majority in the United States, which is 18. In other countries the age may differ, which can impact the legality of your actions on the projects as they relate to minors.
- Pedophilia: A paraphilia involving an attraction to pre-pubescent minors. This term is sometimes used colloquially as referring to anyone who abuses children, but its use as a clinical definition is much narrower.