2021-009-FB-UA: Case Number Case Description
2021-009-FB-UA: Case Number Case Description
2021-009-FB-UA: Case Number Case Description
2021-009-FB-UA
Case number
Case description
In May 2021 a Facebook user in Egypt shared a post by a verified Al Jazeera news
page about the escalating violence in Israel and the Occupied Palestinian Territories
of Gaza and the West Bank. The Al Jazeera post consists of text in Arabic and a
photo. The text states: "'He Who Warns is Excused'. Al-Qassam Brigades military
spokesman threatens the occupation forces if they do not withdraw from Al-Aqsa
Mosque." The Izz al-Din al-Qassam Brigades are the military wing of Hamas and
have been designated as a terrorist group by multiple states, either individually or
as part of Hamas.
The photo shows two people in camouflage fatigues with their faces covered
standing in front of a row of microphones and wearing headbands featuring Al-
Qassam's insignia. Superimposed over the photo is an Arabic language statement in
quotation marks attributed to a spokesperson for the Al-Qassam Brigades.
Translated into English, the statement on the photo reads: "The resistance
leadership in the common room [ ]الغرفة المشتركةgives the occupation a respite until
18:00 to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah
neighbourhood, otherwise he who warns is excused. Abu Obeida – al-Qassam
Brigades military spokesman." The Board notes that Al Jazeera's post (which the
user shared) is currently available on Facebook.
Facebook initially removed the user's post for violating its Community Standard
ondangerous individuals and organisations. In their appeal, the user stated that
they had shared the post to update people on the developing crisis and that it was an
important issue that more people should be aware of. The user also noted that their
post simply shared content from an Al Jazeera page.
After the Board asked Facebook to confirm the eligibility of this post for Board
review, Facebook identified the removal of this post as an enforcement error and
restored the content. The Board chose to proceed with reviewing this case as it
continues to raise important questions about Facebook's policies and enforcement
practices.
In its decisions, the Board can issue policy recommendations to Facebook. While
these are not binding, Facebook must respond to them within 30 days. As such, the
Board welcomes public comments proposing recommendations that are relevant to
this case.
Public Comment Appendix for
2021-009-FB-UA
Case number
To protect the privacy and security of commenters, comments are only viewed by
the Oversight Board and as detailed in the Operational Privacy Notice. All
commenters included in this appendix gave consent to the Oversight Board to
publish their comments. For commenters who did not consent to attribute their
comments publicly, names have been redacted. To withdraw your comment, please
email [email protected].
To reflect the wide range of views on cases, the Oversight Board has included all
comments received except those clearly irrelevant, abusive or disrespectful of the
human and fundamental rights of any person or group of persons and therefore
violating the Terms for Public Comment. Inclusion of a comment in this appendix is
not an endorsement by the Oversight Board of the views expressed in the comment.
The Oversight Board is committed to transparency and this appendix is meant to
accurately reflect the input we received.
Public Comment Appendix for
2021-009-FB-UA
Case number
26
Number of Comments
Regional Breakdown
0 0 7 1
Asia Pacific & Oceania Central & South Asia Europe Latin America & Caribbean
3 0 15
Middle East and North Africa Sub-Saharan Africa United States & Canada
2021-009-FB-UA PC-10128 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
Link to Attachment
No Attachment
2021-009-FB-UA PC-10130 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
It is my opinion that posts like this that are meant to save lives by warning if
terrorist behavior SHOULD always be allowed. If they are verified as true, lives can
be saved. FB has a responsibility to do no harm and to do good when it can. It USED
like a doctor who sees a patient and find out that they are being beaten at home, he
had a responsibility to try and notify the authorities out someone who can help.
Don't be afraid to do good. There will always be someone who disagrees with your
policies but you should still do the right thing if it may save a life. Remember that
human life is the most important thing in the planet because everyone is someone
else's loved one.
Full Comment
It is my opinion that posts like this that are meant to save lives by warning if
terrorist behavior SHOULD always be allowed. If they are verified as true, lives can
be saved. FB has a responsibility to do no harm and to do good when it can. It USED
like a doctor who sees a patient and find out that they are being beaten at home, he
had a responsibility to try and notify the authorities out someone who can help.
Don't be afraid to do good. There will always be someone who disagrees with your
policies but you should still do the right thing if it may save a life. Remember that
human life is the most important thing in the planet because everyone is someone
else's loved one.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10131 United States and Canada
Case number Public comment number Region
Withheld No
Organization Response on behalf of organization
––––
Short summary provided by the commenter
Full Comment
Our ideas of free speech are meant to guard against govt and social tyranny.
Government can try to limit such freedom through laws or force and the citizens
who have the loudest voices by silencing views they don’t like through things like
ostracizing individuals and corporations. The goal is to kick ideas out of the market
place undermining individual and social growth. The more ideas we are exposed to
the more we strengthen our own (no dogmatism) or realize we were wrong or
partially wrong. We need a robust and well functioning marketplace. But words can
harm so we use the harm principle to determine when interference is justified. The
harm in question should be defined very narrowly to keep the market place large
which is a central element to function well. Bad ideas eventually will leave and
natural growth (the best kind of growth) will occur. In addition, the marketplace
only works if all voices are heard and we must be careful not to silence vulnerable
voices and work to help those voices get larger. Another key element needed for a
well functioning marketplace is open minded people willing to engage in reflection
and civil discussion. I believe that if the ideas expressed are ones that actually add
to the fruitful discussion of a topic that can further understanding, test people’s
current beliefs, and foster civil fruitful discussion, that we must be very careful
employing the harm principle- especially for vulnerable groups whose voices are
often dismissed. We must be pretty tolerant of harm. In this case, however, the
language is a mere threat and doesn’t add to the marketplace. Rather, it undermines
it. It undermines discussion. Threats make people defensive and more likely to fight
and not be open minded. In addition, Hamas is not the same thing as Palestinians.
It’s not a vulnerable group. It’s a terrorist organization. Finally, the fact that it’s a
sharing of an article from a news source is of no relevance. Facebook has censored
articles from legitimate news sources many times. Therefore, Facebook was right to
censor this post and should also remove the original post as well. Facebook, I
believe, has an obligation to keep the marketplace large, but it also has the
obligation to keep it well functioning. Thanks for reading this. Again. I’m a
professor but my thoughts are solely my own.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10133 United States and Canada
Case number Public comment number Region
None No
Organization Response on behalf of organization
––––
Short summary provided by the commenter
If trump can be band then facebook is hypocritical if they do not ban this terrost
forever. . and. suckerberg should go to jail for infringing and violating peoples
rights.
Full Comment
If trump can be band then facebook is hypocritical if they do not ban this terrost
forever. . and. suckerberg should go to jail for infringing and violating peoples
rights.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10134 United States and Canada
Case number Public comment number Region
Withheld No
Organization Response on behalf of organization
––––
Short summary provided by the commenter
Given the increasing violence in the Middle East as well as here in America allowing
post of this nature does nothing more then add to the violence.
Full Comment
This post and all other like it do nothing more then escalate the violence. Allowing
far right and extremist statements like this do more harm and threaten a fragile
peace. Give voice to terrorists and wack jobs in the US government just emboldens
them. News agencies and social media by covering such people just feed their
narcissistic personalities and the sooner the main stream media and social media
realize this these terrorists and narcissists will fade into the background and will I
hope in time become irrwlavant.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10136 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
Link to Attachment
No Attachment
2021-009-FB-UA PC-10139 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
Although Al Jazeera was helping Hamas during and before this last conflict with
Israel, this seems to be information shared as a warning to all about Hamas’
intentions on the mosque area. That’s why all who informed were forgiven. In the
future I suggest that if Facebook decides to leave the post in place, they could add a
warning label or box that this is from a terrorist aligned source and info is from a
terrorist source. Observe with caution.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10145 Europe
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
This post was shared from a legitimate source and presents the situation from a
Palestinian perspective, which Facebook seems alarmingly eager to silence.
Additionally it shows the very underreported fact that Al Qassam gives warnings
before launching missiles. This is important information of which the public should
be aware. It shows that the brigades are misrepresented by Western media. To
remove this post was either negligent or deliberate censorship.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10149 Latin America and Caribbean
Case number Public comment number Region
––––
Short summary provided by the commenter
Jewish Voice for Peace would like to affirm Facebook’s original decision to restore
the user-in-question’s content as the just remedy to the user’s appeal. Users must
have the right to share news articles and information about the political situation on
the ground in Palestine. Removing content from a news article is over-moderation
and negatively impacts people’s abilities to exercise their right to freedom of
expression and the right to access information.
Full Comment
Jewish Voice for Peace (JVP) is a US-based, grassroots organization and registered
non-profit inspired by Jewish tradition to work for a just and lasting peace
according to principles of human rights, equality, and international law for all the
people of Israel and Palestine. JVP also has one of the largest Facebook followings
of any US-based Jewish organization. In reference to Oversight Board Case number
2021-009-FB-UA, JVP would like to affirm Facebook’s original decision to restore the
user-in-question’s content as the just remedy to the user’s appeal. Users must have
the right to share news articles and information about the political situation on the
ground in Palestine. Removing content from a news article is over-moderation and
negatively impacts people’s abilities to exercise their right to freedom of expression
and the right to access information. Social media is often one of the only vehicles
for Palestinians to share with the world their experiences facing a crippling military
occupation, including assaults on civilians, forced displacement and home
demotions, a brutal apartheid regime, and other violence and oppression. The
freedom to share their stories and experiences is vital for Palestinians to seek
international support in holding the Israeli government accountable for its human
rights violations against the Palestinian community. JVP relies on the information,
documentation and stories from journalists, activists, and legal professionals on the
ground in Palestine in its work to end Israeli government violations of Palestinian
human rights. Facebook should protect users’ freedom to share information, visual
documentation, and opinions regarding events on the ground in Palestine all of
which are crucial to holding the Israeli government accountable. Facebook must
begin a sincere effort in rebuilding trust with Palestinian user communities and
greater movement for Palestinian rights, and the Facebook Oversight Board can
help play a key role in ensuring Facebook is a safe space for all. In particular,
Jewish Voice for Peace is very concerned about the possibly privileged relationship
between Facebook and the Israeli Ministry of Justice’s Cyber Unit, and the
unacceptable levels of silencing and censoring of Palestinians and Palestinian
human rights supporters on the Facebook platform. The FOB has been presented to
our communities as an independent, unbiased body – and we hope that its decision
will reflect the value it places on all users’ freedom of expression.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10159 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
I want to thank the Facebook Oversight Board for taking up this case – a case that is
emblematic of the far-reaching challenges to Palestine-related political free speech
on social media today. It is impossible to overstate the importance of social media to
Palestinians everywhere. It is especially important for Palestinians living under
Israeli military occupation in the West Bank, East Jerusalem, and Gaza Strip –
locales in which traditional media and major human rights organizations
sometimes have difficulty operating (e.g.:
https://www.washingtonpost.com/opinions/2021/05/18/israel-gaza-idf-ap-media-
attack-journalism/ & https://www.hrw.org/news/2019/11/25/israel-expels-human-
rights-watch-director-today). For many Palestinians, social media is the only means
they have to inform the world of their reality living under occupation; to document
and share evidence of this reality; and to communicate, from their own perspective
and in their own words, their struggle for rights and freedom. In this way, social
media is critical to the ability of Palestinians to engage the world. Social media
empowers Palestinians to push back against narratives that dehumanize them. It
offers them a mechanism to challenge/refute narratives that erase their history and
deny their present-day lived reality under occupation. And it enables them to non-
violently bring attention to their cause, and to build the kind of international
awareness and understanding of the situation on the ground that can translate into
meaningful pressure to hold Israel accountable for its treatment of Palestinians,
and in so doing can lead to meaningful changes. Palestinians today are mistrustful
when it comes to social media. This is understandable, particularly in the wake of
the recent violence on the ground in Jerusalem and Gaza, which was accompanied
by censoring of Palestine-related content and quashing of Palestinian voices on
some social media platforms (in some cases explained, after the fact, as algorithm-
generated errors). In this context, I urge the Board to uphold Facebook’s decision to
restore a user’s Palestine-related content. Such a ruling would be an important step
toward rebuilding trust that Facebook supports the freedom of expression of all its
users, including Palestinians, and supports freedom of access to information related
to Palestine – whether in the form of posts sharing first-hand news from the ground,
or through the posting of links to articles, video clips, or other sources of
information. Such a ruling by the Board would also send a much-needed signal that
Facebook will stand firm against efforts to use de-platforming and politicized
content moderation policies to bolster the dehumanization of Palestinians and the
silencing of Palestinian voices on social media.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10166 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
We hope that the Facebook Oversight Board will take seriously repairing the trust
that has recently been eroded with communities of human rights advocates and
Palestinians as we challenge apartheid Israel's human rights violations. We are very
concerned about the impact of the Israeli Ministry of Justice’s Cyber Unit’s efforts to
silence Palestinians and human rights supporters, and the impact that this unit may
be having on Facebook’s policies and practices. The FOB has been presented to our
communities as an independent, unbiased body and we hope that its decision will
reflect the valuing of all users’ freedom of expression.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10168 Europe
Case number Public comment number Region
Mnemonic Yes
Organization Response on behalf of organization
––––
Short summary provided by the commenter
In Palestine, Facebook both under invests resources needed to address existing and
future human rights impacts of its products, and collaborates opaquely with
governments in ways that actively silence vulnerable voices. In addition to making
policy recommendations to Facebook on this case, The Board should again direct
Facebook to clarify its Dangerous Individuals and Organizations policy to make it
clear that discussion about important political matters that is not incitement to
violence does not fall under the policy. Furthermore, the Board should require
Facebook to indicate where it is using automation in content moderation, as well as
conduct a complete and thorough audit of its content moderation policies and
enforcement in Palestine.
Full Comment
This case makes it clear yet again that the cost of doing business where human
rights are being repressed must include investing more resources into upholding
human rights. In Palestine, Facebook both under invests resources needed to
address existing and future human rights impacts of its products, and collaborates
opaquely with governments in ways that actively silence vulnerable voices. In
addition to making policy recommendations to Facebook on this case, we urge the
Board to try something new: recommend Facebook and Instagram undertake a full,
independent, public audit of content moderation policies and enforcement with
respect to Palestine. First, the Board appropriately asks about the state of media
freedom in Palestine and beyond- there is little media freedom in the whole region.
Both Israeli and Palestinian governments suppress vulnerable voices, including
activists and independent media. Israel surveils and detains activists, and pushes
social media platforms to take down content through its “Cyber Unit.” Despite
repeated requests by civil society, Facebook has refused to provide transparency
about this relationship. The Israeli Supreme Court just rejected a legal challenge to
the Unit -- but also required the Unit to start documenting referrals for transparency
and recommended that the Israeli legislature ensure oversight of the Unit through
legislation. At the same time, authorities in Gaza and the West Bank repress dissent.
The Palestinian Authority just arressted multiple activists, and an activist critical of
the Authority died in custody last month. Despite these challenges people continue
to use these platforms to share their stories with the world, have open discussions
about political affairs, and create open source archives of human rights related
content. Social media offers one of the few avenues for them to do so, and when live
streaming can even provide protection from police and military violence. Second,
it’s clear that this removal was inconsistent with both Facebook’s policies and its oft-
stated values, including a commitment to free expression. With regards to
referencing designated groups for the purpose of “report[ing] on, condemn[ing], or
neutrally discuss[ing] them or their activities, the Dangerous Orgs policy has just
been updated in response to this Board’s policy recommendations to state that it
is“designed to allow room for these types of discussions, but we require people to
clearly indicate their intent.” The post in this case was branded by a news
organization. It was clearly allowed under the policy. Unfortunately, this is one
instance amongst many in Palestine in which Facebook improperly removed or
limited important political content and accounts. What’s more, a brief perusal of
current content in Hebrew brings up myriad posts that repeat the warning from the
“Izz al-Din al-Qassam Brigades” and contain similar imagery. Unlike Al Jazeera’s
post, which came from a verified page, these posts lack a clear indication that they
are coming from news agencies. The difference? They’re in Hebrew. This removal
was consistent with Facebook’s abysmal content moderation record in the entire
Arabic-speaking world, but particularly in Palestine. In this context, the claim that it
was an “enforcement error” is disingenuous, to say the least. Facebook has claimed
too many times that removals of important speech in Palestine were an
enforcement error. For example, Instagram supposedly removed posts about Al
Aqsa mosque because the name of the holy site is “unfortunately included in the
names of several restricted organizations.” Facebook has been claiming that
removals of important content in Palestine were mistakes since at least 2016, when
it disabled accounts of several Palestinian journalists. Facebook is either completely
broken in the way it works in Palestine, in which case it needs to invest more
resources, or Facebook is covering up biased handling of content moderation by
claiming mistakes. Either way, Facebook needs to address the patently obvious
issue: enforcement in Palestine is silencing vulnerable voices and that is especially
harmful to human rights because of the context of poor media freedom and ongoing
human rights violations by state and non-state actors. Finally, regarding contexts
where designated individuals or orgs play a significant role in public life; current
discussions around content moderation taking place in the multistakeholder forums
of the Global Internet Forum to Counter Terrorism and the Christchurch Call are
considering the issue of terrorist and violent extremist designations and the role
those designations play in automated content moderation. These forums are also
considering the impact of increased removal of “terrorist and violent extremist
content” on human rights broadly, and on documentation of human rights abuses
specifically. In line with the human rights concerns being raised in these
discussions, Facebook needs to undertake a more public and thorough audit of its
Dangerous Individuals and Organizations policy. Furthermore, in line with the
Oversight Board’s growing body of work in this area, including the Board’s decisions
in cases 2021-006-IG-UA, 2021-003-FB-UA and 2020-005-FB-UA, Facebook must
consider context when taking down content that references an individual or
organization on Facebook’s internal lists, or on external lists, rather than
automatically moderating that content. The Board should again direct Facebook to
clarify its Dangerous Individuals and Organizations policy to make it clear that
discussion about important political matters that is not incitement to violence does
not fall under the policy. Furthermore, the Board should require Facebook to
indicate where it is using automation in content moderation, as well as conduct a
complete and thorough audit of its content moderation policies and enforcement in
Palestine.
Link to Attachment
PC-10168
2021-009-FB-UA PC-10169 Middle East and North Africa
Case number Public comment number Region
––––
Short summary provided by the commenter
7amleh - The Arab Center for the Advancement of Social Media would like to affirm
Facebook’s original decision to restore the user in question's content as the just
remedy to the user’s appeal as this protects user’s right to freedom of expression
and their right to access information, which is essential for discussing important
social, economic and political issues and to make decisions and assess the risks
related to violence on the ground that can impact the safety and lives of
Palestinians.
Full Comment
7amleh - The Arab Center for the Advancement of Social Media (7amleh) is a
Palestinian organization and non-profit working to protect the digital rights of
Palestinians. In reference to Oversight Board Case number 2021-009-FB-UA, 7amleh
would like to affirm Facebook’s original decision to restore the user in question's
content as the just remedy to the user’s appeal as this protects user’s right to
freedom of expression and their right to access information. Palestinians rely on
social media to both learn from and share with their family, friends and the world
the reality of their lives living under Israeli occupation and as second-class citizens
in Israel. Facebook is also used to discuss important social, economic and political
issues and to make decisions and assess the risks related to violence on the ground
that can impact the safety and lives of Palestinians. This includes sharing news
articles from media outlets and journalists who are covering political developments,
including statements made by political leaders, such as in the case under review, so
that these leaders can be held accountable. As was apparent in May, during the
increased Israeli attacks on Palestinians in the Gaza Strip and efforts to forcibly
displace Palestinians from their homes in East Jerusalem, Facebook is an incredibly
important place for sharing news and information about events happening on the
ground. Unfortunately, during this period, Facebook’s response to increased
activism on its platform was increased censorship of journalists and activists.
During this time, 7amleh documented (and shared with Facebook) hundreds of
cases of content takedowns that did not violate Facebook’s community standards,
further alarming advocates and Facebook users. While Facebook has provided
access to information and freedom of expression, many of the policies and practices
of Facebook over the past decade have disproportionately silenced Palestinians.
Overmoderating the content of Palestinians has resulted in a general belief amongst
Palestinian users and human rights advocates that Facebook’s content moderation
policies are biased and discriminatory. It is our hope that the decision of the FOB to
uphold Facebook’s original decision to reinstate this political and newsworthy
content will be respected, and that the FOB will be able to show to the Palestinian
community that its decisions are unbiased and independent and contribute to better
relations between Facebook, Palestinians and human rights advocate worldwide.
Link to Attachment
PC-10169
2021-009-FB-UA PC-10170 Europe
Case number Public comment number Region
––––
Short summary provided by the commenter
Concern about yet another claim by Facebook that the case is an enforcement error
by a third-party contractor. Either these people are not being trained and guided
properly, or the claim that they're responsible for over half the cases so far is false.
Either way, there's no point the Board having opinions on policy if it can't be
reliably implemented.
Full Comment
Facebook have designated this case as yet another "enforcement error." FB has
blamed the person(s) implementing policy in over half of the cases where the Board
has found FB at fault. There is not much point changing or refining policy if the
people on the front line are not doing their job properly. This claim assumes there's
nothing wrong with FB's policies and the Board doesn't need to get involved. The
problem is under-performance of external contractors who are not even at FB's
offices. First, are these cases a representative sample of FB's moderation problems?
If they're not a symptom of any systemic or policy issue, is reviewing them a good
use of the Board's time? Even if moderators were to achieve 99.9% accuracy, tens of
thousands of mistakes would still be made every day. Are these cases just outliers,
representative of the 0.1% of decisions where someone got it wrong? But each of
these cases have already been reviewed, at FB or by contractors, and the original
decision upheld. Surely any genuine enforcement errors have already been
rectified? It appears that FB only, and always, claims there has been an
enforcement error for certain types of case. When it appears that FB's decision-
making is at odds with the majority public opinion, the blame is passed to the
lowest-paid individuals in the content-moderation food-chain, people who are not
even employed by FB. If these cases reflect the current state of content moderation,
and FB's explanation is correct, this means the moderating teams, and the people
reviewing their decisions, are all making the same errors and not implementing
policy properly. There's a problem with training and/or the way policy is
communicated. It's important for the Board to understand what's going on when FB
asserts there has been an enforcement error. Start by reviewing the original
decision to the same standard that the moderators are held to: the Implementation
Standards. Multiple people reviewed the same content, applied the IS as supplied by
FB, and reached the same conclusion. It's not appropriate for you to use a different
standard, such as the public rules or human rights law. There's no point the Board
ruling on FB's policy if it's not implemented properly by people outside of FB due to
errors in the IS. There is however a strong possibility the Board will find that the
moderator made the correct decision per the IS and there was no enforcement
error. It's been my experience that FB passes blame down the food-chain and
punishes the person at the bottom for things beyond their control. I can name four
individuals known to me personally who were fired after implementing policy as it
had been taught to them, but inadvertantly displeased FB. I am sure that FB will tell
you the person(s) responsible for the alleged error have not been sanctioned in any
way. But moderators are not generally employed by FB. They work for contracting
companies which have been widely reported as treating staff badly. Mark
Zuckerberg has dismissed these reports as "a little over-dramatic" but there is a
documented pattern of abusive and incompetent management of content
moderators by these firms. It is easily imaginable that a low-level manager in a
contracting company might take action against an individual alleged to have made
an error that causes a problem for FB. It's not a given, but it's a fair concern to have.
I ask the Board to speak, privately, without "supervision", to the person accused of
making an error, to understand their decision and obtain confirmation from them
that they haven't been unfairly sanctioned. No individual at FB ever seems to be
held accountable for failures of policy, and it would be only fair to confirm that the
outside contractors trying to enforce those policies are also protected appropriately
when the Board investigates their activities. As to whether the original decision was
consistent with the company's stated values ... In my experience, actually
implementing the policies as written, I always felt there was a genuine commitment
on the part of the writers to doing the right thing. But it's incredibly difficult to write
a set of standards that will always give the same outcome when implemented by
people from different cultures and with differing levels of life experience. There are
just too many variables to predict everything in advance. There were many
occasions during my time as a moderator where, after being compelled by a rule to
make a decision I was unhappy about, I asked myself how I would write the rule
differently. I usually gave up in despair, because I could never improve on what was
there. It's very easy for an armchair critic to sit back and complain that the rules are
inadequate, or overly broad, to complain about the number of false positives, or the
content that is allowed to stay up when it shouldn't. But I am not confident anyone
can ever create a set of standards that will do the job without fail every time. The
policy teams are constantly trying to refine the rules, trying to nail the issues down,
but cases like this one are always going to slip through the cracks. I have watched
as, every two weeks, Facebook's policy team issued updates to the IS, constantly
tweaking the wording to try and catch the edge cases they missed last time without
also penalising the marginal-but-acceptable content. If they have missed a trick this
time, it's just part of the inevitable and never-ending process of learning and
adapting. I don't see any big policy failure. If the detailed nitty-gritty of the IS did
cause anyone to make what the public views as the "wrong" call in this case, I would
not be willing to claim that this is intentional on anyone's part. Nor would I accuse
anyone of incompetence in this case. It just means that the IS need to be tweaked,
but that's not going to happen if FB just blames the outside contractors instead.
Link to Attachment
No Attachment
2021-009-FB-UA PC-10171 Europe
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
Link to Attachment
PC-10171
2021-009-FB-UA PC-10172 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
My comment is in response to Facebook's arguably incorrect legal interpretation of
its obligations with respect to U.S. law regarding designated terrorist organizations,
and the impact of the company's "Dangerous Groups and Individuals" policy on
vulnerable communities, including artists, activists, and human rights
documentarians.
Full Comment
July 14, 2021 Submission of comment to Facebook Oversight Board re: Case 2021-
009-FB-UA From: Jillian C. York, Electronic Frontier Foundation The case in
question, involving content shared by a verified news organization, may have
violated Facebook’s Community Standards—in particular the prohibition on
“Dangerous Individuals and Organizations.” However, as numerous civil society
groups have argued, this standard is an ad hoc one, lacking any semblance of
transparency.. Hamas, the parent organization of Izz al-Din al-Qassam Brigades, is
indeed designated by some states (including the U.S.) as a terrorist organization. To
users in Palestine, however, the group is part of a legitimately elected political
entity. While it may not be Facebook’s place to decide the appropriateness of such a
designation, it is Facebook’s duty to be transparent about the basis of its Community
Standards. If, as it appears, Facebook is basing its definition of “Dangerous Groups”
on its lawyers’ questionable interpretation of U.S. law, then it is incumbent on the
company to be transparent about that legal underpinning so that users in any
country can understand the rules and act accordingly, as a user cannot comply with
a rule of which they are not aware. Despite being aware of this issue for many years,
Facebook has done no such thing. Therefore, the company is not acting in
accordance with its own stated principles. Furthermore, the fact that the United
States designates Hamas as a terrorist organization may be irrelevant. As we and
other rights groups have argued, U.S. law is not determinate as to whether hosting
speech of a U.S. designated foreign terrorist organization (FTO) constitutes
“material support” and is in violation of the law. In the case in question, the speech
was not by Hamas or Izz al-Din al-Qassam Brigades, but rather, was posted by a
verified and respected news organization. The question of legality as presented by
the Facebook in this case is therefore seemingly irrelevant and should therefore not
be taken into account. Given that Facebook voluntarily restored the content, the
focus of my argument will therefore be on how the company should proceed with
respect to this policy. Facebook has chosen to be a global platform with a diverse
userbase, but this particular rule reflects a U.S.-centric, colonial outlook. If, as it has
been argued, Facebook does not have a legal obligation to remove the content in
question, then the company should review its “Dangerous Individuals and
Organizations” policy, taking into account the global nature of its userbase, and the
historically uneven application of this rule toward Islamic organizations.
Furthermore, the company must grapple with the fact that, in banning certain
political parties in a country like Palestine (or Lebanon, or Turkey), they are
inherently choosing sides and effectively meddling in foreign politics—as I argued
in my book, Silicon Values: The Future of Free Speech Under Surveillance
Capitalism. But perhaps most importantly, it must be considered that banning not
just the groups themselves, but large swaths of discussion about such groups has a
chilling effect on counter speech in locales that are most affected by violent
extremism. We have ample examples demonstrating that overbroad content
moderation—and in particular, the ever- increasing use of automation in such
processes—has the effect of removing not just harmful extremist speech, but critical
counter speech, academic research, art, human rights documentation, and even
panoramic imagery in which (for instance) a flag of a terrorist group is present. This
policy serves not Facebook’s global user base, but the interests of the United States
government, if anyone. Facebook should, at minimum, maintain a public,
transparent list of “Dangerous Groups and Individuals” so that users can make
informed choices about whether they want to use the company’s services. At best,
Facebook should align its policy with international standards, not U.S. ones. I would
be remiss if I didn’t note that I believe some of the questions Facebook is asking in
this consultation are irrelevant. Again, I emphasize that the use of “designated
individuals or organizations” (as noted in point 3) requires serious interrogation.
While Facebook’s staff have repeatedly stated to members of civil society, myself
included, that it is their legal obligation to remove groups designated by the United
States government as “terrorist” organizations, neither their own rules nor the law
itself seem to back that up. It is therefore incumbent on Facebook to revise its
Community Standards. Finally, with respect to questions 5 and 6, I believe that it is
abundantly clear at this stage that Facebook’s ad hoc, opaque rules regarding
“Dangerous Groups” are negatively impacting the ability of Palestinians to speak
freely about injustices in their country and are repressing vulnerable voices in their
efforts to speak out. I would point the Oversight Board to the recent campaign led
by a significant number of respected Palestinian and international organizations,
and signed by prominent figures (including former Facebook policy staffers) calling
on the company to take concrete steps to ensure that the Community Standards are
transparent and in line with international human rights frameworks.
Link to Attachment
PC-10172
2021-009-FB-UA PC-10175 United States and Canada
Case number Public comment number Region
––––
Short summary provided by the commenter
Full Comment
Link to Attachment
No Attachment
2021-009-FB-UA PC-10176 Europe
Case number Public comment number Region
Withheld Yes
Organization Response on behalf of organization
––––
Short summary provided by the commenter
Full Comment
Link to Attachment
No Attachment