We, the undersigned organizations, welcome the recent decision by Facebook’s Oversight Board on Facebook’s unjustified removal of news content related to the recent Israeli aggression on Palestine under its Community Standard on Dangerous Individuals and Organizations (DIO).
On May 10, 2021, a Facebook user in Egypt shared a news item from Al Jazeera’s verified Arabic page which contained a threat of violence by the spokesperson of the Qassam Brigades, the military wing of the Palestinian political faction Hamas. Facebook initially removed the content after being reviewed by two content moderators for violating its Community Standard. Both the Qassam Brigades and its spokesperson are designated as dangerous by Facebook.
Restricting freedom of speech
As the Oversight Board rightly concluded in its review, Facebook’s content removal was an unjustified restriction of freedom of expression on a subject of public interest. The Board further noted that the content removal was not necessary since it did not reduce real-world harm, which is the aim of the DIO policy.
While Facebook restored the content, the case is emblematic of Facebook’s systematic arbitrary and non-transparent overenforcement of this Community Standard, particularly in relation to Arab and Muslim communities often at the detriment of users’ freedom of expression, and their freedom to seek, receive, and impart information. During the period of May 6 – 19, 2021, Instagram removed or restricted at least 250 pieces of content related to Palestine and #SaveSheikhJarrah campaign, while Facebook removed at least 179. The reported cases are only the tip of the iceberg with speculation these numbers reach the thousands.
Arbitrary and non-transparent enforcement
Furthermore, the case details shared by the Board raises a number of serious concerns.
Firstly, Facebook restored the content only after the Board declared its intention to review the complaint lodged by the user. Facebook stated it had mistakenly removed the content but failed to answer the Board’s request to explain the reasons why content reviewers during the manual review rated the content as a violation of the DIO Policy. The decision to remove, then restore, this content is a demonstration of Facebook’s arbitrary and non-transparent enforcement of its content moderation policies, which is a widely shared grievance among journalists, activists, and human rights defenders in the Middle East and North Africa (MENA) region.
Secondly, according to the Board, the content was first reviewed and rated by a moderator in North Africa. It was then re-reviewed by another moderator based in Southeast Asia following the user’s objection. The second reviewer did not speak Arabic and had access only to automated translation of the content. We recall that this is a systemic problem as civil society organizations have repeatedly urged Facebook to invest in the necessary local and regional expertise to develop and implement context-based content moderation decisions aligned with human rights in the MENA region. A bare minimum in this case would have been to hire content moderators who have adequate Arabic language skills and can understand regional context and nuances.
Thirdly, it is deeply worrying that Facebook not only removed the piece of content but also restricted the user’s account, allowing him read-only access for three days. It also restricted the user’s ability to broadcast live stream content and use advertising products on the platform for 30 days. Such disproportionate responses have been reported by many users in that period which make Facebook guilty of suppression of speech.
Recommendations
We therefore support the Board’s recommendations and call on Facebook once again to:
- Conduct a full, independent, public audit of content moderation policies with respect to Palestine, and a commitment to co-design policies and tools that address deficiencies or overreach of content moderation found during the audit. Furthermore, rules should be based on existing human rights frameworks and must be applied consistently across jurisdictions.
- Provide complete transparency on requests — both legal and voluntary — submitted by the Israeli government and Cyber Unit, including number of requests, type of content enforcement; and data regarding compliance with such requests. Users should also be able to appeal content decisions.
- Clearly indicate through notices and public statements where automation and machine learning algorithms are being used to moderate content related to Palestine, including error rates and classifiers used.
- Publish any policies, guidelines, and procedures related to the classification and moderation of terrorism and extremism, including any internal lists of groups classified as “terrorist” or “dangerous.”
-
-
- At minimum, Facebook should provide statistics about the list, including but not limited to: the number of Dangerous Individuals and Organizations per country and/or region; and the number of DIOs unique to Facebook (i.e. not present on the U.S. Foreign Terrorist Organization list, or any other global or national list).
- Facebook should also make explicit all policies related to how DIOs are designated and handled internally. Additionally, Facebook should engage in a co-design process with civil society in drafting a state actors policy requiring DIO experts to disclose requests by state actors, and a de-designation policy clarifying the process of removing DIO’s from the list in the case their designation is deemed unjust. Users cannot adhere to rules that are not made explicit.
-
-
- Commit to a general co-design process with civil society to improve upon policies and processes involving Palestinian content.
Organizations:
Access Now
ARTICLE 19
SMEX
Kandoo
INSMnetwork — Iraq
7amleh
Mnemonic
JOSA
Electronic Frontier Foundation