In December 2024, Syrians witnessed the unthinkable: the collapse of Assad’s regime by a coalition of armed groups, led by Hayat Tahrir al-Sham (HTS). As people rushed to document this historic moment and share critical updates online, Meta, once again, proved unequipped to moderate communications. It deleted key posts, including statements from HTS leaders and Syria’s new interim president, Ahmed al-Sharaa.
These removals, made under Meta’s Dangerous Organizations and Individuals policy and the Violence and Incitement policy, have raised urgent questions about Meta’s capabilities of moderating content in volatile political moments such as regime change. The Oversight Board—an independent body reviewing polarizing moderation decisions—is now evaluating two such cases. SMEX has submitted a public comment urging Meta to restore the posts and reexamine its approach to content moderation at this pivotal moment in Syrian history.
The Two Cases
The first post, shared by the owner of a public page, includes an image and quote in Arabic from Sharaa shortly after he assumed leadership. The text appears to be an excerpt from a speech given by Sharaa that day, congratulating the group’s revolutionary soldiers for subduing their enemy, encouraging them to keep fighting to liberate Syria and restore people’s rights, urging them to “not waste a single bullet except in the chests of your enemy, for Damascus awaits you.” Following the post’s removal, the user appealed and asked “why Meta would ban mention of people fighting for freedom in Syria while supporting dictatorship by allowing photos of former president Bashar al-Assad.”
The second post involves a short video in Arabic shared by a user who self-identified as a journalist in their appeal to the board. The video shows a similar speech from an HTS commander, in which he celebrates and encourages rebel groups and addresses Assad’s forces by saying “You have no choice but to be killed, flee or defect.” In this case, the user appealed to the decision referring to the freedom of press and that they were trying to inform their audience of factual developments.
Unfortunately the cases’ announcement by the Oversight Board had very little information about the accounts that shared these videos, in particular the public page that shared the photo and speech text of Ahmed al-Sharaa. We know that the second case in this bundle was shared by a user that self-identified as a journalist. We don’t know what kind of public page shared the first video, and whether it was a page that identified itself as a source of news. Regardless, there is no indication that these are accounts that are dedicated to promoting HTS or inciting violence.
The removal of content documenting the fall of the Assad regime or providing information on how the HTS has engaged with troops loyal to Assad is not just a technical moderation mistake. It is another example of Meta’s inability to deal with the political complexity of the SWANA region, and of its persistent overenforcement of Arabic content. As we argued in our submission to the Oversight Board, both pieces of content at the heart of this case should have remained on the platform. The policies Meta relied on to justify their removal do not apply in this context, and the decision reflects deeper structural failures in how Meta moderates speech during conflict and crisis.
The Syrian Context Matters
Since 2011 the Assad regime has conducted a well-documented campaign of repression that has led to international sanctions and, in some cases, war crimes prosecutions. The regime responded to peaceful protests with brutality, and has killed more than 200,000 civilians and disappeared nearly 100,000 others according to the Syrian Network for Human Rights. After the fall of the Assad regime, Ahmed al-Sharaa was named interim president and now heads the transitional Syrian government. Western governments have begun meeting with these former HTS members but Meta’s moderation practices failed to adapt to the new political reality in time, perhaps as a symptom of structural shortcomings in its general content moderation policies. Internal guidance at Meta reportedly allows “official communications from/on behalf of al-Sharaa exclusively when shared in his official capacity as interim president,” but this rule was never made public by Meta. It was only revealed in the Oversight Board’s case announcement.
Meta’s lack of transparency keeps users in the dark about what they’re allowed to post on its platforms. During the rapid advance of armed groups arriving in Damascus, Syrians inside and outside the country relied on posts (like the ones removed by Meta) for real-time updates from the ground. At the same time, Syria’s minority communities, particularly the Alawites, Druze, and Christians, have found themselves increasingly vulnerable to violence. Updates shared on Meta’s platforms could make the difference between life and death for families fleeing areas of heightened tensions and revenge killings in the chaos of Assad’s ousting.
Meta’s Policies Were Not Applied Correctly
Sectarian and extremist rhetoric online manifested into real-life violence in Syria. Content that blatantly incites against minorities in Syria should be moderated and kept in check. However, circulating news and unfolding updates about political developments in Syria should not be censored under the pretext of content moderation. This will only be possible if Meta invests in engagement with impacted communities and uses that engagement to directly build the cultural literacy of its automated and human moderation systems.
Meta removed both posts under its Dangerous Organizations and Individuals (DOI) policy and then later cited the Violence and Incitement (V&I) policy as a secondary justification. In both cases, we argue these policies were misapplied.
The V&I policy includes explicit exceptions. It allows aspirational or conditional threats of violence directed at terrorist or violent actors. It also allows such content when shared to raise awareness. The statements in these posts were directed at armed Assad loyalists and framed as conditional: surrender or face confrontation. This is well within what Meta already permits.
Similarly, the DOI policy allows for reporting on or neutral discussion of designated groups. It even permits quoting these groups if the intent is news reporting, not praise. In the case of the photo and quote from al-Sharaa, and the video of the HTS commander, there is no indication that the accounts behind these posts were promoting HTS or supporting violence. In fact, the journalist in the second case explicitly said the post was for informational purposes. The same principle was applied in previous Oversight Board cases, such as the “Shared Al Jazeera” case, where users shared official statements from designated groups without endorsing them.
Policy Gaps and Hidden Decisions
Meta’s decisions around content unfolding during heightened political upheaval lack transparency. In Syria, whether al-Sharaa remains on Meta’s DOI list is still unclear. So is the process Meta follows to activate its Crisis Policy Protocol (CPP), which is supposed to guide how content is handled during times of crisis.
These are not minor technicalities. They shape what millions of users can see, share, and say. Yet the criteria for activating the CPP, who is responsible for activating the CPP, and the list of who is or is not designated as a Dangerous Organizations and Individual, are opaque. This lack of transparency and inconsistent enforcement are not compatible with accurate content moderation.
What Meta Should Do
We urged the Board to restore the two deleted posts on Meta platforms, and to use this opportunity to implement further improvements. We encourage the Board to take into consideration the following recommendations:
- Engage Impacted Communities for Cultural Context: We urge the Board to ask Meta to work with people from impacted Syrian communities to ensure appropriate cultural context and understanding. To better identify and address genuinely harmful content, collaboration with individuals from affected communities is essential.
- Strengthen the Violence and Incitement Policy: The current policy addresses explicit threats of low-severity violence against “Protected Characteristic” groups, but requires additional context to act on “veiled or implicit” threats. We recommend that the inclusion of a protected group in any veiled or implicit threat be recognized as relevant context, and that the policy clearly states this. Achieving this will require Meta to deepen its understanding of cultural contexts to recognize such threats effectively. In other words, if a post includes a “veiled threat” (an indirect threat) that targets or mentions a protected group (i.e., religious minority), then this should be treated as important context when deciding whether to remove it.
- Increased Transparency Around Crisis Policy Protocol (CPP): We recommend that Meta publishes more information about the CPP on the Transparency page. There is a need for greater clarity regarding when and how the CPP is activated, particularly in cases involving the targeting of minority groups. The Board should reiterate its recommendation in the UK Riots case that Meta “revise the criteria it has established to initiate the Crisis Policy Protocol,” and should specifically note that those criteria should include targeting of protected characteristic groups, including religious and ethnic minorities.
- Public Notification of Delisting from the DOI List: We recommend that Meta publicly announces when a group or individual has been delisted from the DOI list.
SMEX has previously submitted comments on similar cases from the SWANA region. SMEX submitted comments in a recent case in which the Oversight Board considered how Meta moderated content that used the word “Shaheed,” a term that broadly translates to “martyr” in Arabic that Meta was overmoderating. The Oversight Board recommended Meta to allow the use of “Shaheed” under clearer, more context-specific moderation standards,as well as increase transparency around Meta’s DOI policy and its use of automation. SMEX also submitted a comment on a case regarding the Pro-Palestine slogan “From the river to the sea.” where we argued that this phrase does not violate the Hate Speech, Violence and Incitement or Dangerous Organizations and Individuals policies. The Oversight Board found that the standalone phrase “cannot be understood as a call to violence against a group based on their protected characteristics, as advocating for the exclusion of a particular group, or of supporting a designated entity – Hamas”.
The current cases involving HTS and the transitional government in Syria are a continuation of a broader struggle. Once again, Meta is facing the fact that decontextualized, black and white policies cannot account for the region’s complexities with nuance, transparency, and respect for local realities.
These cases raise broader questions about how content moderation should function in countries under political turmoil, war, and sometimes, bloody transitions in state rule.
When government leaders emerge from previously proscribed groups, should their speech be automatically excluded from public discourse on private platforms? Can platforms like Meta enforce rigid policies when cultural context is key, political categories are in flux, and the stakes for freedom of expression are so high? And how should Meta balance the need to prevent harm with the equally urgent need to allow users in Syria to access vital information during a time of deep uncertainty?