The New Rules of the Game
In early 2025, Instagram and TikTok flagged several Arabic-language sexual and reproductive health rights (SRHR) posts using medically accurate terms like “contraception” and “menopause” as sexual content. Many were removed or shadow-banned despite being educational.
Global regulations are pushing platforms to change. The EU’s Digital Services Act (DSA), implemented in 2024, tightened accountability for illegal and harmful content. At the same time, U.S. lawmakers increased pressure on companies to curb disinformation and abuse. The result is inconsistent enforcement: some rules loosen, others tighten, leaving users caught in the middle.
For Arabic-language SRHR content, already vulnerable to misinterpretation, these shifts often translate into removals, hidden reach, or demonetization. Posts meant to inform are re-categorized as “adult.” Even Meta’s 2025 announcement promising more leniency left Arabic-speaking activists unsure whether anything would change in practice.
Policy Shifts: What’s Changing?
Meta announced policy changes in early 2025 aimed at supporting free expression, which resulted in 33% fewer content removals globally on Facebook and Instagram between January and March. However, critics noted rushed changes, weak human rights assessments, and the end of fact-checking programs.
TikTok has leaned on automation since 2021 to remove nudity and sexual activity. This often sweeps up SRHR content in Arabic, erasing nuance and context. X (formerly Twitter) generally applies lighter moderation, but SRHR content in Arabic is often tagged as “sensitive,” limiting its visibility. YouTube keeps strict monetization rules: videos on contraception and reproductive health are often demonetized as “non-advertiser-friendly.”
Taken together, these shifts show a common thread: platforms may change policies in response to global pressure, but enforcement remains heavily automated and rarely adaptive to Arabic-language contexts. The impact is uneven. While some English-language SRHR creators report more freedom under Meta’s loosening rules, Arabic-speaking activists often experience the opposite: shadowbans, demonetization, or ad rejections that treat health content as explicit.
For Ghewa Nasr, who works in Arabic-language SRHR media and activism, her experiences over the past three years contradicted Meta’s promise of more freedom on these platforms. Commenting on this gap between policy claims and real enforcement, she said that while policies may shift on paper, in practice, Arabic-language SRHR content continues to face restrictions and takedown. The daily reality in Arabic is not debate over policy language but guessing which post will be flagged next, and how to rebuild reach after every unexplained shadowban.
SRHR Activism Under Pressure
For activists across the Arabic-speaking region, errors in content moderation make it harder to find essential and reliable health education. One key issue is weak content moderation in Arabic. Automated moderation systems are often trained in English, meaning Arabic words for basic health terms like “contraception” or “menstruation” are more likely to be flagged as “explicit.” At the same time, human reviewers often lack the cultural or linguistic expertise to differentiate between educational sexual health and pornographic content.
As Mariam Jabali, founder of the Inootha initiative, explained: “The same post in English might go through without problems, but in Arabic, it gets flagged or restricted. We find ourselves punished for using our own language, even when the meaning is identical.”
The language problem runs deeper than translation. With the launch of a series of videos on Sharika wa Laken with Dr. Sandrine Atallah on sexual pleasure and rights, Ghewa shares how engagement was steady until they added English subtitles. Instagram stories linked to the episode dropped in views without warning or flagging the content as potentially violating.
There are also inconsistent advertising rules. SRHR organizations that try to run ads for awareness campaigns, such as promoting contraception access, often find their ads blocked in Egypt or Jordan, even though the same content might be approved in Europe or the U.S. These disparities are not accidental: platforms frequently calibrate enforcement based on local laws, political pressures, or ‘community standards’ tied to norms. The result is that Arab-language campaigns are restricted at the very moments they are most needed, such as during health awareness periods or humanitarian crises.
Ghewa Nasr recalls a Period Poverty toolkit from her time with Fe-Male, a Lebanese feminist organization that works on gender justice, media representation, and digital rights. This tool, which focused on the lack of access to menstrual products, education, and safe facilities, was first produced in Arabic and repeatedly failed Meta’s ad approval with only a generic “policy violation” notice. The toolkit itself contained necessary educational information and resources, without provocative or graphic imagery, making the rejection unclear. Members of her team had to eventually escalate the case through Meta’s designated ‘focal points’ and consult external digital rights experts just to nudge the process forward. Even when approved, the delays meant missing key awareness windows, such as Menstrual Hygiene Day (May 28), International Women’s Day (March 8), or during moments of heightened crises.
In occupied Palestine, censorship compounds these problems. Human Rights Watch documented how Meta systematically suppressed Arabic posts on Gaza, violating the rights to freedom of expression and access to information. Posts about Israeli evacuation notices directed at Gaza residents, updates on bombed health facilities, and humanitarian advisories were repeatedly removed or down-ranked. When SRHR groups in the region attempted to post health-related guidance with these updates, their content often disappeared into the same moderation void.
This creates an atmosphere of uncertainty for educators like Mariam Jabali. She explains that the same Arabic word might pass moderation once and be removed the next time, with no clear explanation. People creating this content are forced to waste valuable time rewriting entire posts, swapping terminologies, or migrating content to another platform just to avoid takedown. This constant unpredictability makes long-term planning nearly impossible and drains energy that could otherwise go into producing accurate health education, especially in times when SRHR is essential during war or economic crisis.
Regional feminist collectives such as MARSA Sexual Health Center in Lebanon and Chouf Minorities in Tunisia show how human rights defenders working on SRHS navigate the pressures strategically. MARSA, for example, tailors its messaging by using Lebanese dialect rather than formal Arabic to reduce the chance of being flagged online. Chouf leans on audiovisual art to tell stories that are harder for algorithms to mislabel.
Both frame their content around positive, scientifically-grounded education. For both these organizations, social media remains both essential for outreach, but uncertain, as posts on contraception or consent can be removed or targeted with backlash. Instead of retreating under mounting anti-rights rhetoric, these groups continue to adjust their language and campaigns, publishing even when takedowns are likely.
Human rights defenders and entities working on SRHR issues cannot predict whether a post will be flagged, hidden, or allowed. For communities already facing stigma offline, these digital
Adaptive Strategies
Faced with persistent censorship, SRHR activists across the region have become experts at bending platform rules just to stay visible. Many resort to coded language and symbols to avoid takedowns. In Lebanon, for instance, educators have replaced the word ‘حيض’ (menstruation) with a 🌙 emoji, while contraception is referred to through slang terms like “tools” or “plans” to evade detection.
When posts or ads are blocked outright, activists often shift to alternative platforms. Facebook’s refusal to approve contraception awareness ads in Jordan pushed some organizations to migrate campaigns into closed Telegram and WhatsApp groups, where conversations could continue safely but to much smaller circles already in the know. Others go a step further by building their own platforms entirely. Hossam Chehade, executive lead of Intersection–تقاطع, explained to SMEX how they shifted to archival practices for their SRHR-focused e-magazine AJSADOUNA: “Anything posted on social media can be taken down easily with just a few reports, and it isn’t always safe to keep there. That’s why we wanted to have it on our own platform. Part of it is a reaction to the censorship that occurs, but also because we want to keep this as a long-term asset and tool, to archive these efforts.”
Because no single platform is safe, many activists scatter their work across multiple spaces. Nour Emam’s Motherbeing is a clear example: after repeated takedowns, she built her audience across Instagram, YouTube, and finally into her own app, Daleela, ensuring that years of work could not be lost to wrongful content moderation or demonetization. Others in Egypt mirror this by posting the same videos on TikTok and YouTube Shorts, calculating that if one platform censors them, the content still survives elsewhere.
Even with these precautions, censorship persists, but activists rarely face it alone. In Lebanon, feminist networks have reposted Ghewa Nasr’s flagged health education videos to keep them circulating. In Tunisia, queer collectives linked to Chouf Minorities step in when content is taken down, resharing materials across accounts to prevent total erasure. In practice, the removal of one post often sparks a wave of reuploads, transforming an attempted silencing into collective amplification.
Other defenders choose to fight from within. Ghead Hamdy, who launched Speak Up in 2020 to document harassment, transformed it into Egypt’s first digital helpline for survivors of online gender-based violence. When survivor testimonies were removed under Meta’s “community standards,” the helpline not only supported women in removing abusive content but also pressed Meta and TikTok to refine their moderation practices. By challenging platforms directly, Hamdy highlights how policies often protect abusers more effectively than survivors, forcing SRHR work to double as digital rights advocacy.
Creative expression could also offer another route. In Tunisia, Chouf Minorities uses audiovisual storytelling through feminist art and online campaigns to address sexuality, bodily autonomy, and queer rights. Similarly, Love Matters Arabic has circulated cartoons and short animations explaining contraception and consent, crafted to pass moderation filters while remaining highly shareable.
“We want to be bold and honest about bodies,” Ghewa told SMEX.“But the systems punish frankness, so we storyboard around the filters.” That paradox captures the reality for SRHR defenders: activists spend as much energy outsmarting algorithms as they do producing vital health education.
Advocacy and Pushback
Despite these adaptations, activists and organizations across the region emphasize that workarounds are not enough. They continue to pressure platforms to address systemic biases. Coalitions like the MENA Alliance for Digital Rights have issued open letters to Meta and TikTok, calling for greater transparency around takedowns, better appeal mechanisms, and meaningful consultation with Arabic-speaking communities. These campaigns argue that policies designed for Western contexts cannot simply be applied wholesale to the region.
Other groups have pushed for safer online environments more broadly. In early 2024, Helem, in partnership with Human Rights Watch and others, co-launched the Secure Our Socials campaign, which called on Meta to improve protections for LGBTQ users in the SWANA region. The campaign pressed for tools such as account lockdown features, greater transparency around content removals, and investment in human moderators familiar with regional Arabic dialects.
While not focused exclusively on SRHR, these efforts underscore how moderation failures affect marginalized communities whose digital presence is already precarious, including those advocating for sexual and reproductive health. By strengthening digital safety and accountability, such campaigns indirectly create safer grounds for SRHR content to exist online.
Hossam Chehade elaborates on how AJSADOUNA fills a gap left by mainstream platforms, where SRHR discussions in Arabic are heavily constrained by banned terminology and viral rules. By contrast, the e-magazine provides space for contributors to write and create freely, even under pseudonyms, while archiving these testimonies outside the unpredictability of platform moderation.
Yet platform accountability remains elusive. Meta’s 2025 easing of sexual content rules was rolled out globally, but without consultation with SRHR activists in West Asia and North Africa.
“The answer lies not in waiting for favors from companies but in building alternative power,” explained Nasr. She argues that SRHR work in the region needs Global South infrastructure, community-run platforms, legal frameworks, and long-term training, so activists are not perpetually guests in spaces designed to erase them.
Donors, too, must be pushed to stop playing politics and start backing systemic change rather than one-off campaigns. “What is needed is bigger than appeals,” she said, “but a movement that brings Global South voices together to demand digital rights on their own terms.”
SMEX’s Zeinab Ismail explained how platforms carry a huge part of the responsibility; they should ensure that SRHR content is treated as accurate and necessary knowledge. That requires companies to move away from automation or AI, and instead employ moderators from across the region, with the language skills and cultural awareness to know what should and shouldn’t be removed. Crucially, these systems must be built in collaboration with organizations and experts already working on SRHR, so that information remains safe, accurate, and accessible.
The question remains: Can activists influence platform policies at all, or must they remain in a constant state of adaptation? Until tech companies take regional input seriously, activists fear the latter.
What’s at Stake
SRHR activism is about access to knowledge, the power to speak openly, and the visibility of marginalized communities. In a region where reproductive rights are heavily stigmatized offline, online platforms remain essential.
When these digital spaces fail, the cost is immediate: fewer people access critical information about their own bodies and rights. To move forward, platforms must invest in context-aware moderation, transparent appeals, and collaboration with local actors.
For now, activists continue to adapt through coded language, creative visuals, and solidarity. As “Social media is unavoidable, especially in our current context, but I’m happy to see self-publications and personal platforms expanding as a model for future SRHR advocacy,” explained Hossam. Such models give some independence to activists who want to express themselves freely on SRHR and who want to create content that reaches their targeted audiences.
Until donors, platforms, and policymakers take Arabic-speaking communities seriously, SRHR activists will remain trapped in survival mode. What is needed is not only technical fixes like better algorithms, but systemic change: one that recognizes SRHR as a right, not a violation, and creates digital spaces where this knowledge can circulate freely.