Acknowledging the importance of protecting sexual and reproductive health rights (SRHR) both offline and online in the West Asia and North Africa region (WANA), SMEX has conducted comprehensive research as part of the Masarouna project, a five-year program that mobilizes the power of young people in the region so they can claim SRHR. Through this work, SMEX explored SRHR within a digital rights scope and analyzed content moderation policies and practices of social media platforms; Facebook, Instagram, TikTok, X (formerly Twitter), and YouTube surrounding Arabic-language SRHR content in the region. Based on this analysis SMEX is now leading an advocacy effort to improve content moderation practices for SRHR content.
Sexual and reproductive health rights (SRHR) in the WANA region are not sufficiently recognised nor protected. SRHR content touches upon, among other things, protection from sexually transmitted diseases, access to contraception, sexual pleasure, gender equality, and the health of mothers and children in all stages of their lives, as well as their roles in society. Various factors, such as socioeconomic status, societal norms, tradition, access to education, local laws, and family environment, hinder access to SRHR-related information and educational content. Especially in the WANA region, this translates into a lack of access to abortion services and difficulty accessing contraceptives, among other challenges. Societal and state actors’ failure to provide adequate education on sexual and reproductive health is often substituted by information available online and disseminated on social media. Social media platforms, however, have content moderation policies that impede the circulation of SRHR content, posing a serious challenge to rights activists, educators, experts, and NGOs active in the sector, as well as affected individuals seeking and imparting information.
Policies around SRHR content are not clear on all platforms subjected to our study. Companies do not have specific policies that deal with SRHR content, vague mentions of SRHR are found in advertising policies, or the community guidelines’ “adult”, “sexual”, “mature” or “adult products and services” content policies (X, Youtube, Instagram, Facebook). TikTok clearly states it allows reproductive health and sex education content in its “sensitive and mature themes”.
Without comprehensive and targeted policies about SRHR, platforms tend to unjustifiably, without further reasoning and mostly through automation, remove or restrict posts, ads, and accounts. This practice has major impacts not only on SRHR but also on freedom of expression, as this creates self-censorship practices for users and SRHR-promoting civil society organizations. The unclear moderation of SRHR often results in contradictory and inconsistent content moderation decisions, leaving users without a proper understanding of what content is permissible online, how it is moderated, and what the appeal processes entail. In addition, platforms often adopt different practices across the globe that can be considered biased towards users in the SWANA region.
Advertisement policies related to SRHR content on all platforms included in the study place several restrictions on promoting content, services, and products related to SRHR and wellbeing, mainly under community guidelines that ban nudity and sexual content, in addition to regulations banning certain advertising for certain “adult” products and services. This leads to self-censorship amongst rights activists, educators, experts, and NGOs active in the sector. For example, promoting non-prescription contraceptives is prohibited in Egypt on X. Also, Google doesn’t allow ads related to birth control or fertility products in countries such as Bahrain, Djibouti, Egypt, Iraq, Jordan, Kuwait, Lebanon, Libya, Morocco, Oman, Palestinian Territory, Qatar, Saudi Arabia, Syria, Tunisia, United Arab Emirates, Yemen, etc.
To address these issues, platforms need to adopt clear, comprehensive and rights-based policies on SRHR and adopt a series of measures.
- We call on platforms to clarify the reasoning for restricting any “adult” and “sexual” content.
- Platforms should enforce and improve exceptions for educational, medical, scientific, and artistic content, and review advertising policies on sexual and reproductive health and wellbeing.
- We recommend that platforms clarify the reason SRHR-related ads are subject to increased restriction. Platforms should also be fully transparent about restrictions on visibility, as well as improve the content moderation appeal process to cover all types of restrictions, including removal of content, ad rejections, suspension of accounts, pages, advertising accounts, and decisions to decrease the visibility of content and accounts globally.
- Moreover, platforms should provide comprehensive training for human moderators to equip them in recognizing SRHR content and dedicate resources for fair and human rights-centred content moderation in the region. Lastly, platforms should take action against users abusing flagging mechanisms by reporting content frivolously. To achieve this, platforms should check diligently whether the flagged content should actually be removed or if the mechanism has been used to censor content permitted by the platform.
We call on social media companies to enforce all necessary measures to ensure that SRHR content is protected online. This includes in particular performing human rights due diligence including examining their policies’ and practices’ impacts on SRHR, as well as addressing reports of bias toward the region and its languages and dialects, in addition to double standards in moderation of content dealing with female anatomy and sexual pleasure. Platforms must also increase their transparency regarding all issues related to SRHR, including how they moderate SRHR-related content.