When it comes to social media reach and engagement, content can perform very differently depending on the perceived gender of its audience. In a recent experiment, I published two nearly identical videos on Instagram—one for men, the other for women. The men’s version drew 15,000 views and 151 likes, while the women’s post reached just 7,000 views and 59 likes.
In another video intended for men, the analytics show 76,000 views, 739 likes, 126 comments, and 3,597 shares. Meanwhile, a video tailored for women reached 8,400 views, 94 likes, 28 comments, and 134 shares.
Why am I sharing these examples with you? As a content creator, these two cases clearly illustrate two issues. The first is shadow-banning, a restriction that Meta does not admit to imposing on its users, which it justifies by claiming that the video was simply not categorized under “recommendations.” The second is the significant restriction on content related to women compared to that related to men.
Creators of educational content, especially those addressing sexual and reproductive health, struggle to reach their audiences and to generate engagement, because Meta classifies their content, based on Instagram’s policies, as sexual, pornographic, inappropriate, or misleading.
Yet the contradiction in these standards is alarming. How can non-educational nudity or defamatory content spread widely without any warnings, while educational and awareness-raising material is consistently restricted or even removed?
Restrictions and takedowns without justification
As creators of content related to sexual and reproductive health and rights, we resort to tricking social media platforms by manipulating how we present and phrase information to fit within Meta’s “community guidelines.”
For instance, we write the word “sexual” in distorted form, like “جنىىىي” in Arabic, and instead of using the term “sexuality,” the algorithms favor “reproductive.” This strategy helps boost visibility and shields the content from being restricted or shadow-banned.
Three or four years ago Instagram once tried to restrict and delete two videos I published about HIV. I submitted an appeal to have the decision reviewed, since the content was purely educational and focused on awareness-raising around health.
My online presence was not limited to Instagram. I also had an active account on TikTok before it was banned by the Jordanian government, which announced that it was blocking the platform “after its misuse and its failure to deal with posts inciting violence and calls for chaos.”
Content spread much faster on TikTok compared to Instagram, but TikTok was also more restrictive with my material, labeling it as pornographic and inappropriate, and warning me multiple times that my account could be shut down.
Why must we, as creators of sexual and reproductive health and rights content, always struggle to have our material reach audiences? And whose interest does this restriction serve?
Publishing content by credible experts and specialists poses no threat to the public. On the contrary, it helps raise awareness and deliver accurate information in an accessible and low-cost way. What frustrates me and my colleagues is that Instagram presents itself as a space that welcomes diverse causes and individuals, yet in practice, it is dominated by political agendas that prevent users from accessing and benefitting from accurate knowledge.
“Shame” culture
Some followers express their appreciation for my content when they run into me in public, despite the sensitivity of sexual health issues. From what I have heard from many, people often avoid engaging with sexual health content on Instagram so that their relatives and friends do not notice.
Those around them may start asking questions out of curiosity, ridicule, or shaming, such as: “Why are you following a page about AIDS? Do you have it?” “This fool is promoting perversion,” or “How can a girl talk openly about her period?”
My Instagram page has around 14,000 organic followers, but a large number of those who message me do so through fake accounts, completely empty and with random names. And if they have enough courage to ask questions and share personal details, they do so without following the page.
Others resort to using the disappearing messages feature, which ensures the entire conversation disappears once it is closed.
Based on my experience, I can say that Jordanian society in particular, and the Arabic-speaking societies in general, still views sexual health issues as “Western.” Many assume sexual health issues are promoted to corrupt the morals and values of society, with no connection to the life of the Arab individual, whose role is reduced to reproduction.
Sometimes, those seeking information end up relying on the internet and social media as their trusted source for answers to their sexual and reproductive health questions to avoid embarrassment and stigma from society. This is especially the case for people who engage in non-normative behaviors, such as sexual relations outside of marriage, same-sex practices, or abortion.
In addition, I have noticed many people sharing their conversations with ChatGPT, instructing it to interact with them in a warm, intimate, and affectionate manner. This directly affects people’s mental health.
AI technology was designed to respond positively to users, but what we often fail to see with a critical eye is the risk of addiction to it, since it offers a safe space that provides emotional comfort. The human brain carries accumulated psychological burdens that can push individuals to rely on AI, which poses a threat to their psychological and social well-being.
The price of bad moderation
From my experience again, I believe social media platforms shape social cultures and adopt them in the form of capitalist technical policies, guidelines, and standards, which they then promote.
For example, how much more likely is Instagram to boost a video or post of a half-naked, fit man compared to one featuring a man who is overweight? And what are the chances of the platform supporting the content of someone pulling silly pranks compared to another person explaining the correct use of a condom? I will leave the answer to the reader.
Since 2021, Instagram has been draining my mental energy immensely, ever since I began producing sexual, reproductive, and mental health content. I am not an influencer chasing empty entertainment, the kind of content the platform supports.
Rarely does Instagram care about the complaints I and other creators submit regarding incitement and violent hate speech in our comments, which we receive repeatedly.
For a large segment of Jordanian society in particular, I am not “man enough,” and my comment sections overflow with insults and accusations rooted in distorted cultural legacies.
What I have been experiencing since the start of this journey pushes me to ask many questions, the most pressing of which is: Does Instagram really believe that educational content about sexual and reproductive health is more dangerous than violent, hateful, and inciting speech? Or is it simply indifferent to fair content moderation practices that respect the rights of its users and protect them from violations?