Since October 2023, both pro-Palestinian groups and their opponents have increasingly relied on online political ads to shape public discourse. In the case of pro-Israeli ads, multiple state and nonstate actors have spent millions of dollars on disinformation campaigns on Google, Meta, X, and YouTube.
For example, in November 2023, the IDF sponsored a YouTube ad targeting children’s videos, spreading disturbing messages about the lie of 40 beheaded Israeli babies. Meta also makes profit from advertising the illegal sale of property in the Occupied Palestinian Territories. Only this week, users in Lebanon received political ads on Instagram, in Arabic, urging them to join the Mossad, again.
In 2023 and 2024, we received numerous reports of unjustified takedowns of pro-Palestinian ads about Gaza on Instagram and Facebook. Considering Meta’s history of biased content moderation, we investigated whether similar patterns extend to political advertising policies.
Using a combination of quantitative and qualitative methods, SMEX investigated more than 4,500 ads extracted from Meta’s Ad Library datasets across three key timeframes: October 16–21, 2023; May 7–21, 2024; and August 5–10, 2024.
The sample was limited to English-language ads published in the United States with large audience reach. Each ad was categorized according to political alignment (pro-Palestine or pro-Israel), removal status, factual accuracy, and presence of hate speech or violent language.
Findings reveal that while pro-Palestinian ads are often published by internationally recognized non-governmental organizations (NGOs) (i.e., Doctors Without Borders or UN bodies) and call for humanitarian aid or ceasefire, while pro-Israeli ads cheering on the war are usually run by private media companies funded by Israeli lobbies (i.e., Facts for Peace and the Israeli Government Advertising Agency).
We found that when an ad violated Meta’s ads or content moderation policies, pro-Palestinian ads were removed at a faster rate than pro-Israeli ads. As a result, pro-Israeli ads remained visible for longer periods despite violating Meta’s policies. This disparity in enforcement led to unequal reach and engagement.
The study also found significant shortcomings in the functionality and transparency of the Ad Library itself. Some ads visible in the Comma-Separated Values (CSV) export were inaccessible via public search. Metadata related to removal justifications, targeting criteria, or moderation timestamps was either missing or inconsistent. The lack of filtering and analysis tools within the interface made comprehensive analysis time-consuming and technically disconnected.
This was not the case two years ago. In August 2024, Meta shut down CrowdTangle, a crucial transparency tool used by journalists and researchers for real-time monitoring and analysis of ads and other public content.
Our research confirms that Meta’s political ads policy, as demonstrated in the case of Israel’s war on Gaza, is marred by inconsistency, limited accountability, and structural opacity. These flaws not only silence Palestinian perspectives but also hinder civil society’s ability to assess whether content moderation practices are applied fairly.
We demand Meta updates its policy for greater clarity, adopt improved transparency tools, and reinstates real-time monitoring systems for equitable and accountable digital governance.
Read the full research here.