Study finds that Meta, X approved hateful ads containing anti-Muslim and antisemitic violence ahead of the German election.
The German federal elections are scheduled for Sunday, February 23. The German federal elections will take place on February 23, 2019. Meta’s five approved ads included hate speech that compared Muslim refugees to “vermin”, “rodents”, and called for them to sterilize, burn, or gass them. Meta approved five hateful ads, and another five that included violent hate speech against Jews and Muslims. These additional ads approved included messages attacking “rodents” immigrants who were “flooding the country” to “steal our democracy” and an antisemitic insult that suggested Jews were lying about climate changes in order for them to gain economic power and destroy European industry. Another ad X had approved was a direct attack against the SPD, Germany’s center-left coalition government. The ad claimed that the SPD wanted to accept 60 million Muslim refugees. It then went on to incite violence. X also duly scheduled an ad suggesting “leftists” want “open borders”, and calling for the extermination of Muslims “rapists.”
Elon Musk, the owner of X, has used the social media platform where he has close to 220 million followers to personally intervene in the German election. Elon Musk, the owner of X, has used the social media platform where he has close to 220 million followers to personally intervene in the German election. In the case of X it is not clear if the platform modifies ads as all 10 violent ads were approved to be displayed.
EU’s Digital Services Act is shown in the frame
Eko’s tests suggest that neither platform enforces bans on hate speeches they claim to apply in their policies. Eko also reached the same conclusion in 2023 when it conducted a similar test ahead of the new EU online governance regulations coming into effect — suggesting that the regime had no impact on how Meta operated.
“Our findings suggest that Meta’s AI-driven ad moderation systems remain fundamentally broken, despite the Digital Services Act (DSA) now being in full effect,” an Eko spokesperson told TechCrunch.
“Rather than strengthening its ad review process or hate speech policies, Meta appears to be backtracking across the board,” they added, pointing to the company’s recent announcement about rolling back moderation and fact-checking policies as a sign of “active regression” that they suggested puts it on a direct collision course with DSA rules on systemic risks. Eko submitted its findings to the European Commission which is responsible for enforcing the DSA. It also said it shared the results with both companies, but neither responded.
The EU has open DSA investigations into Meta and X, which include concerns about election security and illegal content, but the Commission has yet to conclude these proceedings. In April, it had said that it suspected Meta of not properly moderating political ads. However, the full investigation, which kicked off in December 2023, also concerns illegal content risks, and the EU has yet to arrive at any findings on the bulk of the probe well over a year later.
Confirmed breaches of the DSA can attract penalties of up to 6% of global annual turnover, while systemic non-compliance could even lead to regional access to violating platforms being blocked temporarily. The EU has yet to decide on its final decision regarding the Meta and X investigations, so any DSA sanctions are still in flux.
Meanwhile, it’s now just a matter of hours before German voters go to the polls — and a growing body of civil society research suggests that the EU’s flagship online governance regulation has failed to shield the major EU economy’s democratic process from a range of tech-fueled threats.
Earlier this week, Global Witness released the results of tests of X and TikTok’s algorithmic “For You” feeds in Germany, which suggest the platforms are biased in favor of promoting AfD content versus content from other political parties. Researchers from civil society have also claimed that X has blocked data access in order to prevent them studying election security risks ahead of the German vote – access the DSA should enable. “Our findings and mounting evidence from civil society groups show that Big Tech won’t clean up their platforms on its own.”
Meta and X are continuing to allow hate speech and incitement to violent acts, as well as election disinformation, to spread widely, despite the legal obligations they have under the DSA,” said the spokesperson. We have not revealed the name of the spokesperson to avoid harassment. “Regulators need to take action – both by enforcing DSA and also, for example, implementing mitigation measures before elections. This could include turning off profiling-based recommender systems immediately before elections, and implementing other appropriate ‘break-glass’ measures to prevent algorithmic amplification of borderline content, such as hateful content in the run-up elections.”
The campaign group also warns that the EU is now facing pressure from the Trump administration to soften its approach to regulating Big Tech. The campaign group warns that in the current political climate there is a real risk that the Commission will not fully enforce the new laws to appease the U.S.