EU Asks Meta To Curb Inaccurate Content On Israel-Palestine Conflict

‘Both Meta and TikTok are bound by obligations set out in the DSA’

EU Asks Meta To Curb Inaccurate Content On Israel-Palestine Conflict

The European Union has given Meta a week to explain in further detail how it is combating the spread of unlawful content and misinformation on its Facebook and Instagram platforms regarding the Israel-Palestine conflict.

The European Commission, the EU's executive arm, announced Thursday that it has made the official request for information to Meta (META).

The committee also requested further information from TikTok on the actions it had taken to prevent the spread of "terrorist and violent content and hate speech," but without mentioning the Israel-Hamas conflict.

Last week, EU Commissioner Thierry Breton sent a letter to various social media sites, including Meta and TikTok, asking them to describe the steps they were taking to comply with EU content moderation standards included in the newly adopted Digital Services Act (DSA).

On Friday, Meta stated that its staff has been monitoring its platforms "around the clock" since the Hamas assaults on October 7 and described some of its steps against disinformation and anything that breaches its laws and standards.

On Sunday, TikTok stated that, among other things, it has established a command center to coordinate the work of its "safety professionals" throughout the world and to enhance the technologies it employs to detect and delete graphic and violent content automatically.

However, the European Commission has stated that further information is required. In its notice on Thursday, the agency gave Meta and TikTok until October 25 to answer its demands and warned that if it was not happy with their responses, it may levy financial penalties.

According to the commission, both businesses have until November 8 to outline how they intend to defend the "integrity of elections" on their platforms.

Both Meta and TikTok are subject to the duties outlined in the DSA, a significant piece of legislation passed in August that tries to more stringently control giant digital corporations while also protecting people's online rights.

The official demands came a week after the commission gave a similar ultimatum to X, the corporation formerly known as Twitter, requesting details on how it intends to limit the dissemination of unlawful, deceptive, violent, and hateful content.

The commission stated that it has launched an inquiry into X's DSA compliance. It has made no announcements about concurrent investigations into Meta or TikTok.