The European Union has issued a stern warning to TikTok regarding the proliferation of “disinformation” on its platform following the recent conflict involving Hamas and Israel. In a letter addressed to TikTok’s CEO, Shou Zi Chew, the EU urged the company to intensify its efforts to combat false information and requested that the platform provide a clear plan for compliance with European law within the next 24 hours.
Social media platforms, including TikTok, have experienced a surge in the spread of misleading content related to the conflict, such as manipulated images and inaccurately labeled videos. The EU had previously issued warnings to other major tech companies, such as X (formerly Twitter) and Meta, concerning the dissemination of similar content.
The EU emphasized the importance of TikTok recognizing its substantial user base among young people and asserted that the platform bears a particular responsibility to safeguard children and teenagers from violent content, terrorist propaganda, dangerous challenges, and potentially life-threatening material. EU Commissioner Thierry Breton stressed these concerns in a statement posted on X.
As of now, TikTok has not publicly responded to the EU’s communication, and the BBC reached out to the platform for comment.
X was also given a 24-hour deadline to address the issue of disinformation. In response, Linda Yaccarino, X’s chief executive, informed the EU that her company had taken action by removing or flagging “tens of thousands of pieces of content” since the conflict involving Hamas and Israel began. Additionally, X reported the removal of hundreds of accounts.
The EU also issued a similar warning and deadline to Meta, the parent company of Facebook and Instagram. Although the EU declined to comment on whether it had received a response from Meta, a spokesperson from the European Commission revealed that they were maintaining ongoing contact with Meta’s compliance teams.
Meta’s spokesperson assured the BBC that the company had established a special operations center staffed with experts, including individuals fluent in Hebrew and Arabic, to closely monitor and respond to the evolving situation stemming from the terrorist attacks by Hamas. The company’s teams are working tirelessly to ensure the safety of its platforms, take action against content that violates policies or local laws, and collaborate with third-party fact-checkers in the region to mitigate the spread of misinformation.
X’s chief executive, Linda Yaccarino, detailed the company’s response to the EU’s requests in her letter, stating that X had addressed over 80 requests from the EU to remove content and had added explanatory notes to certain posts to provide context. These notes were displayed on more than 700 posts related to the attacks and ongoing events, with additional notes automatically generated for matching images or videos when they were reused in new posts.
Regarding the EU’s claim of “illegal content,” Ms. Yaccarino confirmed that X had not received any notices from Europol.
EU Commissioner Thierry Breton emphasized the necessity for both X and Meta to demonstrate their prompt, diligent, and objective actions in addressing the situation.
The EU introduced new legislation in August 2023, known as the Digital Services Act (DSA), which regulates the types of content permitted online. The DSA requires “very large online platforms” to proactively remove “illegal content” and demonstrate that they have implemented measures for its removal when requested.
The EU has not disclosed its next steps in these specific cases, but it clarified that, hypothetically, under the law, it can conduct interviews and inspections. If unsatisfied with a platform’s compliance, the EU can initiate a formal investigation. Depending on the outcome, the EU may impose significant fines or, as a last resort, request that the platform be temporarily banned from operating within the EU.
Check out the latest news in our Global News section
Stay updated on environmental data and insights by following KI Data on Twitter