The European Commission has expressed concerns that Meta’s Facebook and Instagram platforms are not adequately protecting the EU elections from disinformation, particularly in light of potential vulnerabilities to Russian networks. The Commission is investigating suspected infringements related to deceptive advertising and political content on Meta’s services. Additionally, Meta’s advertising network is seen as vulnerable to misinformation and potentially a target for Russian networks. The lack of third-party real-time election-monitoring tools on the platforms, such as the absence of the CrowdTangle tool, has made it difficult for researchers and journalists to track the efforts made by the company to remove illegal content.

With over 250 million monthly active users in the EU, Meta’s platforms have a substantial reach, making it crucial for the company to have effective mechanisms in place to tackle misinformation and protect the integrity of the electoral processes. The Commission is particularly concerned about the lack of transparency regarding political content and the ability to flag illegal content on the platforms. While there is no specific timeline for Meta to make the necessary changes, the Commission expects the company to cooperate and act promptly. Meta’s designation as a Very Large Online Platform (VLOP) under the Digital Services Act (DSA) means that it must adhere to strict rules, including transparency requirements and the protection of minors online.

Meta had announced earlier this year that it would establish its own operations center for the elections in order to quickly identify potential threats and implement mitigations in real time. The company also plans to start labeling AI-generated content in May 2024. In response to the Commission’s investigation, a Meta spokesperson stated that the company has a well-established process for identifying and mitigating risks on its platforms and looks forward to continuing cooperation with the European Commission. The Commission recently invited Meta, TikTok, and other platforms to stress-test election guidelines under the DSA to help mitigate risks that could impact the integrity of elections and their services.

Last week, the Commission launched an investigation into TikTok, and earlier this year, probes into X and AliExpress related to illegal content and compliance with the DSA. Meta itself filed a legal complaint at the General Court in Luxembourg in February regarding a supervisory fee imposed by the Commission under the DSA. The concerns raised by the Commission highlight the importance of online platforms taking adequate measures to combat disinformation and protect the integrity of elections. It is essential for companies like Meta to have effective tools and mechanisms in place to ensure that their platforms are not exploited by malicious actors seeking to undermine democratic processes. Promoting transparency, accountability, and cooperation with regulatory authorities are key aspects of ensuring the safe and secure use of online platforms in the context of elections.

Share.
Exit mobile version