The European Commission has launched an investigation into Meta Platforms’ Facebook and Instagram for their failure to tackle disinformation and deceptive advertising leading up to the European Parliament elections. The move by EU tech regulators comes amidst concerns about potential sources of disinformation from Russia, China, and Iran, as well as within the EU itself.
The Digital Services Act, which came into effect last year, requires Big Tech companies to take more action against illegal and harmful content on their platforms or face fines of up to 6% of their global annual turnover. The EU investigation will focus on a Russia-based influence operation network known as Doppelganger, which clones authentic media and was exposed by Meta in 2022.
EU digital chief Margrethe Vestager expressed concerns about Meta’s moderation practices, transparency in advertisements, and content moderation procedures. Meta, with over 250 million monthly active users in the EU, defended its risk mitigation process and stated its willingness to cooperate with the European Commission.
The Commission highlighted Meta’s alleged non-compliance with DSA obligations related to deceptive advertisements, disinformation campaigns, and coordinated inauthentic behavior in the EU. Additionally, concerns were raised about the lack of an effective third-party real-time civic discourse and election-monitoring tool ahead of the European Parliament elections.
Meta has been given five working days to address the concerns raised by the EU and provide details of remedial actions taken. The investigation underscores the growing scrutiny of tech companies’ role in combating disinformation and ensuring the integrity of democratic processes.