EU to investigate Meta over handling of Russian disinformation

Regulators set to express concern over insufficient moderation of political ads that risk undermining electoral process

The European Union is set to open a probe into Meta’s Facebook and Instagram as soon as Monday over concerns the social media giant is failing to do enough to counter disinformation from Russia and other countries.

Regulators suspect that Meta’s moderation does not go far enough to stop the widespread dissemination of political advertising that risks undermining the electoral process, the European Commission is expected to say on Monday, two people with knowledge of the matter said.

European Union (EU) officials are particularly worried about the way Meta’s platforms are handling Russia’s efforts to undermine upcoming European elections. The commission, however, is not expected to single out Russia in its statement and will only make reference to the manipulation of information by foreign actors.

EU officials also fear that the company’s mechanism to let users flag illegal content is not easily accessible or user-friendly enough to comply with the EU’s Digital Services Act, the bloc’s landmark legislation designed to police content online.

READ MORE

The law, approved in April last year, includes measures to force platforms to disclose what steps they are taking to tackle misinformation or propaganda. If the EU finds Meta to be in breach of the Act, it could be fined up to 6 per cent of its global annual turnover.

The move represents the latest regulatory action taken by the commission against Big Tech groups, as fears grow among member states that Russia is pushing disinformation on social media to undermine democracy in advance of Europe-wide elections in early June.

The commission is to start the investigation based on a report sent by Meta in September on how it is handling disinformation risks on its platform as well as the EU’s own assessment. The investigation will assess whether the way Facebook and Instagram place political content on their sites is compliant with the law.

Aviva re-entering the Irish health insurance market: ‘this can only be good news for all consumers’

Listen | 44:04

Investigators will look into whether Meta has failed to mitigate risks as it looks to discontinue CrowdTangle, a tool that shows publishers how content is spreading across the site, and to outline concerns related to how Meta tracks disinformation to help fact-checkers and journalists.

The commission is expected to give Meta five working days to say what it will do to mitigate the situation or threaten the social media group with measures under the DSA, the people said.

There is no set deadline for the investigation to end and it will depend on Meta’s willingness to co-operate, the EU is expected to say.

“We have a well-established process for identifying and mitigating risks on our platforms,” said Meta. “We look forward to continuing our co-operation with the European Commission and providing them with further details of this work.”

The commission did not reply to a request for comment. The timing of the announcement can still shift, the people said.

Meta’s probe follows a separate investigation into X in relation to illegal content and disinformation of violent and terrorist content spreading on its platform after Hamas’s October 7th attacks against Israel.

It also comes after regulators imposed election safeguards aimed to counter online threats to the integrity of electoral processes.

As a result of the guidelines, social media platforms such as X and Meta will be required to scrutinise the risks of online disinformation across the bloc. – Copyright The Financial Times Limited 2024