Meta ordered to take action after Facebook found to be exposed to ‘terrorist content’

Examples of terrorist content can include photos of crime scenes glorifying terrorist acts, or posts or comments which incite the commission of terrorist offences

Meta will now be obliged to take specific measures to protect their services from being used for the dissemination of terrorist content. Photograph: Jim Wilson/The New York Times
Meta will now be obliged to take specific measures to protect their services from being used for the dissemination of terrorist content. Photograph: Jim Wilson/The New York Times

Social media giant Meta has been ordered by Ireland’s media regulator to take mitigating actions after one of its platforms, Facebook, was found to be exposed to “terrorist content”.

Coimisiún na Meán published the decision on Monday determining that Meta services, in respect of Facebook, are exposed to terrorist content after the notification of two or more final removal orders from EU authorities in the last 12 months.

It follows another recent decision last month which similarly found that services operated by Meta, in respect of Instagram, and other social media companies TikTok, and X are “exposed to terrorist content”.

Examples of terrorist content can include photos of crime scenes glorifying terrorist acts, or posts or comments which incite, solicit or advocate for the commission of terrorist offences.

READ MORE

Any material that includes any instructions on the making of weapons or explosives to commit or contribute to the commission of terrorist offences also constitutes terrorist content under the EU Terrorist Content Online Regulation.

The regulation provides an EU-wide mechanism for counteracting the dissemination of terrorist content online and enabling the “speedy removal” of terrorist content by hosting service providers.

Meta, which also owns Instagram, WhatsApp and Threads, will now be obliged to take specific measures to protect their services from being used for the dissemination of terrorist content and to report to Coimisiún na Meán on the action taken within three months.

“These measures shall be effective, targeted and proportionate and respectful of the fundamental rights of users.

“Among the measures a hosting service provider exposed to terrorist content is required to take is the inclusion in its terms and conditions of provisions to address the misuse of its service for the dissemination to the public of terrorist content,” Coimisiún na Meán’s statement reads.

The regulator added that it will supervise and assess the actions taken by Meta to mitigate terrorist content.

“Where An Coimisiún considers that the specific measures taken do not comply with legislative requirements, An Coimisiún will address a decision to the hosting service provider requiring it to take the necessary measures so as to ensure that legislative provisions are complied with,” it said.

In September, the regulator was designated the competent Irish authority under the EU Terrorist Content Online Regulation to impose penalties on hosting service providers who do not comply with their obligations.

Any violations of the regulation by hosting service providers such as Meta can lead to administrative fines, including financial penalties of up to four per cent of global turnover.

In issuing its determination on Monday, the regulator urged social media users to report suspected terrorist content to the respective social media platform and to report “any content which could be a threat to life to An Garda Síochána”.

Jack White

Jack White

Jack White is a reporter for The Irish Times