Subscriber OnlyCourts

New online safety code in the crosshairs as Elon Musk’s X takes media regulator to the High Court

Big tech fears more stringent rules being applied to their online platforms under a series of domestic and EU laws

Musk
Under EU law, the Irish State is obliged to regulate video-sharing platforms and streaming services established out of Dublin for the whole of Europe. The new code means platforms such as Elon Musk's X (formerly Twitter), are obliged to comply or face fines of up to €20 million, or 10 per cent of a platform’s annual turnover, whichever is greater. Illustration: Paul Scott

In September, Taoiseach Simon Harris had a message for social media companies. The world of self-regulation was changing, he said.

“And that’s not as just a slogan,” he added.

The previous day, Harris had been briefed by regulators and the Garda on efforts to improve online safety.

At the end of October, the new era, forecast by the Taoiseach, arrived when media regulator Coimisiún na Meán published its new online safety code. This set out binding rules for video-sharing platforms based in Ireland.

READ MORE

Minister for Media Catherine Martin said the code represented “a big step forward in online safety” that would “make all of us, but particularly our children, safer online”.

She said the rules would introduce “real accountability” for online video-sharing platforms and require them “to take action to protect those that use their platforms, including by having robust complaints-handling procedures and introducing effective age-verification”.

The online safety code brings the State in line with the EU’s Audiovisual Media Services Directive (AVMSD).

Under the EU law, the Irish State is obliged to regulate video-sharing platforms and streaming services established out of Dublin for the whole of Europe. Not only must the State enshrine the directive in Irish law, it must also set up a new regulator to oversee that it is obeyed and enforced.

Services operated by tech companies TikTok, X and Meta ‘exposed to terrorist content’, media regulator findsOpens in new window ]

The new code meant platforms such as Facebook, Instagram, YouTube, TikTok, LinkedIn, X (formerly Twitter), Pinterest, Tumblr and Reddit would be obliged to comply or face fines of up to €20 million, or 10 per cent of a platform’s annual turnover, whichever was greater.

But this is not the only set of rules in place.

The EU’s Digital Services Act (DSA), which aims to prevent illegal and harmful activities online and the spread of disinformation, was applied across member states from February 2024.

Then there is the Digital Markets Act, which legal experts said introduced additional requirements for entities that had very significant power in the market.

How these measures interact or overlap is a concern to some companies.

Harris said the new online safety code would not come as a surprise to the tech companies. “They’ve all been following this, I’m sure, very closely over a significant period of time,” he said.

The companies had indeed been monitoring developments, but not all were happy.

On Monday, tech giant X is scheduled to begin a legal challenge to the new online safety rules. A month ago, Twitter International Unlimited Company (TIUC), the company behind the social network, brought judicial review proceedings against Coimisiún na Meán. Twitter was renamed as X in July 2023 following its acquisition by Elon Musk, one of the richest men in the world and a close ally and supporter of incoming US president Donald Trump.

On Monday it is set to apply for leave to seek judicial review, which is essentially the opening stage in a legal challenge where it is expected to ask the courts to quash aspects of the new code.

In August it had warned the regulator it was reserving its rights to “challenge the lawfulness of the code”.

In a submission to a consultation process run by Coimisiún na Meán, the company raised concerns about complications arising from the potentially overlapping nature of the EU laws that regulate it.

X said it strongly supported “the co-regulatory approach encouraged by the AVMSD, to achieve protection of all users, including children and young people, from harmful online content”.

But it cautioned that it was “important” that the transpositive of the EU directive does not impose obligations that “go beyond what is required by the AVMSD and which potentially conflict with the Digital Services Act.”

The details of X’s case have not yet been revealed.

Some observers believe it may argue similar points made by lobbyists for social media giants: that the code included overly-prescriptive obligations to moderate illegal and harmful online content.

However, X was not the only tech company with concerns.

Lobby group Technology Ireland, a division of the employers’ group Ibec which represents companies in the ICT, digital and software sectors, told Coimisiún na Meán that its members were “very concerned” that the rules cut across the Digital Services Act and “apply an overly prescriptive rather than outcome-based approach”.

It argued that aspects of the code were “disproportionate” to its objectives and “fail to recognise evolving risks and solutions”.

Regulating social media is an impossible job but someone has got to at least tryOpens in new window ]

EU politicians “sought to strike a practicable and proportionate balance” between protecting EU citizens online with their fundamental rights in adopting the AVMSD and DSA.

“The application of the code must not disturb that careful balance or conflict with requirements or terms set out in the DSA and AVMSD,” Technology Ireland said.

The judicial review by X will not be the first legal action faced by the regulator over the online safety code.

In June, the High Court dismissed separate challenges by Reddit and Tumblr against Coimisiún na Meán’s decisions to include them on a list of video-sharing platforms to be regulated under the code.

Both companies claimed they should not have been included, and the decisions to do so would severely and adversely affect their operations if allowed to stand.

In June, the High Court dismissed separate challenges by Reddit and Tumblr against Coimisiún na Meán’s decisions to include them on a list of video-sharing platforms to be regulated under the online safety code. Photograph: (Amy Lombard/New York Times
In June, the High Court dismissed separate challenges by Reddit and Tumblr against Coimisiún na Meán’s decisions to include them on a list of video-sharing platforms to be regulated under the online safety code. Photograph: (Amy Lombard/New York Times

Dr TJ McIntyre, associate professor at the school of law at UCD, said the tech sector was facing significantly more regulation that was largely being driven at EU level.

There had been “huge resistance at the EU level in terms of lobbying in relation to this”, he said.

“The nature of the change is that we have gone from having a situation where controls on the various platforms were largely a matter of national law and it was largely done in an after-the-fact sort of way to a situation where it is now a matter of EU law,” he said.

This was “being done in a way that imposes obligations on how they design their services and how they structure their services”, he added.

Previously, where racist content, denial of the Holocaust or defamation was identified online, internet companies were immune from liability if they removed content “promptly” once notified, said Dr McIntyre, and national law would determine whether they faced any further liability.

“Germany, for example, had rules regarding how firms had to handle certain types of complaints and procedures that had to be put in place and reporting they had to do,” he said.

Ireland had nothing comparable, said Dr McIntyre.

In most countries, and particularly in Ireland, it was mostly about how companies responded after the fact about a complaint without having to build in any proactive designs or safeguards, he said.

“That has now changed dramatically. The position under the AVMSD, which is what was involved in [drawing up] the online safety code and also in relation to the Digital Services Act, is one where regulators can proactively begin to impose obligations as to how firms run their business,” he said.

Light touch regulation did not work with banks and won’t work for video sharing platforms like TikTokOpens in new window ]

While some tech companies may feel that legislation and regulations are pushing them too hard, others argue they have not gone far enough.

Dr Johnny Ryan, a director of Enforce, a unit of the Irish Council for Civil Liberties who previously held senior roles in online advertising, media and technology, said there were two main problems in the online world: the publication of illegal content and the amplification of extreme views for profit.

Where illegal material was published, it must be unpublished and regulators such as Coimisiún na Meán may help there, he said.

Polarising material being intentionally selected and pushed to people that algorithms used by tech companies knew would be incensed, or somehow engaged by it, was “the real problem”, he said.

This was not an issue the regulator could fix, and the online safety code would not help either, he added.

“It could have helped. In the draft, they had a supplementary section that had a kind of a soft ban on those algorithms being on by default,” he said.

This was subsequently removed.

While tech companies and civil liberties groups differ on the appropriate measures to protect EU citizens online, it may ultimately be left to the courts to decide.