Study shows few effective age controls at social media firms

More than 1m underage users of Meta, Snap in Australia, report says, ahead of law banning such users

The watchdog’s research depicted a social media industry with few effective controls over who signs up to their services, despite the risks to children from harmful content and other users. Photograph: KIRILL KUDRYAVTSEV/AFP via Getty Images)
The watchdog’s research depicted a social media industry with few effective controls over who signs up to their services, despite the risks to children from harmful content and other users. Photograph: KIRILL KUDRYAVTSEV/AFP via Getty Images)

Meta Platforms, TikTok, Snapchat and other leading social media platforms probably have more than 1 million underage users in Australia, a regulatory report said, highlighting the scale of policy failure at the companies before the country enforces unprecedented user age limits this year.

About 80% of children in Australia aged between 8 and 12 used at least one platform in 2024, research by the country’s eSafety Commissioner released Thursday showed. The findings indicate social media companies are allowing about 1.3 million kids nationwide to flout their own rules barring under-13s from their platforms, the regulator said.

The watchdog’s research depicted a social media industry with few effective controls over who signs up to their services, despite the risks to children from harmful content and other users. The report suggests the platforms are largely ill-prepared for controversial legislation, due to take effect by December, that will raise the minimum age for social media users in Australia to 16 years.

“All of these companies have a long way to go,” eSafety Commissioner Julie Inman Grant said in an interview. “Some of them are starting from a very low bar.”

READ MORE

The legislation means the number of social media users deemed underage in Australia will suddenly balloon. There are more than 1 million users aged between 13 and 15 currently using the most popular platforms, according to the eSafety Commissioner. The regulator said it used its powers to make the platforms disclose the data.

Australia’s crackdown is part of a global effort to raise social-media age limits or tighten oversight of online content. Inman Grant said the discussions she’s had with platforms, including Snap, about the upcoming law changes indicates they’re worried that other countries will follow Australia’s model. “They are concerned about the first domino,” she said.

“Understanding age is a complex challenge. We continue to invest in AI and other technologies to address this issue,” Meta, the owner of Facebook and Instagram, said in a statement. A TikTok spokesperson said the safety of its users was the company’s highest priority. Since 2023, TikTok has used age-detection tools to remove more than 1 million Australian users suspected of being under the age of 13, the spokesperson said. Snapchat-owner Snap had no immediate comment.

Under the new Australian law, digital platforms will themselves be responsible for enforcing the higher age limit, with penalties of as much as A$50 million ($32 million) for breaches. The legislation requires platforms to take “reasonable steps” to stop under-16s from signing up. YouTube and Discord are exempt from the ban.

Inman Grant said what’s considered a reasonable step hasn’t yet been determined. But she said it will have to go beyond a simple self-declaration of age by the user. Options include determining age through facial analysis, online behaviour or word choice, she said. She rejected claims by companies including Meta that Australia’s new law overlooks the reality of age-assurance technology.

“It’s not a matter of capability, resources or access to advanced technology,” she said. “Frankly, it’s been a form of wilful blindness. They haven’t been forced to implement more robust forms of age assurance.”

“Think about how some of these platforms can target you with deadly precision when it comes to advertising,” she said.

To be sure, Inman Grant said it’s inevitable that some under-16s will get around whatever new controls the platforms put in place, but the law will still offer more protection to vulnerable kids.

“The hope is that we can at least shield them from some of the more harmful and deceptive features,” she said. “With the manipulative algorithms, the dark patterns and the rabbit holes, it’s not really a fair fight, when you think about it.” – Bloomberg