TikTok fined £12.7m for misusing UK children’s data

Censure from British watchdog comes amid global concerns over conduct of Chinese-owned social media app

The UK’s data watchdog has fined TikTok £12.7 million (€14.5 million) for breaking the law on the protection of children’s data, amid mounting global concern about the Chinese-owned social media app.

The Information Commissioner’s Office (ICO) on Tuesday said it estimated that up to 1.4 million UK children aged under 13 had used the viral-video app in 2020, even though TikTok’s own rules forbid children younger than 13 from creating accounts.

The platform had failed to gain parental consent to use children’s data, contravening UK data protection laws, the regulator said.

“TikTok should have known better, TikTok should have done better,” said John Edwards, the UK information commissioner.

READ MORE

The group has a right to appeal the fine within 28 days. “We will continue to review the decision and are considering next steps,” it said.

'We have a lot of eggs in few baskets' - does the positive outlook conceal threats to our economy?

Listen | 34:53

The fine, the first issued by the ICO for under-13s accessing an online service, comes as TikTok faces a regulatory onslaught from governments around the world.

TikTok’s chief executive was grilled by US legislators last month as the social media app attempted to head off a potential US ban over national security fears linked to its Chinese ownership.

A number of jurisdictions, including the UK, EU, Canada and the US, have banned TikTok from government devices.

In response to mounting pressure, TikTok last month laid out new measures to protect users’ data in Europe. It will open two data centres in Dublin and a third in Norway to store videos, messages and personal information generated by 150 million European users of the platform.

The ICO investigation found that TikTok “did not respond adequately” when a concern was raised internally with senior employees about children under 13 using the platform.

TikTok said it had taken steps to prevent children from accessing its platform. It also publishes information on how many accounts are linked to users it suspects are under 13.

The social media app said it removed more than 17 million accounts in the last three months of 2022.

“We invest heavily to help keep under-13s off the platform and our 40,000-strong safety team works around the clock to help keep the platform safe for our community,” said TikTok.

The UK government is set to introduce its Online Safety Bill, which will bring in tougher requirements for social media companies and could lead to jail sentences for tech executives who fail to protect children online.

The ICO’s investigation into TikTok, which covered activity from May 2018 to July 2020, concluded before the regulator introduced stricter requirements around the processing of children’s data.

TikTok’s fine was reduced from a £27 million charge proposed in September, after the regulator dropped findings relating to the processing of “special category data”, such as personal or biometric data.

Baroness Beeban Kidron, who founded children’s privacy charity 5Rights Foundation, said the tech sector should “accept the principle of delivering products and services with basic safety built in by design”. “The future of tech will not be built on the back of children’s anxiety, inappropriate content and dangerous activities – but will be an accountable and regulated sector that prioritises the impact on children over profits,” she added. – Copyright The Financial Times Limited 2023