Subscriber OnlyTechnology

Karlin Lillington: Long past time for closer scrutiny of TikTok

Youth-focused social media platform admits to allowing Chinese access to data and there are fears about use of data of minors

The large size of TikTok’s young user base is the reason why it should have been closely scrutinised from the start. Photograph: Dan Kitwood/Getty Images

We’ve needed to talk about TikTok for a long time, well before governments – including India, Canada, the US, and the European Commission, council and parliament – decided the app should be banned from work devices.

At the moment, it can feel as if every other news bulletin brings some new woe for the app owned by Chinese-headquartered company ByteDance, from government bans or discussions of bans to double investigations by Ireland’s Data Protection Commissioner.

Why all the current worry over an app most people associate with fun and frivolity – Covid lockdown dance challenges, influencers influencing, or comedy skits? The concern is that TikTok might be surveilling individuals, gathering user data, and sending it back to China.

TikTok routinely had dismissed such fears, noting, for example, that US data couldn’t move beyond its US and Singapore data centres. It became particularly indignant when Buzzfeed published a story last June alleging Chinese employees had accessed US user data.

READ MORE

Karlin Lillington: Getting to grips with AIOpens in new window ]

In September testimony before the US Congress, a senior executive denied the allegations but, by Christmas, the company had admitted that US and Chinese employees had accessed the data of journalists from Buzzfeed and the Financial Times plus some of their contacts, in an attempt to identify staff sharing information with reporters.

Meanwhile, the company’s Irish-based European headquarters acknowledged in November that some EU citizens’ data is sent to China (information tucked quietly into a statement with an alphabetised list of the countries handling EU data.

The DPC is now looking into this, with a draft decision expected this year. It’s also looking into how TikTok handles the data of EU minors. TikTok has already been fined $5.7 million by the US Federal Trade Commission for illegally collecting children’s data, and received fines and investigations in other countries on related concerns around minors. On Wednesday it laid out new measures to protect users’ data in Europe, as it attempts to address these growing security concerns.

Then there’s the opacity of TikTok’s algorithm which, from the second a user signs up, starts to populate a user’s feed with viewing recommendations. This has been described as TikTok’s “secret sauce”, an instantaneous content response to apparently, the tiniest of user cues. But as studies are finding, the algorithm also seems able to pile on alarming self harm, suicide and eating disorder videos. TikTok has said it takes all these concerns seriously and has teams to spot and remove dangerous content.

Karlin Lillington: Why Iceland trumps Ireland as a home for data centresOpens in new window ]

The app remains, overwhelmingly, an app of the more vulnerable young. Two-thirds of US teens use it, 40 per cent of global users are under 24, and 75 per cent are under 34. Usage falls steeply by age after 34.

I’ve read suggestions that the current focus on TikTok is an unfair overreaction, in particular, because it was such an important communications platform for younger people. Why weren’t other platforms being scrutinised to the same degree?

But hang on. Over the past half decade, as other US-originating platforms came, rightly, under growing scrutiny, TikTok mostly skated below serious political, regulatory or media focus. The early, big-headline congressional inquiries – the ones you probably can recall – featured CEOs and senior executives of companies like Facebook, Alphabet, Amazon, and Twitter, not TikTok. That has only gradually changed.

And the large size of TikTok’s young user base is the reason why it should have been closely scrutinised from the start in the context of improper registrations (children under 13 are not supposed to be able to register), collecting personal information from minors, any Chinese access to such sensitive data, and the content served to minors and potential harms caused by such content.

Attention should have swung to the platform as TikTok’s growth, especially among the young, exploded from 2018 onwards (the new EU Data Services Act now requires companies to submit numbers of online users regularly, helpfully enabling hockey stick growth to be flagged to regulators early on).

The snowballing focus on this company is long overdue and only misplaced if it doesn’t have broader outcomes.

It should remind us that all commercial platforms, everywhere, make money by intense gathering and monetising of personal information, including from minors and people who aren’t even registered users. And to do this, platforms boost content that encourages the most engagement (hence, even if arbitrarily, promoting the sensational and vile). Automated moderation will not effectively filter this fire hose of worrisome content.

We’ve had so many examples of social media platforms inappropriately and even illegally accessing user data that stopping this should by now be a regulatory priority. We have another example of why no one should believe any company when it declares data cannot possibly be accessed across supposedly technologically-closed borders. This is a particular concern with companies based in countries with autocratic governments.

The world badly needs to heed the bigger picture warnings from security experts: we should not be so engaged with nor dependent on components, devices, software and apps from countries that may have an interest in surveilling citizens, human rights activists and governments, anywhere. Such critical cautions need to go TikTok viral.