Logan Paul and YouTubers like him are not just Google’s responsibility

Regulation conversation shouldn’t be silenced by promises of greater monitoring

YouTube chief executive Susan Wojcicki ‘Too many of you unfortunately had to learn a new word in 2017.’ Photograph: Justin Sullivan/Getty Images
YouTube chief executive Susan Wojcicki ‘Too many of you unfortunately had to learn a new word in 2017.’ Photograph: Justin Sullivan/Getty Images

When the historians of the internet look back upon 2018, Logan Paul may be considered a mere footnote in the broader, grimmer clash between the tech giants on one side and people baffled by how the tech giants have eluded regulation on the other.

But he will likely be considered a watershed in how YouTube-owner Google deals with controversies concerning the content on its platform. It has taken its fingers firmly out of its ears. Perhaps even more importantly, it has been seen to do so.

To be familiar with the existence and work of Logan Paul, a video blogger popular with children, is to know two things: that he is awful, and that he can't be the worst of what's out there. His prominence on YouTube, and the unlikely wealth that springs from it, means it is his tasteless misdemeanours, and not the next idiot's, that have attracted censure.

On New Year’s Eve, Paul (22) uploaded a video showing a dead body in Japan’s “suicide forest” and posed near the corpse. The ensuing controversy saw this self-styled “polarising dude” take a break from vlogging “to reflect” for all of two minutes. Last week, he took to Tasering dead rats.

READ MORE

It is tempting to conclude “this man-child is out of control” and hope that some responsible adults can get through to him in private before his unhappy public showcase gets nastier and nastier, like the kind of film that starts with a game of truth or dare but ends in full-blown torture-porn.

But before moving on to the next unedifying story, consider this: Paul is part of the massive commercial network that is Google. It may not have given birth to him, but it adopted him. Like previous headline-maker PewDiePie, the erstwhile creator of videos featuring anti-Semitic messages, Paul is Google’s child. And it has a responsibility.

After the dead body video, YouTube removed him from the Google Preferred programme, which sells premium advertising on behalf of the top 5 per cent of YouTubers. This time, it has temporarily suspended all advertising on his channels, citing a damaging “pattern of behaviour”. He can still upload his messages to the world. He just can’t make direct ad cash on the back of them – not on YouTube anyway. So having first told Paul to go stand in the corner of the room, Google has now taken away his pocket money.

‘Demonetisation’

The tone of a recent blog post from YouTube chief executive Susan Wojcicki was that of a weary teacher giving wayward charges one last chance. "Too many of you unfortunately had to learn a new word in 2017: 'demonetisation'," Wojcicki wrote to YouTubers, some of whom had been kicking off about their videos being unfairly flagged by Google's algorithm as unsuitable for advertising. (They call this their "adpocalypse".)

Wojcicki said YouTube would make its appeals system faster and that a more accurate system would include a “more human review”. Any “egregious” behaviour by errant vloggers, meanwhile, would “lead to consequences”. They wouldn’t be allowed to spoil things for the rest of the class.

It would be wrong to assume that just because Wojcicki's post was addressed to "YouTube Creators" that they were its only audience. Every statement that Google or Facebook or any other tech platform makes about monitoring, flagging or reviewing content, or content policies, has the same subtext: Don't even think about regulating us.

When Google announced its army of human monitors would reach 10,000 this year, it was partly a response to last year’s desertion by advertisers (finally spooked by the sight of their brands appearing next to some of the most hate-filled extremism imaginable), and partly a way to dampen down the criticism of awakening politicians.

Google, in this conversation, is less of a parent-figure and more like the child promising to change its own behaviour. Gone is the see-no-evil, hear-no-evil assertion that it cannot possibly monitor all those hundreds of videos uploaded to YouTube every minute. In its place is the promise to do better next time, and catch out more of those who violate its policies.

But should the content policies set by YouTube, or any other tech giant, really be the only ones in play here? No, they shouldn’t. These platforms may talk to many admirable third-party organisations, such as those that make it their mission to protect civil rights or prevent suicide, but this is not a substitute for independent oversight.

It is an undeniable anomaly that the licensed broadcasters that commission content are subject to thorough, paternalistic regulation (with special provisions governing content consumed by children), whereas a platform such as YouTube, making editorial decisions after-the-fact, can rule its own global kingdom. If YouTube doesn’t agree that a piece of content is harmful, the only option for complainants is to target advertisers or try to register objections through social media. It may work on occasion, but it’s a queasy fallback.

That dead rat is going to cost Logan Paul. Worryingly, he has a brother who is also a YouTuber in the prankster mould. Becoming a YouTuber is now a career ambition for kids. To make it a sounder one, Google has to do more, and society needs to accept its responsibility too.