Facebook guidelines offer disturbing insight into moderation policy

Social media giant says videos of physical abuse and self-harm can be permissible

The publication by the Guardian of Facebook’s internal guidelines for dealing with offensive content shines a light into what the world’s largest social media network really thinks its responsibilities are to its public. The picture is not particularly reassuring.

Despite what some might say, Facebook and other social networks can never realistically be expected to check all content before users upload it.

A requirement to do so would effectively shut all such services down permanently. But they do have a responsibility to monitor and remove harmful material.

Like most digital platforms - including the reader comment facility on The Irish Times website – Facebook relies on users to report inappropriate content.

READ MORE

Community standards

Moderators then review reported posts and remove them if they fall foul of Facebook’s community standards.

These standards commit the company, in broad and rather vague terms, to removing certain types of material, including hate speech and violent or graphic material. But what content will be removed, how quickly should it happen, and how transparent should the process be?

The documentation seen by the Guardian gives more detail, and many will find parts of it disturbing.

For example, the guidelines state that remarks such as “Someone shoot Trump” should be deleted, because a head of state is in a “protected category”. But it may be permissible to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because these are not regarded as “credible threats”.

Sadistic or celebratory

Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless they are deemed to be sadistic or celebratory. Videos of abortions are allowed, as long as there is no nudity. And Facebook permits people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”.

The publication of these minimalist and often confusing guidelines will add to the swelling chorus from those calling for Facebook to be held more accountable for the content it hosts.

Not all such calls are free from self-interest; the news media, in particular, has relished the opportunities which controversies such as the recent livestreaming of murders in the US and Thailand have presented.

Editorial standards

They see a chance of regaining lost ground from a competitor which, they believe, is profiting unfairly from their content without being held to the same legal and editorial standards.

Such criticisms are not unreasonable. Facebook’s assertions that it is simply a content-neutral technology company, and that responsibility for what gets posted on the platform rests primarily with its users, is looking increasingly frayed.

With two billion users worldwide, Facebook is now the most powerful media organisation on the planet, and with that enormous power comes greater responsibility.

Publishers and broadcasters in every country are bound by longstanding legislation covering defamation, threats, intimidation and the right to privacy. Meeting these standards is an expensive, time-consuming and labour-intensive business which Facebook is understandably reluctant to take on.

Media responsibilities

However, over the last couple of years, slowly and grudgingly, the company has been moving towards accepting that, even if it is not a publisher in the traditional sense of the word, it is indeed a media organisation with the responsibilities that implies.

The shift is largely due due to pressure from countries such as Germany, where members of the Bundestag have proposed new legislation which would impose seven-figure fines for unacceptable content if it is not swiftly removed.

This is where the Guardian's revelations are so telling. They suggest the company is still struggling to take its responsibilities seriously. Facebook currently has 4,500 content moderators. Three weeks ago Mark Zuckerberg announced it was adding a further 3,000.

Those numbers might look significant but, set against the platform’s massive user base, they are less impressive. According to a leaked document, moderators had to assess nearly 54,000 potential cases of revenge pornography alone in a single month.

Most moderators apparently work for subcontractors, with Facebook refusing to disclose their exact number or locations. Moderators get two weeks' training, with guidelines drafted by Facebook executives based at the company headquarters in California.

In addition to human moderators, Facebook uses a number of technological tools to filter or block violent or sexually explicit content. It regularly maintains that future solutions to problematic material are more likely to come from developments in artificial intelligence than from hiring more employees.

Impatient legislators

This is unlikely to prove sufficient to satisfy increasingly impatient legislators. In the UK, a House of Commons inquiry, set up following the murder last year of Labour MP Jo Cox by a far-right gunman, was highly critical of multinational social networks which it said prioritised commercial objectives over public safety.

“Social media companies currently face almost no penalties for failing to remove illegal content,” the MPs wrote.

“We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”

Technology companies routinely use the utopian vocabulary of the early internet to justify their actions. So Mark Zuckerberg speaks frequently of the importance of “openness” and the “community”.

But Facebook is one of the world’s most profitable companies and is notoriously secretive about the algorithms and commercial strategies it deploys.

The network of hundreds of millions of people it monetises every day through targeted advertising may not be a real community in any meaningful sense of the word, but they are citizens who deserve the protection of the law.

Hugh Linehan is Culture Editor