Subscriber OnlyOpinion

Una Mullally: Facebook experts at shrugging off culpability

Firm does not even directly employ these low-paid workers in their distressing roles

Mark Zuckerberg testifies remotely during the Senate judiciary committee in Washington. Photograph: Bill Clark/Pool/AFP via Getty
Mark Zuckerberg testifies remotely during the Senate judiciary committee in Washington. Photograph: Bill Clark/Pool/AFP via Getty

Last week, an open letter from more than 200 Facebook moderators, including 114 working in Dublin, and addressed to Mark Zuckerberg and Sheryl Sandberg of Facebook, chief executive of CPL/Covalen Anne Herarty, and chief executive of Accenture Julie Sweet, was published.

In this letter, Facebook content moderators wrote to “express our dismay at your decision to risk our lives – and the lives of our colleagues and loved ones – to maintain Facebook’s profits during the pandemic . . . Before the pandemic, content moderation was easily Facebook’s most brutal job. We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support. Now, on top of work that is psychologically toxic, holding on to the job means walking into a hot zone.”

Moderators work to review and remove content that violates Facebook’s policies. In that work, they see countless distressing images, videos and statements every day, including extreme violence and child abuse. These moderators sit at screens in offices around the world, earning little, and work to protect people on social media from the worst possible “content” imaginable.

Perhaps the pandemic is now offering another of its reveals: to shine a light on the working conditions and rights of content moderators

In order to do that, they have to see it. An ex-Facebook moderator I spoke to recently began our conversation with a memory from the job interview: “One of my interview questions was: ‘Okay, so you’ve come into work, you’ve grabbed your cup of coffee, you sit down at your desk, and the first thing you see is a baby being raped. How do you cope?’ ”

READ MORE

In addition to the very obvious stresses of this work, the pandemic has created more challenges. While initially sent from offices to work from home, some content moderators have been sent back to offices in Dublin, in our Level 5 lockdown, including to an office at Sandyford, where cases of Covid-19 have been reported (Facebook said last week that it has “exceeded health guidance on keeping facilities safe for any in-office work”).

Perhaps the pandemic is now offering another of its reveals: to shine a light on the working conditions and rights of content moderators, and also to highlight the ways in which they are hired, working on Facebook’s platform, but subcontracted by companies such as CPL and Accenture.

Legal actions

Facebook is facing multiple legal cases from moderators. Earlier this year the company made a €48 million settlement with lawyers representing about 10,000 former and current moderators in the US. And 30 content moderators are now planning to launch legal action in the Irish courts. Sinn Féin TD Louise O’Reilly raised the issue in the Dáil on November 10th.

“I want to ask if the Ministers are aware of this new form of work,” she said, “and the fact that the people engaged in it are subject to serious psychological damage and injury due to being exposed to explicit content during the course of their everyday work, and if the Tánaiste will engage with those workers, and, indeed, with their employers, to address this situation, because in some instances, as we now know, this is causing post-traumatic stress disorder, among other psychological damage, to many of those workers exposed to this type of content.”

When the Government pats itself on the back about new “tech jobs” or tech companies expanding their “operations” here, there’s rarely any discussion about what those jobs actually are.

We know that in 2019, for example, the average salary of a staff member at Facebook in Dublin was about €161,000 (there is a large disparity in earnings across various roles in the company, but the average still tells us something), yet a large number of people working for Facebook in Ireland are not directly employed by the company, but are subcontracted, and are working in far less salubrious conditions than the shiny Grand Canal Dock offices, are earning far less and are doing extraordinarily distressing work.

The average pay for someone working in “content review” for Facebook is about €25,000 annually. There are about 15,000 people working as Facebook moderators globally, and all of them are low-paid relative to the salaries in the industry they’re working in. Dublin is a significant site for this work.

Distorting language

If this work is so vital, why isn’t it remunerated as such? If this work is so core to Facebook’s “mission”, why aren’t people directly employed by Facebook? If the working conditions are ideal, why are moderators suing? If the unusual pressures of the job are being dealt with adequately, why are people developing PTSD?

Some job notices posted by Accenture call for “a level of resilience and maturity” while also saying “recent graduates welcome”. Is this a job you’d want your child, fresh out of college, to be doing?

Facebook tends to respond to questions or exposés of its endless dubious practices with the disposition and vocabulary of a child tasked with coming up with its own scolding. They say things such as, “We are committed to getting this right”.

They have become experts at manufacturing a language that shrugs off culpability, a sort of language so bland and meaningless that it attempts to absolve itself of its sins while simultaneously contemplating the potential existence of solutions or action only in the abstract.

In the past, jargon designed to hide awful things sought to invent impenetrable terms as a disguise: “collateral damage”, “extraordinary rendition”, “direct provision”. Yet Facebook pulls an even greater trick. It steals from us words that used to mean something – “community”, “values” – and depletes their meaning in ways that are as insidious as they are blatant. For what could be so rotten about a company that professes to want to “bring the world closer together”?