Facebook must prioritise children’s wellbeing, Zuckerberg told

Letter from global alliance sent to founder in wake of revelations from Facebook whistleblower

Mark Zuckerberg is urged to take five steps to address concerns over Facebook’s approach to protecting children on its  social media platforms. Photograph: Mandel Ngan via  Getty Images
Mark Zuckerberg is urged to take five steps to address concerns over Facebook’s approach to protecting children on its social media platforms. Photograph: Mandel Ngan via Getty Images

Facebook has lost the trust of parents, is prioritising commercial gain over children’s needs and must take steps to restore faith in its platforms, a global alliance of child protection campaigners and experts has warned Mark Zuckerberg.

The Facebook founder and chief executive is urged to publish its internal assessments of the risks young people face on its services in a letter with 59 signatories including the UK’s National Society for the Prevention of Cruelty to Children and the Child Rescue Coalition in the UK.

“The company must do significantly better to regain the trust of parents and child protection professionals, and most importantly, to ensure its product decisions contribute to rather than compromise children’s safety and wellbeing,” said the letter.

Mr Zuckerberg is urged to take five steps to address concerns over its approach to protecting children on its eponymous social media platform, its Instagram photo and video-sharing app and its WhatsApp messaging service. Those steps are:

READ MORE

-Share all its internal research on the impact its platforms have on children’s wellbeing.

-Set out what research has been conducted on how the company’s services contribute to child sexual abuse.

-Publish risk assessments of how its platforms affect children.

-Provide details of an internal reputational review of its products.

-Review the child protection implications of encrypted messaging.

The letter has been sent to Mr Zuckerberg in the wake of revelations from Facebook whistleblower Frances Haugen, who has accused the company of a lax approach to safety in testimony to US senators and a number of document leaks that formed the backbone of a series of damning articles in the Wall Street Journal.

Teen wellbeing claim

One of the most damaging leaks was internal research at Instagram that showed the app’s impact on teen wellbeing, including one slide showing that 30 per cent of teen girls felt Instagram made dissatisfaction with their body worse.

Echoing Haugen’s claim that the company puts profit before people, the group letter states: “We cannot continue with a situation in which children’s needs are or appear to be secondary to commercial motivations, and in which young people’s right to safety, privacy and wellbeing is traded-off to prioritise the interests of adults and other more influential drivers.”

A spokesperson for Facebook said: “’We’re committed to keeping young people who use our platform safe. We’ve spent $13billion [€11.19bn]on safety in recent years – including developing tools to enhance the safety and wellbeing of young people across Facebook and Instagram. We’ve shared more information with researchers and academics than any other platform and we will find ways to allow external researchers more access to our data in a way that respects people’s privacy.”

The letter was sent as hearings resumed into the UK’s draft online safety Bill, which imposes a duty of care on social media companies to protect children from harmful content and to prevent the proliferation of illegal content such as child pornography. MPs and peers on the committee were told by Laura Edelson, a social media expert at New York University, that Facebook’s algorithms pushed vulnerable users towards more harmful content because they found it so engaging.

“Attention is what these platforms sell,” said Ms Edelson. “They sell advertising. So this is their business model, to engage users and they have to build algorithms to do that and certain types of harmful content are just more engaging.”

Another witness, Guillaume Chaslot of campaign group AlgoTransparency, said social media and video-sharing companies should be fined according to how whether damaging video content is viewed widely. Under such a regime, platforms would have an incentive to “react quickly” to dangerous content and ensure that their algorithms do not recommend unacceptable posts. – Guardian