‘Global first’ ruling that online content contributed to 14-year-old Molly Russell’s death

London coroner said teenager ‘died from an act of self-harm while suffering depression and the negative effects of online content’

Molly Russell, 14, who died in 2017. Picture: Russell family/PA
Molly Russell, 14, who died in 2017. Picture: Russell family/PA

A senior coroner’s conclusion that English schoolgirl Molly Russell died while suffering from the “negative effects of online content” has been hailed the first of its type.

Andrew Walker said online material viewed by the teenager on sites such as Instagram and Pinterest “was not safe” and “should not have been available for a 14-year-old child to see”.

Head of child safety online policy at the children’s charity the NSPCC, Andrew Burrows, said it was the “first time globally it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death”.

Molly’s father Ian Russell said he hoped the conclusion would be an “important step in bringing about much-needed change” and asked Meta chief Mark Zuckerberg to “just listen… and then do something about it”.

READ MORE

Welling up as he ended proceedings at a press conference in Barnet, north London, on Friday, Mr Russell’s voice broke as he said: “Thank you, Molly, for being my daughter. Thank you.”

The Prince of Wales, who met Mr Russell in November 2019, said on Twitter: “No parent should ever have to endure what Ian Russell and his family have been through.

“They have been so incredibly brave. Online safety for our children and young people needs to be a prerequisite, not an afterthought.”

Concluding it would not be “safe” to rule the cause of Molly’s death as suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”.

At North London Coroner’s Court, he said: “At the time that these sites were viewed by Molly, some of these sites were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see.

“The way that the platforms operated meant that Molly had access to images, video clips and text concerning or concerned with self-harm, suicide or that were otherwise negative or depressing in nature.

“The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text – some of which were selected and provided without Molly requesting them.

“These binge periods, if involving this content, are likely to have had a negative effect on Molly.”

The inquest was told Molly accessed material from the “ghetto of the online world” before her death in November 2017, with her family arguing sites such as Pinterest and Instagram recommended accounts or posts that “promoted” suicide and self-harm.

Meta executive Elizabeth Lagone said she believed posts seen by Molly, which her family say “encouraged” suicide, were safe.

Pinterest’s Judson Hoffman told the inquest the site was “not safe” when Molly used it.

Ian Russell, the father of Molly Russell, speaks to media outside Barnet Coroners Court, north London, after the inquest into the death of the schoolgirl. Photograph: Joshua Bratt/PA
Ian Russell, the father of Molly Russell, speaks to media outside Barnet Coroners Court, north London, after the inquest into the death of the schoolgirl. Photograph: Joshua Bratt/PA

Speaking after the conclusion of the inquest, Mr Burrows said: “This is social media’s big tobacco moment.

“For the first time globally it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death. The world will be watching their response.”

Cross-bench peer and internet safety campaigner Baroness Beeban Kidron said she would be bring forward “an amendment to the Online Safety Bill in the House of Lords that seeks to make it easier for bereaved parents to access information from social media companies.”

Digital Secretary Michelle Donelan said the inquest had “shown the horrific failure of social media platforms to put the welfare of children first.”

Shadow digital, culture, media and sport secretary Lucy Powell said it was a “scandal that just as the coroner is announcing that harmful social media content contributed to Molly Russell’s death, the Government is looking to water down the Online Safety Bill”.

Online safety campaigners at the children’s charity NSPCC said Molly died after suffering from “negative effects of online content” and it should “send shockwaves through Silicon Valley”.

Out of 16,300 posts that Molly saved, shared or liked on Instagram in the six months before her death, 2,100 were related to depression, self-harm or suicide, the inquest was told.

The court was played 17 clips the teenager viewed on the site – prompting “the greatest of warning” from the coroner.

The inquest also heard details of emails sent to Molly by Pinterest, with headings such as “10 depression pins you might like” and “New ideas for you in depression”.

Continuing his conclusions, Mr Walker said: “Other content sought to isolate and discourage discussion with those who may have been able to help.

“In some cases, the content was particularly graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that could not be recovered from.

“The sites normalised her condition, focusing on a limited and irrational view without any counterbalance of normality.

“It is likely that the above material viewed by Molly, already suffering with a depressive illness and vulnerable due to her age, affected her mental health in a negative way and contributed to her death in a more than minimal way.”

The coroner said on Thursday that he intends to issue a Prevention of Future Deaths (PFD) notice, which will recommend action on how to stop a repeat of the Molly Russell case.

The Russell family’s lawyer, Oliver Sanders KC, asked the coroner to send the PFD to Instagram, Pinterest, media regulator Ofcom and the Department for Digital, Culture, Media and Sport.

A spokeswoman for Meta said in a statement following the conclusion that the company is “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers” and would “carefully consider the coroner’s full report when he provides it”. — PA

Samaritans’ free helpline is at 116123, or you can email jo@samaritans.ie or jo@samaritans.org; Pieta’s free helpline is at 1800-247247, or text help to 51444