Molly Russell: Sector supports coroner's recommendations to protect children online

Emily Harle
Wednesday, October 19, 2022

Digital safety experts have welcomed a report urging social media providers to better protect children online after a coroner ruled exposure to harmful content contributed to the death of a 14-year-old girl.

Coroner Andrew Walker ruled the the 'negative effects' of social media contributed to Molly Russell's death. Picture: Leigh Day solicitors
Coroner Andrew Walker ruled the the 'negative effects' of social media contributed to Molly Russell's death. Picture: Leigh Day solicitors

Coroner Andrew Walker told an inquest, held at North London Coroner's Court, that unsafe online content contributed to Molly Russell's death "in a more than minimal way".

Recording a narrative verdict, he ruled that the 14-year-old "died from an act of self-harm while suffering depression and the negative effects of online content".

The inquest heard that, prior to her death in 2017, Molly was exposed to extensive harmful content across various online platforms, relating to depression, self-harm and suicide.

Hannah Ruschen, senior child safety online policy officer at NSPCC, described Walker's decision as a “landmark ruling” due to his acknowledgement of the dangers of online harms.

She welcomed a report by the coroner which lays out a series of recommendations to government around children's online safety as “another step towards ensuring no more children are exposed to the horrendous content experienced by Molly Russell on social media”.

Walker's prevention of future deaths report outlines the findings of the inquest and has been sent to Michelle Donelan, secretary of state for digital, culture, media and sport, and a number of social media companies - including Pinterest and Meta - which were used by Molly prior to her suicide.

Recommendations in the report urge the government and social media platforms to consider:

  • Reviewing the provision of internet platforms to children, in relation to harmful online content

  • Implementing separate platforms for children and adults

  • Age verification to access platforms

  • The role of social media algorithms in providing content

  • The use of advertising

  • Parental/carer control, including access to material viewed by their child

Walker also recommends that the government consider using an independent regulatory body to monitor online content across platforms, in relation to these recommendations.

The relevant parties must respond to the report within 56 days of receipt.

Responding to the report, a spokesman for The Molly Rose Foundation (MRF), a suicide prevention charity founded in Molly’s name by her family and friends, said: “The MRF supports the coroner’s view that an independent regulator with sufficient powers to protect children and where necessary take proportionate sanctions against those who have failed in their duty of care, must be set up.”

Ruschen said: "The Culture’s Secretary’s promise to double down on child protection should mean every social media site has to ensure children are not seeing harmful content. This should be overseen by a senior manager responsible for child safety who’s personally liable for gross failure, particularly when it involves the death of a child.”

Will Gardner, chief executive of online safety organisation Childnet, added: “The conclusions of the Molly Russell inquest are hugely significant, and absolutely underline the need for regulation in this space.

"Online platforms remain hugely popular with children and young people, and so at the same time as pushing for regulation we need to make sure that young people have the information and skills to look after themselves and others whilst online.”

He said that it also highlights the "importance of pushing forward the Online Safety Bill" which is currently being passed through parliament.

The MRF has continued to push for the swift implementation of the bill, saying: “This must be an utmost priority, it doesn’t have to be perfect and it can be fine-tuned and strengthened as needed. In any case, legislation will need to be fleet of foot to keep up with the pace of tech change.”

However, experts have also criticised the effectiveness of the bill, with director of UK Safer Internet Centre David Wright saying “it doesn’t go far enough” and may be further weakened around protections for people who view harmful content.

He added: “The processes around ‘Impartial Dispute Resolution’ are currently being removed and this will take away the essential practice of giving victims the opportunity to seek independent recourse when it comes to the devastating impact of legal but harmful content.

“If we are to best protect our children online, then the kind of content, viewed by Molly and thousands of others, needs to be addressed, assessed and removed - with those who fail to do so being held to account.”

 

 

CYP Now Digital membership

  • Latest digital issues
  • Latest online articles
  • Archive of more than 60,000 articles
  • Unlimited access to our online Topic Hubs
  • Archive of digital editions
  • Themed supplements

From £15 / month

Subscribe

CYP Now Magazine

  • Latest print issues
  • Themed supplements

From £12 / month

Subscribe