Youth workers instrumental in building children’s 'digital resilience', study finds

Emily Harle
Tuesday, October 4, 2022

Youth workers must work with education professionals, parents and carers to help children become aware of and deal with the impact of online risks, new research finds.

Supporting children to build resilience online may be more beneficial than protecting them from harmful content, research suggests. Picture: Adobe Stock
Supporting children to build resilience online may be more beneficial than protecting them from harmful content, research suggests. Picture: Adobe Stock

A study by the University of East Anglia suggests that a collective focus by youth services, schools and families, on supporting children to build "digital resilience" may be more beneficial than protecting them from online harms altogether as they navigate the digital world.

The research - conducted as part of a UK Research and Innovation funded initiative which supports work which promotes young people’s mental health online - also finds local and national government, policymakers, and internet corporations, including social media providers, must play a role in boosting the skill set among young people. 

"Digital resillience" is described by researchers as "the ability to recognise and recover from online risks, such as inappropriate content or online bullying".

Doctor Simon Hammond, lead author of the study, notes that current United Kingdom Council for Internet Safety (UKCIS) guidance talks about "digital resilience" at an individual level which places the emphasis to avoid harmful content on the child.

This risks “marginalising how home, community and societies support children to learn how to navigate and grow from risky online experiences”.

Doctor Kimberley Bartholomew, from University of East Anglia, added that the findings could “shape new ways of teaching which promote controlled exposure to risky opportunities”.

This could be more productive in building digital resilience than simply trying to avoid risky situations online altogether, she adds.

The research was conducted using 10 focus groups across England which included 59 children between the ages of eight and twelve alongside support from six 16-to 17-year olds, who helped to collect and analyse the views of the younger children.

The research also involved interviews with parents and teachers, as well as internet safety experts from across the UK.

It comes amid sector concerns that the Online Safety Bill – currently making its way through parliament – will not provide enough protection for young people.

In a new report on online harms, Dame Rachel de Souza, children’s commissioner for England, states that she is “simply not satisfied that enough is being done to keep children safe online”.

De Souza's report, based on a national survey of 2,005 children and parents regarding online safety, raises concerns over a wide range of inappropriate content available to children, including “sexualised and violent imagery, anonymous trolling, and material promoting suicide, self-harm and eating disorders”.

It finds that 45 per cent of children aged eight to 17 have been upset or worried by inappropriate content online, and further found that boys were more likely to have a higher exposure to harmful content than girls. 

It also highlights a link between disadvantaged children and online harm, revealing that more than half of children eligible for free school meals have been exposed to harmful content online, and are 14 per cent more at risk than their peers who do not receive free meals.

The survey further reveals that just half of children who were exposed to harmful content online reported it, with 40 per cent of those who did not saying they felt there was "no point" in doing so.

De Souza said it was vital that children’s views are central to the Online Safety Bill, adding that it is her duty to “ensure that children’s voices underpin each stage of the legislative process, as well as inform Ofcom’s work in drafting the Codes of Practice which will define their regulation”.

The publication of the children's commissioner's report coincided with a ruling by coroner Andrew Walker that the "negative effects of online content" were a factor in the death of a girl who took her own life.

Molly Russell died in November 2017, aged 14, after viewing thousands of images of self-harm and suicide online.

Walker told the inquest at Barnet Coroner's Court in north London, which concluded today (30 September) that he will  compile a report based on his findings and write to social media company Meta, which is responsible for Instagram and Pinterest, as well as the government and media regulator Ofcom.

CYP Now Digital membership

  • Latest digital issues
  • Latest online articles
  • Archive of more than 60,000 articles
  • Unlimited access to our online Topic Hubs
  • Archive of digital editions
  • Themed supplements

From £15 / month

Subscribe

CYP Now Magazine

  • Latest print issues
  • Themed supplements

From £12 / month

Subscribe