
Office for National Statistics data for 2018 shows nine out of 10 households have internet access - up from 65 per cent a decade ago - and in homes with children, 100 per cent had accessed the internet in the past three months.
Ofcom's most recent Media Use and Attitudes Report, published in January, gives a snapshot into how children - from pre-school to mid-teens - are consuming media and helps to identify trends on the type of content they use and how they are accessing it.
Some of the more striking findings include:
- 52 per cent of three- to four-year-olds spend nearly nine hours a week online
- 42 per cent of five to seven-year-olds have their own tablet computer
- 74 per cent of eight- to 11-year-olds play games online for 10 hours a week
- 71 per cent of 12- to 15-year-olds take their mobile phone to bed
The Ofcom report finds the proportion of 12- to 15-year-olds with a social media profile fell over the past year - from 74 per cent in 2017 to 69 per cent in 2018; and from 23 per cent to 18 per cent over the same period for eight to 11-year-olds. However, there was a slight rise in the proportion of under-eights with a social media profile.
Girls are more likely than boys to use social media. The UK Household Longitudinal Survey shows that 17.4 per cent of girls said they spent more than three hours a day on social media in 2015/16, compared with 8.3 per cent of boys. The gap between the sexes of 9.1 percentage points is a third higher than the previous year, and more than double the difference in 2009/10.
A clear trend in recent years is the increased popularity of children watching online videos via websites such as YouTube. Separate research by Ofcom found that online streaming services tended to be used as a "time-filler" in much the same way that previous generations watched television. The report, Life on the small screen, highlighted how streaming live sites are particularly popular with boys, many of whom will watch vlogs and livestreaming sites linked to popular computer games (see research evidence).

Safeguarding concerns
The growth in online media consumption by children and the integration of smart technology into most aspects of their daily lives has led to rising concern about the risk this poses to their wellbeing. Much of the risk is linked to children being exploited by adults for financial, sexual or political gain. However, peer-on-peer abuse and the misuse of technology to share content are also major risk factors for children.
In terms of the scale of the threat, new research by charity Internet Matters found an increasing gap between the majority of young people who are becoming increasingly digitally adept and a vulnerable minority whose online life puts them at greater risk of harm. The research identified some of the most vulnerable young people were most at risk, including children in care, those with special needs and/or disabilities, and who have mental health difficulties (see research evidence).
Ofcom's research shows that young people are generally conscious of the pros and cons of using social media - of 12- to 15-year-olds, 78 per cent said there is pressure to "look popular", while nine out of 10 said people are "mean to each other". Girls particularly said they felt pressure to project a "glamorous" image of themselves on platforms such as Snapchat. However, the vast majority of research respondents said using social media made them feel happy, while two thirds reported using it to give support to friends and organisations.
Despite this, research by The Children's Society found overwhelming support among young people for social media companies to do more to tackle cyberbullying. According to Ofcom, one in eight young people have been bullied on social media, although an evidence review of online risks published by the UK Centre on Child Internet Safety suggests the rate could be twice that. Cyberbullying can take many forms, including threats, teasing and spreading rumours, with potentially serious negative outcomes for children's wellbeing - evidence shows that children who spend more than three hours using social networking websites on a school day are twice as likely to report high or very high scores for mental ill-health.
In October, Health Secretary Matt Hancock commissioned England chief medical officer Dame Sally Davies to undertake a review into the impact that cyberbullying and other online abuse can have on children's mental health.
Another area of concern is children talking to strangers online. In the Ofcom study, 22 per cent of 12- to 15-year-olds said they had been contacted online by someone they didn't know, and one in 10 said they had seen something of a sexual nature that made them feel uncomfortable. A key risk factor, according to the Children in our Media Lives study, is that some young people are exposing themselves to contact from strangers by keeping their social media profiles public or allowing people to add them without knowing who they were. Charity Childline recorded 2,200 counselling sessions with young people where online sexual abuse was a factor in 2017/18. Meanwhile, the Internet Watch Foundation identified 78,000 websites containing child sexual abuse images in 2017.
How companies collect and store children's personal information has also been raised as a concern by parents. The issue was highlighted in a report by the Children's Commissioner for England last year. Who Knows What About Me? highlights how everyday devices used by children - from toys to smart speakers - can collect information about their owners with this being stored by corporations. How tech companies harvest data from a child's online footprint, and how it is then used to target advertising at children was one of the greatest concerns among parents in the Ofcom study.
Children - like other vulnerable groups - are at greater risk of having personal information - such as date of birth, address, image, bank details - stolen in phishing scams to be used fraudulently. This was cited by 32 per cent of parents in the Ofcom research, and has prompted City of London police to develop an education programme for children about online fraud (see practice example).
How young people share information and content among themselves is another area of rising concern among parents. The capture and sharing of sexualized images among groups - known as sexting - is widespread. NSPCC research from 2016 suggests one in 10 young people had taken naked photos of themselves, with half of them sharing it with others. Meanwhile, a third of young people said they had shared photos of other people, even though this is illegal and could lead to prosecution. Research carried out by charity Childnet found that a quarter of young people admitted sharing sexual images of peers (see practice example). Related concerns include the ease of accessing pornography online and the damage that can do to young people's self-esteem.
The internet is a very effective tool for terrorist groups to radicalise young people. This is a concern for 22 per cent of parents in the Ofcom research, and is high on the political agenda due to high-profile cases of children travelling to Syria to join Islamic State. The scale of the problem is unknown, although more than 2,000 children have been referred to the Home Office's de-radicalisation Prevent programme.
Key e-safety policies
The tragic case of 14-year-old Molly Russell, who took her life after being able to access images of self-harm and suicide on Instagram, has led to widespread calls for social media companies to do more to prevent inappropriate content being hosted on their platforms - and for the government to better regulate the internet giants. This was picked up by the Ofcom research, which found nearly a third of parents were concerned about this, up from one in five the year before.
It is an issue that is likely to be addressed in the Online Harms white paper, set for release later this year. In the meantime, the children's commissioner for England Anne Longfield has called for social media companies to sign up to a statutory duty of care - a legal obligation to prioritise the safety and wellbeing of children using their platforms - and fund a digital ombudsman to act as an independent arbiter between themselves and users.
The statutory duty of care would mean online providers would have a duty to "take all reasonable and proportionate care to protect children from any reasonably foreseeable harm". The definition of harmful content would cover a wide range of e-safety threats to children.
Longfield says the duty would "strengthen the incentives" for online providers to develop better systems that prevent children from being able to access adult content, with sanctions for failure to do so (see expert view, below). These would include a public admission of failure in addition to financial penalties.
A duty of care is also supported by Dame Sally Davies, the UK and England chief medical officer, who this month published findings from her government-commissioned review of the impact of social media use on children's mental health. Her review found evidence that using social media for more than five hours a day doubled the risk of depression. She and the UK's other chief medical officers have developed guidance that recommends parents restrict screen use to a maximum of two hours at a time, not allow phones in bedrooms or at the dinner table, and talk to their children about the risks of sharing information and images online.
The Online Harms white paper was announced as part of the government's response to the 2017 Internet Safety Strategy green paper. The response outlines details of the Digital Charter, which sets out the "norms and rules" of the online world to make it a safe space to explore and the government's principles to ensure this is delivered. Key areas covered by the principles include open access, the use of personal data, rights and responsibilities, and protections for vulnerable groups.
The green paper set out a raft of changes to tackle e-safety issues related to children, and proposed the creation of the UK Council for Internet Safety (UKCIS) to absorb the work of the UK Council for Child Internet Safety (UKCCIS). This went ahead, with the UKCIS appointing a new board in December.
The white paper outlines how separate policy initiatives across government are addressing e-safety issues, including the role of social media in youth street violence, creating a commission to counter online extremism and tackling children's use of online dating sites. It also sets out how it is supporting schools to tackle bullying online, particularly of vulnerable groups such as children with special educational needs and disabilities.
The revised Keeping Children Safe in Education guidance, which came into effect in September 2018, recognises the role the internet can play in child abuse and exploitation and sets out how education professionals should be trained in this. It particularly highlights the role the internet can play in sexual harassment by pupils, and the damaging impact this can have. This is likely to be a feature of statutory relationships and sex education (RSE) when it is introduced in September 2020. The government is set to publish its response to a consultation on the issue shortly.
Another policy development is the anticipated introduction of age verification for pornography websites from April 2019. Campaigners hope the introduction of age verification - which has been delayed by a year - will reduce the ease with which children and young people can access - either deliberately or unwittingly - adult sexual content.
Last year, the UKCCIS also produced its Education for a Connected World framework, which describes the knowledge and skills children and young people should have at different stages of their lives to be able to navigate the internet successfully and safely. It outlines how children use technology, the impact it has on their behaviour and what teachers can do to develop effective strategies for understanding and managing online risks.
In February, UKCIS produced online safety guidance for early years professionals reflecting the rising exposure that pre-school children have to online technology. As digital technology becomes ever-more integrated into how children soclialise and learn from an early age, government agencies and policymakers will need to be vigilant to ensure sufficient safeguards are in place to ensure this provides increased opportunities without risks.
TECH GIANTS HAVE A DUTY OF CARE TO CHILDREN
By Anne Longfield, children's commissioner for England
Children tell me they love using the internet and social media, but they also sometimes worry about how to handle the very intense and overwhelming digital world. I have been calling for young people to be given the power, information and resilience they need to make informed choices about what they do online, and I have urged parents to engage with what their children are up to online. However, it is the policymakers and, crucially, the big tech companies, who now need to really step up to the plate.
Until recently, government has been cautious, the internet giants aloof, defensive and unwilling to take their share of responsibility for what appears on their platforms and for the fact that young children are often using their apps. These attitudes are starting to change.
For far too long we have allowed the tech giants to self-regulate. So I'm pleased that government ministers are now looking at new legislation to protect children online.
I have published my own proposals for what legislation could look like, including a 12-clause statutory duty of care law, drawn up with the help of the privacy law firm Schillings. It is a powerful yet simple proposal which would mean online providers like Facebook, Snapchat, Instagram and others would owe a duty to take all reasonable and proportionate care to protect children from any reasonably foreseeable harm. Harmful content would include images around bullying, harassment and abuse, discrimination or hate speech, violent or threatening behaviour, glorifying illegal or harmful activity, encouraging suicide or self-harm, and addiction or unhealthy body images.
We know that children are looking for this content and I believe the companies have the technological nous and responsibility to anticipate this behaviour and respond. If they fail, there should be sanctions, including fines and being made to admit publicly when they are failing to protect children, including posting information about the breach on their platform for all users to see. And a new regulator must do that - not the company's PR team.
I hope the government will look carefully at these proposals and introduce policy that has the same effect, quickly.
It is time we recognised that young people are one of the biggest users of tech and that those companies who play such a big part in their digital lives must now accept that with their power also comes responsibility.
ADCS VIEW
We must give children the tools to navigate online
By Stuart Gallimore, president, the Association of Directors of Children's Services
Social media, smartphones and the internet are ubiquitous in our everyday lives but it's increasingly difficult to ignore the impact they have on mental, emotional and physical health.
Our children are more digitally connected than ever before yet growing numbers report feeling isolated. New technologies can also be a gateway to bullying, self-harm, harassment or blackmail.
While the law struggles to keep up, our safeguarding responses to these risks are constantly evolving. As with child sexual exploitation several years ago, it is tempting to reach for a criminal justice response to online grooming, for example, or the distribution of indecent imagery. However, it's vital we recognise that young people involved in these activities can be both an abuser and a victim of exploitation themselves by unscrupulous adults.
The challenge for all of us is not just about getting better at spotting harm once it has occurred; a purely child protection approach isn't helpful. Instead, we need to empower young people with the knowledge and tools to navigate their life online in a way which protects and promotes their health, safety and wellbeing, and, if they're too young to act with agency, then we should educate parents on digital dangers.
Research in 2017 showed there is a high level of public support for tighter regulation of social media companies, particularly in relation to the negative impacts on children and young people. The public also felt these companies - and the government - were not doing enough. The introduction of a statutory code of practice for social media companies or the appointment of a digital ombudsman have both been floated as solutions. With parliament mired in Brexit planning, this doesn't seem likely any time soon.
Online safety will feature in the new statutory relationships and sex education (RSE) curriculum in all schools from 2020 onwards. There remains little evidence on the consequences of exposure to social media over time nor who is at risk of experiencing harm online and why, yet the new RSE guidance is currently in development.
Multiple studies show that young people think social media has a negative effect on their mental wellbeing, something that health professionals put down to increased feelings of inadequacy and anxiety.
All too often children are an afterthought - it's time we put their rights and interests ahead of those of corporations and their shareholders.
Online gaming risks: five key questions
By John Logan, clinical lead, The Edge
Q. What is the scale of online gaming among young people?
A. Data from Statista.com shows that 13.6 million people aged six to 64 play online games and around 12 million play app or "packaged" games. Of this group, 14 per cent are males and 11 per cent females aged 15- to 24.
Q. When does gaming become a problem?
A. If you're spending 12 to 14 hours a day gaming, your studies are suffering as a result, you hardly leave your room except to eat, you spend all of your time in your bedroom either gaming and when you're not doing that you're sleeping, then your life has become unmanageable. Relationships with family tend to suffer, usually gamers don't have many friends, apart from the people they are gaming with, and the only communication they have is about gaming.
Q. What are the risks of excessive online gaming?
A. Gambling goes hand-in-hand with gaming; that's the biggest risk factor in terms of what happens online. Drug abuse is also a very big risk factor - gaming goes hand-in-hand with marijuana addiction, that's how people are able to stay gaming for long periods of time. The two mental health issues that tend to be very prevalent are anxiety and depression.
Q. What are the signs a young person has a problem?
A. Their physical health tends to deteriorate - weight fluctuation, poor nutrition and poor self-care where people don't shower or brush their teeth. They don't take care of themselves, or pay attention to their physical appearance. ?
A huge tell-tale sign is that schoolwork is neglected or ignored completely. Gamers tend to get angry if parents/carers either try to confiscate their gaming device or turn the Wi-Fi router off in the house. That's when gamers tend to get really angry and break things.
Q. What can parents do to minimize risk?
A. First, restrict the amount of access young people and children in particular have to technology. I recommend that children and adolescents not be allowed to game for more than one hour a day.?
Second, I would recommend no computer in the bedroom where there is no oversight or accountability. If you're going to allow a child or adolescent to use technology, they should use it in a communal area.
- The Edge runs a gaming addiction treatment programme
FURTHER READING
- Statutory Duty of Care for Online Service Providers, Children's Commissioner for England, February 2019
- Screen-Based Activities and Children and Young People's Mental Health and Psychosocial Wellbeing: A Systematic Map of Reviews, UK Chief Medical Officers' Commentary, February 2019
- Safeguarding Children and Protecting Professionals in Early Years Settings: Online Safety Considerations, UK Council for Internet Safety, February 2019
- Media Use and Attitudes Report, Ofcom, January 2019
- Keeping Children Safe in Education guidance, Department for Education, September 2018
- Internet access data, Office for National Statistics August 2018
- Internet Safety Strategy, green paper and government response, Department for Digital, Culture, Media & Sport, May 2018
- Education for a Connected World framework, UK Council for Child Internet Safety, February 2018
- Digital Charter, Department for Digital, Culture, Media & Sport, January 2018