
The bill is of significant interest to councils, covering a wide range of issues from child protection and public health issues to abuse and intimidation and free speech. The wide-ranging nature of the bill, and the significant role of the internet in the lives of most people, means there are likely to be additional issues of importance for councils that emerge.
Key proposals include:
-
The introduction of duties of care to some user-to-user services and search engines, as well as duties on providers in relation to the protection of users’ rights to freedom of expression and privacy.
-
All regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect them from harm. Regulated services will be grouped into three categories based on their size and functionality within which different duties will apply.
-
A definition of harm and how harmful content will be prioritised for child and adult users. However, local government leaders have called for the government to set out the proposed categories of harmful content to children as soon as possible.
-
The creation of four new criminal offences: a harm-based communication offence; a false communication offence, a threatening communication offence; and a cyber-flashing offence.
-
More regulatory powers for Ofcom, which will also be responsible for drafting codes of practice for all duties and ensuring services have the systems in place to adhere to these.
-
A new user empowerment and verification duty that will enable users to control what content and users they interact with.
First published on 17 March 2022, the bill is about to enter its third reading in the House of Commons before being debated in the House of Lords. Many of the bill's proposals were set out in a draft bill and were subject to consultation following the publication in 2019 of the government's Online Harms white paper and subsequent parliamentary scrutiny.
A number of original proposals in the bill have been amended or removed during the parliamentary process, raising concerns among campaigners that protections for children and young people are being watered down (see below). More changes are likely before the bill is enacted. Key amendments affecting children's services include:
Amendment 159 Would enable Ofcom to categorise regulated services as category 1 (and subject to the most stringent duties) based on its assessment that they pose a very high risk of harm, regardless of the number of users.
In a briefing on the bill's amendments, the Local Government Association (LGA) has highlighted the growing risks posed to children and young people by criminals using online spaces to groom and exploit victims. While welcoming the government's ambition to keep children safe online, the LGA says there are ways to strengthen the bill so that it tackles the range of ways that abusers use social networks.
Amendment NC28 Would establish an advocacy body to represent the interests of child users of regulated digital services. The body – which could be newly-created or an existing organisation – would represent the interests of children and protect and promote their interests.
The LGA backs this amendment. “Only by considering the ‘real world’ impact of online activity – both positive and negative – can we hope to effectively ensure online spaces that allow us to safely harness all the benefits offered by social media and search platforms,” states its briefing.
Amendment NC16 Creates a new offence under the Suicide Act 1961 of encouraging or assisting self-harm. An offence would have been committed if a communication is sent encouraging another person to commit serious physical harm to themselves, whether or not that person does commit such self-harm.
Again, the LGA backs this amendment as providing protection to children and young people whose mental health could be affected by viewing harmful content online. “In extreme cases this can include encouragement to self-harm, and it is appropriate that such encouragement be made a specific offence,” it states.
The amendments will be voted on in the third stage of the bill's reading before it transfers to the Lords. The government has pledged the bill will become law during this current parliamentary session.
- More from bills.parliament.uk/bills/3137
NSPCC raises concerns over bill delay
The NSPCC estimates that more than 13,000 online child sex offences will have been recorded by the police while the Online Safety Bill was delayed last summer by the Conservative party leadership contest.
The society says that more than 100 online grooming and child abuse image crimes are likely to be recorded every day that the legislation is delayed.
Childline counselling sessions about online grooming jumped 35 per cent between April and September 2022 compared with the previous year.
In response to the uncertainty, almost 50,000 people signed a petition calling on the Prime Minister to make it his mission to pass the Online Safety Bill as soon as possible
NSPCC chief executive Peter Wanless said: “The scale of online child abuse and continued inaction from tech firms to tackle damaging suicide and self-harm content being targeted at children should be a wake-up call to the Prime Minister to make passing the Online Safety Bill his mission.
“There is overwhelming public consensus for the crucial legislation to be a priority and with strengthened protections for children, so they are systemically and comprehensively safe from harm and abuse for years to come.”