News

Use of algorithms in children's social care 'risks families being missed'

1 min read Social Care
Use of data algorithms by children’s services to identify “at risk” children could lead to some being missed and other families being forced to undergo unnecessary scrutiny, a report has warned.
Machine learning “without proper ethical oversight” poses serious risks of reinforcing biases, researchers said. Picture: Adobe Stock
Machine learning “without proper ethical oversight” poses serious risks of reinforcing biases, researchers said. Picture: Adobe Stock

A review of the use of "machine learning" in children’s social care, by the University of Oxford’s Rees Centre and The Alan Turing Institute for What Works for Children’s Social Care, called for the approach to be used “cautiously”.

Machine learning is a general approach in computer science that allows algorithms to carry out tasks on the basis of data, without being explicitly and completely pre-programmed by designers, researchers said.

This means systems may use data mining to flag up children deemed to be "at risk".

However, the report warns that inaccuracies in machine learning systems could lead to "‘false negatives’ that miss children in need of protection". 

Register Now to Continue Reading

Thank you for visiting Children & Young People Now and making use of our archive of more than 60,000 expert features, topics hubs, case studies and policy updates. Why not register today and enjoy the following great benefits:

What's Included

  • Free access to 4 subscriber-only articles per month

  • Email newsletter providing advice and guidance across the sector

Register

Already have an account? Sign in here


More like this

Hertfordshire Youth Workers

“Opportunities in districts teams and countywide”

Administration Apprentice

SE1 7JY, London (Greater)