
It is concerned that the “hyper realistic images and videos” created through AI threaten to normalise abuse and heighten the risk of offenders escalating their behaviour to commit abuse themselves.
The concerns have been raised by the NCA in its 2023 national strategic assessment report, which details that it has increased its focus on digital threats over the last year “reflecting the fact that more crime takes place online or is enabled by technology”.
The use of augmented and virtual reality is among evolving AI threats, which manipulate or merge virtual and real imagery of abuse, it notes.
In addition, the use of AI “will make it harder for us to identify real children who need protecting, and further normalise abuse,” said NCA director general Graeme Biggar.
Register Now to Continue Reading
Thank you for visiting Children & Young People Now and making use of our archive of more than 60,000 expert features, topics hubs, case studies and policy updates. Why not register today and enjoy the following great benefits:
What's Included
-
Free access to 4 subscriber-only articles per month
-
Email newsletter providing advice and guidance across the sector
Already have an account? Sign in here