Dear Secretary of State,
Re: AI-Driven Technology to Tackle Child Sexual Abuse Material

It is a shock to discover that the UK is among the top three global consumers of for-profit livestreamed child sexual abuse. [1] This form of abuse is happening at scale on everyday apps and platforms, streamed through devices such as smartphones, tablets, and laptops. Unlike known child sexual abuse images and videos, which are sometimes detected and reported, livestreamed child abuse is largely going undetected. [2]
A report from International Justice Mission (IJM) and Nottingham Rights Lab (2023) found that in 2022 alone, nearly 500,000 children in the Philippines were trafficked to produce new child sexual exploitation material, including livestreamed videos. That’s roughly 1 in every 100 Filipino children. [4] The Philippines is just one of the many places where children are trafficked to produce for-profit livestreamed child sexual abuse. Children in Colombia, Romania, and Brazil, amongst other countries, are similarly exploited.
Worryingly, men in the UK who report online sexual offending behaviours against children also report being 2.5 times more likely to seek sexual contact with children in person. [5] The risk is higher for livestreaming offenders, as research shows that, ‘...individuals who sexually offend against a child must first cross a psychological threshold. Arguably, CSA [child sexual abuse] live streaming offenders have already done this, by directing and watching the live sexual abuse of a child online – which is on par with abusing the children themselves.’ [6]
However, technology now offers a breakthrough, as AI-powered classifiers can be used to detect and disrupt individuals from creating or consuming illegal child sexual abuse material (CSAM). This technology can be embedded directly into the operating systems of internet-connected, camera-enabled smart devices.
It can identify and disrupt CSAM in real time, preventing it from being captured or viewed. Crucially, because detection happens entirely on device, it would preserve user privacy and be compatible with end-to-end encryption. This technology could not only disrupt livestreamed child sexual abuse but also help prevent the creation and sharing of videos and images linked to grooming, child sexual extortion, or other image-based harms.
Similar tools using on-device nudity classifiers are already being used by technology companies on children’s accounts to detect and disrupt nudity.
Given that UK offenders are driving the abuse of hundreds of thousands of children, it is vital that the UK Government acts. Whilst the Government is already taking some steps on this issue, through investment in a network of Undercover Online Officers and supporting the work of the NCA and GCHQ, we urge you to go further.
The UK Online Safety Act is an excellent starting point for addressing the online sexual exploitation of children. However, the scope of the Act is limited to platforms and search engines. Regulation must extend to device manufacturers and operating systems, preventing CSAM and livestreamed abuse sustainably at scale.
The current Ofcom child protection codes include strong measures to protect the accounts of juvenile users, and Ofcom has recently consulted on codes which cover broadcast livestreaming. These codes leave a dangerous gap, as CSAM is increasingly produced, streamed, and accessed in private communications on adult accounts and devices.
Additional measures that stop these activities are necessary.
Legislating to require device manufacturers and operating system services to deploy such measures would be a groundbreaking step in protecting children in the UK and around the world from online sexual abuse and exploitation, showing global leadership on the part of the UK. We call upon you to act.