The growth of the online world is a technological revolution, the likes of which hasn’t been witnessed in centuries.
The internet has enhanced our lives immeasurably, by opening up education, communication and research in ways those of us who are now well into our adulthoods might never have imagined. For children growing up in 2025, who are among the first generations to have never known a solely analogue life, being online is second nature.
It is an incredible asset in our daily lives, but it has also fundamentally changed the nature of how we interact with each other, how we stay safe, and how we maintain privacy. For most children, if not all, it has introduced a darker side. They are forever in their digital playgrounds.
Every day, children tell me about the violent, upsetting or degrading things that are shown to them online by algorithms designed to capture their attention. That’s why, as Children’s Commissioner, I have been relentlessly focused on driving for greater safety online. It’s also been driven by what I saw in children’s changing behaviour during my years as a teacher and headteacher, as they learned to navigate a life through a digital lens.
But the subject of this report – sexually explicit ‘deepfakes’ – is not one I was familiar with until more recently, despite having worked with children every day of my professional life. Of all the worrying trends in online activity children have spoken to me about – from seeing hardcore porn on X to cosmetics and vapes being advertised to them through TikTok – the evolution of ‘nudifying’ apps to become tools that aid in the abuse and exploitation of children is perhaps the most mind-boggling.
The technology used by these tools to create sexually explicit images is complex. It is designed to distort reality, to fixate and fascinate the user – and it confronts children with concepts they cannot yet understand. The evolution of these tools is happening at such scale and speed that it can be overwhelming to try and get a grip on the danger they present.
I have heard from a million children in my four years as Children’s Commissioner. As I enter my fifth and penultimate year, I also have a bank of data from almost 90% of schools in England in my toolkit on the most pressing issues facing their children – that’s evidence from around 19,000 safeguarding leads, headteachers, and senior leaders.
It is rare for children and adults to hold the same position on a subject. On this issue, however, they agree: the risks posed by children’s unfettered access to harmful content is a constant and growing worry – and one on which children’s experiences cannot be sidelined.
On this subject, there is little nuanced debate required. There is no good reason for tools that create naked images of children. They have no value in a society where we value the safety and sanctity childhood. Their existence is a scandal.
As one 16-year-old girl asked during the research phase of this report: “Do you know what the purpose of deepfake is? Because I don’t see any positives.”
The act of making such an image is rightly illegal – the technology enabling it should also be. Any individual or organisation motivated by the idea of making profit by creating a tool that supports the exploitation of a child must be held to account.
The Government has set an admirable and ambitious mission to halve incidences of violence against women and girls within this Parliament. Getting rid of these exploitative apps and tackling this emerging threat to children would be a significant step towards this mission. Women and girls are almost exclusively the subjects of these sexually explicit deepfakes: 99% of these images online are of women and girls. Chillingly, evidence suggests that many of the tools designed to create them only work on female images because they have been trained to do so.
The risks to boys are different but equally harmful – studies have shown that teenage boys are predominantly at risk from the influence of Artificial Intelligence and online communities of extremist material, including sexual.
When we talk about children’s rights, we mean the ‘always’ and the ‘nevers’: the things children should always have, and the things that should never happen to them. This report offers a stark example of the latter. Children are growing up fearing that a smartphone might, at any point, be used as a way of manipulating them.
We owe it to our children, and the generations of children to come, to tackle this now, instead of accepting it as just one more restriction placed on their freedom, and one more risk to their mental wellbeing.
In my Big Ambition survey last year, just one in five children told me that politicians and leaders listen to their views.
It is time to prove otherwise. Children growing up today are paying the price of our inaction.