Skip to content

An abridged version of the below blog appeared in the BMJ on April 10.  

Throughout my time as Children’s Commissioner, I have heard from a million children and young people about their hopes, ambitions, and concerns. An issue which frequently comes up in these conversations is how young people can spend time online safely and protect themselves from distressing or harmful content.  

As a former teacher and headteacher, I know the online world is an integral part of young lives today and have witnessed the spike in children’s time spent online during the last few years. Last month, my own nationally representative poll of children aged eight to 15 found that 25% of children spent two or three hours a day using an internet-enabled device such as a computer, smartphone, tablet or gaming console, and 23% of children spent more than four hours a day on such a device.   

In speaking to young people, I have been deeply shocked by their experiences of the online world. Children have told me that they see harmful and illegal content frequently online – including pornography, suicide, self-harm, and eating disorder content, as well as abusive content, anonymous trolling and hate content, including racism and sexism. 

Children have told me this harmful content often finds them, rather than them actively seeking it out. Girls as young as nine have told my team about strangers asking for their home address online, and in a room of 15- and 16-year-olds, three quarters had been sent a video of someone being beheaded. Content like this is often promoted and offered to children by complex recommendation algorithms designed to capture retain their attention, designed for business purposes, not the interests of users.  

In recent months there has been much debate around the role of mobile phones in children’s lives, and many calls to ban them – particularly in schools. I wanted to know what’s happening in schools, so I used my unique statutory powers to conduct the largest-ever survey of schools and colleges to discover the policies in place in schools today.   

My landmark research, with responses from 19,000 schools and colleges – representing nearly 90% of schools in England – provided the first comprehensive evidence on mobile phone policies in schools. The overwhelming majority of schools – 99.8% of primary and 90% of secondary – already have policies in place limiting or restricting the phone of mobile phones during the school day. And yet schools have told me in the same survey that, despite these policies, they remain deeply concerned about children’s online safety.  

It’s clear the time that children spend online are not happening in school hours. If we want to protect children, we need to turn our attention and our energy to keeping them safe online when they aren’t under the rules of their teachers.  

Outside of school, children are too often being left to explore the internet unsupervised and unprotected. Parents believing parental controls, combined with in-app features like reporting and blocking, will keep children safe online. But online platforms are constantly being created and changing, so even the most tech savvy parents are still learning how to navigate the online world that their children experience. We cannot solely rely on these functions to keep children safe.  

Tech companies will promote their platform’s safety features to reassure parents, but young people are still often seeing harmful content online – whether they look for it or not. Last year I called the tech companies into my office, and my Young Ambassadors told them about their online experiences and the inappropriate and harmful content they have seen, and what they want these platforms to do to keep them safe online.  

Young people’s experiences highlight the need for better regulation – something young people told me they wanted in The Big Ambition. Ofcom recently announced guidance –as part of the broader Online Safety Act – that requires robust age verification measures to prevent children from accessing harmful online content, particularly pornography. These measures are expected to be implemented this summer.   

While welcome, further evidence is needed to check that these measures have been adopted and are working effectively. Ofcom has a duty to hold tech companies accountable for enforcing the protections, to ensure online safety for children.  

I have been clear that the Online Safety Act needs to be implemented in a way that keeps pace with an evolving online world. The tech companies will need to assess the function of their platforms, the algorithms that recommend content, and even their company governance for any risks they might present to children. 

We cannot continue to think of the internet in terms of adult spaces and children’s spaces – it is a shared space. It’s vital that children’s voices are at the heart of these discussions, which is why I have invited the tech companies to a roundtable with some of my Young Ambassadors today.  

Moving forward we have the chance to make important steps towards protecting children online – but we cannot forget children’s voices in these decisions. Every incoming measure in the Online Safety Act is an opportunity to safeguard children, and we must each measure must be robust and ambitious. 

We owe it to our children to build a better online world. It’s not enough to react to harms after they happen, so we must create a digital world that is safe by design, where children can explore, learn and connect without being exposed to danger.  

Related News Articles