Skip to content

This week the Children’s Office is recognising the importance of online safety for children and young people. In the first of a series of blogs, the Children’s Commissioner discusses why online safety is a major concern for her, and her work in her time as Commissioner to guarantee a safer digital world for all young people.

I am delighted to kick off online safety week with a look back at my work on over the past 18 months, and a forward look to my key priorities for online safety.

It was a privilege to be commissioned by the Department for Digital, Culture, Media and Sport (DCMS) and the Department for Education in May 2021 to explore children’s experiences of online harm, and to offer my formal recommendations to Government. As the Online Safety Bill makes its final move through Parliament, I will continue to represent children’s views in the legislative process.

My key concerns

Children are born to an online world which they did not create. While offering boundless new opportunities for play, connection and learning, there is also a darker side to digital technologies. Below, I outline a few of my key concerns:

  1. Children’s daily lives are now shaped by highly complex algorithms. These can push and amplify harmful content to children’s accounts. Heart-breaking cases where children have taken their own lives, such as Molly Russell and Frankie Thomas, are symbolic of the real-life harm of unregulated algorithms. I pay tribute to their families who have done so much to raise awareness about this issue and the change that is needed.
  2. A huge number of children are using platforms when they are too young. In a recent survey I found that between 36%-79% of users aged 8-17 are under the minimum age across 7 social media platforms. I have repeatedly challenged tech firms on the number of underage children using their services, yet still they are slow to develop meaningful age assurance and to abide by their own terms and conditions.
  3. Social media platforms don’t listen to children’s complaints. My recent survey of children aged 8-17 found that just 50% of children report to platforms, of these 25% hadn’t received a response. Worryingly, children are less likely to report harmful content the older they get – despite being more likely to be exposed to harm. Platforms must rebuild children’s trust by operating reliable and responsive complaints routes, operated by human analysts.
  4. Pornography is just one click away for most children. Unlike in the offline world, where strict legislative and regulatory rules are in place, children have free access to adult content online. There is currently no legal duty for platforms hosting pornography to verify the age of users. The burden is placed on parents and carers to deploy monitoring and filtering tools – leaving many children without protection from the harms of pornography.

My work so far

My ongoing priorities

Related News Articles