Since becoming Children’s Commissioner, I have spoken to more than one million children. They have shared their hopes for the future with me, as well as their worries.
Among the biggest worries is how to protect themselves online. Ofcom’s consultation on the Children’s Code closes this week. This is a key moment in the delivery of the Online Safety Act. Through this Code, we have the opportunity to radically improve online safety for young users and make tech platforms safe by design – but we are letting these platforms off the hook through low ambition and low accountability.
In its current form, this is a Children’s Code that protects corporations, not children.
It wasn’t written with children in mind, but tech firms and lawyers. It should be an opportunity to force tech companies to face up to their responsibilities, using the full force of the Online Safety Act.
Children’s voices are entirely missing from this code, despite being intended to protect them. Children are digital pioneers. They know how to navigate the online world seamlessly and have witnessed first-hand the evolution of these platforms from places full of funny videos to networks where dangerous rhetoric or irresponsible stunts are promoted for clicks.
Tech companies continue to downplay children’s experiences on their sites. I want to see this code include a requirement for them to consult with children before rolling out any new feature or product and prove they are safe by design – not wait for children to come across disturbing content before removing it. We can’t allow platforms to adopt a ‘wait and see’ approach to emerging technology like chatbots, or AI that creates ‘deepfake’ images – how many times are we going to let children be the victims of tech companies’ inability to put protection before profits?
As one of my young Ambassadors told tech companies directly: “X is actually Triple X.”
I am writing to these tech companies once again to compel them to find and share the numbers of children that are on their sites – something I’ve repeatedly challenged them to do and which they have previously claimed not to know.
How else can we believe they will take the steps required to keep them safe, if they are not able to identify where and who they are?
Children’s safety online shouldn’t be a tick-box exercise. As it stands, that’s all that is expected of tech companies in the Code’s current form – paying lip service to the idea of safer online spaces instead of properly assessing the risks to children, harnessing all the technology these companies have at their disposal to do this well.
Safety isn’t just about regulation. It’s about meaningful, practical action.
As a headteacher, I protected my pupils not because I was being regulated, but because it was my duty. I protected every student, not just the ones who came to me to report a concern – the onus was on me to keep them safe.
In my Big Ambition survey, published in March, just one in five children said they felt adults in charge listened to them. Yet they have faith that these decision makers can make great and positive changes in their lives.
Ofcom’s Code and the Online Safety Act are a good start, but more needs to be done. There’s no time to waste. I am eager to work closely with the government to strengthen and build on this foundation and I welcome the commitment from the Prime Minister to look at better protecting children from harmful content.
Children’s voices deserve to be heard. They are on the front line when it comes to social media. Their authentic experiences bring huge value to the debate. We ignore them at our peril.