Skip to content

As part of The Big Ambition, I co-hosted the latest roundtable with some of my recently appointed Ambassadors focusing on online safety. The roundtable brought together representatives from Google, Meta, TikTok, Ukie – the UK video games trade body – and X to discuss the theme: “What more could platforms do to keep children safe online?” 

My Ambassadors spoke directly to representatives of the tech companies and social platforms to voice concerns they have experienced. The roundtable sparked some great conversations about keeping children safe online and I was delighted that the representatives committed to regular meetings with my Ambassadors to discuss online safety.  

The ideas discussed during the online roundtable will help shape the recommendations in The Big Ambition, which I will be taking to policymakers and politicians, as we approach the upcoming general election.  

The Ambassadors who attended the online roundtable have shared some thoughts:  

Ben, Children’s Commissioner Youth Ambassador  

Online safety is a vital part of a young person’s life, social media over the past few years has been evolving so quickly, it can be hard to keep up. I think we need to need to better inform parents, teachers and other adults in a child’s life about how safe social media can be but making everyone aware of the potential dangers. Schools need to teach young people about being safe online, instead of just telling us to ‘report and block’, which most teachers seem to tell us.  

My experiences with social media haven’t always been smooth sailing, but that’s why I and the Children’s Commissioner want to see change. Children need to be safe online, so our childhoods aren’t ruined so early.  

Overall, the online roundtable was informative, and it was great that the representatives from the online platforms answered all my and the other youth Ambassador’s questions. They told us about the things they are constantly doing to try and improve the experiences of young people online, and things that they agreed with us need to change. I left the roundtable feeling more motivated than ever to help young people, and ready to make a difference, not only now but for future generations of children too.  

Maximilian, Children’s Commissioner Youth Ambassador  

Tech companies are already aware that age verification is notoriously unreliable. It is therefore absolutely necessary to find a reliable method of age verification: one that does not infringe on the rights of the child to data protection and privacy and yet still ensures to the highest degree of certainty, that a child is indeed a child.  

As we have unfortunately seen, many tech companies have already violated our trust, by amassing tons of data on us. So, the question remains: why would we trust them to ensure accurate age verification, without selling that data to the highest bidder?  

Conversely, rather than trying to separate children from adults online with technical solutions such as age verification and overcomplicated AI-powered approaches, why won’t companies prioritize overall internet safety that would subsequently make the digital environment safer for children – and all users.  

Platforms cannot burden parents with parental controls and other incomprehensible features. Platforms cannot blame parents for their lack of understanding of new technologies. But as creators of the platforms, tech companies should be proactive and positive by acting responsibly to keeping all their users safe.  

Tech companies need to remember that transparency is key, they need to make it easy for teenagers to understand what data they’re collecting.  They shouldn’t bury issues in overcomplicated Terms and Conditions Pages, that only lawyers can understand, and they should implement security/safety features by default for all users. And lastly, we don’t need new shiny features and “tools” that are obscured deep in settings pages. We need safety-by-design.  

Mehul, Children’s Commissioner Youth Ambassador  

From a young person’s perspective there are some elements of default account settings which do not make sense, I’m always surprised that when a young person registers for a social media account online they are more often than not able to message other young people online, even when they do not follow each other. This seems like an oversight which could help potentially protect young people from unwanted messages.  

There are now more filters for content, warning users that content behind the warning message might be more extreme or upsetting. But I always find that the way these warning filters are presented to users makes it more likely they will want to click to reveal what is behind the warning, defeating the objective of the warning system. I think online platforms should review how they filter such content and make the warnings less inviting and safer for all users.  

While tech companies say they have all these protections in place for their young users they need to be better at staying ahead of an extremely tech savvy generation. More often than not young people will find ways to bend and circumvent the protections designed to keep us safe, meaning that when this happens, they’re useless.  

Tech companies need to better explain the protections to young users too – many of the restrictions they can put in place for children can appear more suffocating rather than protective, which feeds into young people’s desires to bend the protections, so it’s vital that tech companies find ways to keep us safe online and provide a healthy balance.  

Rhea, Children’s Commissioner Youth Ambassador  

With younger children becoming more adept at using technology, it is crucial that we recognise the advantages and dangers of using social media.  

I think tech companies to understand the danger of finding harmful communities online. During our teenage years, many children turn to the internet for support and understanding, yet they face the danger of falling into a space which aims to pull you down, rather than build you up.  

On apps and sites like TikTok and X, some spaces would encourage users to share tips on things like self-harm, and these groups and spaces are extremely popular. These spaces are widely known to users and also extremely accessible.  

Recently on my social media accounts I have been shown unhealthy ways to lose weight, ones which often lead to eating disorders, despite my account clearly belonging and being registered by someone under the age of 18. The danger here is that for people who may struggle with body image issues, one post may cause them to spiral. While for others, as soon as you interact with one post you are constantly served similar content, creating an almost inescapable echo chamber.  

The same problem is replicated where people can find spaces with harmful ideologies and misinformation – they call victim to people who aim to manipulate rather than inform.  

To impressionable teenagers, spaces online may appear to be an escape, a place where they can feel understood. Therefore, it is imperative that tech companies and online platforms do more to understand and take visible action against these dangerous communities, to ensure that social media is a true safe space for all.  

I would like to thank my Ambassadors for attending the online roundtable and for their valuable contributions. The outcomes of The Big Ambition will be published next month.  

Related News Articles