Earlier this month I co-hosted a roundtable for the major tech platforms operating in England with some of my Youth Ambassadors, where we focused on children’s hopes for the online world. A year on the last roundtable on online safety, I was keen to convene representatives from the tech companies again to understand what the industry was doing to prepare for the implementation of Ofcom’s Children’s Code.
My Ambassadors told representatives from Meta, Google, X, TikTok and Ukie about their online experiences and concerns – here, they share some of their experiences and solutions about how to improve the online world to make it safer for children:
Stanley, 2025 Youth Ambassador:
One thing I would tell the government and tech companies about young people’s experiences online would be that many of us are trying our best to spend less time online. However, this can be difficult as coping with the addictive nature of social media is not easy when your brain is still developing.
I want tech companies to focus on designing their apps and platforms, so they are less appealing to children and young people. Obviously, many young people currently use these platforms, however as the minimum age is often 13, the best thing that companies could do would be to try and reduce consumption amongst young people through their designs. But I don’t think tech companies would do this, as they would miss out on profiting off teenagers that are addicted to their platforms.
Penelope, 2024 Youth Ambassador:
Tech companies continue to marginalise young people from platforms, especially platforms which perpetuate harmful content. Although it is important for young people to stay engaged with the digital age we live in, the issues on content regulation, age identification and the use of AI and mass advertisement promotes a toxic culture for young people that has detrimental impacts on health, ethics and the environment.
I think it’s vital that tech companies promote youth engagement with their platforms to help transform the uses of AI and advertisements young people consume. The issues of mass consumerism and reliance on AI, ruins the attention-span and analytical thinking of young people. These should not be a replacement for key thinking skills made by young people and should only be used where appropriate.
Chan, 2025 Youth Ambassador:
If I could tell the government, the online regulator or tech companies about young people’s experiences online is having a significant impact on our mental health. Instead of the online world being used as a tool to connect and educate young people it is doing the complete opposite. The internet is full of fake news and information that is based on feelings and not facts so many impressionable young people are influenced by social media.
After speaking at the round table I would like the tech companies to focus on online safety and protection. I do not feel online platforms are protecting me – I feel unsafe, especially when people try to approach me online pretending to be someone who they are not. There must be better sharing of responsible and fair information online, so we are less divided as a society.
Maximilian, 2024 Youth Ambassador:
Young people don’t feel protected online – they feel abandoned. We’re navigating platforms flooded with hate, misinformation, and exploitative features while companies focus on optics over safety. If regulation only results in box-ticking and PR statements, then it has failed us. Real accountability must start with listening to young people – not just as users, but as stakeholders with a right to safety and dignity online.
Tech companies must prioritise genuine youth engagement – not tokenistic consultations, but real seats at the table where we can influence policies, product decisions, and safety protocols. If we’re old enough to be profiled, tracked, and targeted by these platforms, we’re old enough to be heard and to help shape the systems that govern our digital lives.
Valerie, 2025 Youth Ambassador:
The online world allows young people to express themselves, and also learn so much, especially from others. However, there is an increasing problem where content online is having a negative effect on people. I think that young people can definitely be influenced subconsciously by the content they view online, and more needs to be done by the online regulator and tech companies to ensure that young people are aware that they don’t need to become or conform to what they see, hopefully having a more positive effect on their mental and physical health.
I would like tech companies to focus more on support for people of different age groups, but particularly teenagers and young adults. As much as the internet and other online platforms can be informative and enjoyable, there is still a lot of negative content on many of these platforms. It is important for tech companies to take the steps to remove negative content and also provide support for young people whether that be through links to helpful youth-focused services, or the opportunity for young people to comment on what they’ve seen and get a proactive response from tech companies.
Rylie, 2024 Youth Ambassador:
Online spaces still aren’t truly built with young people in mind. Harmful content, targeted advertising, and algorithmic risks are the norm, not the exception for too many young people. We’re told to report, block, and ‘be resilient,’ but that places the burden on us instead of holding platforms to account. If we have to report something, the harm has already been done. Online safety needs to be proactive, not reactive and embedded into every platform young people use.
Instead of tweaking safety settings, tech companies need to redesign how content is promoted and what’s allowed on their platforms in the first place. Algorithms shouldn’t be pushing harmful content to young people at all, whether it’s unrealistic beauty standards, violence, or harmful ‘advice’. If platforms know what keeps us scrolling, they also know what puts us at risk and they need to take responsibility for that.
Ben, 2024 Youth Ambassador:
The online world plays such a big part in young people’s lives, and it needs to be made safer for children. Too many children are able to bypass age requirements for social media by putting in the wrong age when they sign up for an account. Social media companies need better checks in place to stop children signing up for accounts before they meet the age requirements – there has to be stricter age verification to protect younger children online. This is an important step in tech companies making the online world safer for everyone, if they focused on making the online world safer for everyone, they would make the online world safer for their young users too!