Two months ago, I sat in a meeting with senior executives from Instagram, Facebook, Google and some of the other social media companies, listening to them tell Ministers that that yes, there is a problem with some of the content they host and yes they would do all they could to take it down urgently. Two months on, and you can find 710,000 results with the self harm hashtag. In February, it was 640,000. Instead of tackling the problem, it’s even worse.
That’s why this era of self-regulation needs to end, and today’s proposals from the Government to introduce a statutory duty of care cannot come a moment too soon.
Ask any parent and they will tell you the hold the digital platforms have on children’s lives. Messages about how they should look, the way they spend their time and the views they have are relentless. Some are affirming and positive and some, as we know, much less so. For the digital native, the world is fast – driven by addictive technology that demands loyalty and punishes those who dare think about switching off. Whilst this new childhood may bring untold opportunities, it also brings an avalanche of pressures which children are struggling to come to terms with.
And we know that children are using social media and the internet from a younger and younger age, spending more and more time online. There are thousands of under 13s on apps that were not designed for them – their parents either oblivious, or crossing their fingers hoping their child won’t be the one for whom it all goes wrong. You could feel the chill in the room when I asked the social media companies what they were doing to test whether users where under age. Kids are inflating their age when they sign up, and the internet giants use it as an excuse to do nothing.
A statutory duty of care would change all of this. It would give companies providing online services a legal responsibility to keep users safe. It would make them take down the most harmful material down quickly. If they didn’t there would be a regulator to levy fines. Alongside this there will be more support for parents and children.
Thirty years on from the birth of the internet, this could be the turning point where the digital world is reimagined to make it a place children can have confidence in. But it will depend on decisions taken now, which is why I want government to be both decisive and bold. I want the new regulator to have teeth with strong powers to represent children. Social media companies have held all the cards for too long. It’s time for the balance of power to decisively shift.
So meaningful financial penalties, public apologies and a requirement to show they have changed their ways all need to be on the menu of the regulator if a company is seen to fail in their duty. That doesn’t mean just social media platforms. Millions of children spend far longer on online games than they do on Facebook or Instagram. The companies that manufacture these games should have the same duty of care as the internet giants.
We also shouldn’t assume that Facebook and the other well known apps are the only ones who need regulation. Smaller companies should have to follow the same laws because we can see with apps like TikTok how quickly they can become popular with children. At the same time, just like TV ads placed in family shows, it would be a mistake to focus only on companies that ‘market to children’. Many apps don’t, but are used by children anyway.
Let’s remember social media has been in its infancy. Remarkably, if Facebook was a child it would only just be old enough to use its own platform. But the time for the chaos and excitement that so often comes with new innovations has now passed. The internet and social media is part and parcel of every childhood. However much some parents might wish it away, it’s going nowhere. It is time for the internet giants to grow up.
The public now expect more from this new entity in their lives. Accepting that with such power comes responsibility is the first step in doing so – with the strength of government regulation behind it in case tech companies don’t think we mean it.