Skip to content

The following open letter to Nick Clegg, Facebook’s Head of Global Affairs, was published in The Daily Telegraph on Jan 30, 2020.

Dear Sir Nick,

I was very interested to hear the interview you gave the BBC from Davos as Facebook’s Head of Global Affairs. You were discussing concerns I have been raising about your company’s platforms and other social media giants regarding the worrying ease with which children, some of whom are particularly vulnerable, can access harmful, upsetting and even dangerous content on yours, and their sites.

You accepted on the BBC’s Today programme that in terms of tackling this problem across Facebook, WhatsApp and Instagram, “It’s not something we are completely on top of”. I welcome that recognition but regard it as an understatement despite some movement from yours, and other companies and waves of warm words

I detected a sense of frustration, and your answers conveyed the idea that somehow parents and those raising concerns, don’t understand the difficulties Facebook have in practically tackling this, or appreciate what has already been done. I can only say that any frustration felt is matched by my own that social media companies have in reality failed so far to match the scale of valid concerns that parents and children have raised with me, in large numbers.

I accept that there has been some action taken, but I know you recognise that vulnerable children can still see inappropriate content, across platforms right now. A platform with a third of the world’s population using it may indeed have practical problems ‘policing’ its usage but that just raises more questions about Facebook than it provides reassurance to parents. The scale of Facebook’s growth is down to Facebook and Facebook has to manage the problems that brings. With power comes responsibility.

The word ‘policing’ is also troubling. Removing clearly harmful content shouldn’t carry a fear of being branded heavy handed or at odds with the spirit of free speech. We believe, when it comes to safeguarding children, removing all harmful content, quickly, should be the main priority of your social responsibilities as a company. Achieving that should be a key focus for the whole company across all three platforms.

Any lack of appreciation of progress from parents or wider society might be due to the selective way in which Facebook is prepared to be transparent. You could quickly pinpoint that “149 Billion messages” had been shared on your platforms on New Year’s Eve and yet repeated requests from us, and others, to reveal how many children under 13 regularly use Facebook, WhatsApp and Instagram, has never been met with a figure

I am particularly concerned by Facebook’s reliance on arguments around privacy in explaining its plans to encrypt Facebook Messenger and Instagram messages. The decision to encrypt represents a real threat to children, who may come to harm when interacting with other users via these routes, with your company and the police left with no real way of knowing or intervening.

You told the interviewer that Facebook takes material down “when it is reported to us”. Despite your own proactive work Facebook still relies very heavily on users reporting material to you themselves – self policing – by which time of course the content has already been seen, often by children.

From what children tell us, there are still big issues here: ranging from a lack of response, long waiting times for a response if it comes, and often a lack of action excused with a simple explanation that the content doesn’t breach your terms and conditions, without actually addressing the reasons why the child felt it was harmful and reported it.  Children have told me many times this is one their biggest issues especially with regard to bullying and its detrimental effect on their mental health. Indeed many tell me platforms have so often been unresponsive in the past, they now no longer bother to alert you.

You said in response to one question, “There is nothing in the business model that lends itself to showing harmful and unpleasant, and offensive or dangerous material to anybody”. What would be more exciting and positive would be to see your business model recognise the commercial advantage of very publicly tackling online harms on a scale it hasn’t yet come close to. Saying, albeit with regret, that it would be hard to do, is not good enough for a company that has been at the cutting edge of huge “speed and scale” solutions when it wants to promote its own growth. Algorithms and the smartest minds can be found to do the latter, why not the former?

Furthermore, the business model does provide a disincentive to applying one possible solution.  Social media platforms build huge user numbers by offering “seamless on-boarding” i.e. using platform design to make it incredibly easy and very quick to become a user. There seems to have been little appetite so far from Facebook, Instagram or WhatsApp, and indeed other companies, to retro-fit safety measures, such as age verification, lest they clutter up the gateway with delays which damage that ability to grow users. Given the Information Commissioners Age Appropriate Design Code will be effective in 12-18 months’ time, that argument would no longer be viable, and we’d welcome hearing how the company might be thinking ahead of that.

That code should mean you have to do what you already could have done which is to genuinely restrict the platforms to the age they are designed for. The age limit is laid out in your own Terms and Conditions. For children under 13, the best way to remove their access to harmful content, is not to allow them access to your site, under your own rules.

You were in Davos making the argument that the UK Government shouldn’t impose a tech-tax on companies such as yours until there has been global discussion of how to do it properly. As an experienced politician, you’ll know historically, such arguments regularly have the secondary effect of delaying something one doesn’t want. However on the issue of online harms and vulnerable children this debate has been taking place for some years, the arguments made, and suggestions for solutions made, repeatedly. Facebook has had time to make its case. From my perspective, and that of too many parents, and children you and many other social media giants simply haven’t made that case convincingly enough, forcing Government to be the driver here.

We warned when Government did indeed start looking at ways to tackle these problems that any proposed legislation or code would get big push back from tech companies. So it has proved and perhaps your interview should be seen in that context. I find that surprising. When you were at the top of Government yourself I find it hard to believe you wouldn’t have been full-square behind the legislation and codes now being suggested. If you are, I’d welcome you saying so.

Anne Longfield
Children’s Commissioner for England

Related News Articles