Skip to content

Anne Longfield, Children’s Commissioner for England, has published a report, Access Denied: How end to end encryption threatens children’s safety online”, looking at children’s use of private messaging services like WhatsApp and Facebook Messenger.

The study suggests that millions of children in England are using messaging platforms that they are not old enough to be accessing. The report comes following announcements by Facebook, and indications by other platforms such as Snap, that they plan to apply end-to-end encryption to all their messaging services. End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse.

The report includes a survey revealing the extent of children’s use of messaging services, including by children much younger than the minimum age requirement. It shows:

The report warns that the privacy of direct messaging platforms can conceal some of the most serious crimes against children, including grooming, exploitation and the sharing of child sexual abuse material. An NSPCC investigation found that Facebook, Instagram and WhatsApp were used in child abuse images and online child sexual offences an average of 11 times a day in 2019. It also found that the rate of grooming offences committed in the UK appears to have further accelerated over the course of lockdown, with 1,220 offences recorded in just the first three months of national lockdown. Facebook-owned apps (Facebook, Instagram, Whatsapp) accounted for 51% of these reports and Snapchat a further 20%.

End to end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse. The Children’s Commissioner’s survey found that Whatsapp – an end-to-end encrypted service owned by Facebook – is the most popular messaging app among all age groups, used by 62% of children surveyed. Chat services attached to large social media sites, such as Snapchat, Instagram, Facebook and TikTok, are also popular, particularly among teenagers. None are yet end-to-end encrypted by default but all – with the exception of TikTok – have made public their plans to do so in the near future or suggested that they are looking into it. All have age limits which children routinely ignore, and which platforms do little to meaningfully enforce.

It is now over 18 months since the publication of the Government’s Online Harms White Paper, and over 3 years since the publication of the Internet Safety Strategy green paper which preceded it. Added to this delay, the Children’s Commissioner is concerned that end-to-end encrypted messaging services could be defined as “private communications” and could therefore not be subject to the duty of care in the same way as other platforms.

The Children’s Commissioner is also warning that end-to-end encryption could be a cynical attempt on the part of some tech firms to side-step sanctions and litigation, as the UK Government prepares to establish a new legal ‘duty of care’ on companies towards their users. If a platform is unable to read a message shared across their server, it follows that it would be hard for a Government to hold them accountable for its contents.

The report makes a series of recommendations, including:

Anne Longfield, Children’s Commissioner for England, commenting on the report, said:

“This report reveals the extent to which online messaging is a part of the daily lives of the vast majority of children from the age of 8. It shows how vigilant parents need to be but also how the tech giants are failing to regulate themselves and so are failing to keep children safe.

“The widespread use of end-to-end encryption could put more children at risk of grooming and exploitation and hamper the efforts of those who want to keep children safe.

“It has now been 18 months since the Government published its Online Harms White Paper and yet little has happened since, while the threat to children’s safety increases.

“It’s time for the Government to show it hasn’t lost its nerve and that it is prepared to stand up to the powerful internet giants, who are such a big part in our children’s lives. Ministers can show they mean business by promising to introduce legislation in 2021 and getting on with the job of protecting children from online harms.

Simone Vibert, Senior Policy Analyst for the Children’s Commissioner, and author of the report, said:

“Messaging services play an important role in children’s lives, helping them to keep in touch with family and friends. But there is a more sinister side to these platforms. This research shows that hundreds of thousands of children are using messaging apps to contact strangers, including sharing images and photos, and that they are receiving images messages back which make them feel uncomfortable.

“The fact that there are age limits on these apps shows that the tech giants themselves are aware of the risks, and yet most do very little, if anything, to reliably check the age of their users. Our research shows a majority of children are using a messaging app which they aren’t old enough to be using.

“It is yet more evidence of the need for a statutory duty of care on online platforms, including messaging apps.”

Related News Articles