Tech firms’ plans for more encrypted messaging ‘risks greater child exploitation and abuse’

Encryption of online messages could make it harder to police child abuse and grooming online, warns the Children’s Commissioner for England, Anna Longfield.

It comes as a new report by the commissioner examining how children use online apps finds that the vast majority of youngsters aged eight and above have used some sort of messaging service.

The study suggests that millions of children in England are using messaging platforms that they are not old enough to be accessing. It also follows announcements by Facebook, and indications by other platforms such as Snap, that they plan to apply end-to-end encryption to all their messaging services.

End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse.

Nine out of 10 children aged 8-17 are using messaging apps

Within the report is a survey showing the extent of children’s use of messaging services, including by children much younger than the minimum age requirement. It shows:

  • Nine out of ten children aged between 8-17 are using messenger services.
  • 60 per cent of 8-year-olds and 90 per cent of 12-year-olds reported using a messaging app with an age restriction of 13 or older.
  • Almost one in ten children report using a messaging service to talk to people they don’t already know.
  • One in six girls aged 14-17 reported having received something distressing from a stranger via a private message.
  • 1 in 20 children say they have shared videos or photos of themselves with strangers.
  • Over a third of 8-10-year-olds and over half of 11-13-year olds admit that they have said they were older than they were in order to sign up to an online messaging service.

Privacy can conceal some of the most serious crimes

The report warns that the privacy of direct messaging platforms can conceal some of the most serious crimes against children, including grooming, exploitation and the sharing of child sexual abuse material. An NSPCC investigation found that Facebook, Instagram and WhatsApp were used in child abuse images and online child sexual offences an average of 11 times a day in 2019.

It also found that the rate of grooming offences committed in the UK appears to have further accelerated over the course of lockdown, with 1,220 offences recorded in just the first three months of national lockdown. Facebook-owned apps (Facebook, Instagram, Whatsapp) accounted for 51% of these reports and Snapchat a further 20 per cent.

End to end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse. The Children’s Commissioner’s survey found that Whatsapp – an end-to-end encrypted service owned by Facebook – is the most popular messaging app among all age groups, used by 62 per cent of children surveyed.

Chat services attached to large social media sites, such as Snapchat, Instagram, Facebook and TikTok, are also popular, particularly among teenagers. None are yet end-to-end encrypted by default but all – with the exception of TikTok – have made public their plans to do so in the near future or suggested that they are looking into it. All have age limits which children routinely ignore, and which platforms do little to meaningfully enforce.

It shows how vigilant parents need to be

It is now over 18 months since the publication of the Government’s Online Harms White Paper, and over three years since the publication of the Internet Safety Strategy green paper which preceded it. Added to this delay, the Children’s Commissioner is concerned that end-to-end encrypted messaging services could be defined as “private communications” and could therefore not be subject to the duty of care in the same way as other platforms.

The Children’s Commissioner is also warning that end-to-end encryption could be a cynical attempt on the part of some tech firms to side-step sanctions and litigation, as the UK Government prepares to establish a new legal ‘duty of care’ on companies towards their users. If a platform is unable to read a message shared across their server, it follows that it would be hard for a Government to hold them accountable for its contents.

‘Duty of Care’ should cover private communications

The report makes several recommendations:

The Government should introduce its online harms legislation to Parliament in 2021. It should set a strong expectation on platforms to age verify their their users and allow for stiff sanctions against firms which breach their duty of care, including fines

If tech giants are unable to demonstrate that new features they introduce will not put younger users at increased risk, then the feature should not be implemented

Companies should not apply end-to-end encryption to children’s accounts if doing so reduces children’s safety, and they must introduce better mechanisms for proactively monitoring their platforms for child sexual exploitation. Firms should also retain the ability to scan for child sexual abuse material. Platforms failing to meet these test should be judged to have breached duty of care.

The Government’s proposed duty of care should cover ‘private communications’, including those which are end-to-end encrypted.

“This report reveals the extent to which online messaging is a part of the daily lives of the vast majority of children from the age of eight,” said Commissioner Longfield. “It shows how vigilant parents need to be but also how the tech giants are failing to regulate themselves and so are failing to keep children safe. The widespread use of end-to-end encryption could put more children at risk of grooming and exploitation and hamper the efforts of those who want to keep children safe.

“It has now been 18 months since the Government published its Online Harms White Paper and yet little has happened since, while the threat to children’s safety increases.“It’s time for the Government to show it hasn’t lost its nerve and that it is prepared to stand up to the powerful internet giants, who are such a big part in our children’s lives. Ministers can show they mean business by promising to introduce legislation in 2021 and getting on with the job of protecting children from online harms.

‘Ministers can show they mean business’

Simone Vibert, Senior Policy Analyst for the Children’s Commissioner, and author of the report, added: “Messaging services play an important role in children’s lives, helping them to keep in touch with family and friends. But there is a more sinister side to these platforms. This research shows that hundreds of thousands of children are using messaging apps to contact strangers, including sharing images and photos, and that they are receiving images messages back which make them feel uncomfortable.

“The fact that there are age limits on these apps shows that the tech giants themselves are aware of the risks, and yet most do very little, if anything, to reliably check the age of their users. Our research shows a majority of children are using a messaging app which they aren’t old enough to be using.“It is yet more evidence of the need for a statutory duty of care on online platforms, including messaging apps.”

Author: Simon Weedy

Add your comment

characters remaining.

Log in through one of the following social media partners to comment.