Guidance

End-to-end encryption and child safety

Published 20 September 2023

Key facts and stats

According to recent data:

  • there were almost 34,485 offences in the UK relating to online indecent images of children in the year ending December 2022 - an increase of 13% from last year [footnote 1]
  • there are up to 830,000 people in the UK who could pose a sexual threat to children, either through online or in-person abuse [footnote 2]
  • the fastest-growing age group appearing in online child sexual abuse imagery is 7 to 10 year olds (Internet Watch Foundation data)
  • prevalence of the most severe forms of online child sexual abuse have more than doubled since 2020 (Internet Watch Foundation data)
  • there are over 400,000 searches for online child sexual abuse material every month in the UK [footnote 2]
  • an estimated 27 million images have been identified through UK law enforcement investigations of child sexual abuse [footnote 3]
  • currently, the information that social media companies give to UK law enforcement contributes to over 800 arrests of suspected child sex offenders, and results in an estimated 1,200 children being safeguarded from child sexual abuse on average every month [footnote 2]

According to YouGov polling commissioned by the NSPCC of 1,723 adults across the UK:

  • 8 in 10 (79%) think companies should seek to develop technology that allows them to identify child sexual abuse in end-to-end encrypted messaging apps
  • 8 in 10 (79%) think it should be a legal requirement on social media sites

End-to-end encryption

End-to-end encryption (E2EE) is a secure communication system where messages can only be seen by the sender and receiver.

Technology companies currently use encryption positively to keep your bank transactions and online purchases safe and secure. Encryption has many other uses throughout everyday life, but some social media companies such as Meta are proposing to implement or already have implemented E2EE in private messaging spaces.

E2EE overrides current controls in place that help to keep children safe and potentially poses a huge risk.

E2EE and child safety

At the moment, social media companies scan their platforms to find and report child sexual abuse material (such as images, videos, and grooming conversations) to The National Center for Missing and Exploited Children (NCMEC). NCMEC pass these referrals to the relevant law enforcement agencies, so that abusers are arrested, and children are protected.

The implementation of E2EE, without robust child safety measures in place, poses a catastrophic risk to the safety of children online, social media companies will no longer be able to find and report child sexual abuse material in the same way.

Intentionally implementing E2EE without necessary safety features will blind social media companies to the child sexual abuse material that is being repeatedly shared on their platforms. More child sexual abuse content will go unreported and unchecked and that will put more children in greater danger.

The UK government supports strong encryption. We are not asking companies to stop the implementation of E2EE across their messaging services. We are instead urging all social media companies to implement sufficient child safety measures on their messaging platforms that will maintain and/or enhance the identification and prevention of child sexual abuse.

Meta

Meta has been a leading industry player in the fight to tackle child sexual abuse. For over a decade, Meta has utilised hash matching technologies to enable it to detect child sexual abuse material being shared on its platforms. This has made it one of the leaders in detecting and reporting online child sexual abuse, providing law enforcement with leads to safeguard children and arrest child sex offenders.

However, Meta and other companies are now planning to implement E2EE, without similar technologies in place, across their messaging platforms such as Facebook Messenger and Instagram Direct Messages. The roll out of E2EE is likely to happen later this year.  The National Center for Missing and Exploited Children (NCMEC) estimate up to 70% of Meta referrals could be lost following the roll-out of end-to-end encryption.

Risks to child safety

The safety and security of children is at the heart of this issue. It is crucial that all technology companies implement safety-by-design across their platforms, especially when they are implementing design choices that are detrimental to child safety, such as E2EE.

Currently, Facebook and Instagram account for over 85% of the global referrals of child sexual abuse instances from tech companies.

The implementation of E2EE will significantly reduce the number of monthly referrals of suspected child sex offenders to UK law enforcement.

We are urging Meta and other social media companies to work with us and use their vast engineering and technical resources to develop a solution that protects child safety online and suits their platform design best.

Current detection and reporting methods

For over a decade, Meta have utilised hash matching technologies to enable it to detect child sexual abuse material (such as images, videos, and grooming conversations) that they then report to law enforcement. Every month, this contributes to the arrest of approximately 800 suspected child sex offenders and the safeguarding of 1,200 children.

Meta has implemented several measures that they believe will address child sexual abuse across their platforms, including Facebook, Instagram, and WhatsApp. Some of these measures include:

Detection and reporting: Meta uses technologies such as artificial intelligence to detect and remove content that violates its community standards, including child sexual abuse content. In an end-to-end encrypted environment, this will only be child sexual abuse material that is shared in public spaces, which will only be a minutia of all of this illegal imagery, the vast majority is shared in private messaging. Additionally, Meta encourages users to report such content to Meta, which is then reviewed by a team of human moderators. A lot of the time, children do not understand that they are being groomed by older individuals on these platforms and therefore a reporting function has limited success.

Age verification: Meta requires users to be at least 13 years old to create an account. To enforce this, Meta uses a combination of machine learning and human review to identify and remove accounts that belong to underage users. Despite this, stronger age verification methods need to be implemented.

Privacy and safety tools: Meta provides various privacy and safety tools to its users to help them control who can see their content and who can contact them. These tools include blocking and reporting features, private messaging settings, and content filtering options.

All of these measures are on the right track yet they currently do not sufficiently protect children from harm on platforms with E2EE in place and they rely on the child to report abuse. Meta and other companies need to work with the UK government and invest in technologies that will maintain and/or enhance the identification and prevention of child sexual abuse on its messaging services – specifically Facebook Messenger and Instagram Direct - as they rollout E2EE more widely.

The Online Safety Bill and E2EE

The Online Safety Bill is new legislation intended to make the UK the safest place to be online. The bill has been designed to protect both the safety of users as well as their right to privacy. It is deliberately tech-neutral and future-proofed, to ensure it keeps pace with technologies, including end-to-end encryption.

It sets out a legal duty for social media companies to put in place systems and processes to tackle child sexual abuse content on their services irrespective of the technologies they use, including services using E2EE.

The bill will give Ofcom the power, where necessary and proportionate, to require that a company use accredited technology, or make best efforts to develop technology, to tackle child sexual abuse on any part of its service including public and private channels.

If they fail to do so Ofcom will be able to impose fines of up to £18 million or 10% of the company’s global annual turnover - depending on which is higher.

The technical solutions available

The Safety Tech Challenge Fund is a UK government funded challenge programme that first ran from 2021 to 2022. The fund was designed to support the development of proof-of-concept tools to detect child sexual abuse material across E2EE environments, whilst upholding user privacy. The fund demonstrated that it would be technically feasible.

It is recognised though that each and every online social media platform and service is different, and therefore solutions will need to be tailored. Therefore, companies such as Meta should utilise their vast expertise and engineering resources and build on the outputs of this fund and develop solutions for their individual platforms/services.

In addition, some of the UK’s leading cryptographers have written an academic paper outlining a variety of techniques that could be used as part of any potential solution in E2EE to provide both user privacy and security, while protecting child safety and enabling law enforcement action.

Technical solutions and people’s privacy

The government is not requiring companies to use “off the shelf” products and where appropriate encourages firms to use their vast engineering and technical resources to develop solutions that work for their own platforms.  Technical solutions should be designed with a particular focus on safeguarding people’s right to privacy and data protection.

The Safety Tech Challenge Fund demonstrated that it would be technically feasible to protect both child safety online and individual privacy. The government supports technical innovation and strong privacy, including E2EE, but we have always said that this cannot come at the cost of public safety.

Further information and support

The UK government and Internet Watch Foundation have produced a guide to help parents understand the risks that Meta’s plans could pose to child safety.

If you are worried about online sexual abuse or grooming of under 18-year-olds, you can report it to the National Crime Agency’s CEOP Safety Centre

For further advice there are a number of organisations that can help, visit the Internet Watch Foundation website for details: Additional help - TALK Checklist by Internet Watch Foundation (iwf.org.uk).

The UK government is continuously working to protect children online. The full details of our efforts have been recently set out in the 2021 Tackling Child Sexual Abuse Strategy.

  1. Statistic provided by Home Office 

  2. Statistic provided by National Crime Agency  2 3

  3. Statistic provided by HO Child Abuse Image Database