Independent report

CDEI publishes briefing paper on facial recognition technology

The CDEI has published a Snapshot briefing paper looking at the uses and potential implications of facial recognition technology’s deployment in the UK.

Documents

Snapshot Paper - Facial Recognition Technology

Request an accessible format.
If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email enquiries@dcms.gov.uk. Please tell us what format you need. It will help us if you say what assistive technology you use.

Details

What is a CDEI Snapshot?

CDEI Snapshots are designed to introduce non-expert readers to important ethical issues related to the development and deployment of AI and data-driven technology. Their purpose is to separate fact from fiction, clarify what is known and unknown about an issue, and outline possible avenues of action by government and industry. Previous Snapshots have looked at deepfakes, smart speakers and the use of AI in the personal insurance sector.

What does this paper cover?

Facial recognition technology is among the most controversial data-driven innovations in use today. Advocates claim that it could make our streets safer, our bank accounts more secure, and our public services more accessible. Critics argue that it poses a threat to privacy and other civil liberties. This paper attempts to bring clarity to this debate, putting claims under scrutiny and helping readers understand where to direct their attention. It seeks to answer several fundamental questions about FRT systems, including how they are developed, where they have been deployed to date, and the mechanisms by which they are governed.

The paper’s findings are informed by interviews with experts from across civil society, academia, industry, law enforcement and government.

What are the key findings?

The paper finds that:

  • FRT can be used for varied purposes. Some systems aim to verify people’s identity (e.g. to unlock an electronic device), while others seek to identify individuals (e.g. by scanning a group of people to see if any are featured on a watchlist).

  • FRT systems have been deployed across the public and private sectors. Several police forces have trialled live FRT, while banks have installed FRT functionality within apps to improve customer experience.

  • Used responsibly, FRT has the potential to enhance efficiency and security across many contexts. However, the technology also presents several risks, including to privacy and the fair treatment of individuals.

  • The extent to which an FRT system is beneficial or detrimental to society depends on the context, as well as the accuracy and biases of the specific algorithm deployed. Each use must be assessed according to its own merits and risks.

  • The use of FRT is regulated by several laws, including the Data Protection Act and the Human Rights Act (for public sector applications). However, a standalone code of practice for FRT has yet to be drawn up.

  • The regulatory regime governing the use of FRT in the private sector is less extensive than the one for public law enforcement. Policymakers should consider whether there is sufficient oversight of FRT in contexts such as retail and private property developments.

What happens next?

The CDEI will continue to examine the impacts of FRT on society. We are particularly interested in how the technology is being used in the private sector, and where it might be deployed to support Covid 19 response efforts (e.g. to power the digital ID systems behind C19 health certificates).

Updates to this page

Published 28 May 2020

Sign up for emails or print this page