Police use of facial recognition: factsheet
Published 4 December 2025
This factsheet provides information on police use of facial recognition technology.
Here is an easy-to-understand guide on how the technology works and how the police use it.
Types of facial recognition
The police use three types of facial recognition:
-
Retrospective Facial Recognition (RFR)
-
Live Facial Recognition (LFR)
-
Operator Initiated Facial Recognition (OIFR)
Retrospective Facial Recognition
Retrospective Facial Recognition (RFR) is an important tool that helps the police to identify suspects. It can also help them to identify missing or deceased people.
RFR is used after an event or incident as part of a criminal investigation. Images typically come from CCTV, mobile phone footage, dashcam or video doorbell footage or social media. The images are then compared against images of people taken on arrest to identify a suspect.
When there is a possible match, a specially trained operator visually assesses the returned results. If a match is confirmed, it is passed to an investigating officer who reviews it for accuracy, and will consider all the available evidence, as they would in any normal investigation.
Police forces use the RFR search facility on the Police National Database and carry out over 25,000 facial image searches every month. Some police forces also have their own local RFR search facility.
Real life examples of how police forces use this technology to catch criminals and keep people safe include:
-
Craig Walters (39-year-old) was jailed for life after attacking a woman he followed off a bus. He was arrested within 48 hours of the incident by South Wales Police who used RFR on images captured by CCTV, including on the bus.
-
RFR was used to quickly identify and arrest Bradley Peek (20-year-old) for a knife attack on a bus driver in Tower Hamlets. He was subsequently charged with grievous bodily harm with intent and sentenced to more than 5 years in prison.
-
The National Police Chiefs’ Council reported that 127 suspects who engaged in serious disorder across the UK in August 2024 — following protests sparked by the tragic killings in Southport — were identified using RFR. Northumbria Police used the technology to help identify those who took part in the disorder and made 19 arrests.
-
Several men forcibly entered a vulnerable woman’s home, and one of them assaulted her. An identification was made within 10 minutes using RFR, and a man was arrested within 24 hours. This early evidence collection enabled safeguarding measures for the victim.
-
Phillip Thompson (35-year-old) admitted to burgling 79 properties after being identified using RFR. He pleaded guilty and was sentenced to 7 years and 1 month in prison for the burglaries and other offences: assault, coercive and controlling behaviour.
Live Facial Recognition
Live Facial Recognition (LFR) technology uses live video footage of people passing an LFR camera in a public place and compares their images to a specific list of people wanted by the police (known as a watchlist). It means the police can quickly locate wanted people and take them off the streets.
All LFR deployments are targeted, intelligence-led, time-bound, and geographically limited. The police will inform the public when they are using the technology, including through visible signage at deployment locations and posts on social media. The police will also provide information on where the public can obtain more information on its use.
Following a possible LFR alert, a police officer on the ground will decide what action, if any, to take. The officer must form reasonable grounds that the person they have located is responsible for committing an offence and that there is justification for an arrest.
If the LFR system does not make a match with the watchlist, the person’s biometric data is deleted immediately and automatically. The watchlist is also destroyed after each operation.
As of November 2025, thirteen police forces have used or are using LFR: South Wales Police, Metropolitan Police, Essex Police, Leicestershire Police, Northamptonshire Police, North Wales Police, Hampshire and Isle of Wight Constabulary, Bedfordshire Constabulary, Suffolk Police, Greater Manchester Police, West Yorkshire Police, Surrey Police and Sussex Police.
When using LFR, police forces must follow the College of Policing Authorised Professional Practice guidance and comply with the Data Protection Act 2018, Human Rights Act 1998, Equality Act 2010, Police and Criminal Evidence Act 1984, Surveillance Camera Code of Practice and specific published policing policies.
Real life examples include:
-
An LFR alert helped the Metropolitan Police locate a man wanted for two outstanding rapes and an offence of indecent assault. The indecent assault was committed in 2017.
-
Deployments of LFR in London, from January 2024 to September 2025, led to over 1,300 arrests of individuals wanted for a variety of serious crimes, such as rape, domestic abuse, aggravated burglary, grievous bodily harm, robbery, drug supply, animal cruelty, aggravated harassment, cruelty to children and criminal damage.
-
The deployments in London also led to other positive outcomes, such as allowing the police to ensure that registered sex offenders were complying with court-imposed conditions. This led to over 100 arrests during the same period. For example, David Cheneler (73-year-old) was jailed for two years after he was spotted walking with a six-year-old girl. Police checks confirmed that he was in breach of his Sexual Offences Prevention Order, which prohibited him from being alone with a child under 14. He was also in possession of a knife.
-
South Wales Police used LFR to help locate and subsequently arrest individuals wanted for a variety of serious crimes, such as grievous bodily harm with intent, robbery, intentional strangulation, actual bodily harm, breach of sex offender notification conditions, domestic violence-related malicious communications, breach of a court order relating to a dwelling burglary, vehicle interference and drugs.
-
A high risk missing 14-year-old girl with significant concerns relating to child sexual exploitation and criminal exploitation was identified following South Wales Police’s use of LFR. She was subsequently safeguarded.
-
Suffolk Police’s use of LFR led to four people being located and arrested for failing to appear before court and a fifth person for theft.
Operator Initiated Facial Recognition (OIFR)
Operator Initiated Facial Recognition (OIFR) is a mobile App that allows officers, after engaging with a person of interest, to photograph them and check their identity where they are not sure, without having to arrest them and take them into custody. South Wales Police and Gwent Police are using the technology.
Real life examples include:
-
South Wales Police stopped a 15-year-old they believed had been reported missing. He refused to say who he was or to provide any identification. Using the OIFR App on their phone they were able to identify him and take him to safety.
-
Using OIFR, the police identified a man wanted for burglary and theft after stealing items from staff areas of a store. He was arrested, charged with two commercial burglaries and shop theft offences, and received a 24-week suspended sentence. Since then, he has been arrested and charged with further multiple thefts.
-
A man had attempted to commit suicide by jumping from a bridge into the water. He was in significant pain and barely conscious when located and did not share his details with the police. OIFR was used to check his identity which allowed officers to share this, alongside warning markers and medication details with the ambulance service.
-
Police officers responded to a report of an insecure property and found a man who failed to identify himself. He gave suspected false details, but a match was made using OIFR. The man was wanted under a European Arrest Warrant that had been outstanding for 13 years, in connection with drug offences and the assault of an officer with a weapon. He was arrested and brought before the court.
Frequently asked questions
What laws currently govern police use of facial recognition?
-
The police have common law powers to prevent and detect crime, and must comply with data protection, human rights, equality and other relevant laws.
-
This means that all deployments must be for a policing purpose and be necessary, proportionate, and fair.
-
Police forces also need to comply with the Surveillance Camera Code of Practice, which is supplemented by published policing policies, and in the case of LFR, follow the College of Policing’s national guidance.
-
Although there is a legal basis for police use of facial recognition, the current legal framework is complicated, inflexible and difficult to understand, which in turn limits the extent to which facial recognition and similar technologies can be confidently rolled out.
-
That is why the government is going back to first principles and consulting on a new legal framework to create consistent, durable rules and appropriate safeguards for facial recognition and similar technologies which are likely to follow it.
What is the government’s plan for facial recognition?
-
The Government is focused on enabling more police forces to safely expand their use of facial recognition.
-
We have invested in LFR as well as funding improvements to national RFR capabilities. We have also funded national evaluation work to understand the impact of facial recognition on police and crime outcomes (currently underway) and its relationship to public trust and confidence (already published).
-
The evaluation will help build an evidence base of where and how facial recognition impacts policing and crime outcomes, and how its use can be aligned with public expectations. The findings will inform the development of the future legal framework and support police forces in deploying the technology in ways that are both operationally effective and publicly accountable.
Didn’t the courts say that police use of live facial recognition was unlawful?
-
The Court of Appeal in Bridges found that South Wales Police did not fully comply with privacy, data protection and equality laws during two of their LFR pilots.
-
The Court helpfully set out what needed to be done to ensure compliance with the legal framework. Since then, the police have addressed the Court’s findings:
-
The College of Policing has issued an Authorised Professional Practice guidance on LFR, in particular setting out the circumstances in which the police can use it, and the categories of people they can look for.
-
The National Physical Laboratory (NPL) has independently tested the LFR algorithm South Wales Police and the Metropolitan Police use and found that there were no statistically significant differences in performance based on age, gender or ethnicity, at the settings they use. The 10 LFR vans rolled out in August 2025 are using the same algorithm that was tested by the NPL.
What safeguards are in place to make sure facial recognition is operated fairly and without bias?
-
Police use of facial recognition is governed by data protection, equality, and human rights laws. It can only be used for a policing purpose, where necessary, proportionate, and fair.
-
Following a possible LFR alert, it is always a police officer on the ground who will decide what action, if any, to take.
-
For RFR, specially trained operators visually assess matches returned from the Police National Database. Once a match is confirmed by the operator, it is passed to an investigating officer who reviews it. The standard procedures for investigation, evidence collection, arrest, charge, and prosecution must be followed.
-
Facial recognition technology is not automated decision making – police officers and trained operators will always make the decisions about whether and how to use any suggested matches.
-
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.
-
Where potential bias is identified, the Home Office support the police in ensuring their guidance and procedures minimise the risk of bias.
Is it true that live facial recognition is not as accurate with female or non-white faces?
-
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for bias. The police must ensure their guidance and procedures minimises the risk of any bias.
-
The National Physical Laboratory (NPL) tested the algorithm South Wales Police and the Metropolitan Police Service have been using for LFR.
-
At the settings police use, the NPL found that for LFR there were no statistically significant differences in performance based on age, gender or ethnicity.
-
There was an 89% chance of identifying someone on the specific watchlist of people wanted by the police, and at worst a 1 in 6,000 chance of incorrectly identifying someone on a watchlist with 10,000 images (known as a false alert). In practice, the false alert rate has been far better than this.
-
The 10 LFR vans rolled out in August 2025 are using the same algorithm that was tested by the NPL.
How does the police decide who to include on their watchlists?
-
The College of Policing’s national guidance sets out the categories of people that can be included on watchlists, which depends on the nature of the deployment.
-
Watchlists may include wanted individuals, suspects, missing or vulnerable people, or those posing a risk to themselves or others. Watchlists must be tailored to a policing objective and reviewed before each deployment to ensure it meets legal tests of necessity and proportionality.
Is there any oversight of police use of facial recognition?
-
Oversight is fragmented and we want to improve it through the new legal framework. It is currently provided by a number of public bodies who are responsible for ensuring compliance with the law and safeguarding people’s rights.
-
The Information Commissioner’s Office regulates all use of personal data (including police use of facial recognition) and has issued guidance on the use of video surveillance cameras.
-
The Equality and Human Rights Commission is responsible for upholding equality and human rights laws. The courts system also plays a vital role in ensuring the law is upheld.
-
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services inspect, monitor and report on the efficiency and effectiveness of police forces.
-
The Independent Office for Police Conduct hold the police accountable for their actions to improve police practices.
-
The Biometrics and Surveillance Camera Commissioner is responsible for promoting police compliance with the laws governing DNA and fingerprints and the Surveillance Camera Code of Practice.
Has the RFR search facility on the Police National Database been tested for bias?
-
The algorithm used by police forces to conduct facial image searches on the Police National Database has been independently tested by the National Physical Laboratory (NPL).
-
The NPL testing found that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.
-
At the settings used by the police, testing also found that if a correct match was on the database, the system found it in 99% of searches.
What steps have been taken to address the bias identified?
-
The Home Office and police take the findings of the NPL report very seriously and have already taken steps to address them.
-
A new algorithm has been procured by the Home Office and independently tested by the NPL and can be used at settings with no statistically significant bias. It is due to be operationally tested in early 2026 and will be subject to evaluation.
-
A retrospective facial recognition match is only ever one piece of intelligence, as part of a wider police investigation.
-
Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the PND to be visually assessed by a trained user and investigating officer. These safeguards have been in place before the NPL testing and report.
-
Established training and guidance have been reissued and promoted to remind trained PND users and investigators of the long-standing manual safeguards to ensure the algorithm does not solely influence investigations and operational decisions. These safeguards reduce the risk of images incorrectly returned by the PND informing operational decisions.
-
The National Police Chiefs Council (NPCC) have supported activity to ensure all forces continue to comply with the College of Policing training and monitor how PND facial search is used operationally.
-
The NPCC have updated impact and equitability assessments in line with their Public Sector Equality Duty.
-
While we want to remove bias in facial recognition systems entirely, policing are confident that established training, processes and operational practices protect the public from the identified bias.
-
Given the importance of this issue, the Home Secretary has asked His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS), with support from the Forensic Science Regulator, to conduct an inspection into police and relevant law enforcement agencies’ use of retrospective facial recognition and assess the effectiveness of the mitigations, which the NPCC supports.
-
The timing and publication of the inspection findings will be a matter for HMICFRS, but we anticipate that the inspection process will start before the end of March 2026.
Further information
Further information on police use of facial recognition, public attitudes towards its use, and what is in place to ensure the police use it fairly and responsibly can be found using the links below.