Summary: accuracy and equitability evaluation of Cognitec FaceVACS-DBScan ID v5.5
Published 4 December 2025
Independent Testing of Facial Recognition Technology: National Retrospective Facial Recognition Algorithm used to search the Police National Database
What is the Police National Database?
The Home Office in collaboration with the National Police’s Chief Council (NPCC) provide the Police National Database (PND) which is a national intelligence system, enabling the sharing of information across Police Forces and Law Enforcement Agencies to safeguard vulnerable people, prevent and detect crime and protect communities from harm. PND is designated Critical National Infrastructure (CNI). All PND users receive training from the College of Policing (CoP).
The PND was created in response to the murder of two schoolgirls in Soham in 2002 and the subsequent Bichard Inquiry which recommended a national intelligence system which provides a national view of local force data.[footnote 1]The PND went live in 2011.
It brings together data from all 43 police forces in England and Wales, as well as Police Scotland, Police Services Northern Ireland, British Transport Police, Ministry of Defence Police, the National Crime Agency, and others. Recently, new data sources have been added, including the National Firearms Licensing Management System and His Majesty’s Prison and Probation Service (HMPPS) Seized Media Database.
PND provides a national source of information for major crime, Organised Crime Groups, County Lines, Modern Slavery and Human Trafficking. It presents a consolidated view to UK and European Law Enforcement Agencies, enabled through the Prum Treaty.
PND national search functions include ‘Person’, ‘Object’, ‘Location’ and ‘event’ (including crime). Retrospective Facial Recognition (RFR) was added in 2013 and upgraded in 2021. Police use a Cognitec algorithm procured by the Home Office. Forces and law enforcement agencies conduct around 25,000 searches each month. PND users can search approximately 19 million still images of people taken when detained at a police station (‘custody images’) in order to identify unknown persons of interest.
As with all PND searches, RFR search results are not evidential and are treated as intelligence. PND users and investigators must complete visual reviews of facial search results and always seek additional corroborative lines of enquiry to validate any matches before taking operational or investigative decisions. No decisions are made by the algorithm or solely on the basis of the potential matches provided by the algorithm.
What is Retrospective Facial Recognition (RFR)?
RFR is used after an event or incident as part of a criminal investigation when the police need to identify someone. Images are typically supplied from CCTV, mobile phone footage, dashcam or doorbell footage or social media. These images are then compared against images of people taken on arrest to identify a suspect.
Specially trained operators visually assess the images returned from the PND. When an operator confirms a facial match, the image is passed to an investigating officer who reviews it for accuracy, and will consider all the available evidence, as they would in any normal investigation.
This is a key tool for the police to identify suspects more quickly and accurately. It can also help identify missing or deceased people, where other methods would not be appropriate or effective.
Why test the system?
The Home Office, in collaboration with the Office of the Police Chief Scientific Adviser (OPCSA) and the NPCC, commissioned independent testing of Facial Recognition Technology which is currently used by specially trained operators in all police forces to search the PND.
Independent testing was commissioned under the previous government in Winter 2023 alongside work to significantly increase facial recognition (FR) searching. The aim was to help police better understand and mitigate bias and inform future investment decisions.
The testing was conducted by the National Physical Laboratory (NPL), a world-leading centre of excellence that provides cutting-edge measurement in science, engineering and technology. The aim of the testing was to develop an in-depth understanding of the equitability and performance of the algorithm when it was being used in operational environments for RFR.
The NPL testing utilised a similar, but not identical, methodology and dataset to testing conducted on the NEC NeoFace algorithm in 2023 for the Metropolitan Police Service and South Wales Police[footnote 2] and was specifically designed to help identify any impact this technology may have on any protected characteristics, in particular race, age and gender.
What do the test results tell us?
The NPL report gives an impartial, scientifically underpinned and evidence-based analysis of the performance of the FR algorithm currently used by PND.
We are now able to better understand the identification performance of the RFR algorithm. At the operational setting used by police, the testing identified that if the true match to the probe was in the gallery, then the algorithm would return a match in over 99% of searches.
We are now also able to better understand the demographic performance of the RFR algorithm in the PND. At the operational setting used by police, the testing identified that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.
How have Policing and the Home Office responded to the results?
Public authorities have a duty under the Public Sector Equality Duty to have due regard to certain equality considerations when exercising their functions.
The results from the testing assist policing with further understanding of how to use facial recognition technology fairly to prevent and detect crime, safeguard national security and keep people safe.
Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the PND to be visually assessed by a trained user and an investigating officer. These safeguards have always been in place, even before the NPL report, and reduces the risk that the wrong person is subject to investigation.
The Home Office has acted on the findings of the report. A new algorithm has been procured and independently tested, which can be used at settings with no significant demographic variation in performance. The new algorithm is due to be operationally tested early next year and will be subject to evaluation. The report has been published on GOV.UK: https://www.gov.uk/government/publications/facial-recognition-technology-tests-national-physical-laboratory
The police have also acted on the findings of the report. Established training and guidance have been reissued and promoted to remind trained users of the long-standing manual safeguards to ensure the bias in the algorithm does not influence investigations. The NPCC has updated impact and equitability assessments in line with their Public Sector Equality Duty.
We want to remove bias in facial recognition algorithms entirely and policing is confident that established training, processes and operational practices protect the public from the identified bias. However, given the importance of this issue we have asked His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS), with the support of the Forensic Science Regulator, to conduct an inspection, which the NPCC supports. The timing and publication of the inspection findings will be a matter for HMICFRS, but we anticipate that the inspection process will start before the end of March 2026.
The full results are presented in the NPL’s commissioned report: Facial Recognition Technology in Law Enforcement Equitability Study