Equalities impact assessment: consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies
Published 4 December 2025
Policy proposal - summary
1. The proposal is to create a durable legal framework for law enforcement’s use of biometrics, facial recognition and similar technologies.
2. Law enforcement organisations can use facial recognition and similar technologies to assist with:
- Supporting the location and arrest of people wanted for criminal offences.
- Preventing people who may cause harm from entering an area (for example, fixated threat individuals, persons subject to football banning orders).
- Supporting the location of people about whom there is intelligence to suggest that they may pose a risk of harm to themselves or others (for example, stalkers, terrorists, missing persons deemed at increased risk).
- Retrospective (i.e. after-the-event) searching of images of suspects (for example, caught on CCTV, doorbells, phones) against a database of images post event/crime, to identify them.
- Checking the identity of an individual by searching their image against a police database of images.
3. Whilst these use cases are important to support the Safer Streets Mission, it is also important to ensure that any use of these technologies proportionately balances the potential benefits to public safety against the level of interference in individual rights, such as rights to privacy and freedom of peaceful assembly (Human Rights Act,1998) and that there is sufficient oversight and governance of their use.
4. This consultation asks about principles that could be applied to police use of facial recognition, as well as a wider range of technologies, which all have the potential to interfere with people’s privacy and other rights, such as freedom of expression and freedom of assembly and association. The Home Office will consider whether the new legal framework should extend to the use of other biometric and inferential technologies.
5. The legal framework should be easily understandable to the public and to the law enforcement bodies to whom it applies.
6. The proposals are for a new legal framework that will:
- be accessible and transparent – ensuring the police and the public can easily understand it and foresee how data and the technology will be used.
- set clear limits on police authority and discretion – defining who makes decisions and guarding against misuse.
- clearly define who sets rules and checks compliance – consolidating and simplifying oversight and regulation.
- be adaptive, allowing new technologies to be adopted consistently – making the framework future proof.
7. The Home Office is seeking consultation feedback on the following:
- The parameters of the framework, that is, the types of biometric and/or inferential technologies it should apply to.
- The types of law enforcement the framework should be applicable to.
- Assessment of interference in privacy and other rights caused by these technologies versus the seriousness of harm they seek to prevent.
- Necessary purposes for acquisition, retention, and use of biometrics and associated technologies
- Safe, fair, consistent uses of these technologies, including prevention of bias and discrimination.
- Circumstances in which access to images on government databases might be justified.
- Powers and remit of oversight and regulatory bodies.
8. The evidence from this consultation will support creation of a legal framework governing use of biometrics, facial recognition and similar technologies for law enforcement. This will include clarifying oversight and regulation into a new body, which could also provide advice and guidance to law enforcement organisations.
Policy proposal – overall equalities considerations
9. The Court of Appeal in R (on the application of Bridges) v Chief Constable of South Wales Police (Information Commissioner and others intervening, 2020))[footnote 1] ruled that there is a legal basis for the police to use facial recognition technology; this relies on common law powers and 1. a patchwork of legislation, codes of practice and published police guidance. But it also highlighted areas where the police had not complied with the law in the case being examined and needed to make improvements, including in relation to their Public Sector Equality Duty (PSED).
10. Since the judgment, the police have acted to address the Court’s findings, including those on the impact of the technology on people with protected characteristics. In particular, the National Physical Laboratory (NPL) has since independently tested the live facial recognition algorithm South Wales Police and the Met Police have been using[footnote 2] and found them to be highly accurate with no statistically significant differences on performance in terms of gender, age or ethnicity, at the settings they use.
11. Whilst the improvements made by policing addressed the Court’s findings and ensure that the police are operating within the law, the patchwork of legislation means the rules are complex and difficult to understand and may not give the police sufficient confidence to use the technology at greater scale. This complexity also limits public confidence that they will be used safely or fairly. The consultation aims to address these issues, whilst keeping the PSED in mind.
12. The consultation asks questions about a new legal framework, the aim of which will be to ensure technologies like facial recognition are adopted and used by policing transparently, safely, fairly and consistently. One key objective of the new framework will be to ensure the use of biometrics, facial recognition and similar technologies does not result in direct or indirect discrimination against people with protected characteristics as usage of these technologies becomes more diverse and widespread.
13. Alongside the consultation a report will be published into independent testing of the algorithm currently used to carry out retrospective facial recognition searches on the Police National Database (PND), together with actions that have been and will be taken to address the demographic bias identified by that testing. Whilst we are assured that the steps necessary to mitigate against this bias are being taken, the outcome of this testing illustrates the need for consistency – we would expect the proposed new independent oversight body to, for example, set national quality standards for facial recognition and other biometric systems, thus ensuring necessary testing was conducted before deployment to minimise bias and potential discrimination against people with protected characteristics in future.
14. The consultation is open to any member of the public, as well as stakeholders and interested groups. It is available on gov.uk and will be publicised, to ensure a wide response, including from people with protected characteristics. We also plan to undertake targeted engagement with under-represented groups (e.g. those who might not engage with a traditional public consultation), including those with protected characteristics, to ensure their views are heard and fully considered when developing the new legal framework.
Consideration of aim 1 of the duty: eliminate unlawful discrimination, harassment, victimisation, and any other conduct prohibited by the Equality Act 2010
15. The equality impact assessment below first considers the equalities implications of the current policy and then considers how potential discrimination could be mitigated by the proposals listed above, including the new frameworks likely to arise as a result of the outcome of the public consultation. This document remains under constant review and will be amended as the policy and new framework develop.
16. The evidence used in considering PSED has come from a variety of sources, including stakeholder engagement with law enforcement, civil society groups, regulators, operational partners, technology providers and more. Alongside this, we have considered algorithm testing reports by the National Physical Laboratory (NPL), statistics provided by police forces, as well as internal analysis conducted by the Home Office Analysis and Insight team. We have considered all the feedback received particularly where this pertains to possible equality issues. Evidence from the public consultation will significantly contribute to the drafting of the new legal framework to ensure potential discrimination is actively considered and mitigated against.
17. It should be noted that there is limited available data available linking specific arrests and criminal justice outcomes to facial recognition, although individual police forces may record and/or report publicly this information – like the Met.[footnote 3] The Home Office is undertaking formal evaluation work to better understand and evidence this. Further, it is envisaged that the proposed new legal framework would improve this situation – for example as a requirement of the new independent oversight body - necessitating the consistent recording and reporting of such data.
Direct discrimination within the current position
18. For the purposes of this duty, we have considered the following protected characteristics:
- Age
- Disability
- Gender Reassignment
- Marriage and Civil Partnership
- Pregnancy and Maternity
- Race, Religion or Belief
- Sex and Sexual Orientation.
19. We do not consider that any of the proposals currently subject – or will subject, – any person to less favourable treatment than any other person because of a protected characteristic and therefore no direct discrimination arises in relation to the protected characteristics, asides from potentially age.
20. Although still subject to legal advice it is likely that any measures proposed as a result of the consultation will not extend to children in their entirety (this will be dependent on the detailed codes of practice proposed as a result of the consultation, e.g. a child under 18 will not have their image taken in custody, but will still be subject to a small element of biometric processing as a result of a LFR van).The policy therefore subjects adults to less favourable treatment than children because of the protected characteristic of age. However, this is a proportionate means of achieving the legitimate aim of ensuring the protection and welfare of children.
Indirect discrimination within the current position
21. Independent testing has identified bias in the algorithm currently used for retrospective facial recognition in the PND. There is therefore potential for indirect discrimination on the basis of: sex, race and age in the way the police currently use this particular facial recognition tool. However, the purpose of the proposals set out above i.e. the consultation on a new legal framework, is to mitigate and prevent such matters in future, as set out in the above paragraphs (11-12).
22. In the meantime, a comprehensive package of measures is in place to ensure any potential indirect discrimination is appropriately mitigated. These measures include the police updating their Equality Impact Assessment (EIA), Data Protection Impact Assessment (DPIA), processes for checking image matches and further training for PND users, as well as obtaining independent assurance on these mitigations from HMICFRS.
23. We have also identified that there could be indirect discrimination as a result of the use of facial recognition or similar technologies against people who cover their head or face for religious, health or other reasons were they, for example, to be treated differently by police using facial recognition or similar technology as a result of wearing such head or face coverings. We assess that this could amount to indirect discrimination in relation to religion or age. We do not currently have evidence that this is taking place but will address this possibility through the consultation and development of the new legal framework.
24. No potential indirect discrimination has been identified in relation to the protected characteristics of gender reassignment, disability, sexual orientation, marriage and civil partnership, pregnancy and maternity.
How we expect these indirect discriminations to be minimised by the proposals detailed
25. Notwithstanding the particular disadvantages identified above, it is considered that the current policies for the law enforcement use of biometrics, facial recognition and similar technologies are justified and proportionate in pursuance of the department’s aim to equip law enforcement with the necessary tools to safeguard the public effectively. Whilst unfortunate, the Home Office considers that any marginal impact on the basis of sex, race, age or religion can be justified due to the overall benefits for policing and to public safety. Nevertheless, action has been taken to minimise these impacts, as outlined above.
26. We also understand that there is room for potential indirect discrimination to be further minimised, which we will address through the consultation as outlined in paragraphs 21 to 24 above.
Consideration of aim 2 of the duty: Advancing equality of opportunity between people who share a protected characteristic and people who do not share it
27. The proposed policy changes resulting from the consultation are intended to deliver better outcomes for those with protected characteristics, by ensuring that wider and more diverse use of these technologies in future is done according to a bespoke legal framework, ensuring transparent, safe, fair and consistent adoption.
Consideration of aim 3 of the duty: Fostering good relations between people who share a protected characteristic and persons who do not share it
28. The proposed legislative changes (dependent on the outcomes of the consultation) aim to ensure consistency in approach to the use of these technologies by law enforcement and strengthen oversight. As such, the aim is to build public trust by highlighting the specific legal frameworks that will be put in place and the statutory bodies for oversight, which will apply to everyone in England and Wales, irrespective of any protected characteristic, whether shared or not.
Ongoing compliance with the Public Sector Equality Duty
29. The Home Office will ensure to keep the PSED under review during the process of the consultation, including: engagement with the consultation, responding to the consultation and development of the framework as a result of the consultation. Once the framework is created, the responsibility to ensure ongoing compliance with the PSED lies with the intended users of the technologies e.g. law enforcement. However, we envisage that the changes proposed through this consultation will aid in ensuring ongoing compliance with the PSED.
Section 55 duty (for immigration, asylum, and nationality considerations only).
30. It is possible that facial recognition could in the future be used for immigration, asylum and nationality functions, but the consultation focuses on use by law enforcement, noting that questions in the consultation about scope extend to other public bodies. However, children will not be processed without legal consent. (This is to be updated as policy develops).
Risks to vulnerable individuals and other groups
31. The proposed legislative changes are unlikely to create risks for vulnerable individuals or vulnerable groups, but rather the changes aim to enhance the protections of vulnerable individuals. Biometric facial recognition and similar technologies have already demonstrated considerable value in monitoring offenders in breach of conditions where vulnerable individuals might be at risk as a result of the breach. The increase of use in the technologies discussed as a result of the legislative changes proposed, is likely to lead to greater monitoring of said individuals and thus enhance the protections afforded to vulnerable individuals.
Section 8: Declaration and sign off
32. I have read the available evidence, and I am satisfied that this demonstrates compliance, where relevant, with section 149 of the Equality Act 2010 and that due regard has been had to the need to eliminate unlawful discrimination, advance equality of opportunity and foster good relations.
This EIA will be reviewed as the policy is developed through 2026.
SCS Name & Title: Alex Macdonald
Directorate/Unit: Data and Identity, Identity Policy Unit, Science, Technology, Analysis and Research and Strategy (STARS) Group
Lead contact: [Name redacted]
Date: 24/11/2025
-
R (on the application of Bridges) v Chief Constable of South Wales Police (Information Commissioner and others intervening) case judgment: https://www.judiciary.uk/wp-content/uploads/2020/08/R-Bridges-v-CC-South-Wales-ors-Judgment.pdf ↩
-
The same algorithm is now used for the small national live facial recognition capability (10 vans) funded by the Home Office. ↩
-
Live Facial Recognition Annual Report September 2025 - https://www.met.police.uk/SysSiteAssets/media/downloads/force-content/met/advice/lfr/other-lfr-documents/live-facial-recognition-annual-report-2025.pdf ↩