The Biometrics and Surveillance Camera Commissioner's response to the AI Growth Lab call for evidence (accessible)
Published 8 January 2026
Response from the Biometrics and Surveillance Camera Commissioner, Professor William Webster
About you
1. Are you responding on behalf of an organisation or in a personal capacity?
On behalf of an organisation.
2. Please select the group you most closely belong to:
A regulator or public sector organisation.
3. What is the size of your organisation (number of employees)?
1 to 9.
4. Which sector do you work in?
Public Administration and Defence – specifically law enforcement Public Sector
5. We may want to follow up with you - if you are happy to be contacted, please provide us with a contact name, organisation (if relevant), and email address
AI Growth Lab questions
6. The AI growth lab would offer a supervised and time-limited space to modify or disapply certain regulatory requirements. To what extent would an AI Growth Lab make it easier to develop or adopt AI?
It would make it (select one option):
Somewhat easier
7. What advantages do you see in establishing a cross-economy AI Growth Lab, particularly in comparison with single regulator sandboxes?
Open-ended, word limit: 300 words
The proposed AI Growth Lab would provide a recognised established and consistent methodological approach to facilitate cross sector working to enable emerging technologies, such as Live Facial Recognition (LFR) and AI applications in policing, to develop in a safe regulated environment. This would be beneficial to a range of policing and law enforcement organisations, as well as the retail sector and local government. A cross-economy approach can also ensure that ethical and responsible practices are embedded in the process.
8. What disadvantages do you see in establishing a cross-economy AI Growth Lab, particularly in comparison with single regulator sandboxes?
Open-ended, word limit: 300 words
It would be important that the Lab is available across sectors for relevant crosscutting AI applications/capabilities and that there is a transparent mechanism for prioritising the access to the Lab. The Lab should be available to local authorities, devolved Governments and policing and law enforcement organisations, and these organisations should have the possibility of influencing the Labs priorities. Also, if prioritisation means a project proposal is not taken forward, then there should be a mechanism to deploy the sandbox application method locally or with local funding. Consideration also needs to be given to how the Lab interfaces with other existing Labs/sandboxes to avoid duplication.
9. What, if any, specific regulatory barriers (particularly provisions of law) are there that should be addressed through the AI Growth Lab? If there are, why are these barriers to innovation? Please provide evidence where possible.
Open-ended, word limit: 300 words
The AI Growth Lab can be utilised to address the use of AI in policing and law enforcement, which is a unique public service context, in that the use of such technologies can be seen as state or mass surveillance. Much of policing is governed by Common Law and pilots and trials using emerging technologies may conflict directly with Human Rights legislation. The lack of a dedicated legal framework governing these technologies in policing leads to an uncertain environment and a lack of confidence in the deployment of these technologies. The AI Growth Lab could provide a safe regulated environment for the testing of such technologies. A good example, where a barrier to innovation exists is where there is a lack of existing established regulations for LFR which is currently being consulted on separately by the Government in the law Enforcement area (see Legal framework for using facial recognition in law enforcement). The new consultation is a once in a generation opportunity to address this gap in the law enforcement sector and will give more clarity to Law Enforcement agencies on how fast moving technologies such as LFR might be deployed in future.
10. Which sectors or AI applications should the AI Growth Lab prioritise?
Open-ended, word limit: 300 words
The reform the regulatory landscape around AI in law enforcement presents a good opportunity for agencies such as the police to consider using a new Growth Lab as their use of AI technologies develops.
11. What could be potential impacts of participating in the AI Growth Lab on your company/organisation?
Other (please specify)
It presents the opportunities for bodies such as the police to test new technologies to assist with law enforcement, crime reduction, and wider Government objectives to support victims in the criminal justice system and Violence against Women and Girls (VAWG). This could allow for a ‘safe’ environment to test emerging technologies and to ensure that critical principles around ethics and human rights are embedded at the outset of technological development.
12. Several regulatory and advisory sandboxes have operated in the UK and around the world, for example, the FCA’s Innovate Sandbox, the Bank of England / FCA Digital Securities Sandbox, the MHRA’s AI Airlock, and the ICO’s Data Protection Sandbox. Have you participated in such an initiative?
No.
13. What lessons from past sandboxes should inform the design of the AI Growth Lab?
Open-ended, word limit: 300 words
Sandboxes have been utilised internationally successfully for a number of years in relation to digital products and services.
14. What types of regulation (particularly legislative provisions), if any, should be eligible for temporary modification or disapplication within the Lab? Could you give specific examples and why these should be eligible?
Open-ended, word limit: 300 words
The AI Growth Lab should provide a platform for experimentation in a safe regulatory environment, this experimentation with digital technologies should be accompanied by embedded ethical and responsible approaches to data use, especially in relation to Human Rights. Approaches around data protection are an area where flexibility can be taken. Law enforcement regulation might be considered here, although practical considerations such as relevant public communication/consultation are key considerations so the public can be informed on what technologies, and regulatory changes might be under consideration in the interests of transparency. Also, the Government might wish to use the Growth Lab as it develops in parallel in 2026 to its proposals for the regulatory framework for LFR and similar technologies in law enforcement and other sectors.
15. We propose that certain types of rules and obligations, such as those relating to human rights, consumer rights and redress mechanisms, and workers’ protection and intellectual property rights, could never be modified or dis-applied during a pilot. What types of regulation (particularly legislative provisions) should not be eligible for temporary modification or disapplication within the Lab (e.g. to maintain public trust)?
Open-ended, word limit: 300 words
Human rights are sacrosanct in relation to policing and Law enforcement and are critical to maintaining public trust and confidence in Policing.
16. What oversight do you think is needed for the Lab?
Other (please specify)
Oversight must be meaningful if it is to command public confidence. Oversight should include a clear process for soliciting, evaluating, and managing proposals and pilots. Transparency is crucial and there should be a clear audit and reporting mechanisms. Oversight should be an enabler as well as a regulator. The oversight mechanism should also have credible independence and expertise. Parliamentary scrutiny may incur a burden on Parliament and could be satisfied by period statutory reporting. An oversight committee with statutory powers and responsibilities may be an appropriate vehicle for oversight.
17. How would this oversight work most effectively?
Open-ended, word limit: 300 words
It needs to be transparent and proportionate to command the confidence of Parliament, Devolved bodies, and the public. Decision-making and audit processes should be public and transparent, enshrined in legislation and independent of Government.
18. What criteria should determine which organisations or projects are eligible to participate in the Lab?
You have an innovative product you want to bring to market.
Your innovation is intended for the UK market, or you are a UK based firm.
Your innovation is directly connected to AI.
There is a regulatory barrier (legislation) which the AI Growth Lab would help overcome.
Other (please specify):
There is an evolution of new regulation being considered by Parliament which might be informed by projects within the Lab.
19. Which institutional model for operating the Lab is preferable?
Other (please specify)
The AI Growth Lab should be independent of Government, with the support of sectoral regulators.
20. What is your reason for selecting this institutional model?
Regulators need to have independence from Government if they are to be effective. Powers and remits can be derived from legislation. Such a cross-economy Lab should not be captured by individual regulatory bodies, although they should all input into the process.
Open ended, word limit: 300 words.
21. What supervision, monitoring and controls should there be on companies taking part in the Lab?
Open ended, word limit: 300 words.
It is a critical that relevant regulators are kept informed of developments that impact on their regulatory responsibilities, so each regulator needs to have oversight of companies participating in the Lab. Commercial companies participating in the Lab are part of the service ecosystem and can understand governmental needs better if fully integrated into the Lab.
22. Do you think a successful pilot in the AI Growth Lab would justify streamlined powers for making changes permanent, as opposed to following existing legislative processes which would take considerably longer?
Don’t know.
23. If you answered ‘yes’ or ‘maybe’ to question 22, what is the most effective way to achieve streamlined powers to make permanent legislative changes?
Not applicable.
24. Would there be value in extending the AI Growth Lab to other high-potential technologies?
Yes.
25. If you answered ‘yes’ or ‘maybe’ to question 24, which technologies would benefit the most?
Open ended, word limit: 300 words.
The AI Growth Lab needs to accommodate service areas where the regulatory framework is immature or in development, and which may emerge from its own ecosystem. For example, a particular challenge in the law enforcement space are those areas where the regulatory landscape has not matured for the deployment of innovate digital and biometric technologies, such as LFR.
26. Thank you for taking the time to complete the survey. We really appreciate your time. Is there any other feedback or evidence that you wish to share?
No.