Biometrics and Forensics Ethics Group: annual report 2022 to 2024 (accessible)
Published 5 September 2025
Chair’s introduction
It has been some time since the last Biometrics and Forensics Ethics Group (BFEG) Annual Report, which ended in June 2022, and a lot has happened in the meantime. With the growing interest in the ethics around matters such as the use of artificial intelligence (AI) or the use of the biometric technology for facial recognition, the importance of the independent advisory contribution that BFEG brings to the Home Office has never been more significant.
With a membership of nationally and internationally renowned experts, BFEG provides a key role in challenging and supporting the ethical underpinning of Home Office priorities (specifically with regards to Biometrics, Forensics, and large and complex datasets). BFEG assists the Home Office in developing ethically- considered policy and programmes of work.
I have identified some highlights of BFEG’s achievements since our last report in my introduction.
In March 2023, BFEG held a launch event for the updated BFEG ethical principles. This event was attended by Lord Sharpe, our then Ministerial Sponsor, and Professor Jennifer Rubin, the Chief Scientific Adviser for the Home Office. This provided us with an opportunity to share our work and the resource we offer not only internally to the Home Office and externally to our network of stakeholders. I am pleased to note, for example, the College of Policing has already engaged with the BFEG ethical principles and our work on AI in the design of their own data ethics principles.
Exciting areas of work for us have included the development of ethics for data use in policing and providing ethical advice to the Accelerated Capability Environment (ACE). These areas of work, including BFEG’s work on the requirements for a national policing data ethics committee and BFEG’s ethical advice to the ACE Deepfake Detection Challenge are discussed later in the report.
With many members reaching the end of their terms this year, BFEG’s membership has many new faces. I was very sad to lose the expertise of those members whose final terms expired over the past year and enormously grateful for their considerable contributions and for volunteering their time. The new members bring with them a wealth of knowledge, experience and expertise in: AI, data management, law and awareness of advanced technologies and their applications. It is pleasing to see we continue to attract some of the leading experts in their fields, particularly since recruitment is by competitive public appointment and members are not remunerated for their service.
Finally, I was pleased to learn, in August 2023, that my appointment as Chair of BFEG had been extended for a further two-year term. It has been an incredible privilege and honour for me to steer this independent non-departmental public body through a period of significant change. Given that BFEG’s work has always tracked scientific and technological advances, our group’s ever-varying work sees some of the most novel and challenging ethical issues today I look forward to my next year with much excitement.
Professor Mark Watson-Gandy OBE
Chair, BFEG
What we do
The Biometrics and Forensics Ethics Group (BFEG) is an advisory non-departmental public body sponsored by the Home Office. It provides independent ethical advice on issues related to the collection, use and retention of biometric and forensic material, and on the use of large and complex datasets and projects using artificial intelligence (AI).
The Home Office has commissioned BFEG to consider the ethical impact on society, groups, and individuals from:
- the use of large datasets within the Home Office, including the implementation of systems using machine learning and AI;
- the collection, retention and use of human biometric identifiers, such as DNA, fingerprints and facial recognition;
- the retention and use of forensic data such as extracted digital forensic material;
- policy and projects from the Forensic Information Databases Strategy Board (FIND SB); and
- relevant projects from the Home Office Biometrics (HOB) programme, including advice on Data Protection Impact Assessments (DPIAs).
The BFEG also considers:
- issues raised by key stakeholders such as FIND SB, the Biometrics and Surveillance Camera Commissioner, and the Forensic Science Regulator (FSR); and
- issues raised by members of the BFEG as part of its self-commissioned work (roughly 30%).
2022/2023 Commission
For the 2022/23 commissioning year, the Home Office requested BFEG continue the important work of its existing working groups covering the full advisory remit of the group. This included prioritising the provision of advice on biometrics policy and projects as well as projects using advanced data processing techniques and/or large and complex datasets. In addition to continuing the work of previous years, BFEG was asked to:
- support policy development regarding the governance of powerful data-driven technologies in the Home Office; and
- provide advice on the ethical use of novel biometric technologies, including gait and voice recognition.
The full 2022/23 commissioning letter can be read on the BFEG GOV.UK website. In response to the commission, BFEG continued the work of its existing working groups and developed two new working groups to address the increasing prevalence and excitement around artificial intelligence (AI), and to support the progression of new workstreams. The below summaries of working group activity outline what BFEG has done to address requests from the Home Office.
It is important to note that due to delayed receipt of the formal BFEG commissioning letter in November 2022 and to align the annual report with the financial reporting year, it was agreed (for this commission cycle) to run from the end of the last annual report to the end of the 2024 financial year. As such, this annual report covers a period of 21 months from July 2022 to March 2024.
As part of our continued commitment to innovation, BFEG will review the current working groups during the next commission cycle, to ensure they reflect and are aligned to the Home Office priorities and our other work commitments. We will publish the revised working groups and their terms of reference on our website in due course. This will give BFEG the opportunity to utilise the extensive expertise and experience of our current and new membership in a focused and dedicated manner. It will also reflect our commitment to greater transparency and engagement.
Summary of BFEG working group activity
Home Office Biometrics Ethics Working Group (HOB E WG)
The HOB E WG advises the Home Office Biometrics (HOB) programme and related Home Office initiatives on projects at an early stage of their development as well as providing advice on HOB Data Privacy Impact Assessments (DPIAs).
BFEG provided its availability and capacity to support and advise on any requests from the HOB programme team across the 21 months of this report. While engagement has been reduced during the period due to the working priorities of the HOB programme, BFEG has supported and delivered against all requests within this period and provided guidance and ethical oversight in the progression of the HOB programme. BFEG continue to work with Home Office colleagues to develop a more structured and specific delivery programme for the coming year.
Complex Datasets Working Group (CDWG)
The CDWG advises on projects considering the adoption and/or use of explainable data-driven technology and may be asked to assist with development of a discipline-specific ethics framework for use by those commissioning, designing, and using machine learning applications.
This working group has not had an allocation of work, with much of the remit going to the newly formed AI working group. This highlights the need for BFEG to review the current working group structures, to ensure they are fit for purpose. Before undertaking this, BFEG has been keen to ensure all outstanding work from previous years has been addressed and resolved. This includes addressing any outstanding actions from BFEG quarterly meetings and an outstanding commitment to produce a publication in response to a freedom of information (FOI) request.
As part of their workplan, the CDWG intended to produce a public report which provided guidance on the ethical issues in automated classification systems. This had been recorded as an action within the minutes of BFEG quarterly meetings since March 2020. However, at the time, BFEG deemed this report to be too high level and generic to be a valuable document for publication. Since then, BFEG has been involved in review of further data science applications and has been able to collate a wider series of recommendations, which could be broadly applicable to projects involving machine learning systems. To address the outstanding action, BFEG has included these recommendations within this annual report as an appendix (please see appendix 1). As the recommendations were limited, BFEG felt it would be an inefficient use of resources to undertake a separate publication. This addresses another priority for BFEG - to ensure outputs and commitments are not duplicated or unnecessarily complicated.
Data Ethics Advisory Group (DEAG)
The DEAG considers ethical issues in Home Office data projects, especially security or policing projects, and provides a source of advice and guidance for project teams. Specifically, the Group advises on projects that use advanced data processing techniques and/or large and complex datasets.
DEAG has continued to receive submissions and successfully deliver to agreed timescales, continuing to develop longstanding stakeholder relationships and support the Home Office in considering ethical dimensions of various projects.
In December 2023, a submission was made to DEAG, via the College of Policing, to consider a possible use case for an AI application for use by a local police force. The review and consideration by DEAG (as a working group of BFEG) was used as an initial test case and as an evaluation tool to trial the capacity of BFEG to act as a possible National Policing Ethics Body (PEB), in advance of conducting a pilot. Further detail is provided about BFEG’s work on the PEB, on the following page.
Biometrics Recognition Technology Working Group (BRT WG)
The BRT WG advises on the ethical implementation and use of new and emerging biometric recognition technology, for example for use by policing or at the border. The Group also advises on ethical considerations related to the retention of biometrics and forensic material.
As part of the commissioning brief that BFEG received from our Home Office Policy Sponsor, BFEG were asked to consider novel biometrics, such as voice and gait technologies, and their application. In response to this commission the BRT working group and wider BFEG, led by Dr Peter Waggett and Dr Nóra Ní Loideáin, agreed to focus on delivering a report on the use and application of voice technologies for recognition, including recommendations for the adoption of this technology.
The BRT working group hosted two in-person evidence gathering sessions to engage with a wide network of stakeholders which included civil liberties groups, industry and other government departments. A range of questions which were designed to support and inform the recommendations of the final report were considered at this event. Written evidence was also accepted.
BFEG are on target to publish this report in early 2025. BFEG has separately considered the use of gait technology and is due to submit their findings to the Home Office and ministers alongside the voice report.
Working with the Home Office Forensic Information Database Service
The Home Office Forensic Information Database Service (FINDS) is key stakeholder of BFEG. While we do not have a dedicated working group for this area of work, BFEG have continued to work closely with officials within FINDS by providing advice as requested and through membership on the FINDS Strategy Board. Having a member of BFEG sit on the FINDS Strategy Board, ensures there is synergy and consistency in delivery of advice and that ethics is always on the agenda.
As part of our review of working groups, we will look to ways of ensuring and enhancing engagement and delivery with FINDS.
New BFEG working groups and programmes of activity
AI Working Group
In response to the request to consider powerful data-driven technologies and given the nature and significance of AI, BFEG established a working group to focus on this development.
This working group developed an AI Charter which was submitted to the department in July 2023 alongside recommendations for the development of an AI risk triage tool.
At the request of our policy sponsor, BFEG has refrained from publishing the AI Charter to support the work of our Policy sponsor and align with incoming Home Office policy around the responsible and ethical adoption of AI. We will support both the joint publication of the AI Charter and ensuring that the Home Office’s policy on adoption of AI products is ethically sound, in the coming year. BFEG has ambition to use the AI Charter to hold the Home Office accountable in the ethical adoption of these technologies and ensure that the many streams of work associated with the Home Office’s vision for AI are aligned to the BFEG AI Charter.
The wealth of expertise on the BFEG panel continues to remain available to provide ethical advice to the Home Office on procurement, adoption, use and evaluation of AI technologies on a project-specific basis.
Policing Ethics Body (PEB)
As a new stream of work, BFEG was exploring the possibility of becoming a national body for the provision of ethical advice to Policing in digital and data matters as they relate to use by law enforcement. BFEG was proposing their availability and capacity to run a pilot scheme which would run for one year and be evaluated at the end. Both design and evaluation would be co-designed with Policing representation.
BFEG already advises the Home Office on national level policing projects, such as the roll out of facial recognition or the integration of datasets into the national criminal justice laboratory. The role of BFEG as a source of ethical advice to policing would seek to formalise and validate the calibre of BFEG’s ethical advice already provided to Policing via the Home Office.
The National Police Chiefs’ Council (NPCC) has taken responsibility for governance and oversight of this process, thus enhancing the independent element of BFEG’s role in this process.
Prior to the governance being taken over by the NPCC, BFEG initially engaged with the College of Policing on this work. At the point of engagement between BFEG and the NPCC, BFEG were exploring with the College of Policing a test case, which would act as a case study to understand BFEG’s processes and possible suitability national body for the provision of ethical advice to Policing in digital and data matters as they relate to use by law enforcement. Given that this piece of work was in progress, NPCC and APCC awaited the report from the College of Policing regarding the involvement of BFEG as a starting point. BFEG provided their services for the test case within the reporting period, giving ethical insights to a specific test case and practical insights on an organisational level to BFEG’s working practice (report provided May 2024).
The concept of BFEG being utilised for this purpose was first considered and explored at the December 2022 quarterly meeting of BFEG where the Justice and Home Affairs Committee report “Technology Rules? The advent of new technologies in the justice system” was discussed. It was acknowledged that BFEG could provide the ethical governance role recommended within the report (recommendation 23). BFEG’s Policy sponsor recommended BFEG conduct outreach to explain its function and independence to police commissioners.
BFEG conducted various outreach activities, such as presenting on their ethical principles to the College of Policing and inviting representatives of the Association of Police and Crime Commissioners (APCC) and the NPCC to present at the September 2023 quarterly meeting. Notably, BFEG held a forum session with a variety of stakeholders in September 2023 to discuss the need for a national ethical body and the requirements for such body. A report on this forum was published following the event; A national advisory body for ethics in policing.
After the forum meeting, BFEG set up a Working Group to progress this workstream in association with our stakeholders. Internally to BFEG this Working Group has been temporarily termed ‘Police Ethics’ Working Group. BFEG continue to work with representatives of the NPCC and APCC to develop and implement PEB. We hope to be able to provide more detail over the coming months and in our next annual report. This is a unique and innovative approach to tackling strategic ethical issues within policing; utilising the independent expertise advice of BFEG in this process can only enhance our commitment to public safety and public confidence.
Engagement with the Accelerated Capability Environment (ACE)
This commissioning period has also seen BFEG engaging with the Accelerated Capability Environment (ACE), to better ensure consistency of ethical consideration in projects commissioned by the Home Office, and to enable ethical advice to be incorporated at an early stage.
ACE is an organisation that was established in 2017 as an arm’s length organisation to act as a conduit, to support commissioning work in both Home Office and across the wider public sector.
This engagement has already illustrated the significant impact that independent ethical and technical advice can have on generating efficiencies and assisting delivery that is fit for purpose. The feedback from submissions has been very positive so far. To ensure value is added to this process, BFEG has required all submissions to provide an evaluation of progress within a year of the original submission. As an independent ethics advisory body, it is important to know the advice BFEG provides is utilised accordingly and the impact is measured where possible. We will provide more information regarding our commitment to measuring impact over the coming months.
As mentioned in the introduction, BFEG had been invited by ACE onto the Deepfake Detection Challenge project they are conducting, and we will continue to update on our webpage and in our next annual report. This growing area of work demonstrates BFEG’s capacity to address new and emerging technology in a robust and effective manner.
Self-commissioned work
In March 2023 BFEG published and disseminated an updated version of the BFEG Ethical Principles. The principles which were first developed in 2018 intended to support officials and practitioners in the consideration of ethics for a project or policy. There are eight high-level principles which have been written with supporting annexes describing how they could be implemented.
Changes made in the refreshed principles reflect the need to consider ethical procurement and two new principles were added.
Overview of key activities
Meetings
The BFEG held six full committee meetings in the period covered by this annual report (July 2022 – March 2024) (two of which took place in person). The minutes of these meetings are publicly available on the BFEG’s gov.uk website.
While we continue to hold the majority of our meetings online to reflect savings in cost, time, and our carbon footprint, we look forward to an in-person meeting each year. The move by BFEG to encourage these in-person meetings to have a themed perspective – such as the forum hosted on ethics in policing in September 2023 – has encouraged engagement and discussions on relevant issues with a wide, relevant stakeholder catchment.
In addition to quarterly meetings, BFEG working groups have had several meetings relating to delivery of priorities. BFEG continues to meet on an ad hoc basis to address submissions that are of relevance to the whole panel due to their strategic application.
Across this reporting period, BFEG has also held three larger events to facilitate the delivery of their workstreams in a robust manner and to publicise their recommendations internally and externally. This has included a report launch event, a large forum bringing together stakeholder across the network and two half day evidence gathering sessions.
Timeline of activity
Key meeting and milestones for the reporting period of July 2022 – March 2024 are listed below:
September 2022
Quarterly meeting of BFEG - Minutes
- BFEG provided FINDS recommendations on the development of a research donor consent form for the development of a Y-STR reference dataset.
- BFEG expressed concerns and outlined ethical considerations for the development of the Biometric Self-Enrolment programme and second phase testing.
- Professor Mark Nixon attended and presented evidence to BFEG on state of Gait technology and the group discussed with Professor Nixon possible ethical challenges.
December 2022
Quarterly meeting of BFEG - Minutes
- BFEG discussed with FINDS reporting and reduction of errors and the potential impacts from an ethical perspective.
- BFEG discussed the “Technology rules? The advent of the new technologies in the justice system” asking accountability questions of the BFEG Policy Sponsor. The possibility for BFEG to perform the activity of an independent national ethics body was discussed.
- BFEG reviewed and agreed the necessary additions to the BFEG Ethical Principles.
February 2023
BFEG presents on the Ethical Principles to the College of Policing.
March 2023
Quarterly meeting of BFEG - Minutes
- It was recommended by BFEG, in discussion with the Biometrics and Surveillance Camera Commissioner, that advice on how to effectively delete data and conduct a sufficient risk assessment was needed.
- BFEG raised concerns regarding the implementation of the Forensic Science Code of Practice and the impact on the digital forensics’ community. Policy took this forward as an action.
- BFEG provided recommendations to FINDS on the development of a consent form for the Vulnerable Persons’ DNA database.
- BFEG reviewed and agreed the necessary additions to the draft AI Ethics Charter.
Publication of the revised BFEG Ethical Principles accompanied by launch event
Refresh of the BFEG website including publication of information sheet.
May 2023
BFEG Chair attends Home Office Science Advisory Committee (HOSAC) meeting
June 2023
Quarterly meeting of BFEG – Minutes
- BFEG raised concerns to Policy officials regarding the ethical implications of facial recognition technology and requested engagement from Policy officials regarding the progress, offering continued support by providing ethical advice and challenge in the development and implementation of the programme.
- BFEG discussed with Policy officials the impact and potential implications of the proposed reforms in the Data Protection and Digital Information Bill.
- BFEG provided guidance and challenge to the Home Office on the design of a new third-party material consent form.
July 2023
Submission of BFEG AI Charter and AI triage recommendations to policy sponsor team to deadline:
- The Charter provided Home Office officials with a recommended high-level departmental framework to ethically and accountably assess the risks linked to the use of AI tools/systems.
- The Charter was designed to be immediately implementable by the department.
- The AI triage recommendations were a series of eleven recommendations for the design of a risk assessment process for the triage of AI tools/systems and eight recommendations to support the usability of a risk assessment triaging tool by officials.
August 2023
Announcement of re-appointment of Professor Watson-Gandy as BFEG chair.
September 2023
Quarterly meeting of BFEG - Minutes:
- BFEG provide guidance to FINDS on the collaboration with the International Commission on Missing Persons and the United Kingdom Missing Persons Unit.
- BFEG shared considerations and advice on the ethical design of a new automation project within the Disclosure and Barring Service.
- Presenters from ACE and the NPCC presented to BFEG on ways of maximising engagement.
Forum on a national advisory body for ethics in Policing:
- BFEG invited stakeholders from the broad policing network to discuss the requirements for the design of a national ethics body for data and technical matters within policing.
Announcement of the reappointment of 3 members of BFEG for a third term
November 2023
BFEG attended the NPCC’s Police National Ethics Committee to seek feedback and update on progress regarding the role of BFEG as a National Policing Ethics Body
December 2023
Quarterly meeting of BFEG - Minutes:
- An introduction took place between BFEG and the new Biometrics and Surveillance Camera Commissioner.
- BFEG offered their advisory services to Home Office in their work on AI safety and the proposed AI strategy.
- Home Office officials presented to BFEG the draft Science and Technology Strategy. BFEG provided guidance on engagement with academia and industry in the design of the strategy.
- Presenters from ACE, the NPCC and FINDs attended the meeting and provided updates.
January 2024
BFEG Chair attends HOSAC meeting.
Biometric recognition technology working group hold evidence gathering sessions on the current state of and ethical considerations for voice and gait technologies.
February 2024
BFEG and Policy sponsor jointly host a roundtable event to review Home Office priorities and consider the 2024/25 commissioning brief.
March 2024
BFEG Chair attends HOSAC meeting.
BFEG attended a workshop held by ACE to feed ethical considerations into the development of the Deepfake challenge.
BFEG chair presented at Security and Policing conference on Ethics at the Home Office.
Additional stakeholder engagements
In addition to the key milestones and engagements, BFEG membership also attended external events as BFEG representatives.
Events attended from July 2022 to March 2024 are as follows:
- BFEG attended and held membership of the Advisory Board to the National Security Innovation and Technology Exchange (until 2024)
- BFEG attended and continues to hold membership of the Forensic Science Information Database Service Strategy Board.
- BFEG presented to the Indian Second International Conference on Forensic Justice 2023
- BFEG presented at the Envoy Immigration Day 2024
- BFEG presented at the Security & Policing Exhibition 2024
- BFEG attended the:
- Equalities and Human Rights Commission (EHRC) expert workshop on police adoption of new technologies
- Alan Turing Institute Data Ethics Committees in Policing Seminar
- All 3 days of the Security & Policing Exhibition
- Big Brother Watch Report Launch regarding Facial Recognition, held at the Palace of Westminster
- Justice Roundtable for the Independent Review of Disclosure and Fraud
- As part of our commitment to independent ethical advice, BFEG has engaged more broadly with other cross-government committees where there may be considerable interest or overlap in work programmes. This includes initial contact with the Investigatory Powers Commissioner’s Office (IPCO) including their Technological Advisory Panel (TAP), the Office of the UK Statistics Authority and their Centre of Applied Data Ethics Committee, the Police Chief Scientific Advisor and their Police Scientific Advisory Committee. Through this work, BFEG is working towards developing greater cross-governmental awareness of BFEG’s approach to ethics and increased consistency in the application of ethics across the government.
Speakers
We have spent considerable time over the period of this commissioning brief developing a wider stakeholder engagement both internally and externally.
BFEG has welcomed speakers to quarterly meetings, working group sessions as well as ad hoc events. This programme of engagement and inclusion will continue to develop and expand over the next few years.
Membership
Demissions
The following long-standing BFEG members concluded their final terms during this reporting period. BFEG is grateful for the dedication and longstanding commitment of these members in volunteering both their time and years of experience and expertise to the Home Office, holding the Home Office accountable and ensuring the delivery of public programmes is ethically considered.
- Dr Julian Huppert – March 2023 (co-opted to April 2024)
- Professor Nina Hallowell – May 2023
- Professor Louise Amoore – May 2023
- Professor Simon Caney – September 2023
- Professor Liz Campbell – September 2023
- Professor Mark Jobling – March 2024
Appointments
Due to the large volume of demissions which were anticipated during and in the lead up to, March 2024, BFEG launched a recruitment campaign for six new members. The campaign was launched and managed by the public appointments team, seeking individuals with specialist knowledge of the law, social science, data ethics, artificial intelligence, biometrics and medical ethics (specifically review or evaluation of ethics in regard to medical decision making). Interviews took place in late 2023. BFEG await confirmation of start dates for the new members in the summer of 2024. We will confirm details on our webpage as soon as possible.
In addition to launching a recruitment campaign, BFEG were keen to ensure both consistency and contingent delivery of its work programme. Following successful appraisals and Ministerial approval, BFEG is delighted with the announcement that Dr Peter Waggett, Professor Denise Syndercombe-Court and Professor Thomas Sorell would be continuing their tenure on BFEG for an additional 3 years, commencing March 2024.
Finally, it was announced that BFEG would continue to be led by the Chairmanship of Professor Mark Watson-Gandy, who was re-appointed for a further 2 years.
The profiles of BFEG members are continually updated on our website. As part of our drive to improve accessibility and transparency, BFEG will continue to review the content and update as needed.
Full biographies for the individuals who held membership during the period of this report, July 2022 to March 2024, is provided at appendix 2.
Communications
Freedom of Information Act (FoI) requests
We make every effort to respond to any FoI request. However, we must point out that given we are an independent advisory body made up of public appointees, many of the issues directed to us fall within the remit of the Home Office to respond to and not BFEG directly. The importance of transparency is reflected in our commitment to be open and provide relevant information in relation to the volume of FoI requests we have received.
The BFEG received 32 requests for information under the Freedom of Information Act from July 2022 to March 2024.
Website activity
Details of the work is on its GOV.UK website. This website was viewed 2,035 times in the year between March 2023 and March 2024.
Following the launch event for the updated BFEG principles in March 2023, the principles have been viewed 198 times (between March 2023 and March 2024) and the BFEG website was viewed 274 times within that month. This accounts for 13% of the overall webpage’s viewership.
Budget and expenditure
The BFEG’s members are not paid for their work but receive reasonable travel and overnight expenses.
During the extended 21-month reporting period of July 2022 to March 2024, BFEG held two in-person quarterly meetings, a report launch, two in-person evidence gathering meetings and a forum meeting. In addition, members attended various in-person events and board meetings as representatives of BFEG, expenses for which were covered by the Home Office.
Table 1 shows BFEG expenditure from July 2022 to March 2024. Values are inclusive of value added tax (VAT) where applicable.
The figures are indicative of expenses for up to 17 individuals (11 substantive members and 6 who demitted part way through the reporting period, at varying intervals).
Table 1: BFEG expenditure July 2022 to March 2024 (21-months)
Expense | Cost |
---|---|
Members’ expenses | £3,861.99 |
External venue hire | £4,729.14 |
Secretariat expenses (e.g. report printing) | £29.05 |
Hospitality (e.g. catering for full day, in person meetings/events) | £766.75 |
Total | £9,386.93 |
Future work
BFEG continues to develop and improve internal processes and structures alongside wider sustained engagement. The new commissioning brief received from our policy sponsor team has already generated a great deal of excitement and interest.
The BFEG already has the following priorities for the following year:
- developing a work programme to reflect the Home Office priorities, either through our working groups or as ad hoc meetings.
- progressing the work on supporting the development of a national Police Ethics Body for technical ethics advice; and
- acting as an ethics advisory body for commissioning briefs submitted to ACE.
We intend to add to this through further internal and external engagement. BFEG plans to host internal engagements such as Lunch and Learn sessions for officials within the Home Office, to support and broaden familiarity across the department on what BFEG does and can offer. We will continue external engagements with other relevant ethics bodies within the policing and wider ethics community.
BFEG is grateful and excited to be welcoming six new members into its fold and taking the opportunity to review the current working groups, to ensure they are fit for purpose and reflective of our resources and expertise. BFEG is developing plans to commence the year by hosting an awayday to revitalise its working structures and practice. We intend to build on the achievements to date as well as expanding our engagement for the future.
The project and programme management approach to delivery which will underpin the BFEG’s approach to programme management, will also support BFEG in providing a tangible, transparent and detailed review of our delivery in the next annual report and in the future. BFEG members are proud to undertake the demanding and significant role of providing independent ethical advice in a continually changing and complex environment. We look forward to providing as much detail on our progress and achievements as we continue.
Appendix 1: Recommendations on the ethical design of projects involving novel data science applications
The remit of BFEG, and subsequently the breadth of the yearly commission, expanded in 2019 to include ethical considerations of projects involving large and/or complex datasets. BFEG created two working groups to carry out this work and provide the advice requested by the commission. These groups were the complex datasets working group (CDWG) and the data ethics advisory group (DEAG).
In the last four years these groups have shared a wealth of advice to the Home Office on how to ensure projects involving novel data science applications are ethically conducted. BFEG is clear that ethical advice is best utilised when tailored and specific to a given project. BFEG wish to use this annual report to summarise some recurring and generalist recommendations that can be applied to a data science project. In addition to considering the below recommendations, BFEG also recommends that officials within the Home Office review and consider the BFEG ethical principles during design, implementation, and review of a project.
When considering the design of a system BFEG recommends:
-
The use of free text data should be avoided and where possible replaced with fixed data inputs, to increase objectivity. Where this is not possible, there should be indicators of performance of the text analysis system.
-
Only data which is computationally useful must be captured and a full assessment must be made of the datasets being used to ensure that only those datasets that are necessary and proportionate for the purpose outlined in the model requirements are included.
In the process of ensuring the data used is proportional and useful, care should be taken to consider how data used in the models may discriminate against certain groups. The data should be anonymised and the anonymisation processes documented. Where possible, documentation should be publicly inspectable. Legacy data should not be captured unless necessary for the function of the system (e.g., nationality) and where data is being repurposed from another project, there should be a legal basis for doing so. -
An assessment must be made of the individual datasets to ensure that they are free from bias and are fair.
The potential disproportionate level of investigation into certain group should be considered (e.g., through the location of a project or project objectives) and the inherent bias resulting from the method(s) of data collection should be considered.
The highest available quality data should be used. Data should be recorded with qualifying data (e.g., addresses should be recorded with a date as people are likely to move) and working in silos avoided.
Additionally, the accuracy of external sources of data feeding the model should be considered, especially where this could not be completely assured. -
Justifications must be made for the selection of the appropriate AI or machine learning algorithms, and there should be assurance that there is adequate data for training and monitoring the results from the algorithm.
It should be evidenced how algorithmic bias will be addressed. The independent and dependent variables the algorithm(s) could be trained on should be identified at the outset.
Automated decisions based on data analysis are to be avoided and data should not be used for predictions further to this. Human behaviour can influence the interpretation of results and so results which feed into a human decision-making model should be provided in a high-level manner (e.g., a “red, amber, green” (RAG) rating over precise numerical values). -
Successful outcomes and potential ‘negative’ outcomes (such as showing a proposed solution is not useful) must be identified and documented at the outset. These should be identified even in instances where the perceived ‘perfect result’ may not be possible.
-
All decisions of the system must be explainable. The model must be transparent and explainable outside of the development team and the Home Office.
-
Care should be given to the use of language and the way it may be interpreted by different individuals.
-
Transparency and trust building exercises should be built into the project, even if information which is publicly shared is limited. In addition to this, various sources of public mistrust should be considered when designing the project e.g.
- Volume of data
- Who has access to data
- Collection of trivial data
- Who is being targeted by data collection
- Exploitation of data for personal gain
Further to the above recommendations, BFEG has identified a series of suggestions regarding best practice for completing a data protection impact assessment (DPIA).
DPIAs must be completed, and best practice would be for the:
- DPIA to be conducted early in the project lifecycle, before resources are committed to the project.
- DPIA to be independently reviewed.
- DPIAs to be specific and tailored, avoiding general and vague answers.
- Parts of the Data Protection Act that apply (GDPR, Law Enforcement Directive Part 3) to be identified.
- DPIA’s remit to be extended to cover ethical and equality issues as well as legal issues.
- Answers to questions in the DPIA to address the mitigation of the wider ethical implications that could reasonably be anticipated in implementing the proposal.
- DPIA to be revised when circumstances change.
- Analytical Quality Assurance (AQA) documents (Aqua Book, HM Treasury, 2015) to be treated as additional documents that supplement DPIAs and should not be considered as a replacement for the assessment of risks within DPIAs;
- AQAs and DPIAs to be completed by different staff where possible; and the purpose of the use case to be clearly defined and where this is prevention of crime then specific offences to be identified (as required by the rule of law principles of legality, the Human Rights Act 1998 and Data Protection Act 2018).
Appendix 2: Member profiles
Professor Mark Watson-Gandy (Chair)
Professor Mark Watson-Gandy OBE is Chair of the Biometrics & Forensic Ethics Group.
He is a barrister at Three Stone Chambers, a leading commercial chancery chambers. A former Junior Counsel to the Crown, he was called to the English Bar in 1990. He is also a member of the Eastern Caribbean Bar and the DIFC (Dubai) Bar. He is one of the Legal Services Champions for the UK government’s “GREAT” campaign.
He is a Visiting Professor at the University of Westminster and at the Université de Lorraine.
Mark was the founder of Kids MBA whose course teaches core business skills to 12-15-year-old children which is presently provided, through its partnership with the awarding body, ABE Global. Forbes Magazine recently singled it out as one of the “five leading global programmes supporting the next generation of entrepreneurs”.
Until September 2020, he was chair of Mental Health First Aid England, a spin out from the Department of Health charged with raising the nation’s mental health literacy. As at his departure, MHFA was rated by the Financial Times as one of the top 1000 fastest growing SMEs in Europe with 1 in 68 of this country’s adult population having undertaken Mental Health First Aid training.
He is Chancellor of the British Association of the Order of Malta. He is a past Master of the Scriveners Company. Since 2022, the Catholic Herald has included him on their list of the UK’s top 100 Catholic leaders.
Professor Louise Amoore (demitted May 2023)
Louise is a Professor of Human Geography at Durham University. Her research expertise focuses on the geographies of biometric and security technologies, with a particular interest in how contemporary forms of data, analytics and risk management are changing the techniques of biometric data collection and analysis. Louise is currently a Leverhulme Major Research Fellow investigating how the foundation of law, ethics and accountability is challenged by new methods of machine learning and automated recognition.
Professor Liz Campbell (demitted September 2023)
Liz is the inaugural Francine McNiff Chair in Criminal Jurisprudence at Monash Law, Australia, having previously been Professor of Criminal Law at Durham University. She is also an adjunct Professor at Queensland University of Technology School of Justice.
Liz is a global expert in corporate crime, organised crime, corruption and biometric evidence. Her research is socio-legal in considering the law in context and often involves a comparative dimension. Liz’s research has a significant impact outside academia. The Irish Supreme Court has cited her research, which has been relied upon in arguments before the UK Supreme Court. Her work has also been cited in reports of law reform commissions.
Liz sits on several editorial boards and is a member of the UK’s Arts and Humanities Research Council (AHRC) Peer Review College. Liz previously chaired Durham Constabulary’s Ethics Committee and served on the NHS Research Ethics Committee (Scotland).
Professor Simon Caney (demitted September 2023)
Simon is a Professor in Political Theory at the University of Warwick. He has worked on a wide range of topics including global poverty, equality, climate change, our obligations to future generations, the social discount rate, liberal neutrality, political perfectionism, multiculturalism, national self-determination, secession, sovereignty, human rights, resistance, humanitarian intervention, war, non-ideal political theory, realism in international relations, and democratic theory. He has engaged with policy makers at the World Bank, the Trades Union Congress, Oxfam America, and the UN, and is a member of the Nuffield Council for Bioethics.
Professor Anne-Maree Farrell
Professor Anne-Maree Farrell is Chair of Medical Jurisprudence at Edinburgh Law School. Her academic research expertise lies generally in health law, policy and ethics, with a specific focus on the law and ethics of human tissue, public health, online safety and redress, and mental health. Professor Farrell is also Director of the Mason Institute, an interdisciplinary research centre based at the University of Edinburgh, which focuses on ethics and law at the interface between health, medicine and the life sciences at a national and global scale. She has recently undertaken work for the UK Infected Blood Inquiry and the Scottish Covid-19 Inquiry, as well as providing expert advice on health policy and law reform issues for the UK, Scottish and Northern Ireland Departments of Health. Prior to entering academia, she was a lawyer in private practice and is admitted to legal practice as a solicitor in England & Wales, Ireland and Australia.
Professor Richard Guest
Richard Guest is Professor of Biometric Technologies in the School of Electronics and Computer Science at the University of Southampton.
He has an internationally recognised and sustained research record in applied artificial intelligence, biometric systems, security, image processing and feature pattern recognition, securing over £5M of external funding over the past 25 years. He is an appointed member of the UK Home Office Biometrics and Forensics Ethics Group advising the Government on the use of biometric technologies. He has had active involvement with ISO/IEC standardisation processes in the area of biometric data interchange, used in worldwide deployments.
He also has a sustained management record. Before joining Southampton, he held numerous senior leadership posts at the University of Kent including Deputy Divisional Director, Head of School, Deputy Head of School and Faculty Director of Internationalisation. He has significant research leadership as Principal Investigator of several large national and international projects including EU Marie Słodowska-Curie ITNs, EU and UKRI projects.
Professor Nina Hallowell (demitted May 2023)
Nina is a senior researcher at the Ethox Centre, Nuffield Department of Population Health, University of Oxford, where she is involved in a programme of research on ethical issues arising from the use of big data. She has over 20 years of experience of undertaking research on the social and ethical implications of the introduction of genetic and genomic technologies in medicine and has published widely in this field. She has qualifications in social sciences and medical law and ethics. She taught ethics at the University of Edinburgh and has been a member of a number of research ethics committees in Edinburgh, Cambridge and Newcastle.
Dr Julian Huppert (demitted March 2023, co-opted to April 2024)
Dr Julian Huppert is an academic and politician. His research looked at the structure and function of DNA beyond the double helix, and he then served as the Member of Parliament for Cambridge between 2010 and 2015. During this time, he served on the Home Affairs Select Committee for five years and was the Internet Services Providers’ Association UK’s (ISPA) Internet Hero of the Year 2013. He is now Director of the Intellectual Forum, a new interdisciplinary centre at Jesus College, Cambridge.
He is also a Director of the Joseph Rowntree Reform Trust Ltd and a Visiting Professor at King’s College, London. He was the first Chair of the Independent Review Panel for DeepMind Health and Deputy Chair of the NHS Cambridgeshire and Peterborough CCG.
Professor Mark Jobling (demitted March 2024)
Mark is a Professor of Genetics at the University of Leicester, specialising in human evolutionary genetics, forensics, genetic genealogy, ancestry testing and genetics in historical studies. He has held a series of three consecutive Wellcome Trust Senior Fellowships, is a senior editor of the ‘Annals of Human Genetics’, co-director of the Alec Jeffreys Forensic Genomics Unit and was the University of Leicester’s Research Excellence Framework academic lead for Biological Sciences in 2014 and 2021. Mark is lead author of the textbook ‘Human Evolutionary Genetics’ (Garland Science) and has published over 150 scientific papers, including recent work on new technologies in forensic DNA analysis.
Dave Lewis
Dave is a senior leader with a portfolio of interests in ethics, governance, regulation, and leadership. He manages a range of roles across sectors and undertakes freelance work.
He served for 30 years as a police officer, retiring as Deputy Chief Constable of Dorset. During his career he undertook many roles, including Senior Investigating Officer, Area Commander for East Berkshire, Chief of Staff at the Association of Chief Police Officers, and Regional Chief Officer for the Southwest of England.
Dave led the national Ethics portfolio and was Forensics Performance and Standards lead for the National Police Chiefs’ Council. In those roles, he oversaw the development of digital ethics in policing and the implementation of accreditation and quality standards across forensic disciplines. He was also a member of the Forensic Science Advisory Council.
More recently, he gave evidence to House of Lords Justice and Home Affairs Committee’s inquiry into emerging technologies, contributing to their 2022 report: ‘Technology rules? The advent of new technologies in the justice system.’ He has provided guidance to a national forensic organization on the development of a digital ethics framework and has been part of a team advising a European government on the development of a national forensic science strategy.
Dave currently heads up the Cricket Regulator for England and Wales and is a director of the Human Tissue Authority. He is a member of the Home Office Biometrics and Forensics Ethics Group, an Associate of the College of Policing, and chairs and facilitates leadership programs for the Windsor Leadership Trust.
Professor Sarah Morris
Professor Sarah Morris is a leading expert in Digital Forensics and the Deputy Head of the School of Electronics and Computer Science at the University of Southampton. With over 16 years of academic and investigatory experience, her interests include investigating unusual devices, digital data recovery, data privacy, and document analysis. She has performed digital investigations for a wide range of global clients including law enforcement, civil, corporate, media organisations, and celebrities.
An advocate for public engagement, Sarah champions inclusion in computing and electronics. She is also an active volunteer for charities and public initiatives supporting victims of control, bullying, and mental health for digital forensic investigators. She serves on national and international advisory boards, including the Biometrics and Forensics Ethics Group and the UK Police Science Council under the Office of the Chief Scientific Advisor for Policing at the National Police Chiefs Council. Frequently featured in media outlets such as the BBC and CNN, Sarah is a recognised authority in her field combining academic theory with investigative practice.
Dr Nóra Ní Loideáin
Dr Nora Ni Loideáin is Senior Lecturer in Law and Director of the Information Law & Policy Centre at the Institute of Advanced Legal Studies, University of London. Her research focuses on EU law, European human rights law, and technology regulation, particularly within the contexts of privacy and data protection. Nora holds BA, LLB, LLM (Public Law) degrees from the National University of Ireland, Galway and a PhD in law from the University of Cambridge.
Previously, she has held lecturing positions at King’s College London and the University of Cambridge and Visiting Fellowships at the Australian National University and University of Cape Town. In 2024, Nora was appointed Editor-in-Chief of leading journal International Data Privacy Law (Oxford University Press). Prior to her academic career, Nora was a Legal and Policy Officer for the Office of the Director of Public Prosecutions of Ireland and clerked for the Irish Supreme Court.
She has consulted widely on her areas of research and published on topics including AI Digital Assistants, health data, facial recognition, smart cities, and cross-border transfers. With her work being cited by leading institutions including the Alan Turing Institute, BBC, Chatham House, the United Nations, and the UK House of Lords.
Professor Niamh Nic Daeid
Professor Niamh Nic Daeid is an award-winning Chartered Chemist and Authorised forensic scientist. She is a Professor of Forensic Science and Director of the Leverhulme Research Centre for Forensic Science at the University of Dundee, developing robust scientific methods and science communication for the justice systems. She has been involved in forensic science education, research and casework for over 30 years. She is a past chair of the European working group for fire and explosion investigation and the INTERPOL forensic science managers symposium. She sits on the scientific advisory board of the International Criminal Court, is a strategic partner for the International Forensic Strategic Alliance and acts as an expert forensic advisor to the UN. She has published over 200 peer reviewed research papers and book chapters, 5 edited books and holds a research grant portfolio in excess of £30 million. She is regularly involved in the public communication and engagement of science making forensic science accessible across a wide range of audiences.
Professor Charles Raab
Charles Raab is Emeritus Professor, University of Edinburgh, formerly Professor of Government. Visiting positions at academic institutions abroad. Academic background in political science and public policy, within broad social-science and humanities frames of reference. Co-Director of Centre for Research into Information, Surveillance and Privacy (CRISP), former Fellow of the Alan Turing Institute (ATI), co-Chair of ATI Data Ethics Group. Member of Police Scotland’s Independent Ethics Advisory Panel, Digital Identity Scotland Expert Group, Europol Data Protection Experts’ Network. Co-Chaired Independent Digital Ethics Panel for Policing (IDEPP). Teaching and funded research on privacy, data protection, surveillance, governance and regulation, data ethics, ‘smart’ environments, AI, identity and identification, trust, accountability, law enforcement, FOI, and democracy. Author of many articles, chapters and books (see https://www.sps.ed.ac.uk/staff/charles-raab), General Editor of Routledge Studies in Surveillance book series. Advice and reports for many public bodies in the UK, EU and elsewhere. Evidence to UK parliamentary committees; Specialist Adviser to the House of Lords Constitution Committee (2007-2009) for the inquiry, Surveillance: Citizens and the State, HL Paper 18, Session 2008–2009. Fellow of the Academy of Social Sciences; Fellow of the Royal Society of Arts. Received the Outstanding Achievement Award of the Surveillance Studies Network in 2024.
Professor Tom Sorell
Trained in philosophy and has decades of experience in public engagement concerning ethics. Has a particular interest in ethical issues arising from the use of security, including, policing technologies. Author of 9 books and editor or co-editor of 13 collections of papers. Over 150 peer-reviewed papers in philosophy, interdisciplinary technology studies, moral theory and applied ethics.
Areas of expertise:
- AI ethics
- data ethics
- ethics of counterterrorism
- policing ethics
- research ethics
Other current roles:
- Member, West Midlands Police Data Ethics Committee
- Chair, West Midlands Police Ethics Committee
- Member, Information and Governance Committee, NHS Safe Data Environments
- Expert Fellow, Sprite Network Plus in Cybersecurity
Previous roles
- John Ferguson Professor of Global Ethics, University of Birmingham
- Co-director, Human Rights Centre, University of Essex
- Faculty Fellow in Ethics, Harvard University
- Tang Chun-I Visiting Professor in Philosophy at the Chinese University of Hong Kong (2013)
Professor Denise Syndercombe Court
Scientist, geneticist, statistician, academic, editor and author of a prize-winning medical textbook and published author of peer reviewed scientific publications. Trained in systematic reviews and evidence-based approaches of medical and scientific publications. From 1990 was Senior Lecturer in Forensic Haematology at Barts and The London School of Medicine and Dentistry and in 2012 she moved to King’s College London where she is now the Professor of Forensic Genetics.
She has more than thirty years of experience in scientific research, forensic evidence examination and DNA interpretation in relationship and criminal evidence with a sound knowledge of the civil and criminal justice process, including court presentation as an accredited expert witness. She runs an ISO17025 laboratory that specialises in kinship investigation and is an active researcher in new molecular techniques for human identification. She has an active interest in promoting science to a wider audience via television, radio and external lectures.
Dr Peter Waggett
Dr Waggett is an image processing expert and Director of Research at IBM UK Ltd. He was the biometrics and testing lead for the UK Visa Waiver Project delivered to the UK Government. He currently leads the IBM activity associated with the UK’s Hartree National Centre of Digital Innovation (HNCDI). The HNCDI delivers innovative projects and proof of concept systems for UK industry using quantum computing and Artificial Intelligence. Dr Waggett is editor of the ISO Biometrics Vocabulary project (ISO 2382-37) and Chairs the International Standards Organisation (ISO) SC17 Committee responsible for Cards and Devices for Personal Identification. SC17 is responsible for the standards governing bank cards, passports and driving licences worldwide.
Appendix 3: Glossary
Artificial intelligence (AI)
In computer science, AI refers to any human-like intelligence exhibited by a computer, robot, or other machine. AI is the ability of a machine to perform a task usually done by humans, such as recognising objects, understanding and responding to language, making decisions, and solving problems.
Biometrics and Surveillance Camera Commissioner
The Biometrics Commissioner and Surveillance Camera Commissioner roles were established by the Protection of Freedoms Act 2012, which introduced the regime to govern the retention and use by the police of DNA samples, profiles and fingerprints, and to promote appropriate overt use of surveillance camera systems by relevant authorities in England and Wales. Since March 2021, one full-time commissioner has undertaken these roles.
Data Protection Impact Assessment (DPIA)
A process to help identify and minimise the data protection risks of a project.
Digital forensic material
The information extracted from any digital system or data storage media, rendered into a useable form, processed, and interpreted to obtain intelligence for use in investigations, or evidence for use in criminal proceedings.
Facial recognition
Identifying or verifying a person from a digital image or a live video source by comparing it to selected facial features from a known source image.
Forensic Information Databases Service (FINDS)
The Home Office unit responsible for administering the National DNA Database, National Fingerprint Database and Footwear Database.
Forensic Information Databases Strategy Board (FIND SB)
The board providing governance and oversight over the National DNA Database and the National Fingerprint Database. It has a number of statutory functions, including issuing guidance on the destruction of profile records and producing an annual report.
Forensic Science Regulator (FSR)
The official responsible for ensuring that the provision of forensic services across the criminal justice system is subject to an appropriate regime of scientific quality standards. The Forensic Science Regulator Act 2021 makes provisions for a statutory Code of Practice for forensic science activity in England and Wales. The FSR would oversee compliance with this code.
Home Office Biometrics (HOB) programme
A programme running since 2014, that delivers services supporting fingerprints, facial images and DNA (the main biometric modalities currently extensively used in the UK public sector). It also develops capabilities across the Home Office, law enforcement and, where appropriate, more widely across the government.
Live facial recognition (LFR)
The automated one-to-many ‘matching’ of near real-time video images of individuals with a curated ‘watchlist’ of facial images.
Machine learning
A branch of Artificial Intelligence (AI) focused on building applications that learn from data. Machine learning algorithms improve their accuracy with experience without being explicitly programmed to do so.
Missing Persons DNA Database (MPDD)
A database containing STR DNA profile records of missing persons, relatives of missing persons (where a reference DNA profile is not available for the missing person), unidentified bodies and some crime stain DNA profile records that may be linked to missing persons or unidentified bodies (for example, a ‘no body’ murder case).
National DNA Database (NDNAD)
Established in 1995, the NDNAD is an electronic, centralised database holding STR DNA profiles taken from both individuals and crime scenes. The database can be searched to provide the police with a match to an individual or a match linking an individual to a crime scene and vice versa.
Short tandem repeat (STR)
DNA comprises four types of bases known as A, T, C and G. STRs are short sections of DNA that contain repeating sequences of bases (such as AATGAATGAATG). The number of times the sequence of DNA repeats (in this example three times) varies between individuals so it can tell people apart [see also STR DNA profile].
STR DNA profile
A DNA profile is created by counting the number of times a section of DNA repeats at specific areas in a person’s DNA [see short tandem repeat]. Pairs of numbers will be generated as a person has two copies of DNA (if the number of repeats was five on both copies the profile would be 5,5, if it was five on one strand and six on the other it would be 5,6). The number of areas of DNA looked at depends on the profiling chemistry used [see [DNA-17]. The presence of the sex chromosomes, X and Y, is also tested. The numerical representation allows DNA profiles to be uploaded to a database and compared with other DNA profiles. (For further information, the Royal Society has published a primer on DNA analysis.)
Vulnerable Persons DNA Database (VPDD)
A database containing STR DNA profiles from vulnerable persons who are at potential risk of harm, such as people at risk from honour-based assault or forced marriage, sex workers and those potentially at risk of sexual exploitation, or where the police consider the individual at risk. When a vulnerable person volunteers to provide a DNA sample, consent is sought for the resulting profile to be retained on the VPDD and searched against the NDNAD under specific circumstances.
Y-STR profiling
A form of DNA analysis involving only DNA found on the Y-chromosome. Analysing Y-chromosome DNA can be useful where this is a mixture of DNA from a male and a female, as the Y-chromosome is only found in males. Y-STR analysis can also carry out relationship testing as DNA on the Y-chromosome is passed down the male line (father to son).