Corporate report

BFEG meeting minutes: 7 December 2022

Updated 16 June 2023

Biometrics and Forensics Ethics Group

Notes of the 22nd meeting held on 7 December 2022, at 2 Marsham Street, London, and via videoconference.

1. Welcome and introductions

1.1. Mark Watson-Gandy, Chair, welcomed all to the 21st meeting of the Biometrics and Forensics Ethics Group (BFEG) – see annex A for attendees and apologies.

2. Notes of the last meeting and matters arising

2.1. The minutes of the June meeting, the last meeting of the BFEG, had been circulated, and agreed by the members. Following final review, the minutes were to be published.

Action 1: Secretariat to publish the minutes of the previous meeting.

2.2. Regarding September 2022 Action 3, Forensic Information Databases Service (FINDS) representative to review error rate data and update the BFEG. The FINDS representative explained that there were no trends identified for the reported gradual increase over the last 4 quarters and that the error rate had remained proportional at roughly 2%. The FINDS representative explained that the increase was likely due to an increase in the numbers loaded. The action was closed.

2.3. The list of open actions from previous meetings can be found in annex B. All other actions were complete.

3. FIND strategy board update

3.1. The group had received a written update from the FINDS strategy board (FINDS SB) ahead of the meeting. The FINDS representative verbally shared these with the BFEG. The main points were:

  • the board last met on 29 September 2022 for a strategic workshop; the next meeting was scheduled for 14 December 2022

  • at the September 2022 workshop the performance of FINDS was assessed as working within the defined statement of requirements; the BFEG was provided with a number of metrics that were recorded such as a 5% increase in fingerprint loads in 2022-23 and a decrease in fingerprint IDENT1/PNC medium/high errors

  • much of the September workshop was focused on sharing the thoughts of members regarding the board’s activity and discussing areas the board should focus on in short, medium and long term; the outputs of the discussions would be discussed, collated and presented at the next strategy board meeting

  • the Y-STR project was ongoing and the focus, at the time of the meeting, was developing a communication strategy and sampling plan within Policing and Home Office (HO) communities

  • the feedback received from the BFEG at the last meeting on the ‘YSTR reference set’ consent forms, information pamphlet and survey sheet were being incorporated

  • FINDS had followed up with international partners regarding the data protection impact assessments (DPIAs) in place covering international biometric data sharing; the international partners were reviewing the DPIAs that had been developed; it had been agreed that FINDS would approach again once reviews were complete to request DPIAs be shared with the BFEG

3.2. A member of the BFEG queried what ‘crime stain error’ means, to which the FINDS representative commented that all statistics reported into the FINDS SB were at a high level of detail. DNA errors were split as subject errors (errors in samples taken from subjects in custody suites) and crime stain errors (errors in sampling, processing or interpretation of DNA profiles obtained from a crime scene).

3.3. The FINDS representative clarified that errors are defined on a risk register using a ‘red, amber, green’ status, from which overall and red errors were reported to the FINDS SB and scrutinised. It was queried whether ‘red errors’ were the same as ‘high risk errors’. The FINDS representative confirmed that this was the case for process errors but that the risk register also included further risks such as in service availability. The FINDS representative offered to share examples of different ‘red, amber or green’ status errors to support understanding.

Action 2: FINDS representative to share examples of errors recorded with red, amber or green status with members of the BFEG.

3.4. A member of the BFEG asked whether proficiency tests are complete for individuals collecting fingerprint samples. The FINDS representative commented that collaborative test exercises and proficiency tests were available. The representative noted however, the proficiency tests did not cover interaction with IDENT 1 (administrative processing) where many of the errors take place. FINDS were reviewing whether they could support proficiency testing using IDENT 1. It was clarified that errors occur through the whole process/supply chain and did not only result from the technology.

3.5. A BFEG member asked the FINDS representative to clarify what was meant, in real terms, where 0.05% high risk errors were reported. The FINDS representative responded that the 0.05% of high-risk errors represented 26 ‘forms’. The FINDS representative explained that no misidentifications had been reported resulting from intelligence matching. The FINDS representative clarified that errors are typically identified as a result of quality assurance processes before impact is realised.

3.6. The FINDS representative informed the BFEG that error rates are recorded for each police service and performance is reviewed quarterly by FINDS, the Forensic Science Regulator and the United Kingdom Accreditation Service.

3.7. Members of the BFEG shared some comments regarding the Prüm (EU) Step 1 Fingerprints exchange DPIA which FINDS had shared with the BFEG ahead of the meeting. A member of the BFEG asked what the safeguards referenced within the DPIA were, noting that hyperlinks would be beneficial. The FINDS representative agreed to find out.

Action 3: FINDS representative to identify what the safeguards and guidance referenced within the Prüm (EU) Step 1 Fingerprints exchange DPIA (at section 2.4).

3.8. A member of the BFEG asked the FINDS representative what, if any, interaction was had with individuals who may be impacted by any errors. The FINDS representative responded that FINDS do not interact directly with individuals but that they would support the relevant police service with mitigation, with monitoring, and to identify the cause of an error. The FINDS representative noted that the police service could interact directly with any affected individuals.

3.9. A member of the BFEG queried whether there was independent oversight of the monitoring or reporting process. The FINDS representative noted that any errors which require a form of amendment or update had to go through FINDS and that there was a notification system where errors are highlighted prior to reaching FINDS.

4. Policy update

4.1. The group had received a written update from Data and Identity Policy ahead of the meeting. The main points were:

  • The Police, Crime, Sentencing and Courts (PCSC) Act powers for the extraction of Information form electronic devices and code of practice came into force on 08 November 2022. Policy were working with the National Police Chiefs Council and the college of policing to support the implementation, production and delivery of training packages to support the powers in the PCSC act.

  • The Data Protection and Digital Information Bill, introduced in Parliament in July, was awaiting a date for its 2nd reading. The bill includes clauses that would transfer the statutory functions of the Biometrics and Surveillance Camera Commissioner roles to other bodies.

  • An Artificial Intelligence (AI) and Data Ethics policy was being drafted by the new AI Policy and Data Ethics Team. This would cover the life cycle of powerful data driven technology projects and bring together knowledge, procedures and guidance around ethical requirements, standards, legislation, risks and benefits and examples of best practice. As the work progresses the new BFEG working group would be supporting the development of a risk triage process and would assist with the management of a register of algorithms for the Department.

4.2. A member of the BFEG asked whether a United Kingdom (UK) Framework was likely to be updated in line with the proposed changes taking place at the European Union (EU) level (in relation to the proposed EU Prum II Regulation and the 2 proposed EU Advance Passenger Information Regulations) to expand the framework to facial recognition and other biometric categories. The policy sponsor agreed to speak with colleagues in other business areas in order to provide an answer.

Action 4: Policy to speak with colleagues in the international directorate regarding whether the UK Framework will be updated in line with the proposed changes taking place within the EU.

4.3. The BFEG were supportive of the introduction of the AI and Data Ethics Policy team. A member of the BFEG questioned a policy representative on how the team would work and what their function was. A policy representative explained that the new AI and Data Ethics Policy team, which had received sponsorship from the HO Chief Scientific Advisor, would be responsible for the development of the departmental policy regarding AI and data ethics, which would outline the expectations on any team looking to innovate with large datasets and AI.

4.4. A representative from policy explained the interaction of the BFEG and the new AI and Data Ethics Policy team would be more clearly defined through the upcoming months. The representative explained through initial conversations the asks from the BFEG would be around the development of a register of projects using AI in the HO and support the policy team by advising on a risk triaging processing to develop the approach to the policy.

4.5. A member of the BFEG asked policy whether the Home Office is able to implement recommendations from the “Technology rules? The advent of the new technologies in the justice system” report (written by the Justice and Home Affairs select committee) which were rejected in the government’s response. The policy representative responded that the Government agreed with spirit of the report but were not confident in the strength of the evidence in the recommendations.

4.6. Members of the BFEG raised concerns that the government’s response, given in the House of Lords Debate (held on Monday 28 November 2022), was not clear in its reasoning for rejecting recommendations.

4.7. A member of the BFEG commented on the number of institutions involved in developing standards for using AI and recommended utilising resources which already exist. A representative from policy noted a data ethics landscape map was being developed to identify relevant parties and stakeholders.

Action 5: Policy to share the data ethics landscape map with the BFEG members for additional suggestions.

4.8. A member of the BFEG commented on how many algorithms are developed and informed using police intelligence and experts. The member of the BFEG posed the question how much weight algorithms give to human expertise, which might be biased, and how far algorithms go beyond policing expertise that pre-dates AI. The policy sponsor noted the policy team works closely with policing and the police chief scientific advisor, there would be no intention to influence policing and how it works.

4.9. Reflecting on the recent report by the Justice and Home Affairs committee, a member of the BFEG asked the policy sponsor if there would be work taking place in parallel at the Ministry of Justice (MoJ) and if so, how much interaction would the HO have. The policy sponsor responded the report by the Justice and Home Affairs committee was focused largely on policing and therefore, while the MoJ was sighted on the response, it was published by HO ministers. The policy sponsor was not aware of similar work taking place in the MoJ however, reflected that there was a commitment to maintaining regular and ongoing engagement with the recently appointed ethics lead at the MoJ.

5. Review and agree BFEG principles

5.1. The BFEG developed a set of ethical principles in 2018 (updated last in 2020) which described a set of high-level governing principles that should be applied to the development and use of biometric and forensic technologies.

5.2. Nominated members of the BFEG reviewed the principles ahead of the meeting and an updated version was circulated to attendees ahead of the meeting.

5.3. No objections were raised to the proposed updates, the BFEG reflected further and agreed that the principles should be updated to include a dedicated statement on ethical procurement.

Action 6: BFEG ethical principles are to be updated to include a dedicated statement on ethical procurement.

5.4. As the principles mentioned UK law, it was discussed whether there was a need to refer to reports produced in the devolved nations of Scotland and Northern Ireland, or whether it would be necessary to specify the jurisdiction of the BFEG principles. It was concluded the extent of the BFEG, and by extension the principles, would reflect that of the Home Office.

Action 7: Policy to develop standard lines on the jurisdiction of the BFEG principles for inclusion in the BFEG ethical principles.

5.5. The BFEG members debated whether economic considerations should be included in the ethical principles. The group debated that procurement of cheaper systems can be an ethical issue however, it was concluded that economic considerations should not be included in the principles.

6. Discussion of the House of Lords debate on the Justice and Home Affairs Committee report: ‘Technology rules? The advent of new technologies in the justice system’

6.1. A representative from the policy introduced the item commenting on the intention to hear observations and comments from the BFEG on the Justice and Home Affairs Committee report: “Technology rules? The advent of new technologies in the justice system” and the subsequent House of Lords debate.

6.2. Members of the BFEG reflected on a workshop they had attended around the Home Affairs Committee report, hosted by the Turing Institute and suggested policy engage with the hosts as good insight resulted at that workshop. The policy sponsor confirmed they had attended the workshop.

6.3. BFEG members reflected on the underwhelming nature of the government’s response and the lack of urgency in implementing recommendations.

6.4. A member of the BFEG commented, from their perspective, it seemed policing had more appetite for governance/regulation than the government.

6.5. A member of the BFEG commented on the recommendation within the report for governance and reflected that it should still be debated where the governance should sit (centralised or within each police service). It was noted this could be considered by the BFEG.

6.6. A member of the BFEG commented that they have discussed with the college of policing and the association of crime commissioners to explain the function of the BFEG and how it is reviewing the use of technology and AI. The BFEG member shared concerns that work is being duplicated and there is scope for the BFEG to provide expert advice to these organisations.

6.7. It was discussed that the BFEG already works in the policing space through existing working groups and there is scope to offer capacity or resource to interested parties within policing. It was agreed that a new or separate working group of the BFEG, for policing, would not be necessary.

6.8. The policy representative commented that it would be beneficial for BFEG to conduct outreach to explain its function and its independence to stakeholders, specifically police commissioners.

7. Workplan update and working group review

7.1. The working groups for the two new commission items on AI and on voice and gait technology were confirmed.

7.2. Remaining working group updates were provided over email due to time limitations.

8. AOB

8.1. A BFEG member raised that they had been approached by the Accelerated Capability Environment (ACE) and noted that BFEG was listed as a group they intended to engage with. It was discussed that it would be beneficial for the BFEG to hear a presentation from ACE on their work and remit.

Annex A – list of attendees and apologies

Present – in person

  • Mark Watson-Gandy – Chair
  • Ann-Maree Farrell – BFEG Member
  • Richard Guest – BFEG Member
  • David Lewis – BFEG Member
  • Sarah Morris – BFEG Member
  • Nóra Ni Loideain – BFEG Member
  • Niamh Nic Daeid – BFEG Member
  • Denise Syndercombe Court – BFEG Member
  • Juliette Verdejo – FINDS Unit, HO
  • Jennifer Guest - BFEG Secretary, HO
  • Kimberly Stiff – BFEG Secretariat, HO

Present – via videoconference

  • Nina Hallowell – BFEG Member
  • Julian Huppert – BFEG Member
  • Mark Jobling – BFEG Member
  • Charles Raab – BFEG Member
  • Thomas Sorell – BFEG Member
  • Peter Waggett – BFEG Member
  • Alex MacDonald – Data and Identity Unit, HO
  • Cheryl Sinclair – Data and Identity Unit, HO

Apologies

  • Liz Campbell – BFEG Member
  • Simon Caney – BFEG Member
  • Louise Amoore – BFEG Member

  • Biometrics and Surveillance Camera Commissioner

Annex B – review of open actions from previous meeting

September 2022

Action 4: (Secretariat to confirm with the policy sponsor the parameters of the new commission and desired outcomes). The secretariat was developing terms of reference for new working groups and co-ordinating initial meetings with working group members.

October 2021

Action 7: (Secretariat to develop a template to provide to presenters based on the BFEG ethical principles). The ethical principles were reviewed during this BFEG meeting on 07 December 2022. Progress could be made on this action.

March 2020

Action 3: (Complex Datasets working group to produce general guidance on ethical issues in binary classification systems). Action ongoing.