Corporate report

BFEG meeting minutes: 21 September 2022

Updated 16 June 2023

Biometrics and Forensics Ethics Group

Notes of the 20th meeting held on 21 September 2022, via videoconference.

1. Welcome and introductions

1.1 Mark Watson-Gandy, Chair, welcomed all to the 20th meeting of the Biometrics and Forensics Ethics Group (BFEG) – see annex A for attendees and apologies.

2. Notes of the last meeting, action log, matters arising and Chair’s update

2.1 The minutes of the June meeting, the last meeting of the BFEG, had been circulated, and agreed by the members, subject to a minor amendment.

Action 1: Secretariat to publish the minutes of the previous meeting.

2.2 A review of open actions from previous meetings can be found in annex B.

2.3 October 2021 meeting, action 2: FINDS to share the Data Protection Impact Assessment (DPIA) for the International Exchange Policy with the BFEG if possible. FINDS had approached the international partners asking if the DPIA could be shared and the Chair sought an update on progress. The FINDS representative responded there was no further update on this and would follow this up.

2.4 All other actions were complete.

2.5 The Chair informed the BFEG he had attended the 35th International Congress of Genealogical and Heraldic Sciences in Cambridge, in August 2022, and the 1st meeting of the National Security Technology and Innovation Exchange in September 2022.

3. Home Office policy, update and new commission

3.1 The group had received a written update from Data and Identity Policy ahead of the meeting. The main points were:

  • The Data Protection and Digital Information Bill had been delayed at the second reading stage. The Bill proposed repealing the existing Biometrics and Surveillance Camera Commissioner roles with the statutory functions of the Biometrics Commissioner moving to the Investigatory Powers Commissioner. The other functions would be transferred to the Information Commissioner aligning with the Information Commissioner’s oversight of surveillance cameras.

  • The Data Protection and Digital Information Bill, introduced in Parliament in July, would receive its 2nd reading in the Autumn. The Bill would update legislation to reflect the fact that the Forensic Information Database Service has oversight of the National Fingerprint Database (IDENT1), in addition to the National DNA Database.

  • The government’s response to the House of Lords Justice and Home Affairs committee report, titled ‘Technology rules? The advent of new technologies in the justice system’, was published on 23 June (House of Lords Paper 180). The government’s response confirmed the commitment that people, not machines, would continue to take key decisions within the Criminal Justice System. The use of technology would take place within a principle based legal framework that was flexible enough to cope with rapid technological development.

  • A new Artificial Intelligence (AI) Policy and Data Ethics Team had been created, to enable the Home Office to employ powerful data-driven technologies in a way that maintained public trust and avoided unintentional harm.

  • The Forensic Science Regulator was formally placed on a statutory footing in July. The consultation on the statutory Code would close on the 31 October.

  • The public consultation on the Code of Practice for Extracting Information from Digital Devices concluded on 19 July and the Home Office were analysing the responses.

3.2 The policy representative was asked how the new AI Policy and Data Ethics Team would work with the BFEG, and what assistance the BFEG could provide. The Data and Identity Policy representative explained the new team was in the process of carrying out an audit of Home Office projects that made use of AI technologies, to develop a policy framework to support Home Office departments in the use of AI technologies. The representative suggested the BFEG could support the new AI Policy and Data Ethics Team to set out what should be covered in the policy framework and review the application of the project.

3.3 The representative was also asked, following the legislative changes, who would be responsible for updating and overseeing the Surveillance Camera Code. The policy representative responded it would be for the Information Commissioner to decide on the regulatory strategy for the oversight of surveillance cameras and the existing Code may be retained or amended. When asked if the Investigatory Powers Commissioners Office would receive any additional resources to undertake the additional functions held by the Biometrics and Surveillance Camera Commissioner, the policy sponsor answered this had not been decided and the main purpose of the Bill was to simplify the overall regulatory landscape.

3.4 The policy sponsor provided the BFEG with an overview of the 2022/2023 commission. This built on the previous year’s commission and took into account programmes of work proposed by the BFEG. The 2022/2023 commission would be published on the BFEG website.

Action 2: Secretariat to publish the 2022/2023 commission on the BFEG website.

3.5 A member asked if the policy sponsor had any comments on the Ryder Review. The Ryder Review was an independent legal review of the governance of biometric data in England and Wales and was commissioned by the Ada Lovelace Institute. The policy representative provided the BFEG with a brief overview of the recommendations from the Ryder Review, the report from the Ada Lovelace Institute, and Justice and Home Affairs Committee report which were:

  • new principles-based legislation, supported by detailed regulations setting minimum standards

  • to widen the scope of ‘biometric data’ as set out in the Data Protection Act to include biometric data used for classifying individuals

  • to rationalise the roles of government departments in biometrics and consolidate governance structures

  • to develop new principles-based legislation, supported by detailed regulations setting minimum standards

  • to create a new, statutory, national data ethics body and ethics forums at a local level

3.6 The BFEG agreed the overview was very useful as it highlighted significant issues. It was suggested that the BFEG review this again once the new Data Protection framework was developed.

4. FIND Strategy Board update

4.1 The group received an update from the Forensic Information Databases Strategy Board (FIND SB). The update had been circulated to the BFEG prior to the meeting, and the main points were:

  • the next FIND Strategy Board meeting would be a workshop held on 29 September 2022; the workshop would review the workplan for the next 2 years and update the FIND SB terms of reference

  • performance of FINDS had been assessed as working within the National Police Chiefs Council (NPCC) as working within the NPCC defined statement of requirements

  • FINDS had developed a full 12 months of data on force errors combined from data from forces, Forensic Service Providers and through FINDS internal integrity checks; a gradual increase in errors had been observed and FINDS were working with Forensic Service Providers to reduce errors in supply chain

  • FIND SB had obtained legal advice from the NPCC to confirm that under Protection of Freedoms Act 2012 (PoFA), DNA samples could be retaken due to human or laboratory errors

4.2 The FIND SB representative was asked for the reason behind the increased error rate reported and the representative would seek further information on this for the members.

Action 3: FINDS representative to review the error rate data and update the BFEG.

4.3 The BFEG was provided with an update on the consent form for the Y-STR (short tandem repeat (STR) on the Y-chromosome) reference data collection.

4.4 The issue of a UK Y-STR database was first raised by the FIND SB in the September 2017 BFEG meeting. At that meeting it was noted that Forensic Service Providers (FSPs) in the UK were using a worldwide, subscription-free Y-haplotype reference database (YHRD) to facilitate Y-STR analysis (haplotype is the DNA inherited from one parent, in this case the father).

4.5 This database was populated with profiles from across the world, and although there were a large number of UK profiles in the database, these still represented a relatively small proportion of the total database. Also, the worldwide YHRD lacked independent validation and did not allow for the UK to validate functionality nor develop new applications, such as alternative statistical interpretation modules.

4.6 Consequently, there was interest amongst FSPs for a UK-specific YHRD to be developed. FINDS had collaborated with the Forensic Capability Network (FCN) to undertake a DNA sampling exercise to collect DNA samples; the DNA profiles of which would be used for a Y-STR reference data set for the UK population.

4.7 A research donor consent form had been drafted, and the views of the BFEG were sought on the form. The following points were raised:

  • The Participant Information Sheet provided to the donors used complex language and contained many acronyms. It was suggested that more explanation could be included.

  • Clarification was sought over the question asking if the donor had previously given their DNA sample for the YHRD. It was explained that this was aimed at employees of Forensic Science Providers (FSPs) who may have already submitted their DNA profiles to the YHRD database but agreed this could be clearer.

  • Exclusion of northeast African as an option in the question on family background was questioned, as the northeast African population was genetically different to the north African population that was included. The FINDS representative responded this was the same ethnicity data required by the other databases, for example the DNA database, but this would be followed up.

4.8 The comments from the members would be considered by the Y-STR database project team and a follow up meeting was agreed to discuss readability issues with the forms.

5. Biometrics and Surveillance Camera Commissioner update

5.1 The Biometrics and Surveillance Camera Commissioner provided the BFEG with an update. The main points were:

  • the Commissioner had continued to engage with Parliamentarians and Ministers on the ethical and security issues surrounding the procurement of surveillance camera systems

  • on 14 June 2022, the Biometrics and Surveillance Camera Commissioner, in conjunction with the Centre for Research into Information, Surveillance and Privacy (CRISP), hosted a facial recognition event at the London School of Economics

  • the Biometrics and Surveillance Camera Commissioner would be delivering, with the Scottish Biometrics Commissioner, the CRISP annual Lecture, hosted at the University of Stirling, on ‘the Future of Face Recognition Technology: When Technologies Collide’

  • the Biometrics and Surveillance Camera Commissioner annual report was soon to be published

5.2 The Biometrics and Surveillance Camera Commissioner was asked about frameworks and guidance on biometrics and facial recognition. The Commissioner noted the draft European Union AI Act was relevant and helpful. The commissioner had collaborated with Europol, Eurojust, and the Fundamental Rights Agency to develop accountability principles for use in policing and security sectors. A survey had been conducted as part of the Accountability Principles for AI (AP4AI) Project. The survey asked 6,000 individuals across the EU member states, Australia and North America about their concerns with the police use of AI. The survey found 80% of the participants agreed it was important that all law enforcement using AI should be following the same accountability framework. A framework was being developed that would include 12 principles for individuals and organisations using AI which included pre procurement checks and post procurement checks.

5.3 Regarding the use of surveillance technology from overseas state-controlled companies (such as in the case of Hikvision facial recognition products) by UK public authorities the Biometrics and Surveillance Camera Commissioner requested that, in light of evidence now available, the BFEG review and revise their previous position. The BFEG had previously informed the Biometrics and Surveillance Camera Commissioner that taking a position would be outside the remit of the BFEG (this is recorded in the minutes of the October 2021 meeting).

5.4 The Chair thanked the Biometrics and Surveillance Camera Commissioner for their update.

6. Data Ethics Advisory Group update

6.1 The Chair of the Data Ethics Advisory Group (DEAG) provided the BFEG with an update.

6.2 The DEAG continued to provide ethical advice to a team from the Law Enforcement Portfolio (LEP) on a proposed proof of concept study.

6.3 The DEAG’s advice had also been sought on a new proposed study. An initial meeting had been held with the project team and discussions were ongoing.

7. Biometric self-enrolment technology testing

7.1 A representative from the Identity Security team presented an update to the BFEG on the Biometric Self-Enrolment programme and the second phase of technology testing.

7.2 The presentation provided a background to the Biometrics Self-Enrolment programme. The aim of the programme was for non-visa customers to be able to provide their biometrics without having to travel to a physical enrolment centre.

7.3 The representative shared a high-level breakdown of the results of the 2021 Biometric Self-Enrolment Feasibility Trials and shared that the future biometrics team was planning the next phase of self-enrolment testing, this would include a Kiosk Pilot and a new round of trials looking at self-enrolment of fingerprints using a smartphone app.

7.4 The Kiosk Pilot would aim to explore the technology’s performance in the operational environment and would be planned to run in parallel with the live immigration application system.

7.5 The members of the BFEG who were present at the 2021 Biometric Self-Enrolment Feasibility Trails asked the Policy team if the potential usability issues relating to disability or sex, which were observed at the self-enrolment feasibility trials, held in 2021, would be considered. The Identity Security team highlighted that user-research expertise had been sought to support the evaluation of the bids to host the self-service kiosk pilot. The team were looking for a solution which was universally usable.

7.6 The Identity Security team representative noted the BFEG raised the provenance of the technologies used as an area of consideration at the 2021 trials and that this had been designed into the second phase of trials as an assessment area.

7.7 The Identity Security team were working on the next phase of fingerprint smart-phone app testing. The BFEG were informed that before a second round of large-scale feasibility trials with members of the public were held, benchmark testing would take place to understand how the suppliers’ apps had developed since the 2021 feasibility trials, while also assessing new entrants. Progression to large-scale trials will only take place once a set of success criteria have been met.

7.8 The BFEG raised that careful planning was needed when collecting demographic data for future technology testing.

7.9 The BFEG questioned what successful equality testing would look like. The Identity Security representative shared that increased age and reduced mobility were observed as 2 areas in the 2021 trials which resulted in decreased usability. Resultantly, the Home Office had set a target operating point for all capabilities to meet requirements for all demographics.

7.10 The BFEG requested any points of concern which the Identity Security team encounter be shared for the BFEG to feed insight and support the ethical design of the trial.

8. Gait analysis

8.1 A speaker from the School of Electronics and Computer Science at the University of Southampton, joined the meeting to present to the group on gait analysis.

8.2 The presentation highlighted that gait technology could identify an individual based on their shape and how they move. Gait analysis could be useful for identification in instances of obscured view or low-resolution video footage.

8.3 The speaker noted that as the field was progressing there was an increasing shift from handcrafted (using small databases and a mathematical chain approach) to deep learning (large datasets where knowledge is expressed within the data). Deep learning provided higher performance than more traditional approaches.

8.4 Gait technology could be combined with soft biometrics (recognition from human distributions), such as age or height to develop labels. These labels, combined with a database of images, could be developed into a ranked structure resulting in recognition. The main advantage of using this technology was it could identify subjects from low resolution (such as long distance) and obscured images.

8.5 The BFEG questioned whether deep learning was less explainable than the traditional handcrafted approach. The speaker agreed deep learning was currently less explainable and noted that few researchers add error bars (which show the variation of the data to the true or average value) to deep-learning outputs. The BFEG were informed that studies were underway which were aiming to explain these results.

8.6 The BFEG asked if any specific research had been produced on the use of gait analysis in the scanning of crowds. The speaker responded that this was a challenge for the scientific community as consent for data recording within a crowd would be a challenge to obtain. The speaker raised to the BFEG that as identification requires an identifying label (e.g. a name), there is an argument that the identity of any given individual within a crowd is anonymous.

8.7 The BFEG queried the level of data in the forensic gait community around repeatability and reproducibility of measurement such as false positive and false negative rates and whether there was enough data in the databases to determine frequency. The speaker noted that there were performance datasets however, gait recognition was not yet as established as facial recognition, for example. The group debated the evidence available supporting the field of forensic gait analysis and there was a discussion regarding the Royal Society/Royal Society of Edinburgh report. At the end of the discussion, the speaker raised that as noted in the report, forensic gait analysis concerns “the direct visual comparison of 2 or more video recordings to establish whether they are of the same individual” and is not focused on biometrics. It was suggested a follow up debate could be held.

8.8 The speaker noted that gait recognition could pose ethical issues in relation to identifying medical concerns in individuals but commented to the BFEG they felt there were no specific ethical concerns that applied to gait recognition that did not apply to other biometric recognition tools.

8.9 The BFEG asked how, for example, the police could safeguard against an individual who may attempt to alter their gait. It was noted in response that there had not been much research in this area.

9. Workplan update and working group review

9.1 The group reviewed the workplan and updates were shared for each of the working groups.

9.2 A new working group was to be established to take on the new commission to provide advice on the ethical issues in the use of novel biometric technologies, including gait and voice recognition.

9.3 It was queried how the new commission would interface with the work of the complex datasets working group (CD WG). It was agreed that there would be overlap and it would be of benefit to maximise expertise by developing a working group with members who sit in varied working groups.

9.4 The BFEG discussed that it would be beneficial to have ‘use cases’ to review as part of the commission in order to narrow the scope.

Action 4: Secretariat to confirm with the policy sponsor the parameters of the commission and desired outcomes.

9.5 The group agreed a new working group would be established to work on the new commission to provide advice on AI governance.

Action 5: Secretariat to develop a Terms of Reference for the new AI working group and share with members who have expressed interest.

10. AOB

10.1 The next meeting was scheduled for December 2022.

Annex A – List of attendees and apologies

Present – all via videoconference

  • Mark Watson-Gandy – Chair
  • Louise Amoore – BFEG member
  • Simon Caney – BFEG member
  • Ann-Maree Farrell – BFEG member
  • Richard Guest – BFEG member
  • Nina Hallowell – BFEG member
  • Julian Huppert – BFEG member
  • David Lewis – BFEG member
  • Sarah Morris – BFEG member
  • Nóra Ni Loideain – BFEG member
  • Niamh Nic Daeid – BFEG member
  • Charles Raab – BFEG member
  • Thomas Sorell – BFEG member
  • Denise Syndercombe Court – BFEG member
  • Peter Waggett – BFEG member
  • Juliette Verdejo – FINDS Unit, HO
  • Fraser Sampson – Biometrics and Surveillance Camera Commissioner
  • Alex MacDonald – Data and Identity Unit, HO
  • Jordan Webster – Border Security and Identity Unit, HO
  • Sam Barnett – Border Security and Identity Unit, HO – observer
  • Mayuris Davdra – Border Security and Identity Unit, HO – observer
  • Bristi Gogoi – Border Security and Identity Unit, HO – observer
  • Professor Mark Nixon – Southampton University
  • Nadine Roache – BFEG Secretariat, HO
  • Jennifer Guest – BFEG Secretary, HO
  • Kimberly Stiff – BFEG Secretariat, HO

Apologies

  • Liz Campbell – BFEG member
  • Mark Jobling – BFEG member

Annex B – review of open actions from previous meeting

October 2021

Action 2: (FINDS to share the Data Protection Impact Assessment (DPIA) for the International Exchange Policy with the BFEG if possible). Action ongoing.

Action 7: (Secretariat to develop a template to provide to presenters based on the BFEG ethical principles). Action ongoing.

March 2020

Action 3: (Complex Datasets working group to produce general guidance on ethical issues in binary classification systems). The secretariat was working with relevant stakeholders to identify useful areas for general guidance. Action ongoing.