Impact assessment

Public Sector Equality Duty assessment for Data Protection and Digital Information (No.2) Bill

Updated 20 December 2023

Introduction

1. The UK’s current data protection framework is set out in the UK GDPR[footnote 1] and the Data Protection Act 2018 (DPA 2018). The legislation protects the rights of individuals by setting out some key principles that all organisations must follow when processing personal data. It also obliges organisations to comply with other specific rules, for example, when transferring personal data abroad, reporting data breaches to the Information Commissioner’s Office (ICO) or demonstrating accountability with the fundamental principles. It also sets out certain rights individuals have in respect of their data, for example: seeking access to it; objecting to its processing; and seeking its rectification or erasure. The legislation is enforced by the Information Commissioner whose powers of investigation and enforcement are also set out in the DPA 18.

2. In some circumstances the general rules on the processing of personal data as set out in the UK GDPR and DPA 18 are supplemented by the Privacy and Electronic Communications Regulations (EC Directive) 2003 (PECR). PECR builds on the general rules by setting out additional and more specific rules in relation to electronic communications and other matters. For example, it sets out rules on marketing by electronic means, including marketing texts and emails.

3. The provisions within this Bill have been developed to deliver a data protection framework that is pro-innovation and pro-growth while maintaining high standards of data protection for individuals.

4. The provisions will make additions and changes to the UK’s data protection framework to:

a. Remove barriers to responsible innovation using personal data; b. Enable the use of personal data to deliver better public services; c. Reduce burdens on businesses and deliver better outcomes for people; d. Boost trade and remove barriers to personal data flows; e. Reform the Information Commissioner’s Office.

5. This document records the analysis undertaken by DSIT and other contributing government departments to enable Ministers to fulfil the requirements placed on them by the Public Sector Equality Duty (PSED) as set out in section 149 of the Equality Act 2010. The PSED requires the Minister to pay due regard to the need to:

a. eliminate unlawful discrimination, harassment and victimisation and other conduct prohibited by the Act; b. advance equality of opportunity between people who share a protected characteristic and those who do not; and c. foster good relations between people who share a protected characteristic and those who do not.

6. The characteristics that are protected in relation to the PSED are age; disability; gender reassignment; pregnancy and maternity; race; religion or belief; sex; and sexual orientation. Marriage and civil partnership are also protected characteristics under the Equality Act but they are not covered by the duty.

7. The analysis for the data reform provisions within the Bill is split into two sections. Section 1 covers the PSED analysis undertaken prior to the publication of the ‘Data: a New Direction’ consultation, Section 2 covers the updated PSED analysis undertaken following consideration of consultation responses.

8. The Data Protection and Digital Information (No.2) Bill also contains the following measures separate to data reform - analyses by the relevant teams are included in this document:

  • Section 3 - Changes to Part 3 and Part 4 of the Data Protection Act 2018 - enhancing law enforcement use of data through improved consistency across data regimes; maintaining high data protection standards that preserve confidence in use of data for law enforcement and national security; reforming biometrics legislation to simplify the oversight framework for the police use of biometrics and overt surveillance and introduce a statutory code of practice. These measures are led by the Home Office.
  • Section 4 - Digital Identity - making digital identities as trusted and secure as physical documents such as passports or driving licences by establishing robust governance, including independent accreditation and certification, and enabling data checking against more trusted datasets. These measures are led by the Digital Identity team in DSIT.
  • Section 5 - Smart Data - enabling delegated access of an individual’s data (e.g., held by communications or financial product providers such as BT, Nationwide) through third party intermediaries to enable services such as personalised market comparisons, automatic switching to better deals and wider, cross-sector user-centric control of data. This measure is led by the Department for Business and Trade.
  • Section 6 - Open Data Architecture - introducing new enabling powers for the Secretary of State to prepare and publish standards for IT products and services used in the health and adult social care sector in England and will require the suppliers of the products to comply with these standards. This measure is led by DHSC.

Data reform

Summary of policy

9. The data reform proposals will update, and simplify, the UK’s data protection framework, including the role of the Information Commissioner’s Office, underpinned by a focus on protecting individuals’ data rights and generating societal, scientific, and economic benefits from a future data protection regime.

10. On data reform the Data Protection and Digital Information (No.2) Bill will include proposals to:

a. Remove barriers to responsible innovation - We will support innovators and enable greater data access and sharing in ways that preserve public trust. For example, through clarifying rules on use of AI technology to facilitate scientific breakthroughs.

b. Reduce burdens on businesses and deliver better outcomes for people - We will reform the current UK GDPR framework from one that unnecessarily prioritises process and high-volumes of paperwork over effective protection of personal data.

c. Boost trade and remove barriers to data flows - Reforms to create an autonomous UK international transfers regime which supports international trade and eliminates barriers to cross-border data flows, making the most of post-Brexit freedoms to establish and expand international data flows.

d. Deliver better public services - We will permit better and more innovative use of data by government and other public bodies in the public interest, by providing clarity on how personal data can be shared and extending existing data sharing powers where required.

e. Reform the Information Commissioner’s Office - We will free up the ICO’s resources by shifting their focus away from high-volume, low-value work towards more upstream work, supporting organisations to innovate responsibly, solve problems before they occur, and help them comply with the law.

Section 1 - Evidence and analysis (Pre-consultation analysis)

11. This section of the document outlines the PSED analysis undertaken by DCMS prior to the launch of the data reform consultation. This analysis of the equality impacts of specific data reform proposals in the Bill follows the same order as the consultation paper. The consultation paper is split into the following five main chapters: (1) Removing barriers to responsible innovation; (2) Reducing burdens on businesses and delivering better outcomes for people; (3) Boosting trade and removing barriers to data flows; (4) Delivering better public services; (5) Reforming the Information Commissioner’s Office.

Chapter 1 - Removing barriers to responsible innovation

12. This section of the consultation paper invited views on addressing aspects of the legislation that can create barriers to responsible innovation, particularly in the area of research. The specific proposals are listed below and discussed in more detail in the consultation paper.

13. This chapter of the consultation focussed on the following proposals

Proposal number Policy Name
1 Creating a single legislative chapter for research
2 Incorporating a clearer definition of ‘scientific research’ into legislation
3 Incorporating broad consent for scientific research into legislation
4 Providing more clarity on how personal data can be further processed for scientific or historical research purposes, archiving in the public interest or statistical purposes
5 Creating a limited, exhaustive list of legitimate interests that businesses can use personal data for without applying the balancing test for example; processing of personal data necessary for the purposes of ensuring bias monitoring, detection and correction in relation to AI systems constitutes a ‘legitimate interest’ in Article 6(1)(f)
6 Make it clear that the existing derogation in Paragraph 8 of Schedule 1 of the Data Protection Act 2018 can be used for processing of sensitive personal data for the purpose of bias monitoring, detection and correction in relation to AI systems
7 Create a new condition within Schedule 1 of the Data Protection Act 2018 which specifically addresses the processing of special category data for the purpose of bias monitoring, detection and correction in relation to AI systems
8 Clarifying that further processing for an incompatible purpose may be lawful when based on a law that safeguards an important public interest or when the data subject consented initially or has re-consented
9 Bring UK GDPR Recital 26 onto the face of legislation

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

14. The government has considered to what extent the proposals in this chapter will help to eliminate discrimination, harassment, victimisation or any other conduct prohibited by the Equality Act 2010 against individuals with protected characteristics

15. The government is of the view that proposals 5 and 7, which seek to make it easier for organisations to process personal data for the purpose of bias monitoring, detection and correction in relation to AI systems in relation to AI systems will actively help to reduce discrimination. The Centre for Data Ethics has noted that “the algorithmic systems we use have the potential to amplify, accentuate and systemise our biases on an unprecedented scale, all while presenting the appearance of objective, neutral arbiters.”[footnote 2] The Joint Committee on Human Rights recognised similar issues in chapter 5 of its 2019 report on ‘The Right to Privacy (Article 8) and the Digital Revolution’.[footnote 3] The government considered views of consultation respondents before any legislative proposals were finalised, as discussed further in the ‘monitoring and evaluation’ section of this document.

16. Creating a new condition for processing sensitive personal data as per Proposal 7, and, albeit to a lesser extent Proposal 6, may result in a disproportionate increase of processing of personal data of individuals with protected characteristics. The more an individual’s personal data is processed, the greater the likelihood of intrusion into their private and family life (Article 8 European Convention on Human Rights) and the greater the risk of a breach involving their personal data. However, the purpose of this proposal is to support organisations to monitor harmful bias and eliminate discrimination, so any detrimental impact is considered justifiable. Furthermore, the more representative the data that an AI system is trained on, the more the system will reflect a broader cross-section of the population. This in itself is likely to mitigate bias and discrimination, and the government assesses that unjustified discriminatory AI systems are likely to generate greater adverse impacts on an individual with protected characteristics, compared to a data breach or a potentially greater level of intrusion.

17. The other proposals in the chapter are unlikely to help to eliminate discrimination, but the government has nonetheless considered whether any engage the equalities legislation through potential indirect effects on individuals with protected characteristics.

18. Proposals 1 - 4 and 6 - 9 are principally designed to make the legislation simpler for organisations to comply with by providing more focused guidance from the regulator, replicating recitals in the operative provisions of the legislation; and making existing practices more straightforward by bringing existing scattered legislation into one place. As the majority of these do not make significant changes to the current data protection legislation the government views that these will create no negative impacts on individuals with protected characteristics. However, proposal 3 (which enables scientific research to rely on broad consent for processing), could result in a shift when processing data for research purposes from using other lawful grounds, such as legitimate interests. The existing legitimate interests ground for processing contains safeguards for individuals (particularly children) so could have a disproportionately detrimental impact on some individuals in relation to the protected characteristic of age. The government will separately consider if any additional safeguards will need to be adopted under broad consent in addition to the ability to withdraw consent that already exists.

19. Proposal 5 which introduces a limited, exhaustive list of legitimate interests tasks for which organisations can use personal data without applying the balancing test (balancing the organisations interest against the individual’s rights). The aim of this proposal is to give organisations more confidence to process personal data without unnecessary recourse to consent. The government is of the view that some of the listed activities will help reduce acts of harassment and victimisation. For example, one of the listed tasks permits an organisation or an individual to report criminal acts or safeguarding concerns to the appropriate authorities as a legitimate interest without weighing up whether their interests outweigh those of data subjects. The effect of this proposal should give organisations more confidence to take action and may help to protect individuals with protected characteristics (e.g. where matters being reported involve issues of harassment on the grounds of race, sexuality, religious beliefs or disabilities).

20. Other processing activities that are listed as legitimate interests in the Bill that will not require a legitimate interests assessment include: processing activities that help to reduce bias in algorithmic decision-making; allow non-public bodies (such as mobile phone companies) to direct public health messages to particular groups in society; permit websites to use audience measurement cookies without consent to improve content for web users. Again, the government considers this could have positive impacts for particular groups if the algorithms being improved, communications being sent or websites being improved relate to individuals with protected characteristics.

21. The government is mindful that Article 6(1)(f) of the UK GDPR requires data controllers to have particular regard to the rights of children when relying on the legitimate interests lawful ground for processing data. It therefore proposes to maintain the balancing exercise in respect of children’s data, irrespective of whether a processing activity appears on the newly created list of legitimate interests. The government considers this will mitigate any indirect impact on children, but also sought views on further possible safeguards via the consultation process, as discussed further in the ‘monitoring and evaluation’ section of this document.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

22. The government has considered whether the proposals in this chapter remove or minimise disadvantages suffered by persons who share a relevant protected characteristic, take steps to meet the needs of persons sharing a protected characteristic, or encourage them to participate in public life. No impacts are identified in relation to most of the proposals.

23. For many of the same reasons provided in the previous section, the government views proposal 5 as providing positive outcomes in this area.

Foster good relations between people who share a particular protected characteristic and people who do not share it

24. The government also considered whether the proposals outlined in this chapter of the consultation help tackle prejudice and promote understanding between different groups. No obvious impacts are identified in relation to these proposals.

Chapter 2 - Reducing burdens on businesses and delivering better outcomes for people

25. The UK remains committed to high standards of data protection, but the government wants a regulatory regime that delivers effectively without unnecessary burdens. The current legislation is based on a model that prescribes a series of activities and policies that organisations must adopt to be considered compliant. There is a risk that this approach encourages a largely one-size-fits-all approach from organisations regardless of the relative risk of their data processing activities and potentially discourages innovation in how to achieve the actual goals of using data responsibly and protecting individuals’ rights.

26. This chapter of the consultation paper focuses on the following proposals:

Proposal number Policy Name
1 Reform of the Accountability Framework so that it is based on a privacy management programme
2 - i Understanding the impact of subject access requests - introduction of a cost ceiling, similar to the Freedom of Information Act 2000 fee regime, to address organisations’ capacity constraints
2 - ii Understanding the impact of subject access requests - amending the threshold for response by applying provisions to those in the Freedom of Information Act 2000 relating to vexatious requests

The consultation document also noted the previous approach (Data Protection Act 1998) where a small nominal fee could be charged for processing a subject access request.
3 Making the ‘soft opt-in’ in relation to direct marketing activities available to a wider range of organisations
4 - i Working with regulators and the telecoms companies to tackle nuisance and fraudulent calls - updating the ICO’s enforcement powers so that they can take action against organisations for the number of unsolicited direct marketing call ‘sent’ rather than as just on calls which are ‘received’ and connected
4 - ii Working with regulators and the telecoms companies to tackle nuisance and fraudulent calls - introducing a ‘duty to report’ on communication service providers so that have to inform the ICO when they have identified suspicious traffic transiting their networks
5 Bringing PECR’s enforcement regime into line with the UK GDPR and Data Protection Act 2018

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

27. The government has had due regard to the need to eliminate discrimination, harassment, victimisation and other conduct prohibited by the Equality Act 2010 during the development of the proposals in this chapter. Some of the measures are specifically designed to protect the interests of individuals and may help to eliminate discrimination of individuals in society who share protected characteristics. For example, there is evidence that older people, people with mental health issues and other vulnerabilities (i.e. with protected characteristics of age and disability) are disproportionately adversely affected by nuisance calls and more likely to fall victim to scams.[footnote 4] Proposals 4 and 5 will strengthen the ICO’s enforcement powers and telecoms’ companies obligations in relation to nuisance calls to ensure they are effective, proportionate and dissuasive, and may therefore directly benefit these groups.

28. We have not identified the potential for the other proposals in this chapter to help to eliminate or reduce discrimination, victimisation or harassment, but the government has considered whether proposals might indirectly advantage or disadvantage individuals with protected characteristics.

29. Proposal 1 will reduce burdens on organisations by removing some of the unnecessary requirements they must put in place to demonstrate compliance with the data protection legislation. The government has considered whether any aspects of these proposals will engage the equalities legislation.

30. None of the sub-options set out in the consultation paper have an obvious differential impact on individuals with protected characteristics. However, the government has considered whether the removal of requirements to complete Data Protection Impact Assessments (DPIAs) in relation to high risk processing activities and to consult with the ICO if those assessments identify risks that cannot be mitigated might engage the equalities legislation.

31. Data processing can be riskier in nature if it involves “special categories” of data (which includes data about protected characteristics such as race, ethnicity, religious beliefs, sexual orientation). Requirements in relation to DPIAs and prior consultation with the ICO were designed to help organisations identify and mitigate risks, but the small number of consultations with the ICO suggests that organisations are not engaging openly or fully with the regulator, possibly due to fear of enforcement action.

32. To reduce burdens on organisations and increase meaningful collaboration between organisations and the ICO, the government proposes to remove mandatory requirements in relation to DPIAs and prior consultation with the ICO. Organisations will still be encouraged to undertake risk assessments and seek advice from the ICO when appropriate. The ICO will also be required to set out a list of processing activities that it considers to be high risk to ensure clarity for organisations. The government considers that these measures will help to mitigate any indirect impacts on individuals with protected characteristics and might lead to improved outcomes if constructive engagement with the ICO were to increase. The consultation paper sought views on these proposals, including their potential impact on individuals with protected characteristics.

33. Proposal 2 is focused on reducing the burden on organisations of high-volume subject access requests and vexatious requests. The government has considered whether the proposals could disadvantage individuals who make subject access requests. These proposals are not meant to create a barrier or undermine an individual’s right of access to personal data. The government recognises that this proposal may disproportionately impact those less able to express themselves due to age or disability by resulting in their requests being treated as ‘disproportionate’ or ‘vexatious’; and assesses that this may be mitigated by the fact that a third party can raise a Subject Access Request on their behalf. The government has also considered the need for a similar safeguard in the Data Protection Act 2018 as the one provided under Section 16 of the Freedom of Information Act (for public bodies) to help data subjects by providing advice and assistance to avoid discrimination, and is inviting views on the necessity of this safeguard. In addition, the consultation paper invited views on the equality impact of these proposals, whether these or other safeguards are needed, and in case there are any impacts the government has not identified. This is discussed in the ‘monitoring and evaluation’ section of this document.

34. Proposal 3 aims to extend the ‘soft opt-in’ for direct marketing purposes. The soft opt-in currently allows businesses to contact customers by email or text if they have formed a relationship with them via a previous purchase. This proposal will extend the soft opt-in to non-commercial organisations (such as political parties and charities). The extension will mean that individuals who have an established relationship with a particular charity or political party who have not opted-out of receiving direct marketing material could be contacted. The proposal could promote democratic engagement and a healthy third sector, which in turn could have valuable societal benefits. It is possible that there could be an indirect differential impact on particular protected individuals. Charities may be contacting people with particular protected characteristics (e.g. race, sexual orientation, sex, age) due to the focus of their work. Although it is not a protected characteristic, political parties may be processing special category data on the bases of political opinions. Some groups in society (e.g. older people, people with mental health issues) may be more concerned than others by emails, messages, texts from people they do not know well.

35. The government considers that any indirect equality impacts could be justified in light of the societal benefits outlined above, but in any event it has purposefully designed proposals with safeguards in mind to mitigate any risk. Safeguards include ensuring that non-commercial organisations are subject to exactly the same rules as commercial organisations in terms of respecting a person’s right to opt out and making it easy for them to do so. It also sought views through the consultation on whether there are any other safeguards that should be put in place.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

36. The government has considered whether the proposals in this chapter remove or minimise disadvantages suffered by persons who share a relevant protected characteristic, take steps to meet the needs of persons sharing a relevant characteristic, or encourage them to participate in public life or in any other activity in which participation by such persons is disproportionately low. No impacts are identified in relation to most of the proposals in Chapter 2, but for the same reasons as set out in paragraph 32 above, the proposals on reducing the burdens of subject access requests may disproportionately impact those less able to express themselves, for example due to age or disability.

Fostering good relations between people who share a particular protected characteristic and people who do not share it

37. The government also considered whether the proposals in this chapter of the consultation help tackle prejudice and promote understanding between different groups, but there are no obvious ways in which these specific proposals could meet these objectives.

Chapter 3 - Boosting trade and removing barriers to data flows

38. The government aims to create an autonomous UK international transfers framework which reflects the UK’s independent approach to the protection of personal data and supports its wider objectives for trade and security. There is an opportunity to explore more flexible, innovative and reliable mechanisms for protecting the cross-border transfer of personal data. This includes more effectively using the UK data adequacy framework, which empowers the government to permit free flows of personal data where it is satisfied with the data protection standards in another jurisdiction, and improving the alternative tools that facilitate protected personal data flows. This will help support domestic businesses to connect with foreign markets, while attracting investment from abroad by businesses which can have confidence in the UK’s responsible use of data.

39. The specific proposals in the consultation are summarised in the table below:

Proposal number Policy Name
1 UK adequacy: take a risk-based approach to adequacy assessments
2 UK adequacy: Amend the requirement to review adequacy regulations every four years
3 UK adequacy: Amend legislation to be clear that either form of redress (administrative or judicial) is acceptable as long as the redress mechanism is effective
4 Alternative mechanisms for international transfers of personal data: Amend legislation to reinforce the importance of proportionality when using alternative transfer mechanisms
5 Alternative mechanisms for international transfers of personal data: Amend legislation to empower organisations to create or identify their own alternative transfer mechanisms in addition to those listed in Article 46 of the UK GDPR
6 Alternative mechanisms for international transfers of personal data: Creating a new power for the Secretary of State to formally recognise new alternative transfer mechanisms

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

40. The government has considered to what extent our proposals will help to eliminate discrimination, harassment, victimisation and other conduct prohibited by the Equality Act 2010. The government’s view is that none of these proposals advantage or disadvantage any particular group with protected characteristics in a differential way and therefore do not engage the Equalities Act. The government considered the views of consultation respondents before finalising the legislative proposals.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

41. The government has considered whether the proposals in this chapter remove or minimise disadvantages suffered by persons who share a relevant protected characteristic, take steps to meet the needs of persons sharing a relevant characteristic, or encourage them to participate in public life. No direct or indirect impacts are considered likely.

Foster good relations between people who share a particular protected characteristic and people who do not share it

42. The government has also considered whether the proposals in this chapter of the consultation help tackle prejudice and promote understanding between different groups, but there are no obvious ways in which these specific proposals could help to meet these objectives.

Chapter 4 - Delivering better public services

43. The UK’s experience of fighting the COVID-19 pandemic has demonstrated the power of personal data used responsibly in the public interest, and the benefits of collaboration between the public and private sectors. Some of the challenges of collecting, using and sharing personal data to deliver better public services are well known and this section of the paper invited views on how some of the known legal barriers could be addressed. The government also recognises that maintaining public trust is essential if more personal data is to be used for the purposes of improving the delivery of government services. Much of the personal data processed by government departments and other public authorities is sensitive in nature and the public rightly expects it to be processed fairly, transparently and securely.

44. This chapter of the consultation paper included the following proposals:

Proposal number Policy Name
1 Extending data sharing powers contained in Part 5 of the Digital Economy Act 2017: Create data sharing provisions to support businesses when trying to start up or applying for government grants
2 Allow private companies to rely on Article 6(1)(e) of the UK GDPR when processing personal data at the request of public authorities to help them deliver public tasks

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

45. In developing the proposals in the chapter the government has had due regard to the need to eliminate unlawful discrimination, harassment, victimisation and other conduct prohibited by the Equality Act 2010. It considers that proposal 2, engages the legislation via indirect effects on individuals with protected characteristics.

46. Proposal 2 could result in processing currently based on legitimate interests under Article 6(1)(f) of the UK GDPR (where children’s rights and interests are specifically considered) to processing based on public tasks under Article 6(1)(e) of the UK GDPR (where there’s no such consideration). The protected characteristic of age may therefore be engaged by the proposal, but the government considers there are unlikely to be any adverse effects on children or other individuals with protected characteristics. That is because the circumstances in which non-public bodies could rely on Article 6(1)(e) will be very narrow. They will be limited to cases where public authorities have specifically requested non-public bodies to process data on their behalf and have identified their own legal basis for collecting and processing the data. Further safeguards are being taken forward, for example, non-public bodies carrying out the processing activity will not be allowed to continue to rely on Article 6(1)(e) once the task was complete or to reuse the data for other purposes.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

47. The government has considered whether the proposals in this chapter remove or minimise disadvantages suffered by persons who share a relevant protected characteristic, take steps to meet the needs of persons sharing a relevant characteristic, or encourage them to participate in public life. No impacts are considered likely in relation to these proposals.

Fostering good relations between people who share a particular protected characteristic and people who do not share it

48. The government has also considered whether the proposals in this chapter of the consultation help tackle prejudice and promote understanding between different groups, but there are no obvious ways in which these specific proposals could help meet these objectives.

Chapter 5 - Reforming the Information Commissioner’s Office

49. The Information Commissioner’s Office is the independent supervisory authority with responsibility for monitoring and enforcing the application of data protection legislation in the UK. The Information Commissioner is accountable to Parliament and may be called to give evidence to the DSIT Select Committee. DSIT is the sponsoring government department of the ICO and the Secretary of State for DSIT is responsible for the ICO in Parliament.

50. The government will improve the legislative framework that underpins the ICO by setting new and improved objectives and a clearer strategic vision, improving accountability mechanisms, and refocusing the ICO’s activities away from handling a high volume of low-level complaints and towards addressing the most serious threats to public trust in data and inappropriate barriers to responsible data use. In the future, the ICO should devote more resources to support those organisations that want to innovate responsibly and tackle poor practices by those that do not meet the UK’s high standards for data protection. Mechanisms will also be put in place to ensure the ICO continues to work closely with other regulators to achieve better regulatory outcomes in digital markets. This will be done in relation to the government’s Digital Regulation Plan.[footnote 5]

51. This chapter of the consultation paper included the following proposals:

Proposal number Policy Name
1 Strategy, objectives and duties: Overarching objective
2 Strategy, objectives and duties: Duty to have regard to growth and innovation
3 Strategy, objectives and duties: Duty to have regard to competition
4 Strategy, objectives and duties: A duty for the ICO to cooperate and consult with other regulators
5 Strategy, objectives and duties: Introducing a power for the DSIT Secretary of State to periodically prepare a statement of strategic priorities
7 Governance Model and Leadership: Introducing legislation to establish an independent Board and a Chief Executive
8 Governance Model and Leadership: Establishing an appointments process for the Chair, Board and CEO
9 Governance Model and Leadership: Setting the Information Commissioner’s salary
10 Accountability and Transparency: Introducing a requirement for the ICO to develop and publish comprehensive and meaningful Key Performance Indicators
11 Accountability and Transparency: Mandating certain transparency requirements so that the ICO is mandated to publish key strategies and processes
12 Codes and Guidance: Giving the Secretary of State for DSIT the power to require the ICO to set up a panel of persons with expertise when developing complex or novel codes of practice and guidance
13 Codes and Guidance: Making impact assessments and enhanced consultation mandatory when developing complex or novel codes of practice and guidance
14 A more proportionate regulatory approach to complaints: Introducing a requirement for the complainant to attempt to resolve their complaint directly with the relevant data controller before lodging a complaint with the ICO
15 A more proportionate regulatory approach to complaints: Requiring data controllers to have a simple and transparent complaints-handling process in place to deal with data subject complaints
16 A more proportionate regulatory approach to complaints: Introducing criteria by which the ICO can decide not to investigate a given complaint.
17 Enforcement powers: A power to commission Technical Reports
18 Enforcement powers: A power to compel witnesses to interview
19 Enforcement powers: An amendment to the statutory deadline for the ICO to issue a penalty following a Notice of Intent

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

52. The government has been mindful of the need to eliminate discrimination, harassment, victimisation and other conduct prohibited by the Equality Act 2010 when developing the proposals in this chapter.

53. Most of the proposals in this chapter are unlikely to have a differential impact on any particular protected group, but the following proposals may engage the equalities legislation through indirect effects on protected groups:

a. Proposals 14 and 16 relating creating a more proportionate regulatory approach to complaints;

b. Proposal 18 relating to ICO enforcement powers.

54. The impacts of these proposals, and possible mitigations are discussed in more detail below.

55. Proposals 14 and 16 aim to create a more proportionate regulatory approach to complaints. Proposal 14 introduces a requirement for the complainant to attempt to resolve their complaint directly with the relevant data controller before lodging a complaint with the ICO. This has the potential to adversely impact people who share protected characteristics (e.g. children, older people or people with disabilities) who might be less able or confident to take complaints up with the data controller directly and is therefore likely to engage equality legislation. The government considered that risk in developing the proposal and suggested safeguards in the consultation paper. Mitigation of these impacts may include ensuring the policy is combined with a series of exemptions, allowing, for example, children and vulnerable groups to proceed directly to the ICO with their complaint(s). Additionally, all data subjects will be permitted to proceed to lodge their complaint with the ICO if the data controller causes undue delay in responding to their complaint.

56. Proposal 16 proposes introducing criteria by which the ICO can decide not to investigate a given complaint. This policy has been carefully developed and implemented to avoid giving the ICO too much discretion not to investigate significant data protection complaints. The government sought views on these proposals and how these impacts may be mitigated. The government is also exploring additional safeguards for this proposal so that the ICO is always required to investigate complaints from certain data subjects who share protected characteristics (e.g. children or people with disabilities) to the extent necessary.

57. Proposals 17 - 19 on enforcement powers aim to extend the ICO’s powers where appropriate to ensure enforcement provisions are fit for purpose, i.e. that the regulator has tools appropriate to both promote compliance and to impose robust, proportionate and dissuasive sanctions where necessary.

58. It is possible that proposal 18, states that the ICO should be able to compel witnesses to interview, may engage equality legislation. This is because it is a wide ranging power, with implications on individuals’ rights and freedoms. The granting of this power has been carefully considered with due regard to how it will affect individuals and to the circumstances in which it is designed to be used. Being called to interview by the regulator could potentially be intimidating for individuals, particularly for individuals in protected groups. The government is looking to ensure that use of this power is therefore limited to only when necessary to an investigation and information could not be elicited by other means. The government has developed provisions to ensure its use remains proportionate, and set clear parameters to prevent misuse or overuse. For instance it will apply only to individuals with a formal connection to the investigation, in most cases employees or former employees.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

59. The government has considered whether the proposals in this chapter remove or minimise disadvantages suffered by persons who share a relevant protected characteristic, take steps to meet the needs of persons sharing a relevant characteristic, or encourage them to participate in public life or in any other activity in which participation by such persons is disproportionately low. No impacts are identified in relation to most of the proposals in Chapter 5.

60. However, proposal 14 relating to creating a more proportionate regulatory approach to complaints potentially poses a risk in hindering equality of opportunity between people who share a particular protected characteristic and people who do not share it. This is because people who share certain protected characteristics (e.g. children, older people or people with disabilities) may feel less confident or be less able to first approach the relevant data controller with their complaint, prior to proceeding to the ICO. The government is working to create safeguards and exemptions which will prevent any barriers being created between the ICO and data subjects who share a particular protected characteristic, allowing them to proceed directly to the ICO with their complaint(s).

Fostering good relations between people who share a particular protected characteristic and people who do not share it

61. The government also considered whether the proposals in this chapter of the consultation help tackle prejudice and/or build understanding between different groups. There are no obvious impacts considered likely in relation to these proposals.

Section 2 - Monitoring and evaluation (Updated analysis post consultation)

62. The equality impacts of the data reform proposals have been given due regard throughout the policy development process and Ministers have received advice when required and appropriate. The ‘Evidence and analysis’ section of this document outlined the analysis undertaken ahead of publishing the Data: a New Direction consultation.

63. Following the close of the consultation the government considered the consultation responses carefully, including any comments on how specific proposals might affect individuals with protected characteristics.

64. In light of consultation responses the government updated it’s equality duty analysis on the following proposals:

Description of proposal Nature of indirect impact on groups with protected characteristics Proposed mitigation Updated analysis following consultation response
Reform of the accountability framework based on introduction of ‘privacy management programme’ - specifically the removal of the requirements for the completion of DPIAs to identify processing risks and and for prior consultation with the ICO about any risks that can’t be mitigated The removal of prescriptive requirements to complete DPIAs and consult the ICO before embarking on high risk processing, including processing involving special category data, could mean that potential disproportionately detrimental effects of the processing for individuals with protected characteristics may not be identified. Under this proposal, organisations will still be required to have risk management processes in place. While these will allow greater flexibility than DPIAs, the underlying requirement for risk assessment and mitigation remains in place.

Replacing the mandatory requirement with a voluntary incentive to consult the ICO with regards to high risk processing is intended to encourage more dialogue with the ICO and a more collaborative approach.
Our pre-consultation analysis considered whether this proposal could have disproportionately detrimental effects for individuals with protected characteristics.

Concerns were raised by consultees that removing the requirement to undertake DPIAs will result in increased uncertainty and unmitigated impacts on individuals, particularly where the processing relates to groups which share protected characteristics or involves the use of special categories of data. However, under the PMP proposals, organisations will still be required to consider risk through the use of risk assessment tools and implementation of their risk-based privacy management programme, taking into consideration the volume and sensitivity of the personal data they handle. Therefore this in itself is likely to mitigate any risk of protected characteristics not being identified. The use of DPIAs will also remain a valid tool in achieving the new outcomes focused requirement under the PMP proposal. The ICO will be expected to produce updated guidance on risk assessments and the use of impact assessments and we will continue to work with them to ensure there is clear guidance.
 
Introduce a fee-regime, similar to the Freedom of Information Act 2000 fee regime, to address organisations’ capacity constraints, which would introduce a cost ceiling, and amend the threshold for response by applying provisions to those in the Freedom of Information Act 2000 relating to vexatious requests.

The consultation document also noted the previous approach (Data Protection Act 1998) where a small nominal fee could be charged for processing a subject access request.
Reducing the burden on organisations of high-volume subject access requests and vexatious requests may disproportionately impact those less able to express themselves due to age or disability by resulting in their requests being treated as ‘disproportionate’ or ‘vexatious’.

Introducing a nominal fee creates an affordability threshold. This might affect the ability of some groups with protected characteristics to raise a subject access request if they do not have the means to pay (e.g. on account of age, disability etc.)
Third parties can raise a subject access request on behalf of an individual. We are considering whether the need for a safeguard (similar to the one provided under Section 16 of the Freedom of Information Act (for public bodies)) is required to help individuals by providing advice and assistance to avoid discrimination, and we are inviting views on the necessity of this safeguard.

The consultation document noted that additional safeguards may be needed, and invited views on their necessity.

As for the nominal fee, to mitigate the affordability issue, the consultation asked for views on the desirability, and reasonable level of a fee. We also invited views from consultees on possible safeguards we should consider.
Subject Access Requests (nominal fee): Our pre-consultation analysis considered whether the re-introduction of a nominal fee for Subject Access Requests (SARs) could harm or disadvantage individuals who make these requests, and this was highlighted in a small percentage of consultation responses as having a potential impact on individuals with protected characteristics, particularly age and disability. The government will not re-introduce a nominal fee for subject access requests and will not proceed with introducing a cost ceiling. The government also considered whether introducing a similar safeguard in the Data Protection Act 2018 as the one provided under Section 16 of the Freedom of Information Act (for public bodies) to help data subjects by providing advice and assistance is necessary. The government deems the current duty to facilitate data subject rights under Article 12(2) UK GDPR and Section 52(6) of the DPA 2018 to be sufficient. Potential revised guidance from the ICO would further mitigate any impact on groups with protected characteristics. The government is satisfied that proposals in this area will not disproportionately impact groups with protected characteristics.

Subject Access Requests (cost ceiling): The key issues which arises from the cost ceiling proposal in relation to the Public Sector Equality Duty which we have identified is the cost ceiling could negatively impact vulnerable individuals or groups with protected characteristics such as those with a disability or long-term medical condition, who may be more likely to submit requests of fundamental importance to their lives which due to the complexity or high volume involved in complying with these requests may be more likely to reach the cost limit threshold, resulting in them not being able to access how their personal data is being processed. These proposals are not meant to undermine an individual’s right of access to personal data, and the government is not proposing to introduce a cost ceiling for SARs.

Subject Access Requests (advice and assistance): In the consultation, we also considered the need for a similar safeguard in the Data Protection Act 2018 as the one provided under Section 16 of the Freedom of Information Act (for public bodies) to help data subjects by providing advice and assistance, and invited views on the necessity of this safeguard. Following further policy development, we are no longer proposing to introduce a separate duty to assist as we are satisfied the current duty to “facilitate the exercise of a data subject’s rights” (UK GDPR Art 12(2) and s.52(6) DPA 2018) together with the possibility of revised guidance and/or a code of practice from the ICO will be sufficient. This duty may also mitigate any concerns in respect to people with protected characteristics being disproportionately impacted as a result of these proposals. We will no longer take this proposal forward.
 
Permitting organisations to place cookies on a user’s device for other legitimate purposes where the impact on the privacy of the user is likely to be minimal. Some people might have concerns about websites processing increased volumes of data without consent (especially if it relates to children or people with disabilities or mental health issues (i.e individuals with protected characteristics of age and disability) who are less able to raise concerns. The proposal has been purposefully designed narrowly to permit only low risk processing activities without consent. More invasive data processing, such as sharing data with third parties and the micro-targeting of individuals for advertising purposes would not be permitted by these proposals. Safeguards including the anonymisation of personal data, restrictions on data sharing, and the right for users to opt out of the use of such cookies will be required. Our pre-consultation analysis recognised that some people might have concerns about websites processing increased volumes of data without consent, especially if it relates to children or people with disabilities or mental health issues. Concerns were raised by some respondents about the importance of not undermining the Age Appropriate Design Code (AADC) standards, notably the need for a high level of transparency when children’s data is being collected. The proposals that the government will take forward (i.e. permitting audience measurement and some other non-intrusive cookies without consent), will be carefully designed with safeguards to protect the rights of individuals, such as limiting any information that is processed for audience measurement purposes to aggregate statistical information and not using the data for more intrusive purposes. A move from an opt-in to an opt-out consent model for websites would only take place once Ministers are content that users have access to technology that supports them to effectively manage their preferences on how their data is processed.  
Removing consent requirements for all types of cookies This proposal had the potential to have a disproportionate impact on some groups with protected characteristics, if not designed carefully enough. For example, an individual with a gambling addiction who visits a betting website could have his/her data shared with thousands of other gambling websites in milliseconds if consent requirements are removed without alternative safeguards being in place. Similarly, we recognise that the public might be concerned if the proposals led to increased microtargeting of children with inappropriate advertising content. We have been mindful of these risks when designing the new proposals and consent requirements will not be switched off until automated technologies are more widely available and sufficiently understood by web users so that they can set their online preferences at browser level. Even then, the changes will not apply to websites that are likely to be accessed by children. We will also ensure that if regulations are used in the future to switch off consent requirements for more intrusive types of cookies, consideration of relevant safeguards are baked into the design of the provisions. Therefore, this proposal is now less likely to have a disproportionate impact on groups with protected characteristics. The vast majority of respondents disagreed with removing the consent requirement for all types of cookies, particularly more intrusive varieties which collect personal data for the purposes of real-time bidding and the micro-targeting of advertisements. Many respondents argued that web users should be given clearer information about these types of cookies, so that they could exercise their right to reject them where appropriate.

There was also support for websites respecting individuals’ preferences set through their browser, software applications or device settings. Many respondents highlighted that it would improve user experiences and benefit trust in those using services, but some highlighted risks relating to healthy competition and queried whether different platforms and services would have the technological capability to provide legally compliant technology.

Following consideration of responses, the government intends to legislate to remove the need for websites to display cookie banners to UK residents. In the immediate term, the government will permit cookies (and similar technologies) to be placed on a user’s device without explicit consent, for a small number of other non-intrusive purposes. These changes will apply not only to websites but connected technology, including apps on smartphones, tablets, smart TVs or other connected devices.
 
Making the soft opt-in in relation to direct marketing activities available to a wider range of organisations, such as political parties and charities. Some groups in society (e.g. older people, people with mental health issues) may be more concerned than others by emails, messages, texts from people they don’t know well. We proposed safeguards in the consultation paper around ensuring that people have the opportunity to opt out of receiving communications. Organisations will be required to make it easy for individuals to opt out. We also invited views from consultees on whether there are other safeguards we should consider. Our pre-consultation analysis recognised that this proposal would mean that some people will receive direct marketing material that they would not have received previously. This reform could promote democratic engagement and a healthy third sector, which in turn could have valuable societal benefits. It is possible that there could be an indirect differential impact on particular protected individuals. Charities may be contacting people with particular protected characteristics (e.g. race, sexual orientation, sex, age) due to the focus of their work. Although it is not a protected characteristic, political parties may be processing special category data on the bases of political opinions. Some groups in society (e.g. older people, people with mental health issues) may be more concerned than others by emails, messages, texts from people they do not know well. Ministers received advice on this proposal on 4 February. To mitigate the risks identified here, this proposal has been designed so that non-commercial organisations are subject to exactly the same rules as commercial organisations in terms of respecting a person’s right to opt out and making it easy for them to do so.  
A more proportionate regulatory approach to complaints: Introducing a requirement for the complainant to attempt to resolve their complaint directly with the relevant data controller before lodging a complaint with the ICO Some groups in society (e.g. children and vulnerable adults) might be less able/ willing to take up complaints directly with the data controller. Specific safeguards were suggested in the consultation paper, including allowing children and vulnerable groups to proceed directly to the ICO with their complaint(s). Additionally, all data subjects would be permitted to proceed to lodge their complaint with the ICO if the data controller causes undue delay in responding to their complaint. We are no longer pursuing a blunt requirement for complainants to attempt to resolve their data protection complaint with the relevant data controller prior to lodging a complaint with the ICO. Rather, we will be pursuing the proposal to introduce criteria in legislation by which the ICO can decide not to investigate a given complaint (this proposal and associated PSED concerns are outlined in the next section). This will: i) enable the ICO to refuse to investigate a complaint if, in its opinion, the data subject has not made reasonable efforts to resolve it with the controller/processor first; ii) enable the ICO to refuse to investigate vexatious complaints. Concerns were raised by some respondents that requiring data subjects to complain to the relevant controller first would create a (real or perceived) barrier between complainants and the ICO. In pursuing this latter policy, we will also introduce safeguards to mitigate this risk (see next section).  
A more proportionate regulatory approach to complaints: Introducing criteria by which the ICO can decide not to investigate a given complaint. This will: i) enable the ICO to refuse to investigate a complaint if, in its opinion, the data subject has not made reasonable efforts to resolve it with the controller/processor first; ii) enable the ICO to refuse to investigate vexatious complaints. A decision not to investigate could have a disproportionate impact on some groups with protected characteristics, such as children and vulnerable adults. Also, some groups in society (e.g. children and vulnerable adults) might be less able/willing to take up complaints directly with the data controller. Specific safeguards have been proposed in the consultation paper, for example ICO could always be required to investigate complaints from data subjects who share protected characteristics to the extent necessary. Concerns were raised by some respondents that requiring data subjects to complain to the relevant controller first would create a barrier between data subjects and the ICO, and prevent data subjects from being able to exercise their rights to complain or seek redress. To mitigate these risks, we will combine this proposal with several safeguards. As such, the ‘reasonable efforts’ requirement for data subjects to resolve the complaint with the relevant controller/processor first will be discharged if: i) the data subject has given at least 30 days for the controller/processor to respond; ii) they receive a response they consider to be unsatisfactory; iii) a data controller has not provided the data subject with contact details to raise a complaint; iv) a data subject has attempted resolution with the controller but found the process too time consuming; v) for any other reason in the ICO’s judgement, the complaint should be considered by the ICO directly. Moreover, the ICO, in exercising its new discretion, will be required in legislation to consider the vulnerability of the data subject, in particular their age or other characteristics under the Equality Act 2010. We will also retain current statutory accountability mechanisms to ensure that the ICO does not use its discretion too freely with regard to data protection complaints. Article 78 of UK GDPR confers the right of judicial remedy where the supervisory authority ‘does not handle a complaint or does not inform the data subject within three months on the progress or outcome of the complaint’ and this will be maintained.  
Permitting processing of special category data for the purpose of bias monitoring, detection and correction in relation to AI systems This proposal is likely to lead to an increase in the processing of sensitive personal information and the potential for intrusion on individuals with protected characteristics. The purpose of this proposal is to support organisations to monitor harmful bias and eliminate discrimination, so any detrimental impact is thought objectively justifiable. In our pre-consultation analysis we considered how our proposal on creating a new condition for processing of sensitive personal data for bias monitoring and correction in relation to AI systems would impact the public sector equality duty under section 149(1) Equality Act 2010.

Our analysis of consultation responses was largely congruent with our pre-consultation analysis that this condition would lead to increased processing of sensitive personal data of individuals with protected characteristics.

The purpose of this proposal is to support organisations to monitor harmful bias and eliminate discriminatory outcomes, so any detrimental impact is considered justifiable on this basis. Furthermore, the more representative the data that an AI system is trained on, the more the system will reflect a broader cross-section of the population. This in itself is likely to mitigate bias and resulting discrimination, on an individual with protected characteristics.

Finally, we are developing safeguards that reflect concerns raised by stakeholders that organisations would hold greater amounts of sensitive personal data. These include appropriate impact assessments, technical limitations on re-use, and the implementation of state of the art security and privacy preserving measures when processing for this purpose.
 
Future-proofing Article 22

This proposal seeks to clarify Article 22, making it more effective to ensure high-risk decisions are fully captured in its scope and appropriate safeguards taken.
The proposals around reforming Article 22 could potentially lead to an increase in automated decision-making including profiling which would result in an increase in the number of legal or similarly significant decisions made about individuals. The proposals to revise Article 22 to affirm its function as right to be invoked will make clear that low risk (beneficial) automated decision making is not captured within Article 22, this could also potentially lead to an increase in this type of processing. The government acknowledges that historically automated decision making has had a disproportionately detrimental effect upon people with protected characteristics, for example on the basis of race.

If left without further mitigation, this could perpetuate inequalities by increasing the number of decisions made about people based on their protected characteristics.
The government believes this proposal is mitigated by the approach to bias mitigation as set out in the national policy position on AI governance that will be detailed in the White Paper later this year and in the other AI reforms proposed to enable organisations to test AI-driven automated decision-making for potential biases and to ensure appropriate steps are taken to mitigate risks associated with bias. Controllers carrying out automated decision making remain subject to duties under the Equalities Act 2010 and the Human Rights Act 1998. Consultation responses raised concerns over safeguards for automated decision-making, and particularly that removing the right to human review could have a disproportionately negative impact on people with protected characteristics, for example on the basis of their sex or race. A frequently cited example of this was the 2020 A-Level results algorithm, which respondents felt discriminated unfairly against pupils. Though precautions were taken to prevent bias based on protected characteristics, the profiles of those attending different schools inevitably led to outcomes being different based on their protected characteristics, including race and sex. In this example, there was uncertainty over the appeal mechanisms pupils could access. Our proposals retain human review as currently required under Article 22 but will ensure that a data subject has access to clearer safeguards for any significant decision made without meaningful human involvement.  
Setting out a list of processing activities where data controllers do not have to weigh up whether their interests outweigh those of data subjects before embarking on the processing activity. The current legitimate interests test requires data controllers to weigh up whether their interests outweigh the rights of data subjects, having particular regard to the rights of children. ICO guidance recommends that the outcome of the ‘legitimate interests assessment’ is documented. The UK GDPR and DPA 2018 already sets a high bar for data controllers who wish to process data belonging to children. It is the government’s view that the existing safeguards are sufficient. In addition to the existing safeguards, clause 5 permits the Secretary of State to make changes to the draft provision should the need arise to place restrictions on certain processing activities.

The consultation proposed safeguards which would see the balancing test maintained in relation to children’s data.
Our pre-consultation analysis noted the potential indirect impacts of the legitimate interests proposals on groups with protected characteristics. This was supported by consultation respondents who warned that removal of the balancing test could disadvantage children or vulnerable groups in society who are less able to complain to the regulator if their data has been used without careful forethought about the risks. Following consideration of consultation responses, Ministers are minded to adopt a cautious approach in implementation of the policy. The balancing test will only be removed for a narrow range of processing activities where there are strong public interest reasons for the processing to occur without delay. This is likely to include processing which is necessary for national security purposes, the prevention of crime and safeguarding purposes. Although the balancing test will be removed in these scenarios, other safeguards will remain - for example, controllers will still have to demonstrate that processing is necessary, they will still need to comply with the Data Protection principles in Article 5 of the UK GDPR’ and the conditions and safeguards in relation to the processing of special category data in Article 9 of the UK GDPR and Schedule 1 to the DPA 2018.  
Allowing non-public bodies to rely on the ‘public tasks’ lawful ground instead of ‘legitimate interests’ when carrying out activities at the request of government departments. Currently, controllers relying on legitimate interests would have to weigh up whether their interests outweigh the interests of data subjects, giving particular regard to the interests of children. The consultation proposals have been designed so that non-public bodies would only be able to rely on public interest tasks in a narrow set of circumstances, e.g. where a public authority has specifically asked them to disclose information n and has identified its own lawful ground for processing the data. Our pre-consultation analysis noted that this proposal could result in processing currently based on legitimate interests under Article 6(1)(f) of the UK GDPR (where children’s rights and interests are specifically considered) being undertaken under Article 6(1)(e) of the UK GDPR instead (where there’s no such consideration). It is possible that this is permitted currently but not entirely clear so controllers may be relying on A6(1)(f). The protected characteristic of age may therefore be engaged by the proposal, but the government’s view remains that there are unlikely to be any adverse effects on children or other individuals with protected characteristics. That is because the circumstances in which non-public bodies could rely on Article 6(1)(e) would be very narrow. They would be limited to cases where public authorities have specifically requested non-public bodies to process data on their behalf and have identified their own legal basis for collecting and processing the data. Respondents to the consultation suggested that any new legislative provisions should make it clear that non-public bodies carrying out a processing activity under Article 6(1)(e) could no longer rely on that ground once the task was complete or reuse the data for other purposes. They also said any new legislation should clarify whether a data subject’s rights to object should be directed to the non-public body or the public body. We will consider the most effective ways of clarifying these issues when we draft any new legislative measures.  
Defining the meaning of ‘substantial public interest’ for the purposes of processing special category data add/or adding new situations to the list of activities where processing of such data would be permitted. As above, any measure which permits processing of special category data in a wider range of circumstances is likely to engage the equalities legislation. Any new measures will need to be formulated carefully to protect the rights of individuals. We will need to consider evidence submitted via the consultation process to determine whether new provisions are justified. We received very few suggestions from respondents for changes to the substantial public interest exemptions in Schedule 1, but some faith organisations said the legislation could be clearer in relation to the processing of religious data for every day administrative tasks (e.g. correspondence from a ‘Revd’ or a ‘Bishop’ which identifies a religious affiliation) or when compiling electronic prayer lists. Sporting governing bodies asked for greater clarity on the circumstances they could process special category data (e.g. about disability or health) when assessing the eligibility of athletes for sporting competition. One political party asked for greater clarity about the circumstances in which elected representatives can process data about a person’s political opinions for political campaigning purposes. It seems unlikely that legislation of this nature could eliminate discrimination, advance equality of opportunity or foster good relations between different groups, but would be welcomed by the stakeholders who made these suggestions. t. If Ministers were minded to proceed with changes of this nature, they could be taken forward via a statutory instrument using powers in section 10 of the DPA 2018. A more detailed equality assessment would be completed if/when that legislation were pursued.
Changes to increase legal certainty for organisations and individuals processing personal data without consent for the purposes of democratic engagement. In particular, relevant organisations will no longer have to apply the ‘legitimate interests’ balancing test before processing personal data without consent Although the measures could help to advance equality of opportunity and/or foster good relations between different groups by allowing relevant organisations and individuals to actively engage with those that don’t always participate in the electoral process, there are also some risks to certain groups with protected characteristics (eg. age and disability). For example, if the measures resulted in children, older or vulnerable people receiving more unwanted emails, texts or telephone calls from people they didn’t know, this could disproportionately increase anxiety and distress amongst groups with protected characteristics as compared to those without. We have been mindful of these risks when designing the proposals, which is why we are proposing to limit the use of these measures to registered parties, elected representatives, candidates who have been formally nominated, permitted participants in referendum campaigns and recall petitioners. We will be excluding third party campaign groups and individuals who announce an intention to stand in an election but have not yet been formally nominated. We will also be making it clear that relevant organisations cannot contact children under the age of 14 for the purposes of democratic engagement in reliance on this provision, although alternative lawful grounds may be available in certain circumstances. (14 years is the age in some parts of the UK at which children are eligible to register to vote). Also if any child under the age of 18 objects to the processing, then continued processing would not be permitted. There were mixed views from respondents about giving political parties and elected representatives etc. greater freedom to process personal data without consent for the purposes of democratic engagement.

Of the respondents that agreed, some highlighted the importance of clarifying the law to promote a healthy democracy and increase voter turnout.

Some of those who disagreed emphasised the need for continued case-by-case consideration to protect the rights of individuals, and voiced concern about the impact of removing the balancing test on levels of public trust.

In recognition of the mixed views, the government intends to pursue the proposal, but it will be designed with safeguards to limit the scope of the provision (see column to the left).

We will also create a regulation-making power, so that additional conditions and safeguards could be added, if the provisions did not operate as intended. Any changes made via regulations would be subject to parliamentary scrutiny.
 
Secretary of State approval of ICO statutory codes and guidance No10 requested the inclusion of a SoS approvals process for ICO statutory codes of practice and guidance. The SoS will be required to consider a final version of a statutory code submitted by the ICO and decide whether to approve it. Where the code is approved, it must then be laid before Parliament for final approval. Where the code is not approved, the SoS will be required to publish a statement, setting out the rationale for not approving a code. The Commissioner must then revise the code in the light of the statement and submit the revised code to the Secretary of State for approval before it may be laid before Parliament. We have considered this policy in light of the PSED and we consider it unlikely that the proposal will have a differential impact on a protected group. As with the majority of the proposed reforms relating to the ICO, no impacts are identified in relation to advancing equality of opportunity and fostering good relations. N/A - no impacts are identified in relation to advancing equality of opportunity and fostering good relations. The majority of respondents disagreed with this proposal. Respondents mainly highlighted concerns about the risk to the ICO’s independence. The government believes that the new requirements on the ICO to carry out impact assessments and to set up expert panels will ensure that codes and guidance procedures are more robust. However, the government also believes it is important for democratic accountability that such guidance is approved by the DSIT Secretary of State, as a final safeguard in this process, before the guidance is laid in Parliament.  
Clarifying that the definition of processing for scientific research purposes applies to research carried out as a commercial activity This proposal is principally designed to clarify that scientific research carried out as a commercial activity is included in the definition of processing for scientific research. It is not considered as a widening of the definition as it currently stands, as already outlined in ICO guidance. Therefore, as the change is not viewed as making a significant change in current data protection legislation, the Government views that this proposal will create no negative impact on individuals with protected characteristics. Not applicable - The proposal is viewed as having no negative impact on individuals with protected characteristics, therefore, there is no requirement for specific mitigations. Engagement with stakeholders was undertaken after the consultation period to identify where burdens to businesses could be reduced further. Stakeholders sent in proposals and triaged by DSIT policy officials for feasibility. This proposal was selected as a result of that process.  
Reducing record-keeping requirements to only require the minimum amount of information be recorded Broadening the exemption, will reduce burdens on organisations by removing some of the paperwork they must put in place to demonstrate compliance with the data protection legislation. In particular organisations who employ more than 250 individuals whose processing is not likely to be high-risk will no longer have to keep a record. The government has considered whether any aspects of these proposals will engage the equalities legislation. The proposal to broaden the exemption and tie it to ‘high-risk’ does not have an obvious differential impact on individuals with protected characteristics.

However, Data processing can be riskier in nature if it involves “special categories” of data (which includes data about protected characteristics such as race, ethnicity, religious beliefs, sexual orientation). The government does not consider this to impact individuals with protected characteristics as these are considered ‘high-risk’ in nature and therefore will need to be recorded.
Not applicable - No mitigations are required, as we do not view this proposal as having a negative impact on individuals with protected characteristics. Engagement with stakeholders was undertaken after the consultation period to identify where burdens to businesses could be reduced further. Stakeholders sent in proposals and triaged by DCMS policy officials for feasibility. This proposal was selected as a result of that process.  
List a series of activities, such as direct marketing or ensuring network and information security, that may be in the legitimate interests of organisations handling data. This proposal aims to clarify the types of processing activities that are likely to be in the legitimate interest of the data controller. Data controllers would still need to conduct the legitimate interest assessment (balancing test) under this proposal.

We intend to use the listed activities from the recitals to the GDPR (recitals 47,48,49). In effect, this proposal simply confirms in law what has already been indicated in the recitals (as an interpretative tool) , but it would, if implemented, give greater confidence to data controllers to use legitimate interest as a basis for processing if their activity is listed. Given that controllers will still need to carry out the legitimate interests balancing test, that the amendments will only indicate that the activities “may” constitute a legitimate interest and that the activities are already listed in the recitals , we do not view this proposal having a negative impact on individuals with protected characteristics.
Not applicable - No mitigations are required, as we do not view this proposal as having a negative impact on individuals with protected characteristics. Engagement with stakeholders was undertaken after the consultation period to identify where burdens to businesses could be reduced further. Stakeholders sent in proposals and triaged by DCMS policy officials for feasibility. This proposal was selected as a result of that process.  
Ensuring businesses are able to continue to seamlessly use their pre-Bill existing transfer mechanisms - those which meet the required level of protection under the current framework - without a requirement for further checks or repapering The matters considered in this amendment to the Data Protection and Digital Information (No.2) Bill to allow for the seamless use of pre-Bill transfer mechanisms in Article 46(2) and (3) UK GDPR, which meet the required level of protection under the current framework, do not raise any issues relevant to the public sector equality duty under section 149(1) Equality Act 2010 because the amendment does not result in any positive or negative impact on people who share protected characteristics. The aim of this amendment is to reduce burdens on all UK businesses relying on alternative transfer mechanisms to transfer personal data internationally. Not applicable - No mitigations are required, as we do not view this proposal as having a negative impact on individuals with protected characteristics. Engagement with stakeholders was undertaken after the consultation period to identify where burdens to businesses could be reduced further. Stakeholders sent in proposals and triaged by DCMS policy officials for feasibility. This proposal was selected as a result of that process.  
Requiring controllers to consider, among other things, the extent to which a decision has been taken on the basis of profiling when establishing whether or not human involvement has been meaningful. This amendment will ensure controllers consider, among other things, the extent to which a decision has been taken on the basis of profiling when establishing whether or not human involvement has been meaningful. l. This will help l to clarify the circumstances in which safeguards apply to significant decisions that are taken about individuals on the basis of profiling.

The Government acknowledges that automated decision-making about individuals, which often relies on profiling, may have a disproportionately detrimental effect on people with protected characteristics, for example on the basis of race, national origin, or sex.
The Government considers the safeguards in Article 22 which provide data subjects with the right to be informed, contest, and seek human review of automated decisions, will mitigate any potential negative impact. Further, controllers carrying out automated decision-making remain subject to duties under the Equalities Act 2010 and the Human Rights Act 1998. Engagement with stakeholders was undertaken after the consultation period to identify where burdens to businesses could be reduced further. Stakeholders sent in proposals and triaged by DCMS policy officials for feasibility. This proposal was selected as a result of that process.  

65. The government will continue to monitor the potential equality impacts of the data reform proposals as the Bill goes through both Houses of Parliament.

Section 3 - Changes to Part 3 and Part 4 of the Data Protection Act 2018

Introduction

1. The UK’s current data protection framework is set out in the UK GDPR and the Data Protection Act 2018 (DPA 18). The law governing law enforcement and the intelligence services fall under Parts 3 and 4 of the UK GDPR.

2. This document sets out the Equalities Impact Assessment undertaken by the Home Office as part of its policy development of proposals to update data protection law in relation to processing by law enforcement and the intelligence services. The Department for Digital, Culture, Media and Sport have undertaken a similar exercise for the proposals affecting other parts of the data protection legislation.

3. This assessment forms part of the Government’s analysis undertaken to enable Ministers to fulfil the requirements of the Public Sector Equality Duty (PSED) as set out in s.149 of the Equality Act 2010. The PSED is an ongoing duty which will continue to be monitored and reviewed. There are three limbs to the PSED as outlined below:

a. Eliminate discrimination, harassment, victimisation, and any other conduct that is prohibited by or under the Equality Act 2010;

b. Advance equality of opportunity between persons who share a relevant protected characteristic[footnote 6] and persons who do not share it;

c. Foster good relations between persons who share a relevant protected characteristic and persons who do not share it.

  1. This document considers the proposals individually and assesses whether they meet the standards of the PSED.

Summary table of Home Office proposals:

Policy Team Proposal Impacts/Discrimination Identified
Eliminate discrimination, harassment, victimisation Advance equality of opportunity Foster good relations
Part 3/4  Subject Access Requests (SAR)  None None None
Part 4  Co-operation across data boundaries  Indirect: Sex None None
Part 3  Introduce a national security exemption into Part 3  None None None
  Introduce a ‘Legal Professional Privilege’ Exemption  None None None
  Introduce a definition of ‘consent’ to Part 3  None None None
  Introduce a power to allow bodies representing Part 3 controllers and processors to produce ‘Codes of Conduct’  None None None
  Remove the need to log the ‘justification’ for consulting/disclosing data disclosure  None None None
  Introduce the ability to actively review automated decisions  None None None
International Transfers  Reforms of the Adequacy/Removal of the Review Period/Reform to the Appropriate Safeguards Mechanisms None None None
  Clarifying use of s. 76 of DPA. None None None
  Reform subsequent transfers provision (s. 78 of DPA) None None None
Biometrics Oversight Reform  None None None
General Register Office (GRO) Amend Births & Deaths Registration Act 1953 remove the requirement for paper birth & death registers moving to an electronic register None None None
International Law Enforcement Alerts Platform (I-LEAP) Statutory Code of Practice Indirect: Race, Age, Gender Reassignment, Sex Advance equality for all None

Reforms where no or limited impact on discrimination have been identified after considering the three limbs

66. We acknowledge that these reforms will have some impacts on data subjects more broadly, and where this is the case, we have considered the public interest for the measure. We do not however consider that they will adversely impact or discriminate against those groups with protected characteristics.

Parts 3 and 4

67. Subject Access Request (SAR): Currently all SARs under Part 3 (Law Enforcement) and 4 (Intelligence Services) need to be actioned within one month. Unlike the UK GDPR, Parts 3 and 4 of the DPA 18 do not recognise and allow for a proportionate time period for dealing with particularly complex requests. The proposal is to mirror an existing UK GDPR provision within Part 3 and 4 of the DPA 18 that permits a two-month extension to a SAR time period when a request is particularly complex. This will introduce greater consistency across the legislation. The proposed reforms do not seek to limit data subject rights, but simply better reflect the complexities of cases that are often received in a law enforcement or national security setting. By bringing the time frames for responding to SARs under Parts 3 and 4 in line with UK GDPR, data controllers will have a better opportunity to consider and respond to a request. Whilst it is acknowledged that this extension of time may lead to an extended delay to data subjects receiving responses to their SARs, it is considered reasonable as it better reflects the time that can be needed to ensure more comprehensive responses to complex requests. The evidence gathered to come to this conclusion came from discussions with operational stakeholders and legal teams.

Part 3

68. Introduce a ‘Legal Professional Privilege’ Exemption to Part 3: In the UK GDPR there is an exemption for processing that is subject to ‘Legal Professional Privilege’. This protects communications between lawyers and their clients and exempts the requirement for the controller to provide the data subject with information about processing and access to their data. However, this exemption does not apply to data processed under Part 3.

69. This proposal seeks to replicate that exemption in Part 3 in order to bring consistency and clarity between the regimes. Controllers and processors under Part 3 must currently rely on ad hoc restrictions contained within s.44 (Right to be informed) and s.45 (Right of access) DPA 2018 which need to be evaluated and justified even though the restriction will almost certainly always be applied, and stakeholders have indicated that this can be challenging. This amendment therefore seeks to simplify an existing process for information that would already be legally privileged.

70. Evidence received from stakeholders does not indicate that introducing an exemption for LPP is likely to lead to an increase in its use. We therefore consider it unlikely that it will discriminate, cause harassment or victimise any individuals belonging to a particular protected characteristic.

71. We do not consider the introduction of a “Legal Professional Privilege” exemption, in relation to age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who do not: the policy is neutral in this regard.

72. Introduce a definition of ‘consent’ into Part 3: The ‘consent’ of a data subject is an available legal basis for processing under Part 3 of the DPA 2018. There is a risk (albeit low) that it may be interpreted incorrectly in the absence of a clear definition. Therefore, this proposal seeks to replicate the UK GDPR definition of consent into Part 3 thereby bringing clarity for Part 3 controllers and consistency across the regimes. The evidence gathered from the Data Reform consultation responses indicated agreement with the standardisation of definitions across the data protection regimes. From the Data Reform consultation responses indicated agreement with the standardisation of definitions across the data protection regimes.

73. Evidence received from the police, Her Majesty’s Revenue and Customs, the National Crime Agency and the Crown Prosecution Service indicate that consent is rarely used as a basis for processing under Part 3 and that providing a definition is unlikely to lead to an increase in its use. We therefore consider it unlikely that it will discriminate, cause harassment, or victimise any individuals belonging to a particular protected characteristic.

74. We do not consider the introduction of a definition of consent into Part 3, in relation to age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who do not: the policy is neutral in this regard.

75. Introduce a power to allow bodies representing Part 3 controllers and processors to produce ‘Codes of Conduct’: In the UK GDPR codes of conduct can be produced by representative bodies (for example, trade associations) to clarify the application of data protection laws in particular sectors, which are then approved by the ICO. There is no equivalent power under Part 3 DPA 2018 and stakeholders have indicated that this could be a useful tool to future-proof their data use. This proposal aims to expand it to the law enforcement sector enabling similarly representative bodies to create codes of conduct for Part 3 under the purview of the ICO.

76. As the creation of Codes of Conduct will assist data controllers in their compliance with the data protection requirements under Part 3, we consider it unlikely that it will discriminate, cause harassment, or victimise any individuals belonging to a particular protected characteristic.

77. We do not consider the introduction of the power to create Codes of Conduct, in relation to age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who do not: the policy is neutral in this regard.

78. As this provision is designed to future proof LEA compliance with Part 3 data protection requirements, any future Codes of Conducts will be required to comply with the public sector equality duty.

79. Remove the need to log the ‘justification’ for consulting/disclosing data: Currently, law enforcement agencies (LEAs) are required to keep logs of several processing activities that they carry out. This proposal seeks to remove the requirement to record a ‘justification’ in the logs of consultation and disclosure. It is technologically challenging for LEAs to automatically log a ‘justification’ as it requires human input to ‘justify’ the reason for accessing/disclosing data; and from feedback we have received, it holds limited value in maintaining accountability, especially in police misconduct investigations. This is because an individual misusing the database is unlikely to record an honest justification. An important element of our assessment of the impact of this measure is that we are only removing the ‘justification’ element, which is considered of low value; the other requirements to monitor compliance will remain in legislation.

80. The removal of this information is unlikely to have an effect on any individuals belonging to a particular protected characteristic, as information contained within logs of consultation and disclosure is designed to be used for internal review and audit. Since the information about data subjects will maintain its fidelity, we consider it unlikely that it will discriminate, cause harassment or victimise any individuals belonging to a particular protected characteristic.

81. We do not consider the removal for the need to log the ‘justification’ in relation age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who don’t: the policy is neutral in this regard.

82. Introduce the ability to actively review automated decisions: Currently, LEAs are required to inform a data subject as soon as is reasonably practicable where a decision which produces an adverse legal effect is made which is based solely on automated processing. The purpose of this is to allow the data subject to then request that a human either reconsiders that decision or takes a fresh decision which is not based solely on automated processing.

83. The police have stated that this can cause them difficulties. For example, where automated decision making (ADM) is used to match an individual to a record on a dataset, the police must then either inform the data subject that they are under investigation (thereby tipping them off that they are of interest) or, alternatively, ensure that the decision is reviewed by a human (thereby removing the need to inform the data subject but running the risk that by the time the human review had been completed, it would be too late to act).

84. This proposal will provide an alternative option for LEAs to provide for a human to actively review the decision after it has been taken as soon as is reasonably practicable thereby removing the need to notify the data subject at the time. It effectively builds in the remedy that the data subject should have had were they notified that a decision had been made based solely on automated processing. However, in order to ensure that the new power is only used when necessary, LEAs will only be able to use it if informing the data subject would engage one of the grounds set out under section 44(4) of the DPA (i.e. to avoid obstructing an official or legal inquiry, investigation or procedure etc.). This change will ensure that the rights of data subjects who are subject to ADM continue to be protected whilst improving the ability of the police to tackle crime, ensure public safety and bring offenders to justice. It contributes to the HO priority outcomes of reducing crime and the risk of terrorism to the UK and UK interests overseas. Whilst the use of ADM is likely to have to have both a direct and indirect effect on any individuals belonging to a particular protected characteristic, the purpose of this provision is providing an alternative option to LEAs to provide for a human to actively review the decision after it has been taken as soon as is reasonably practicable, thereby removing the need to notify the data subject at the time.

85. Therefore, whilst the use of ADM could potentially discriminate against any individuals, it is unlikely that this particular provision will discriminate, cause harassment or victimise any individuals belonging to a particular protected characteristic because we are only addressing when, or if, a data subject is notified about an ADM decision, not the parameters of why they were subject to ADM.

86. We do not consider the introduction of the ability to actively review automated decisions, in relation age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who do not: the policy is neutral in this regard.

International transfers

87. Reforms of the Adequacy/Removal of the Review Period/Reform to the Appropriate Safeguards Mechanisms – Home Office will mirror those reforms into the DPA. Those proposals are led by the DSIT, and Home Office is mirroring selected reforms into the Data Protection Act 2018 (DPA). The aim is to codify existing practices regarding the UK Adequacy decisions, remove the prescribed by legislation review period for those decisions and alter Alternative Transfers Mechanisms. As those reforms address International Transfers framework, we believe that they will not result in any direct or indirect discrimination of any of the groups with protected characteristics. Data transferred internationally for law enforcement purposes via reformed transfer mechanisms will not impact rights of data subjects, regardless of their background due to the fact that in the law enforcement context all data subjects are treated equally. Additionally, those reforms will allow for transfers of data more efficiently and improve the volume of transfers at a time; it will not change the fact that said data would have already been transferred under the current legislation. On that basis no impact was identified.

88. Clarifying use of s. 76 of DPA. This reform should clarify that Controllers can use s.76 to transfer larger volumes of law enforcement data and decrease the likelihood of non-compliance. Similarly, to other ITR reforms, data transfers made on a basis of the reformed s.76 would be also conducted under current regime, only at a different speed. We did not extend groups of data subjects nor changed the manner with which this data will be handled. Therefore, we conclude that this reform has no impact on persons with protected characteristics.

89. Reform subsequent transfers provision (s. 78 of DPA). This proposal only introduces a very narrow exception where there is an immediate and serious threat to life and authorisation for onward transfer of data cannot be obtained in good time but maintains the general principle that authorisation must be sought in all other circumstances. We do not assess the conditions imposed to be targeted or disproportionately affect any data subjects.

Biometrics

90. Oversight Reform: Changes in this area aim to simplify the oversight regime for the police use of biometrics and overt surveillance. The current oversight arrangements include several different, overlapping oversight bodies (the Information Commissioner, Investigatory Powers Commissioner, the Surveillance Camera Commissioner and Biometrics Commissioner). To help clarify and simplify the oversight functions, we aim to repeal the statutory roles of the Biometrics Commissioner (BC) and Surveillance Camera Commissioner (SCC), transferring the BC’s casework functions to the Investigatory Powers Commissioner (IPC) and leaving the ICO as the key regulator.

91. We assessed the application of the three limbs to the oversight reforms. Through this assessment, we did not identify any changes which would directly change the way either the oversight bodies or data controllers involved interact with individuals of different protected characteristics. The oversight regimes (BC, SCC, IPC & ICO) focus on law enforcement and primarily work with individual police forces, local authorities, and other data controllers, rather than directly with the public The oversight regime, which includes compliance with Data Protection Legislation, Human Rights and Equality Legislation, applies to all law enforcement use of biometrics and overt surveillance regardless of protected characteristics. To inform this assessment, we considered the impact of different oversight bodies and their activities, particularly since the establishment of the Biometrics and Surveillance Camera Commissioners in 2012. In addition, we also took into account the views from various stakeholders and our consultation responses as evidence to support that these changes would not have any direct or indirect impact on individuals with protected characteristics.

GRO

92. Amend the Births and Deaths Registration Act 1953 - to remove the requirement for paper birth and death registers moving to an electronic register: The purpose of this proposal is to reform the way in which births and deaths are registered in England and Wales, moving from a paper-based system to registration in an electronic register (Registration Online, known as RON). Since 2009 all birth and death registrations have been captured electronically on RON in parallel with the paper registers leading to duplication of processes. Once a registration is complete the RON system generates the paper register page which is signed by the informant(s) and the registrar and is the formal record of the event. The registrar puts the signed register page into a loose-leaf register and is required, initially, to keep the register safe in a box provided by the Registrar General. This proposal will remove the current duplication whereby births and deaths are registered both electronically and in paper registers creating a more secure system for the maintenance of birth, still-birth and death registers and for the processing of data held in those registers. In developing the policy proposals, we consulted with the National Panel for Registration.

93. Births and deaths will continue to be registered through personal attendance at the register office, by a person qualified to provide the information, in the sub-district in which the birth or death occurred. As registrars currently use the RON system to register births and deaths the informant attending to register a birth or death will not see any difference in service. We do not consider there are any equality issues with the removal of paper registers and the move to the RON system.

94. We do not consider there are any equality issues with the removal of paper registers in relation to discrimination, harassment, victimisation and any other conduct prohibited by the Equality Act.

95. The Bill includes a regulation-making power for the Minister to make regulations, which would have to be approved by both Houses of Parliament, to provide that if a person complies with specified requirements at the time of registering a birth or death they are to be treated as having signed the register in the presence of the registrar. This may include requiring a person to sign something other than the register or requiring a person to provide specified evidence of identity.

96. When the new affirmative regulations are drafted, following Royal Assent of the Bill, due consideration will be given to the requirements of the Equality Act when determining the specified evidence of identity an informant will need to provide when registering a birth or death.

97. We do not consider there are any equality issues with the removal of paper registers in relation to discrimination, harassment, victimisation and any other conduct prohibited by the Equality Act. Full consideration will be given when the affirmative regulations are drafted following Royal Assent to ensure the provisions in those regulations do not introduce any negative impacts.

Reforms where impacts or discriminations have been identified after considering the three limbs

Part 3

98. Introduce a national security exemption into Part 3: Although Part 3 does provide LEAs with the ability to restrict the rights of data subjects in order to protect national security, unlike Parts 2 & 4, it does not include a standalone exemption for processing national security data. Mirroring this exemption into Part 3 will help strengthen the police’s ability to tackle threats to national security. It will assist closer working between LEAs and the intelligence services and provide greater legal certainty for international transfers involving national security data.

99. The national security exemption is not targeted at particular groups. We therefore believe that it will not lead to any direct discrimination. However, a study of arrest data[footnote 7] shows that men are nearly seven times as likely to be arrested as women, therefore there is a potential for this group to be subject to indirect discrimination. However, we consider that any indirect discrimination caused is proportionate to the legitimate policy aims of keeping the public safe, bringing criminals to justice and maintaining national security.

100. We do not consider the introduction of a national security exemption under Part 3, in relation age, disability, pregnancy and maternity, race, religion or belief, sex, sexual orientation, gender reassignment and marriage and civil partnership; will have any significant impacts regarding the need to foster good relations between people who share a protected characteristic and people who do not: the policy is neutral in this regard.

Part 4

101. Co-operation across data regime boundaries: Introduce a power that would allow the Secretary of State to issue a notice authorising a law enforcement body to process data under the Intelligence Services regime in Part 4 of the DPA 2018 in specified circumstances. 

102. We believe that this reform will not result in any direct discrimination amongst persons of a particular age, ethnic group, sex, or people who have undergone gender reassignment. The evidence gathered to come to this conclusion came from discussions with operational stakeholders and legal teams.

103. There is a risk of indirect discrimination, in that statistically, males are more likely to commit crime and as such will be more likely to have data processed by law enforcement bodies. As a result of this reform therefore, there may be an increased likelihood that males will have their data processed by law enforcement and/or intelligence bodies under this provision. Any indirect discrimination caused is proportionate to the legitimate policy aims of keeping the public safe, bringing criminals to justice and for the maintenance of national security.

104. Statistically, Asian/British Asian and Muslim individuals have been disproportionately affected by terrorism legislation relative to the total percentage of Asian/British Asian and Muslim individuals in the total population. However, this reflects the terrorist ideologies currently present within the UK (Islamic Extremism) and therefore the overrepresentation of some groups within scope of this policy will reflect the nature of terrorism in the UK at any given point. Therefore, any indirect discrimination on Religion or Belief is proportionate to the legitimate policy aims of keeping the public safe, bringing criminals to justice and for the maintenance of national security.

105. We believe that this reform will not result in any direct or indirect discrimination on groups with of the following characteristics: Disability, Pregnancy and Maternity, Marriage or Civil Partnership and Sexual Orientation. There is no association between an individual in one of these groups and the likelihood of having their data processed by a law enforcement body and/or intelligence service.

I-LEAP

106. Statutory Code of Practice: International Law Enforcement Alert Platform (I-LEAP) improves UK law enforcement agencies access to international alerting information. It does this by enhancing the use of INTERPOL and, in time, by creating new international data sharing arrangements with partner countries. I-LEAP will enhance equality of opportunity by enhancing UK Law Enforcement (UKLE) capabilities to keep the public safe, and to better protect those of particular religions or beliefs that historically have suffered disproportionately from crime.

Eliminate unlawful discrimination, harassment, victimisation and any other conduct prohibited by the Equality Act

107. We believe that this reform will not result in any direct discrimination amongst persons of a particular age, ethnic group, sex, or people have undergone gender reassignment.

108. There is potential of indirect discrimination on young people (on the grounds of age) being more likely to engage in criminal activity, and therefore be subject to an international alert. This will also result in an increased likelihood of action taken upon them by UKLE. Any indirect discrimination caused is proportionate to the legitimate policy aims of keeping the public safe and bringing criminals to justice.

109. We believe that this reform will not result in any direct or indirect discrimination on groups with of the following characteristics: Disability, Pregnancy and Maternity, Religion or Belief, Marriage or Civil Partnership and Sexual Orientation. There is no correlation between an individual in one of these groups and likelihood of having an alert issued upon them, and corresponding action being taken by UKLE.

110. There may be a risk of indirect discrimination for groups who have undergone gender reassignment due to the inclusion of historical subject details and images. Any indirect discrimination caused is proportionate to legitimate policy aims of keeping public safe, bringing criminals to justice, etc.

111. There is a risk of indirect discrimination on the grounds of nationality for individuals whose country of origin either upload disproportionately more alerts to Interpol than other countries, or with whom the UK has secured bilateral alert exchange via I-LEAP. Also, a country is disproportionately likely to issue alerts on their own nationals, which means UKLE have access to more alerts relating to certain nationalities and are therefore more likely to take action upon individuals of those nationalities. Where the country in question’s population is composed of a certain ethnicity, this also then carries a risk of indirect discrimination on the ground of ethnicity. Those arrested and convicted for criminal offences are more likely to have an alert issued on them. Ethnic minorities make up roughly 13% of UK population but 23% of arrests. This creates a risk of indirect discrimination based on race. The possibility of unlawful discrimination and indirect discrimination caused is considered and is proportionate to legitimate policy aims of keeping public safe, bringing criminals to justice, etc.

112. There is a risk of indirect discrimination, in that males are more likely to commit crime, and in turn will have an alert issued on them. They are therefore more likely for UKLE to take action upon them based on an alert. If there is no image provided of the suspect the sex may be wrongly assumed by the user based on other characteristics such as name. Any indirect discrimination caused is proportionate to legitimate policy aims of keeping public safe, bringing criminals to justice, etc.

Advance equality of opportunity between people who share a protected characteristic and people who do not share it:

113. No age group has specific needs that could be affected by I-LEAP, nor will any age group be placed at a particular advantage or disadvantage through I-LEAP as a result of being a particular age. Young people aged 12-17 are particularly vulnerable, and by I-LEAP providing more circulations on missing persons – this will encourage equality of opportunity by increasing the likelihood of being found.

114. Those with disabilities do not have specific needs that could be affected by I-LEAP, nor will those with disabilities be placed at a particular advantage or disadvantage through I-LEAP as a result of being disabled. Those with physical (visible), those with mental health conditions and non-visible disabilities could be particularly vulnerable. So better provision and quality of alerts relating to missing/vulnerable people could help with the advance equality of opportunity. Disabled adults were significantly more likely to have experienced crime in the last year (23.1%) than non-disabled adults (20.7%). Data source: Disability and crime, UK – Office for National Statistics (ons.gov.uk). Therefore, if I-LEAP helps to keep the public safe by bringing criminals to justice; this advances equality of opportunity as disabled people will feel safer and more likely to participate in public life.

115. I-LEAP advances equality of opportunity by helping to protect a group of people who are often victimised in hate crimes. Those who identify with a different sex to the one registered at birth account for 30.7% of victims in 2020, compared to 20.8% for those who identify as the same sex they were registered at birth. Data source: Crime in England and Wales: Annual Trend and Demographic Tables (ons.gov.uk). I-LEAP helps to keep the public safe by bringing criminals to justice and therefore advances equality of opportunity as people who have undergone gender reassignment will feel safer and more likely to participate in public life.

116. Those who are pregnant do not have specific needs that could be affected by I-LEAP, nor will those who are pregnant be placed at a particular advantage or disadvantage through I-LEAP as a result of being pregnant. However, being pregnant may put women at increased risk of abuse, although the data available on prevalence of domestic abuse amongst pregnant individuals is limited. Some studies suggest prevalence as high as 40%-60% of pregnant women experience abuse during pregnancy therefore, I-LEAP may improve equality of opportunity.

117. I-LEAP will enhance equality of opportunity by enhancing UKLE capabilities to keep the public safe, and so help to better protect ethnic minorities, women, members of religious groups or belief systems and members of LGBTQ+ community, who historically have suffered disproportionately from crime. With regard to race, if you are black, you are more likely to be a victim of crime (13% of victims, smaller population share). Data source: Statistics on Race and the Criminal Justice System 2018 (publishing.service.gov.uk). I-LEAP will increase the volume of information available to law enforcement partners from additional countries and therefore nationalities that could otherwise be missed from other diffusion matching sources will be available.

Foster good relations between people who share a protected characteristic and persons who do not share it:

118. With regard to Age, Disability, Pregnancy and Maternity, Race, Religion or Belief, Sex, Sexual Orientation, Gender Reassignment and Marriage and Civil Partnership - we have not identified any impacts as regards the need to foster good relations between people who share a protected characteristic and people who don’t: the policy is neutral in this regard. No mitigations are required or identified for these categories at this time as this is very subjective dependant on the types of alerts countries want to share with us for e.g. some may want to include vulnerable and missing people and others may want to focus on wanted people.

Section 4 - Digital Identity

Summary of policy

119. The Digital Identity measures provide people with an additional choice of how to prove things about themselves but do not remove any current methods or mandate a new approach. These measures have no identifiable adverse or negative impacts in relation to the first limb of the Equality Duty.

120. Instead, these measures ought to have a positive impact on promoting equality. As discussed below, without intervention it is likely that a digital identity market may develop which negatively impacts those with protected characteristics. The trust framework aims to tackle this by setting rules for trust-marked private sector organisations (who are not themselves necessarily bound by the public sector equality duty).

121. These initiatives can help advance equality of opportunity between people who share a protected characteristic and those who do not share it. For example, the fact that a digital identity can be based on a wider range of attributes could help someone with a disability prove something about that disability more seamlessly than is currently possible. Further iterations of the trust framework will contain information around sex and gender to give guidance on information sharing for people who have undergone, intend to undergo or are currently undergoing gender reassignment so they can limit excessive or unnecessary disclosure.

Evidence and analysis

122. Although a digital identity market already exists, it is not developed to its full potential and it presents some key flaws which may exclude minorities or those with protected characteristics. For example:

a. When setting up a digital identity, individuals have highlighted that the process usually requires a sequencing of tasks which are considered difficult for people that are, for instance, digitally excluded or neuro-diverse.

b. The digital identity system tends to be rather rigid, therefore excluding people whose circumstances differ from the expected social structure, such as those wishing to manage two bank accounts at the same bank from one mobile phone.

123. Research was commissioned from Royal Holloway, which informed consideration of the Equality Duty.

124. The digital identity measures, by promoting the growth of the digital identity market in an inclusive way, provides the opportunity to use a digital alternative, giving to excluded individuals an easier option for proving their identity or eligibility. For example, those who cannot afford a passport may instead opt for a digital identity product based on their data or a ‘vouch’. A vouch is a declaration from someone that knows the user which can be used as evidence for identity proof.

125. Inclusion is explicitly mentioned in the UK digital identity and attributes trust framework. Although signing up to the trust framework is not compulsory, organisations will need to be certified against it to prove that their products or services meet the UK Government requirements for checking government-held records of identity-related data.

126. The framework aims at improving inclusivity by:

a. Stating that all identity service providers should ensure no one is excluded due to their ‘protected characteristics’”. There are exemptions to this, for instance restricting the availability of a product or service to an individual due to their age (e.g. businesses cannot sell alcohol to underage individuals).

b. Giving examples of ways organisations can increase inclusivity. For instance, when choosing a system for facial recognition, digital identity and attribute providers should ensure that the chosen system is built in an inclusive way. A system which was tested with a small sample of white men risks excluding users of other genders and ethnicities, therefore excluding minorities or those with protected characteristics from being able to use the service.

c. Requesting both public and private sector organisations to meet appropriate accessibility standards. For instance, those that operate in Wales offer products and services available in Welsh.

d. Requiring organisations that sign up to the framework to submit an annual inclusion report.

127. The inclusion report aims to provide information on the routes service providers offer users for acquiring a digital identity. The report will also allow organisations to provide evidence of their efforts to improve their level of inclusivity. This information will give the governance function an overview of all of the avenues available across the market, so that it can determine whether any intervention is needed to encourage a diversity of avenues across the market. The government will not mandate that organisations collect information solely for the purposes of reporting.

Decision making

128. On 10 March 2022 the government published its response to the digital identity and attributes consultation. Alongside this response, the government published a De Minimis Assessment (DMA), assessing the economic impact of the measures set out in that consultation. The DMA contained an Equalities Impact Assessment and analysis of the interaction between those measures and the public sector equality duty. As set out above, there are no negative public sector equality duty concerns inherent in these proposals and several positive impacts these interventions may have in the advancement of equality of opportunity have been identified.

129. On the basis of this analysis, Ministers decided to publish the consultation response, De Minimis Assessment, and to proceed with the digital identity measures.

Monitoring and evaluation

130. The legislation requires the Secretary of State to prepare and publish an annual report on the operation of the digital identity measures. Part of this report will deal with inclusion and the public sector equality duty will be duly considered.

Section 5 - Smart Data

Explanation of the policy

131. Smart Data is the secure and consented sharing of customer data with authorised third-party providers. These providers then use this data to provide innovative services for the consumer or business, such as more efficient switching and account management, for example via account aggregation. This saves time, money and effort for customers who can more easily find and choose better-suited deals.

132. The overarching objective is to improve data portability between suppliers / service providers, customers, and relevant third parties in order to:

a.. Help overcome information asymmetry between suppliers and customers.

b. Empower customers to make better use of their personal data, e.g., enabling accurate tariff comparisons and providing access to better deals.

c. Supercharge competition to ensure all customers will benefit from lower prices and higher quality goods and service delivery.

d. Unleash innovation, providing new services in and across the sectors to help consumers save and manage their money and services.

133. Open Banking, a live Smart Data scheme enabled via an Order from the Competition and Markets Authority (CMA),[footnote 8] has already had a strong user uptake (as of February 2022 Open Banking had over 5 million regular consumer and business users[footnote 9] - It took 10 months to grow the number of users from 1 million to 2 million in 2020, whereas it has taken just four months to grow from 4 million to more than 5 million. This demonstrates the increasing appetite for services to move, manage and make the most of customer’s data and money. It also demonstrates the gradual growth of a scheme, a trend that could be expected if Smart Data is implemented in other sectors.

134. The Smart Data provisions look to build on the powers conferred by the Enterprise and Regulatory Reform Act (ERRA) 2013. These powers were enacted to mandate Smart Data schemes where voluntary schemes did not readily emerge.

135. The ERRA 2013 powers are no longer sufficient to enable effective Smart Data schemes. Specifically, ERRA powers do not include several key technical provisions that could underpin a scheme. They do not:

  • cover product and performance data, which may include valuable information on tariff prices, or customer ratings
  • allow for sub-delegation of rulemaking, which is necessary in particular for IT and security related matters as Smart Data schemes will function in a fast-paced IT environment with a consequent need for standards, specifications and technical requirements which can be regularly changed to keep pace with IT
  • require the collation and retention of data, which may result in data holders not keeping the data that the Smart Data scheme intends to be disclosed
  • provide backstop powers to regulate the use of data by recipients – it is not intended to impose significant requirements on use of data, but it is prudent to have the power to do so in order to ensure compliance with the UK GDPR or to provide similar protections for sensitive data which does not fall under the UK GDPR
  • have adequate charging and spending provisions to ensure schemes are self-funding, minimising the cost of Smart Data schemes to the taxpayer
  • have adequate provisions relating to enforcement and redress to ensure customers are sufficiently protected; the powers in ERRA are largely limited to orders or notices requiring compliance which may not be a sufficient deterrent to infringements

136\This analysis builds off the Smart Data Impact Assessment, which has been signed off and cleared by Minister Scully. This Impact Assessment has been scored Green, fit for purpose, by the Regulatory Policy Committee.

137. The key Public Sector Equality Duty (PSED) issues that arise from these proposals is the risk that not all demographics will benefit from Smart Data. For example, the less digitally engaged may not use Smart Data enabled services. Categorised by the subsections of Section 149 of the Equality Act 2010, the key impacts and mitigations are:

a. Eliminating unlawful discrimination, harassment, victimisation, and other conduct prohibited by the 2010 Act

i. The impacts of Smart Data schemes are assumed to be generally positive, however as people with certain characteristics benefit less from, or can possibly even be disadvantaged by a digital proposal, there is a risk that Smart Data may not evenly benefit across all groups in society. This could lead to discrimination against groups not using Smart Data, an example would be price discrimination.

ii. To mitigate this for each individual scheme, secondary legislation should undertake sector specific analysis of demographic information to understand what groups in a sector are least engaged and most at risk of digital exclusion, and departments should complete PSED assessments showing regard for the regulations’ impact on equality. In addition, existing GDPR protections should mitigate the misuse of data, and the Secretary of State will need to meet statutory preconditions for secondary regulations which in practice would mean having regard to (and seek to minimise and mitigate) the risk of discrimination against people who don’t use Smart Data tools or services and improving competition.

b. Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

i. Smart Data is expected to support consumers with protected characteristics, especially in cases where protected characteristics link with characteristics of financial vulnerability. Not only is this support expected to occur through dedicated use cases, but Smart Data is also expected to improve general engagement with markets and increase customer empowerment. This could go some way to relieve the £3.4 billion a year ‘loyalty penalty’, a harm that is disproportionately affecting certain groups included in the race, age and disability protected characteristics. Recent Open Banking adopter profiles found that many users tend to be less financially confident than the national population, 27% rate themselves as low on their financial confidence. [footnote 10]

ii.. As a mitigating measure to ensure Smart Data advances equality of opportunity, BEIS plans to commission independent research to identify the different areas of vulnerability and how Smart Data could benefit these types of vulnerabilities. Further work will allow schemes to be designed in a way that best achieves the objectives of our legislation for all customers, and minimise any relative disadvantages.

c. Fostering good relations between people who share a particular protected characteristic and people who do not share it

i. The government has also considered whether Smart Data proposals help tackle prejudice and promote understanding between different groups, but there are no obvious ways in which these specific proposals could meet these objectives.

138. Smart Data primary legislation does not introduce any Smart Data schemes, but provides the legislative framework for other government departments (OGDs) to make sector specific schemes.

139. As the main equality impacts will occur as a result of secondary regulations, at the primary stage, we do not consider any potential negative impacts on equality to be disproportionate. Nonetheless, in the Smart Data Impact Assessment (IA), we have recommended to OGDs that in creating secondary legislation they carefully consider the equality implications of any proposed Smart Data schemes in a given sector, building on our primary stage analysis. As suggested in our IA, this may include OGDs conducting demographic analysis to understand how consumers with protected characteristics will engage with a Smart Data scheme. OGDs may also consider interventions targeted at consumers who may not directly benefit from Smart Data as a result of a protected characteristic.

140. As we support OGDs with sector specific Smart Data schemes, we will be supporting the development of monitoring and evaluation plans and considering the implications on equalities. As the Smart Data ecosystem develops, the Council will work to coordinate the mitigation of PSED issues.

Eliminating unlawful discrimination, harassment, victimisation and any other conduct prohibited by the 2010 Act

141. Smart Data is expected to improve outcomes for consumers and businesses and is not expected to result in unlawful discrimination. However, the focus on Smart Data providing online services carries some risks.

142. There is a risk that the use of Smart Data by only some groups of consumers may lead to less equal outcomes within markets; those who use new Smart Data enabled services will stand to benefit, whilst consumers who are digitally excluded in the market may not directly benefit.

143. Ofcom recognises digital exclusion as being comprised of three interconnected aspects:

a. Access to the internet, e.g., having the technology such as a smartphone. b. Ability to use the internet e.g., digital skills and confidence. c. Ability to afford access to the internet (in relation to the affordability of service tariffs).

144. Ofcom’s latest data (June 2021) suggests the number of households without internet access has fallen from 11% to 6%,[footnote 11] however in October 2021 Ofcom estimates about 2 million households were experiencing affordability issues with either their fixed broadband and/or smartphone.[footnote 12] In addition, the ONS report that lack of skills is the second most common reason for not having access to the internet.[footnote 13]

145. Evidence suggests that certain demographics are more likely to be digitally excluded, including older people and those who live with a condition that limits or impairs their use of communication service. According to the ONS, since 2011 adults over the age of 65 have consistently made up the largest proportion of the adult internet non-users, and over half of all adult internet non-users were over the age of 75 in 2018. Moreover, across all age groups, disabled adults make up a large proportion of adult internet non-users. [footnote 14] These characteristics are reflected in the protected characteristics of age and disability.

146. As a result of digital exclusion, consumers of a certain age or disability are at risk of continuing to face the effects of adverse market trends such as the ‘loyalty penalty’, where customers who do not switch service providers often end up paying more than customers who do regularly switch. Smart Data is intended to drive switching rates through empowering consumers with their data, however the innovations arising from this data will not directly benefit those not online.

147. For instance, in 2019, being on the best energy prepayment or ‘pay as you go’ tariff was still £131 more expensive than the best online only tariff.[footnote 15] Those less able to use the internet are less able to access online only tariffs and could consequently pay more on their energy than a more digitally engaged consumer.

148. As a digitally based measure, although Smart Data policy is not expected to cause any of new harms, there is a risk that existing consumer detriments would persist for certain demographics, potentially worsening the inequality in outcomes in a sector.

149. However, where a Smart Data scheme is introduced to a sector, it is reasonable to assume that consumer outcomes as a whole could improve through a trend in improved prices and quality of products. While digitally excluded groups may not realise the direct benefits of a scheme, for example easier switching, the scheme may deliver net positive effects to the whole market. This could include applying pressure on suppliers to offer more competitive deals to all consumers, not only those engaged with Smart Data products and services.

150. Ofcom’s personalised pricing for communications report details how consumers could benefit from lower average prices if price personalisation, which is the practice of charging different customers different prices, leads to intensified competition.[footnote 16]

151. Additionally, over 300 Open Banking firms have registered in the UK in the last 4 years, testament to the market opportunity of Open Banking and its potential to increase competition.[footnote 17]

152. Additionally, overall digital access and capability is growing in the UK. Smartphone penetration for those in the UK aged 16-54 is currently at 85%,[footnote 18] and a smartphone was the device most likely used to go online with 85% of internet users using a smartphone for this purpose. Smartphone penetration and usage is likely to increase further, providing an ever-growing opportunity for more people to be included in Smart Data schemes.

153. One mitigating measure to be explored in further analysis, e.g. analysis underpinning sector specific secondary legislation impact assessments, is analysing demographic information to understand what groups in a sector are least engaged and most at risk of digital exclusion. This type of analysis was conducted by the Department for Business, Innovation and Skills (BIS) for a Smart Data scheme, midata, in their proposal for energy bills to be printed with QR codes to increase consumer engagement.[footnote 19] By doing this, sector specific schemes can adapt legislation to minimise digital exclusion.

154. Aside from age and disability there are no immediate or obvious discrimination, harassment, or victimisation impacts for any other protected characteristics.

155. One potential secondary impact from increased data sharing as a result of Smart Data is firms having more data on consumers, which they could use to profile or discriminate against certain groups of consumers, or discriminating against people who don’t use Smart Data tools or services. Data holders could charge higher prices to customers who they know do not use Smart Data and are therefore sticky customers.

156. However, existing GDPR protections should mitigate the misuse of data as a result of increased data sharing. The principles of GDPR say that all personal data must be processed in a lawful and transparent manner, and the company/organisation must only collect and process the personal data that is necessary to fulfil specific purposes.[footnote 20] If GDPR is breached, the ICO have various powers to take action including issuing fines of up to £17.5 million or 4% of annual worldwide turnover for serious breaches.[footnote 21]

157. In addition, before the Secretary of State can lay secondary regulations, departments should complete a PSED assessment and show regard for the regulations’ impact on equality. Moreover, the Secretary of State will need to meet statutory preconditions for secondary regulations including having regard to the benefits for customers and potential customers, and the effect of the regulations on competition. In practice, this would also mean having regard to (and seek to minimise and mitigate) the risk of discrimination against people who don’t use Smart Data tools or services, and improving competition.

Advancing equality of opportunity between people who share a particular protected characteristic and people who do not share it

158. The government is committed to exploring how Smart Data can be used to empower consumers who may identify as vulnerable.

159. The FCA defines a vulnerable customer as someone who, due to their personal circumstances, is especially susceptible to harm, particularly when a firm is not acting with appropriate levels of care.[footnote 22] In practice, being vulnerable increases the likelihood of consumers not being engaged in a market and paying more due to a reduced ability to find the best deals.

160. The FCA also note that all customers are at risk of becoming vulnerable, but this risk is increased by having characteristics of vulnerability. These could be poor health, such as cognitive impairment, life events such as new caring responsibilities, low resilience to cope with financial or emotional shocks and low capability, such as poor literacy or numeracy skills. [footnote 23] The FCA estimate that 53% of all adults in the UK show a characteristic of vulnerability.[footnote 24]

161. The link between vulnerability and protected characteristics may not always be explicit - further research may be required to better understand this. There may be links between disability, race and low resilience to financial shocks as a driver of vulnerability (e.g. low income); poverty rates are highest for people in households where the head of a household is from the Pakistani and Bangladeshi ethnic groups (lowest for those from White ethnic groups), and rates are also highest among families where at least one member is disabled (compared to families where no one is disabled).[footnote 25]

162. In addition, Citizen’s Advice reported that there is a cyclical link between financial difficulty and mental health (which is included under the disability protected characteristic), with 72% of people experiencing mental health problems stating that their poor mental health made their financial situation worse, and 86% reporting that their financial situation made their mental health problem worse.[footnote 26]

163. We expect Smart Data schemes to help vulnerable consumers in a range of circumstances through tailored use-cases arising from industry innovation. Using the above example, as explained by Citizen’s Advice in their Super-Complaint, those in vulnerable states find it particularly difficult to engage with essential service markets and as a result, particular vulnerable groups such as those on low incomes and those with lower levels of education, but also those with protected characteristics including older people and those with health problems, are particularly likely to struggle with shopping around and switching.[footnote 27] They found that low income and vulnerable customers were often on the worst deals.[footnote 28]

164. Citizen’s Advice also reported that over 8 in 10 over 65s pay the loyalty penalty in at least one market, also that 15% of those who have experience a mental health problem in the last 12 months thought it was too difficult to switch contracts in essential markets.[footnote 29] Therefore, these people could benefit from improved switching services reducing the ‘loyalty penalty’ they pay, and holistic money management tools that could in particular help low income groups stretch their income further.

165. According to the University of Bristol’s Personal Finance Research Centre, in 2016 low-income households were found to be paying an extra £478 for essentials such as energy, credit and insurance.[footnote 30] This is also known as the ‘poverty premium’. Additionally, Citizen’s Advice analysis shows that for those on the lowest incomes, the loyalty penalty could comprise 8% of consumers annual expenditure.[footnote 31] These findings highlight that low-income households (which may correspond to the protected characteristics of race and disability) pay more for essentials, and this expenditure is proportionally larger relative to their income when compared with other income brackets, exacerbating the negative impact.

166. Smart Data could help tackle this in a couple of ways. Firstly, use cases of Smart Data schemes could help improve money management, credit provision, debt advice etc. For example, Open Finance could provide improved credit profiling of those with traditionally ‘thin’ credit files, and holistic money management tools could help low income consumers better manage their money and allow for regular payments (e.g. not paying for your energy by direct debit could cost up to £143 more per year). Secondly, Smart Data is expected to improve engagement with the market and encourage switching. This could go some way to relieve the £3.4 billion a year ‘loyalty penalty’.[footnote 32]

167. The Kalifa Review[footnote 33] recommended that government and regulators should consider offering incentives to FinTechs to focus on particular demographics or areas to improve financial inclusion, as they are ideally placed to service these areas quickly, but there is currently no incentive to offer low-cost credit or to target areas in need of steps to bolster financial inclusion.

168. The government will explore opportunities to support vulnerable consumers through the Smart Data Challenge Prize that BEIS intends to deliver over this Spending Review period. This will help encourage innovative services aimed at providing tangible solutions to meet consumers’ needs.

169. Similar sector specific challenge funds already exist such as Open Banking for Good (OB4G),[footnote 34] a £3 million challenge fund aimed to help create and scale Open Banking apps and online services to benefit customers on low income or otherwise financially vulnerable. A report from Bristol University suggests OB4G largely met its expectations and enabled innovations that tackled real issues for people who were ‘financially squeezed’.[footnote 35] OBIE representatives estimated that Open Banking could save overstretched consumers as much as £287 per year, or 2.5% of annual income.[footnote 36]

170. As a mitigating measure to ensure Smart Data advances equality of opportunity BEIS plans to commission independent research to identify the different areas of vulnerability and how Smart Data could benefit these types of vulnerabilities.

Fostering good relations between people who share a particular protected characteristic and people who do not share it

171. The government has also considered whether Smart Data proposals help tackle prejudice and promote understanding between different groups, but there are no obvious ways in which these specific proposals could meet these objectives.

Section 6 - Open Data Architecture

Introduction/Background/context (explanation of the policy)

172. Taking due regard to the Public Sector Equality Duty as set out at section 149(1) of the Equality Act 2010, the Government has chosen to consider at a high level the potential impacts on equalities that may arise as a consequence of the provisions of the Bill.

173. The Act prohibits discrimination in the exercise of public functions and requires decision-makers to have due regard to the need to eliminate unlawful discrimination, harassment, victimisation and any other conduct prohibited by the Act; and advance equality of opportunity and foster good relations between persons who share a relevant protected characteristic and persons who do not share it.

174. Having due regard to the need to advance equality of opportunity involves having due regard, in particular, to the need to:

  • remove or minimise disadvantages suffered by persons who share a relevant protected characteristic that are connected to that characteristic
  • take steps to meet the needs of persons who share a relevant protected characteristic that are different from the needs of other persons
  • encourage persons who share a relevant protected characteristic to participate in public life or other activities where their participation is disproportionately low

175. Having due regard to the need to foster good relations between persons who share a relevant protected characteristic and persons who do not share it involves having due regard, in particular, to the need to:

(a) tackle prejudice, and

(b) promote understanding.

176. Currently, service users and their care teams cannot easily access or share, in real time, all the health and/or social care information that is relevant to their care. This is in part because IT suppliers are not uniformly providing products and services based on shared principles that incorporate or enable interoperability, so that data can easily be shared between organisations that use different systems.

177. The proposed open health and care architecture provisions will make it clear that the Secretary of State for Health and Social Care’s power, under the Health and Social Care Act 2012 to prepare and publish standards relating to the processing of information, includes technical standards. The provisions will also ensure that suppliers of IT products and services to the health and adult social care sector in England can be required to comply with these standards.

178. The provisions are aimed at ensuring that the health and adult social care sector’s IT products/services are built on principles of a unified system architecture, open data standards and interoperability. This in turn is intended to remove barriers to data flows, providing the technical ability to share a person’s care data (in a standardised form) across and between health and care professionals to provide optimal and safe care; timely data to run and operate health and care services in local areas; and the necessary data for local places to manage population health and reduce health inequalities (where there is a legitimate and lawful basis to do so).

179. The Department for Health and Social Care set out key findings of the direct benefits of interoperability to staff, patients and service users in the Tech Vision 2018.[footnote 37] They cite the Wachter Review, which finds that interoperability reduces regional disparities in the quality of health and care provision and increases scope for empowering patients and their caregivers to be involved in key decisions about their care.[footnote 38] Also cited is a report from the National Information Board, which details the role of interoperability in facilitating care that is genuinely personalised.[footnote 39] Personalised care and patient empowerment benefit patients with protected characteristics by serving to ensure that care is tailored to meet any needs arising as a result of these characteristics. The reports cited do not identify any respects in which improved interoperability would disadvantage people with protected characteristics and do not envisage any detrimental impact. Interoperability improvements merely remove practical impediments to actions that are already authorised, but in practice difficult to achieve. So, such improvements do not create new powers that could disadvantage people with protected characteristics. These findings remain applicable.

Consideration of any equality impact and mitigating actions

Age

180. Older people are more likely to have multi-morbidities and consequently make greater use of health and social care services (for instance 65.6% of people receiving long term adult social care in 2020/21 were over 65).[footnote 40] As a result, the health and care system may hold more information about older people. This means that they may be more likely to benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability that enables health and social care professionals to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of age.

181. Interoperability facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. On balance, we consider that greater interoperability can facilitate effective use of data to benefit the health and care system and may identify issues which could be addressed to improve inequalities in access, experience and outcomes of health and care services.

182. In addition, the likelihood of developing multi-morbidities as an individual ages means that individuals with this characteristic are more likely to benefit from improvements in data use. Multi-morbidities require complex care plans, produced by multi-disciplinary collaboration between a range of care professionals. Greater Interoperability enables such interaction by facilitating the sharing of relevant information between professions.

183. Open health and care data architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of age and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of age and others however, in general, an improvement in services for persons with a protected characteristic is likely to promote understanding between those persons and others.

Disability

184. People with physical and learning disabilities may access health and care services more than people without disabilities. For instance, it is likely that the vast majority of the 289,695 adults under 65 accessing long term social care in 2020/21 will have a disability.[footnote 41]

185. Therefore, it is likely that people with disabilities will benefit from the improvements to health and adult social care provision which we anticipate, through improved interoperability enabling health and adult social care professionals to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of health and adult social care. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of disability.

186. Service users with disabilities will in many cases benefit directly from improvements in interoperability (standardising the way in which organisations collect, store and share data). Interoperability facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. Service users with disabilities who require complex care plans, taking into account information from a large range of sources, will therefore benefit significantly from interoperability improvements.

187. Open health and care data architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will remove practical impediments and have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of disability and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of disability and persons who do not share it; however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Gender reassignment

188. We do not have evidence that persons sharing the protected characteristic of gender reassignment are more likely to use health and care services. However it is likely that such persons will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and social care professional to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of gender reassignment.

189. Service users with the shared protected characteristic of gender reassignment will in many cases benefit directly from improvements in interoperability as it facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. Service users with the shared protected characteristic of gender reassignment who require sometimes complex care plans, taking into account information from a large range of sources, will therefore benefit significantly from interoperability improvements. Such improvements could also increase the likelihood that service users with particular needs resulting from this protected characteristic are treated in line with their preferences, reducing the need to reiterate them each time they interact with a different service provider. We also consider that greater interoperability and standardisation of information can facilitate effective use of data to benefit the health and care system and may identify issues which could be addressed to improve inequalities for those sharing the protected characteristic of gender reassignment in access, experience and outcomes of health and care services.

190. Open health and care architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of gender reassignment and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of gender reassignment and persons who do not share it however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Marriage and civil partnership

191. There is no foreseeable differential impact of the proposals for people with these protected characteristics. Generally, it is likely that such people will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and social care systems to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. On balance, we consider that greater interoperability can facilitate effective use of data to benefit the health and care system and may identify issues which could be addressed to improve inequalities in access, experience and outcomes of health and care services.

192. Open health and care architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will remove practical impediments to actions that are already authorised.

Pregnancy and maternity

193. Currently, pregnant women must bear most of the burden of ensuring that relevant information is available at point of care - they are advised to carry their maternity notes with them at all times in case they need urgent medical care. Interoperability can directly benefit pregnant women and mothers, by ensuring that relevant information is available at the point of care and by facilitating multidisciplinary interaction between relevant care providers.

194. It is likely that people sharing the protected characteristic of pregnancy and maternity will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and social care systems to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of gender reassignment.

195. More widely, improved sharing of data will enable research into specific issues which may impact on individuals with protected characteristics, and better planning and commissioning of health and care services to deliver improved and more tailored outcomes and experiences of healthcare.

196. On balance, we consider that greater interoperability can facilitate effective use of data to benefit the health and care system and may identify issues which could be addressed to improve inequalities in access, experience and outcomes of health and care services.

197. Open health and care data architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of pregnancy and maternity and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of pregnancy and maternity and persons who do not share it; however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Race

198. Access to primary care health services is generally equitable for ethnic minority groups, but this is less consistently so across other health services. However, people from ethnic minority groups are more likely to report being in poorer health and to report poorer experiences of using health services than their white counterparts.[footnote 42]

199. It is therefore likely that people sharing the protected characteristic of race will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and adult social care professionals to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of race.

200. Service users who share the protected characteristic of race will in many cases benefit directly from improvements in interoperability (standardising the way in which organisations collect, store and share data). Interoperability facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. Service users who share the protected characteristic of race will therefore benefit from interoperability improvements that allow for information to be taken into account from a large range of sources.

201. Open health and care data architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will remove practical impediments and have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of race and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of race and persons who do not share it; however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Religion or belief

202. It is likely that people who share the protected characteristic of religion or belief will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and social care professionals to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of religion or belief.

203. Interoperability facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. The interoperability provisions will enable better information sharing between health and care professionals and could increase the likelihood that service users with particular needs resulting from their religious beliefs are treated in line with their preferences and reduce the likelihood of their need to reiterate these preferences each time they interact with a different service provider.

204. Open health and care data architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of religion or belief and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of religion or belief and persons who do not share it however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Sex

205. Persons who make greater use of health and care services will have more information held by different organisations that they interact with and receive services from.

206. Women are more likely to be in receipt of long-term adult social care than men (57.4% in 2020/2021).[footnote 43] With respect to health care, in 2019/20 women constituted 57.8% of outpatient attendance and in 2022 - 49% of patients registered at a GP practice.[footnote 44] [footnote 45] Though this may simply reflect the demographic make-up of those 65 and over (who are most likely to access social care services) it does mean that women are more likely to benefit from the improvements to health and adult social care provision. We anticipate that improving interoperability will enable health and adult social care professionals to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of health and adult social care for those with the protected characteristic of sex. In particular, research into sex-specific issues may have a positive impact on such individuals.

207. Service users with sex-specific needs will in many cases benefit from interoperability that facilitates multi-disciplinary collaboration between care professionals to ensure that the needs of patients sharing the protected characteristic of sex are considered and met in a holistic manner. We consider that greater interoperability can facilitate effective use of data to benefit the health and adult social care sector and may well identify issues which could be addressed to improve inequalities between the sexes in access, experience and outcomes of health and care services.

208. Open health and care architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will remove practical impediments and have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of sex and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of sex and others; however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

Sexual orientation

209. We do not have evidence that persons sharing the protected characteristic of sexual orientation are more likely to use health and care services. However, it is likely that people of all sexual orientations will benefit from the improvements to health and adult social care provision which we anticipate through improving interoperability enabling health and social care systems to better communicate with each other, thereby supporting treatments, improving health and social care service planning, commissioning and delivery, and enabling research to deliver improved and more tailored outcomes and experiences of healthcare. In particular, research into specific issues may have a positive impact on individuals sharing the protected characteristic of sexual orientation.

210. Interoperability facilitates multi-disciplinary collaboration between care professionals to ensure that a patient’s needs are considered and met in a holistic manner. On balance, we consider that greater interoperability can facilitate effective use of data to benefit the health and care system and may identify issues which could be addressed to improve inequalities between persons sharing the characteristic of sexual orientation and others, in access, experience and outcomes of health and adult social care services.

211. Open health and care architecture and the interoperability improvements remove practical impediments to actions that are already authorised, but in practice are difficult to achieve. So, the provisions do not involve discrimination and the resultant data architecture improvements will have a positive impact on advancement of equality of opportunity between persons sharing the protected characteristic of sexual orientation and others. There is limited scope for the policy to foster good relations between persons who share the protected characteristic of sexual orientation and persons who do not share it; however, in general, an improvement in services for persons is likely to promote understanding between those persons and others.

  1. At the end of the transition period the EU GDPR was retained by the European Union (Withdrawal) Act 2018. The retained GDPR was modified by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations (DPPEC 2019) (as amended by the Data Protection, Privacy and Electronic Communications (Amendments etc)(EU Exit) Regulations 2020) and renamed the UK GDPR. 

  2. CDEI Landscape Summary on Bias in Algorithmic Decision-Making 

  3. The Right to Privacy (Article 8) and the Digital Revolution 

  4. National Trading Standards report (“sucker lists”) 

  5. Digital Regulation Cooperation Forum: Plan of work for 2021 to 2022 

  6. These are Age, Disability, Pregnancy and Maternity, Race, Religion or Belief, Sex, Sexual Orientation, Gender Reassignment and Marriage and Civil Partnership. 

  7. Arrests - GOV.UK Ethnicity facts and figures (ethnicity-facts-figures.service.gov.uk) 

  8. Retail Banking Market Investigation Order 2017, CMA, gov.uk 

  9. Open Banking (February 2022): “Open banking passes the 5 million users milestone” 

  10. Open Banking Impact Report (October 2021): Profile of adopters 

  11. Ofcom (April 2021): Digital exclusion review 

  12. Ofcom (Feb 2022): Affordability of Communications Services 

  13. ONS (March 2019): Exploring the UK’s digital divide 

  14. ONS (March 2019): Exploring the UK’s digital divide 

  15. Fair By Design: “Low income consumers pay a poverty premium equivalent to three months’ worth of food” 

  16. Ofcom (August 2020): “Personalised pricing for communications” 

  17. TrueLayer (January 2022): “PSD2 4 years on: why open banking is a success – and how to judge it” 

  18. Ofcom (April 2021): “Adults’ media use and attitudes report 2020/21” 

  19. BIS (January 2014): “QR code use in energy sector: midata programme study”. QR codes are machine-readable codes used for storing website addresses or other information, read using the camera on a smartphone. 

  20. Information Commissioner’s Office: “The Principles” (‘lawfulness, fairness and transparency’), (‘purpose limitation’) and (‘data minimisation’) 

  21. Information Commissioner’s Office: “What are the ICO’s enforcement powers?” 

  22. FCA (July 2021): “Guidance for firms on the fair treatment of vulnerable customers” 

  23. FCA (July 2021): “Guidance for firms on the fair treatment of vulnerable customers” 

  24. FCA (February 2021): “Financial Lives 2020 survey: the impacts of coronavirus” 

  25. House of Commons Library (April 2022): “Poverty in the UK: statistics” 

  26. Citizen’s Advice (2018): “Excessive prices for disengaged consumers: A super-complaint to the Competition and Markets Authority” 

  27. Citizen’s Advice (2018): “Excessive prices for disengaged consumers: A super-complaint to the Competition and Markets Authority” 

  28. Citizen’s Advice (October 2020): “The loyalty penalty in essential markets – two years since the super-complaint” 

  29. Citizen’s Advice (2018): “Excessive prices for disengaged consumers: A super-complaint to the Competition and Markets Authority” 

  30. Fair By Design: “Low income consumers pay a poverty premium equivalent to three months’ worth of food” 

  31. Citizens Advice (April 2021): “Finishing the job on the loyalty penalty: the mortgage and mobile handset markets” 

  32. Citizens Advice (September 2020): “The loyalty penalty in essential markets: Two years since the super-complaint” 

  33. The Kalifa Review of UK FinTech (February 2021) 

  34. Open Banking for Good - “How OB4G works” 

  35. Collard and Evans, University of Bristol (March 2021): “Open Banking for Good: Making a difference?” 

  36. OBIE Independent Representatives (2019): “Consumer Priorities for Open Banking” 

  37. The future of healthcare: our vision for digital, data and technology in health and care 

  38. Making IT Work: Harnessing the Power of Health Information Technology to Improve Care in England 

  39. Personalised Health and Care 2020 

  40. Adult Social Care Activity and Finance Report, England - 2020-21 

  41. Activity and Finance Overview - NHS Digital 

  42. The health of people from ethnic minority groups in England - The King’s Fund (kingsfund.org.uk) 

  43. Adult Social Care Activity and Finance Report, England - 2020-21 

  44. Summary Report - outpatient appointments by gender - NHS Digital 

  45. Microsoft Power BI – NHS Digital