Consultation outcome

Data: a new direction - government response to consultation

Updated 23 June 2022

Introduction

The government launched its consultation ‘Data: a new direction’ on 10 September 2021 to inform its development of proposals to reform the UK’s data protection laws, to secure a pro-growth and trusted data regime as part of the UK’s National Data Strategy.

As the government set out in the National Data Strategy, personal data is a huge strategic asset and the driving force of the world’s modern economies. It fuels innovation in businesses large and small, drives scientific discovery and has been a lifeline during the global coronavirus pandemic. This government’s ambition on data is clear: we will establish the UK as the most attractive global data marketplace.

We want to create a framework which empowers citizens through the responsible use of personal data. Our reforms will give individuals greater clarity over their rights and a clearer sense of how to determine access to and benefit from their own data. Research organisations given a platform to innovate can make medical breakthroughs enabling better care for individuals. The UK’s independent data protection regulator, the Information Commissioner’s Office (ICO), being given more effective powers to deter nuisance calls will alleviate a source of stress for some of the most vulnerable in our society. These reforms will create a data rights regime which delivers not only economic benefits but wider societal benefits alongside personal benefits to citizens.

The reforms we are taking forward will help the UK realise the benefits of greater personal data use. We are reducing the burdens on businesses that impede the responsible use of personal data. By giving businesses the opportunity to protect personal data in the most proportionate and appropriate way, we will make them more efficient, meaning higher productivity rates, and more jobs. We will also make it easier for businesses to use automated decision making tools responsibly - by allowing workers to focus on more productive, higher growth activities they can further fuel economic growth.

The reforms proposed in the consultation provide an opportunity for the UK to reshape its approach to regulation outside of the EU, and seize opportunities with its new regulatory freedoms. This includes the use of repatriated ‘adequacy’ powers from the EU to remove inappropriate barriers to the flow of UK personal data overseas in support of trade, scientific collaboration and national security and law enforcement cooperation. Globally, we are working with the wider bloc of like-minded, democratic economies which support greater interoperability of regulatory frameworks on data and more stable principles for trusted government access to data. These areas of work are mutually reinforcing, designed to make the UK the best place for businesses and scientific institutes to undertake data-driven activity. Our reforms will support the UK’s international commitments on the free flow of data.

Our reforms will mean that UK scientists are no longer impeded by overcautious, unclear EU-derived rules on how they can use people’s personal data. We will provide scientists with the clarity and confidence they need to get on with life-enhancing and life-saving research. We will simplify the legal requirements around research so scientists can work to their strengths. Having legal clarity on what they can and can’t do means they can pour more effort and resources into innovating. Indeed, our reforms will give charities, who fund so much vital research, more opportunities to raise funds, further boosting the UK’s research output.

About the consultation

The current UK data protection regime consists of the UK General Data Protection Regulation (UK GDPR), the Privacy and Electronic Communications Regulations (PECR) and the Data Protection Act 2018 (DPA). This consultation presented proposals that build on the UK’s current regime, such as its data processing principles, its data rights for citizens, and its mechanisms for supervision and enforcement.

We are committed to maintaining these important principles:

  1. We will ensure that high standards of data protection and the UK’s historic commitments to upholding these to maintain public trust in use of personal data will continue to be at the heart of our regime, while providing greater flexibility to organisations to find the most effective and proportionate way of protecting people’s personal data.

  2. The UK’s data protection regime will be future-proofed, by enabling organisations to focus on investing time and effort in delivering what matters - important privacy outcomes - rather than ticking boxes. This will enable our laws to keep pace with changes to the technological landscape without disrupting regulatory certainty.

  3. Almost all organisations that comply with the UK’s current regime will comply with our future regime. The limited number of new requirements are things that are already good or best practice and that many businesses already have in place.

  4. The UK’s data protection regime will deliver concrete advantages for the UK while preserving data subjects’ rights and the independence of our regulator, creating a net benefit for businesses and society as a whole.

  5. The reforms to the ICO will ensure effective, risk-based and preventative supervision, improving its governance, accountability and transparency in line with best regulatory practice.

This consultation ran for 10 weeks, closing on 19 November 2021. The consultation received 2,924 responses, 684 via email and 2,240 via our survey platform. Responses were received from the Information Commissioner’s Office and organisations which represent a cross-section of the UK economy and society, as well as from overseas organisations.

During the consultation period, the government engaged with a range of stakeholders, including over 40 roundtables with academia, tech and industry bodies, and consumer rights groups, providing a wide range of views.

This document is the government’s response to the ‘Data: a new direction’ consultation.

Executive summary

The proposals in this response are arranged into 30 headings across 5 chapters:

  1. Reducing barriers to responsible innovation. Chapter 1 relates to providing clarity and certainty to businesses on the interpretation of current laws, definitions and requirements relating to personal data processing. The corresponding chapter in the consultation paper contained a range of proposals to increase confidence in personal data processing through use of the legitimate interest ground and enabling greater personal data access and personal data sharing for research and other purposes. The government also proposed reforms to create more certainty for organisations about when and how they can responsibly use personal data with the development of cutting-edge data-driven technologies.

  2. Reducing burdens on businesses and delivering better outcomes for people. Chapter 2 relates to reducing disproportionate burdens on businesses and delivering better outcomes for people in relation to the processing of personal data. The corresponding chapter in the consultation paper contained a range of proposals to help the UK move to an outcomes-based compliance regime for data rights. This will strengthen accountability requirements while providing organisations with the opportunity and flexibility to find the most effective and proportionate means of meeting the outcomes needed to protect individuals’ rights. It also included reforms to reduce disproportionate impacts of subject access requests on organisations, and ways to limit unnecessary cookie banners by altering rules in the Privacy and Electronic Communications Regulations.

  3. Boosting trade and reducing barriers to data flows. Chapter 3 relates to boosting trade and reducing barriers to personal data flows. The corresponding chapter in the consultation paper contained a range of reforms to create an autonomous UK international transfers regime, which supports international trade and eliminates unnecessary obstacles to cross-border personal data flows.

  4. Delivering better public services. Chapter 4 relates to delivering better public services through improved use of and access to personal data. The corresponding chapter in the consultation paper contained proposals to make better use of data sharing gateways under the Digital Economy Act to facilitate more joined-up, responsive public services; apply lessons learned from COVID-19; increase the transparency of government processing activities by ensuring that clear information is provided on the use of algorithms; and, simplify the legal framework in relation to the police’s collection, use and retention of biometric data.

  5. Reform of the Information Commissioner’s Office. Chapter 5 relates to the reform of the ICO, the UK’s independent data protection regulator. The corresponding chapter in the consultation paper contained proposals to implement a new, modern governance framework, with an independent board, and require the ICO to account for the impacts of its activities on growth, innovation and competition.

Summary of responses

Overall, responses indicated support for the government’s proposals in many areas, including:

  • changes to research provisions, especially the proposal to consolidate and bring together research-specific provisions, to create a statutory definition of ‘scientific research’ and the changes proposed to notification requirements
  • removal of consent requirements in relation to audience measurement cookies
  • the principle of proportionality outlined in the reform agenda across adequacy and Alternative Transfer Mechanisms (ATMs)
  • reforming the ICO, and emphasis on the importance of maintaining its regulatory independence
  • standardising the terminology and definitions used across the data processing regimes
  • increasing clarity and transparency of the existing rules on police collection, use and retention of data for biometrics, in order to improve transparency and public safety
  • extending powers under section 35 of the Digital Economy Act 2017, to include businesses, as this could be beneficial in terms of joined-up public services.

There were some potential concerns raised about:

  • introducing a nominal fee for subject access requests
  • whether the government should have a role enabling the activity of responsible data intermediaries
  • removing the need for data controllers to carry out the legitimate interests balancing test for specified activities if children’s data were involved
  • removing the right to human review of automated decisions
  • whether to exclude political parties and charities from rules on direct electronic marketing
  • removing requirements for Data Protection Impact Assessments (DPIAs) and Data Protection Officers (DPOs)
  • the potential impact of reforms on the ICO’s independence

As well as this policy-specific feedback, several themes emerged consistently across all the chapters in the consultation. A summary of these areas can be found below and a detailed discussion of views in each of these areas can be found in the relevant chapters of this document.

Respondents highlighted the importance of maintaining data subject rights. The government agrees with the importance of maintaining data subject rights and the reforms presented below deliberately build on the key elements of the current UK General Data Protection Regulation (UK GDPR), such as its data processing principles, its data rights for citizens, and its mechanisms for supervision and enforcement. These key elements remain sound and they will continue to underpin a high level of protection for people’s personal data and control for individuals over how their data is used. This view also aligns with the government’s position to incentivise businesses to invest more effectively in the governance, policies and tools that protect personal data, so we can have even greater confidence that personal data is being used responsibly.

Respondents made clear the benefits they saw from the effective use of personal data that our reforms would deliver, while emphasising the need for this to be done responsibly. The government agrees with this position and we recognise the power of the effective use of personal data to drive innovation and boost the economy, while continuing to protect people’s safety and privacy.

Respondents raised the importance of data flows with the EU, and how our reforms will affect this - in particular with respect to the UK’s EU data adequacy decision. Many respondents recognised the benefits of the government’s approach to making these reforms within the existing framework, and some went as far as expressing a preference for these reforms to be mirrored in the EU General Data Protection Regulation (EU GDPR). As the government made clear in the consultation, we believe it is perfectly possible and reasonable to expect the UK to maintain EU adequacy as it designs a future regime. The UK is firmly committed to maintaining high data protection standards - now and in the future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation, and underpins the trustworthy use of data. EU adequacy decisions do not require an ‘adequate’ country to have the same rules, and our view is that reform of UK legislation on personal data is compatible with maintaining flows of personal data from Europe.

The consultation asked for views on a sliding scale of ‘strongly agree’ through to ‘strongly disagree’. While the quantitative figures are a useful indicator to understand the overall level of support or opposition for each proposal, they do not allow for any nuance in terms of why respondents responded in a certain way, and are therefore not necessarily the best basis for assessing the robustness of the proposals. The qualitative evidence the government received was critical to developing the policy detail and ensuring the competing views were effectively balanced while delivering on the government’s policy intentions.

Throughout this response when the opinions of respondents are referenced, this refers to those who chose to answer that specific question, and is not representative of the total number of respondents to the consultation or necessarily the UK public.

The remainder of this document sets out the government’s response to the consultation in more detail, including its planned next steps in each area. As announced in the Benefits of Brexit policy paper on 31 January, this will include introducing legislation to reform the UK’s data protection regime.

Chapter 1: Reducing barriers to responsible innovation

1.1 Summary

In chapter 1 of the consultation, the government highlighted the importance of innovative, responsible uses of personal data to help drive scientific discovery, enable cutting-edge technology, and deliver real benefits to our economy and society.

Clear and consistent rules for the use of personal data will support the adoption of new data-driven technologies. The government’s proposed reforms are designed to create more certainty for organisations about when and how they can responsibly use personal data as an important step in ensuring our laws keep pace with the development of cutting-edge data-driven technologies. The development and deployment of data-driven technologies will give British businesses an internationally competitive edge. These technologies will make businesses more efficient, and thus more productive - and these benefits will be felt by British consumers, through new innovative products, more jobs or lower prices.

1.2 Research purposes

The UK is ranked second in the world for science and research, and 54% of our research output is world leading. Personal data lies at the heart of a wide range of research activities across many sectors, and the importance of personal data in research activities to our economy and wider society is reflected in the UK GDPR.

The existing legislation already has specific provisions and derogations to facilitate processing for research purposes. However, the laws around the use of personal data for research purposes are complex and the relevant rules are spread across different pieces of legislation, making it harder to establish legal certainty for vital and innovative research. Furthermore, some aspects of the existing framework can place unnecessary barriers in front of researchers, slowing or even stopping their progress.

To address this, the government invited views on several proposals that would restructure the legislation, improve its clarity, and unlock more personal data for responsible researchers to use.

Definition of ‘scientific research’ (questions 1.2.2, 1.2.3,)

The majority of respondents agreed with the proposal to create a statutory definition for scientific research, with respondents arguing that a definition should improve clarity for researchers and provide more certainty. There was some disagreement with the proposal on the basis that creating a new statutory definition would either reduce the scope of processing that was already possible, or extend the scope to include other kinds of processing activities under the guise of scientific research.

The government also asked if the current text of Recital 159 was a suitable basis for any new statutory definition of scientific research. There were mixed views on this. Around a third of respondents agreed that Recital 159 was a suitable basis, on the grounds that it provided a clear and broad base that was understood by researchers. Around a third of respondents disagreed, stating that the current definition is too vague, and a similar proportion did not have a view.

Moving the definition from the recitals to the operative text of the UK GDPR will provide greater clarity and visibility to both researchers and data subjects about what constitutes scientific research. The current wording found in Recital 159 provides a suitable base to do this as research is a constantly evolving field and maintaining a broad definition, while maintaining the capacity for the regulator to provide guidance, ensures that the definition remains future-proofed. In addition, the government will also add the definitions for historic research and statistical purposes to the legislation based on the existing recital language, to ensure consistency and clarity across the fields of research currently recognised in the legislation.

Consolidating research provisions (question 1.2.1)

The government proposed consolidating and bringing together research-specific provisions. The majority of respondents agreed with this proposal on the basis that this would increase the clarity of the legislation, with some respondents noting that they would welcome further guidance on this from the ICO. There was some disagreement with the proposal, mostly from respondents who believed that the same effect could be better achieved by guidance from the ICO or that the current framework was sufficiently clear.

The ICO has recently launched a consultation on guidance about data processing for research. The draft guidance provides a table mapping out of the current research provisions; this guidance-based approach garnered some support by respondents both for and against this proposal. With this guidance in mind, the government will take a more targeted approach to simplifying the legislation by only moving certain sections of the legislation rather than creating an entire chapter.

Lawful basis for research purposes (questions 1.2.4, 1.2.5, 1.2.6, 1.2.7)

The government sought views on difficulties faced by scientific researchers trying to use the existing lawful bases found in Article 6 of the UK GDPR and the appetite for a separate lawful basis for research. The majority of respondents agreed that identifying the most appropriate lawful basis did not create barriers to research.

There were mixed views regarding whether to create a new lawful basis for research purposes, with supportive respondents noting that a new lawful basis would help researchers who currently struggle to identify the most appropriate lawful ground. Respondents who disagreed with the proposal argued this could be open to misuse, especially in conjunction with the broad definition for scientific research currently found in Recital 159.

The government does not see a need to take forward proposals that establish a new lawful basis for research at this time, as the evidence suggests that researchers are currently comfortable in using the existing lawful bases for processing personal data.

There were mixed views on clarifying the use of broad consent, which allows scientific research to use a less specific form of consent when it is not possible to fully identify the purpose of the processing at the point of data collection. With supportive respondents welcoming the flexibility, innovation and clarity this would provide. Respondents who disagreed with the proposal cited concerns that it could be open to abuse due to a lack of certainty.

It was also noticeable that, likely due to its location in the recitals, many respondents were not currently aware that broad consent is already a concept in the UK GDPR. The government believes this demonstrates the need for greater clarity and transparency in relation to broad consent in this context.

To help reduce uncertainty and concerns around mis-use, as well as to improve awareness of broad consent’s potential and use among organisations and individuals, the government will seek to make the provisions concerning broad consent in the recitals more prominent by putting them on the face of the legislation.

Clarifying the rules on further processing (question 1.2.9)

Almost half of respondents agreed with the proposals to clarify the existing rules on re-use or further processing of personal data for research, including the establishment of a lawful basis. Around a third disagreed. As with broad consent, some of those opposed to the proposal were concerned about the lack of transparency and the risk that organisations may use personal data in unexpected ways under the guise of research. However, those in favour saw the benefits of adding clarity to further processing.

As outlined in the section on a new lawful basis for research, the government does not see the need to take forward proposals to establish a lawful basis for research. This is in light of the evidence that, in general, identifying an appropriate lawful basis does not cause barriers for researchers. However, to support transparency and clarity of the law for both organisations and individuals, the government will take forward wider proposals on clarifying further processing. These reforms are further outlined in the section on further processing (section 1.3).

Exemption to provide information required by Article 13 (questions 1.2.10, 1.2.11)

The majority of respondents disagreed with the proposal to introduce an exception to the requirement for controllers to provide data subjects with the information required by Article 13 UK GDPR. This exemption would apply where personal data has been obtained directly from data subjects and is re-used for research purposes, but only when providing the information required by Article 13 would involve disproportionate effort. Opponents to this proposal cited concerns about a reduction in transparency, and apprehension about individuals losing control of their personal data. Those in support of the measure argued that the information provision requirements in Article 13(3) can cause barriers to researchers, particularly those carrying out longitudinal studies. For instance, in studies about certain medical conditions (such as degenerative neurological conditions), researchers can hold a mix of directly collected and indirectly collected pseudonymised data and when researching, where it is impossible or nearly impossible to recontact data subjects.

To ensure that research is not prevented in situations where recontacting data subjects would constitute a disproportionate effort, the government will replicate the exemption currently found in Article 14(5)(b), but only for research purposes. This change does not exempt controllers from providing all of the relevant information in Article 13(1)&(2) to data subjects at the point they collect the data. However, it will allow personal data being used for a research purpose which differs from the original purpose to be exempt from re-providing information under Article 13(3). It will only apply in circumstances where there would be a genuine disproportionate effort to the controller in providing such information.

The government will clarify what constitutes a disproportionate effort by bringing in language currently found in Recital 62 of the UK GDPR into the operative text. Alongside the existing ICO guidance in this area, this legislative change will provide clarity about when it is appropriate to use the exemption.

1.3 Further processing

Re-use (also known as ‘further processing’) of personal data can provide economic and societal benefits through facilitating innovation. Clarity on when personal data can lawfully be reused is important: data subjects benefit from transparency, data controllers benefit from certainty, and society benefits from unlocking the opportunities of re-use.

In recognition of the value of re-use of data in certain circumstances, the UK GDPR sets out rules for when further processing of personal data is considered compatible with the purpose for which it was collected. In the consultation, the government identified areas of uncertainty and set out proposals to improve clarity in the legislation and thereby facilitate innovative re-use of personal data and transparency for data subjects.

Clarity regarding further processing (question 1.3.1)

There were mixed views regarding whether the current provisions on further processing were confusing. Those who found the legislation confusing explained that the uncertainty among organisations’ compliance and legal teams regarding the rules on the re-use of personal data had led to delays in advancing research efforts or innovations within their organisations. There were different interpretations from controllers and data subjects alike about when further processing is currently lawful, with some expressing the view that it is completely prohibited, and others highlighting that they find the recitals a much clearer guide than the operative text.

The government judges that these responses are indicative of insufficient clarity in the current framework, and that there is a need for action. The government will simplify the legislation to address this, making it clear to organisations how personal data can be re-used lawfully, and giving transparency to data subjects to understand how their data may be reused.

Clarifying further processing for an incompatible purpose when based on a law that safeguards important public interest (question 1.3.2)

The government sought views on various elements of further processing that cause potential confusion, such as clarifying further processing for an incompatible purpose when based on a law that safeguards important public interest. There were mixed views regarding whether greater clarification was required. Some respondents expressed concern at further processing for a purpose that does not meet the explicit compatibility test and believe it is always prohibited. However, this is permissible in certain cases under the current legislation.

This reaffirms the need for greater clarification on the rules and permissions of data re-use and the need for greater transparency. Therefore the government will proceed with this proposal.

Clarifying further processing when there has been a change of controller (question 1.3.3)

The government sought views on whether it would be beneficial to clarify whether or not a change of controller should be considered as further processing. The majority of respondents supported this proposal, and agreed that clarification would be beneficial. Around a third of respondents disagreed that clarification was necessary. Some respondents believed that if there is a change of controller, processing should be classified as new processing and not further processing. Some respondents emphasised the importance of utilising third party expertise by permitting third parties to re-use the personal data collected by the original data controller.

The government plans to legislate to provide further clarity on the distinctions between new processing and further processing.

The government asked whether there was a need for clarification on further processing when the original lawful basis is consent. The majority of respondents agreed that clarification was needed, and around a third disagreed. There was strong support from respondents that the concept of consent must be respected with regards to data re-use.

The government plans to codify that further processing cannot take place when the original legal basis is consent, other than in very limited circumstances.

1.4 Legitimate interests

Data controllers are currently required to complete a 3-part test when relying on the legitimate interests lawful grounds under Article 6(1)(f) of the UK GDPR.

They must:

  1. identify a legitimate interest;

  2. demonstrate the processing is necessary for the intended purpose and cannot be achieved through less intrusive means; and

  3. weigh up whether their interests in processing personal data outweigh the rights of data subjects.

The third part of the legitimate interests assessment is sometimes known as the ‘balancing test’. ICO guidance recommends that the outcome of the 3-part legitimate interest assessment is documented to help demonstrate compliance.

Prior to consulting, the government was aware that some organisations were concerned about the time and effort required to complete and record their legitimate interests assessments. Others perceived using the legitimate interests lawful ground as more complicated and risky than other grounds. They were concerned that they might not complete the balancing test correctly and as a consequence they could find themselves being investigated by the ICO, which could lead to potential enforcement action for breaching rules. Organisations tended to seek individuals’ consent to reduce worries about liability, but that could lead to inappropriate reliance on consent, and ‘consent fatigue’ for some customers and service users.

Exemptions for legitimate interest balancing test (questions 1.4.1, 1.4.2, 1.4.3, 1.4.4)

To address these issues, the government proposed to create a limited, exhaustive list of legitimate interests for which organisations could use personal data without applying the balancing test and without unnecessary or inappropriate recourse to consent. It suggested several processing activities for consideration as part of such a list.

Some were included because of the public interest nature of the activity, such as:

  • processing that is necessary for the purposes of crime prevention and safeguarding
  • enabling non-public bodies to deliver statutory public communications and public health and safety messages

Others involved non-intrusive uses of personal data for commercial purposes, for example:

  • processing of customer data for the purposes of installing security updates on a device
  • processing personal data for internal research and development purposes
  • processing personal data for business innovation purposes that are aimed at improving services for customers

There were mixed views from respondents that creating a limited exhaustive list would give organisations greater confidence to process personal data. Of the respondents that agreed, some suggested that further everyday business activities, such as human resources (HR) functions or fraud detection, should be added to the list. Many proponents of such reforms nonetheless highlighted the importance of safeguards, such as a precise definition of relevant processing activities, in order to avoid misuse. Some of those who disagreed emphasised the need for continued case-by-case consideration to protect the rights of individuals, and voiced concern about the impact of removing the balancing test on levels of public trust.

The vast majority of respondents agreed that the balancing test should be maintained for children’s data, with very minimal disagreement.

In recognition of the mixed views, the government intends to pursue the proposal in relation to an initially limited number of carefully defined processing activities. This is likely to include processing activities which are undertaken by data controllers to prevent crime or report safeguarding concerns, or which are necessary for other important reasons of public interest. The government proposes to create a power to update the list of activities in case other processing activities are identified that should be added to the list. Any changes would be subject to parliamentary scrutiny.

The government recognises how important it is to ensure this approach continues to protect individuals’ rights. For those activities where the balancing test is removed, the government will consider if any additional safeguards are needed for children’s data. This may not be necessary where the inclusion of an activity has been added to list for the purpose of ensuring that the legislation does not deter the swift reporting of abuse or taking other steps to protect the welfare of children. In addition, data controllers would still need to demonstrate that processing for one of the listed activities complied with all of the data protection principles and met the necessity test in Article 6(1)(f) of the UK GDPR. Where the processing involved special category data, the controller would also be required to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the DPA 2018.

For any processing activities that do not feature on the list, data controllers will continue to be required to undertake the balancing test.

1.5 AI and machine learning

As set out in the consultation document, when used responsibly, data-driven artificial intelligence (AI) systems have the potential to bring incredible benefits to our lives. The development of AI and machine learning applications is contingent on data, and places demands on its collection, curation and use. The government’s consultation focused specifically on the interplay of AI technologies with the UK’s data protection regime.

Following the publication of the National AI Strategy in September 2021, the government plans to bring forward a white paper on AI governance. This will set out how the government plans to build on the UK’s leadership in AI, across a range of governance issues arising from the emergence of AI technologies.

The consultation explored how reforms can help organisations to build or deploy AI systems responsibly, and to innovate with care, while ensuring risks are managed and individuals can trust that their data rights are respected.

Fairness in an AI context (questions 1.5.1, 1.5.2, 1.5.3, 1.5.4)

Whilst AI governance is a live debate, the government expects AI systems (and their use) to be fair. As ‘fairness’ is a highly context-specific concept, and concepts of fairness exist in a variety of legislative frameworks, navigating fairness in the context of AI is a complex exercise. Fairness has an evolving meaning in the context of the use of machine learning and AI, and there is a question of how unfair outcomes resulting from the use of AI systems can be prevented.

The government sought views on whether the current legal obligations regarding fairness are clear (both in terms of developing AI-driven technologies, and within the existing data protection regime). The government also sought views on what legislative regimes and regulators should play a role in the assessment of concepts associated with fairness (especially in outcomes).

There were mixed views on whether the current obligations are sufficiently clear with regards to fairness when developing or deploying an AI system. Similarly, views were mixed regarding the overall clarity of fairness requirements the existing data protection regime places upon organisations developing and deploying AI systems. Some respondents noted a degree of confusion around how fairness obligations under data protection legislation interact with, admittedly different, legal concepts of non-discrimination. Others considered the development of a ‘substantive concept of outcome fairness’ to be outside the scope of data protection law. Around a third of respondents agreed that the development of a substantive concept of outcome fairness within the data protection regime would pose risks. There was some disagreement that this would pose risks, with respondents noting that data protection law by itself is not able or best-placed to effectively cover all possible AI use cases across the economy on the basis that there is personal data involved.

The consultation also asked a range of open questions about AI and fairness.

Respondents highlighted the importance of measures and safeguards which exist in the wider realm of fairness, though are traditionally outside of the data protection remit, such as:

  • guidance to support AI practitioners and developers to use current fairness and bias-mitigation tools in the AI development process
  • periodical evaluations of the individuals and entities involved in AI development
  • participation of affected stakeholder groups in the wider development process
  • conducting and publishing Algorithmic Impact Assessments

There was support for regulators playing a role in substantive assessments of fairness; many supported this being the ICO as well as sector-specific regulators like the Financial Conduct Authority (FCA), although there was not a definitive consensus. Respondents mentioned the need for regulatory cooperation between authorities and regulators, to improve clarity.

There was no consensus on whether data protection legislation would be the right place to address broader AI governance fairness issues. Responses did indicate that:

  • issues of fairness in data protection need to be considered as part of a holistic approach to AI governance
  • issues of fairness also arise from the use of non-personal data which are not in scope of data protection regulation
  • the role of the ICO should be situated in the wider regulatory landscape

The government will consider the role that fairness should play in wider AI governance as part of the white paper on AI governance, but does not currently plan to legislate on this.

Building trustworthy AI systems (questions 1.5.5, 1.5.6, 1.5.7, 1.5.8, 1.5.9)

In the consultation document, the government asked a range of open questions about whether organisations experience challenges with identifying lawful grounds when processing personal data for the purpose of developing AI systems, and whether organisations have difficulty navigating data re-use limitations as well as relevant research provisions in the current data protection regime. Respondents gave a range of views but there were no clear themes. Some respondents stated that they would benefit from further clarity on the types of organisations to which provisions on processing personal data for research purposes apply, while others suggested that the existing guidance on lawful grounds is sufficient.

The use of AI-powered automated decision-making is likely to increase substantially in the coming years. It is important that AI-powered services are a force for good and do not inadvertently harm consumers. The government recognises that organisations working on AI tools need the space to experiment without causing harm. The consultation asked for views on whether organisations should be able to use personal data more freely for the purpose of testing and training AI. The majority of respondents disagreed that there is a need, at present, to change the current requirements on the use of personal data for training AI, with respondents indicating that the existing provisions already enable organisations to experiment.

Bias mitigation in AI (questions 1.5.10, 1.5.11, 1.5.12)

Another key aspect of a trustworthy AI system is ensuring organisations are aware of potential biases in the datasets. As set out in the consultation document, bias monitoring and correction can involve the use of personal data.

The government sought views on whether processing such data for the purpose of mitigating bias should be included in the list of legitimate interests that organisations can rely on to carry out processing, without applying a balancing test. The government also sought views on whether further legal clarity is needed, and whether paragraph 8 of Schedule 1, Part 2 DPA 2018 is a sufficient gateway or if a new condition should instead be created to support the processing of sensitive data for the purposes of bias monitoring and correction, subject to appropriate safeguards.

The majority of respondents agreed that there should be additional legal clarity on how sensitive data can be lawfully processed in relation to bias monitoring and correction in AI systems. Some respondents provided insight into the merits of introducing a new condition for the processing of sensitive data for bias monitoring and correction, noting that this would enable the processing of sensitive data that are not included within paragraph 8 of Schedule 1. Many respondents highlighted concerns around the potential for loopholes to be created if sufficient safeguards were not implemented, and the government recognises these concerns.

The government plans to introduce a new condition to Schedule 1 of the DPA 2018 to enable the processing of sensitive personal data for the purpose of monitoring and correcting bias in AI systems. The new condition will be subject to appropriate safeguards, such as limitations on re-use and the implementation of security- and privacy-preserving measures when processing for this purpose.

The majority of respondents disagreed with the proposal to include bias monitoring in a list of legitimate interests, although there was some support for the proposal. Organisations see value in ensuring that the personal data being processed is proportionate to the potential harm of bias within a particular system.

As indicated in [section 1.4 (legitimate interests)](#s1-4}, the government intends to remove the balancing test only in relation to an initially limited number of carefully defined public interest activities. This will not include bias monitoring and correction in AI systems as part of the list. This means that organisations processing data for these purposes could still rely on the legitimate interests lawful ground where appropriate, but would need to undertake the balancing test in the normal way.

Rights in relation to automated decision-making and profiling (Article 22) (questions 1.5.14, 1.5.15, 1.5.16, 1.5.17)

As outlined in the consultation document, the trustworthiness of AI systems hinges on specific design features, but also on effective safeguards where they matter - like human oversight to ensure accountability and intelligibility of decisions taken. Article 22 of the UK GDPR contains provisions on automated decision-making and profiling. Not all AI systems trigger the application of Article 22 - only decision-making processes that are ‘solely automated’ and ‘produce legal effects concerning him or her, or significantly affects him or her’ (as set out in Article 22(1) UK GDPR). The government sought views on clarifying the limits and scope of what constitutes a ‘decision made solely on automated processing’ and ‘legal or similarly significant effects’ - as well as whether Article 22 is sufficiently future-proofed, practical, and proportionate.

Consultation responses demonstrated that the current operation and efficacy of Article 22 is subject to uncertainty. Respondents noted confusion with the term ‘solely’ and the ambiguity over the requirements for human involvement needed for a decision not to count as ‘solely’ automated decision making and profiling. Others expressed that it can be highly context-specific whether a decision causes a significant effect (falling within scope of Article 22), with some sectors such as healthcare being much more likely to produce significant effects. There was no clear consensus on how to clarify the limits and scope, with respondents providing mixed views on whether Article 22 is sufficiently future-proofed so as to be practical and proportionate whilst retaining meaningful safeguards.

The vast majority of respondents opposed the proposal to remove Article 22, with respondents noting that the right to human review of an automated decision was a key safeguard. Some respondents argued that the complete removal of Article 22 would damage the reputation of the United Kingdom as a trustworthy jurisdiction for carrying out automated decision-making. The government recognises the importance of appropriate safeguards, and will not pursue this proposal.

The government is considering how to amend Article 22 to clarify the circumstances in which the Article must apply. The government wants to align proposals in this area with the broader approach to governing AI-powered automated decision-making that will be set out as part of the upcoming white paper on AI governance. Reforms will cast Article 22 as a right to specific safeguards, rather than as a general prohibition on solely automated decision-making. Reforms will enable the deployment of AI-powered automated decision-making, providing scope for innovation with appropriate safeguards in place.

Public trust in the use of data-driven AI systems (questions 1.5.18, 1.5.19, 1.5.20)

The consultation asked open questions about the effectiveness and proportionality of data protection tools; provisions and definitions to address profiling issues and their impact on specific groups; what, if any, legislative changes the government could consider; and whether data protection is the right legislative framework to evaluate collective data-driven harms for a specific AI use case. The consultation recognised that data protection is not the only relevant legislative framework in this fast-evolving landscape.

The consultation generated mixed views on whether data protection is sufficiently effective in helping individuals understand how automated decisions are made about them. Some respondents noted that the safeguards provided by current legislation are sufficient, but stated that guidance would be beneficial, particularly if developed through collaboration between regulators. The government will consider further the approach to explainability and intelligibility of AI-powered automated decision-making, including the role of data protection legislation within that, through the white paper on AI governance.

1.6 Data minimisation and anonymisation

UK data protection legislation requires that personal data is adequate, relevant and limited to what is necessary in relation to the purposes for which it is processed. This is commonly known as the data minimisation principle. Data minimisation techniques, such as pseudonymisation, can be applied to personal data sets to safeguard them, which in turn may allow for such data to be used and shared in safer ways. The difference between pseudonymisation and anonymisation is important, as pseudonymised data falls within the data protection legislation, whereas anonymous data does not.

Clarifying data regarded as anonymous (questions 1.6.1, 1.6.2, 1.6.3)

The government put forward a proposal to clarify when data would be regarded as anonymous and therefore outside the scope of data protection legislation. The consultation proposed legislating to confirm that the test for whether anonymous data can be re-identified is relative to the means available to the controller to re-identify the data.

The majority of respondents agreed that greater clarity in legislation would be beneficial, although many were cautious about defining anonymisation in a way that set an impractically high standard to achieve or would not stand the test of time. Some respondents supported the proposal to confirm the test is a relative one, and a limited number of respondents disagreed with the proposal.

The government therefore intends to clarify in legislation: when a living individual is identifiable and therefore within scope of the legislation; that the test for identifiability is a relative one; and that the test should be based on the wording set out in the explanatory report to the Council of Europe’s Convention 108+.

This could be where a living individual is identifiable by the controller or processor by “reasonable means”, taking into account, among other things, the technology available at the time of the processing, and technological developments. Or this could be where the controller or processor knows, or ought reasonably to know, that passing the data to another data controller or processor is likely to result in re-identification, taking account of the means available to that organisation.

By confirming this test for anonymous data as relative, and incorporating the wording from the explanatory report to Convention 108+ which focuses on the means that are available to the controller at a particular time, the government intends to avoid setting an impossibly high standard for anonymisation.

Privacy-enhancing technology (questions 1.6.4)

The consultation asked an open question about whether there is more that the government could do to promote privacy-enhancing technologies (PETs). The need for continued investment into research and development was highlighted, as well as a greater focus on public education and guidance. The ICO guidance on anonymisation and PETs and the Centre for Data Ethics and Innovation’s PETs adoption guide are helpful in this regard, but the government recognises that further guidance tailored directly to the public may be useful to explain the benefits of PETs and improve trust in their use.

The government also noted that some respondents advised caution about overpromising on what PETs can achieve. They suggested PETs could be promoted as part of a holistic privacy management programme, but should not be seen as a substitute for wider organisational measures that can help reduce privacy risks. The government will continue to engage with the ICO to ensure the adoption of PETs is encouraged as part of organisations’ approach to privacy management.

The government is also keen to explore opportunities with its international partners to promote research and development of innovative technologies. For example, it recently announced plans to work with the United States to collaborate on bilateral innovation prize challenges focused on advancing privacy-enhancing technologies (PETs).

1.7 Innovative data sharing solutions

Responsible data sharing solutions can help drive growth and boost innovation, and the government aims to encourage new and innovative ways that confidential (personal or other) data can be shared. One way of doing so is via data intermediaries. Data intermediaries can help steward confidential data between those holding it and those using it, in a responsible and efficient manner.

This industry is nascent and the government sought views on its role in enabling the activity of data intermediaries. A range of open questions were asked about what lawful grounds might be applicable to data intermediary activities, and conferring data protection processing rights and responsibilities to data intermediaries.

Data intermediaries (question 1.7.1)

Data intermediaries provide positive potential for innovative data sharing solutions. The consultation asked some open questions on the potential of data intermediaries. Around three-quarters of the respondents that expressed a view on data intermediaries regarded them as offering positive potential for innovative data sharing solutions.

There were mixed views on whether the government should take a role in enabling the activity of responsible data intermediaries, with no majority either way. When factoring in responses to the open questions, around half of respondents that took a view believed the government should take a role in enabling the activity of data intermediaries. Some respondents considered that the government could play a helpful role in engendering confidence in intermediary activity and addressing barriers to their development, including through the clarification of rights and obligations under the existing framework, as well as setting up new governance mechanisms.

Of the respondents that did not agree with the government taking an active role to enable the activity of data intermediaries, many expressed opposition to the creation of new solutions to encourage personal data sharing, arguing that there should not be increased sharing of personal data. Others argued that data intermediaries could be a potential target for malicious actors.

Views on appropriate intervention varied depending on the type of intermediary in question. Responses focused mainly on two groups of intermediaries: those that could facilitate personal data exchange, between specific parties or as part of a network; and those that could make it easier to manage access rights to confidential data. For the latter, many of the proposed interventions included measures that are also envisioned for regulating data recipients under future Smart Data schemes. These included the creation of common standards to support more frictionless personal data access, mandating data holders to share data immediately - in an interoperable format - upon a customer’s request, and creating accreditation schemes to ensure the responsible conduct of data recipients.

The government has a long-standing commitment to legislate to enable the development of Smart Data Schemes. We are working to ensure that such legislation does not preclude the possibility for a range of data intermediaries to offer services as data recipients under these schemes, and that it enables any associated risks with their potential participation to be appropriately managed through regulation.

The government will work to ensure any regulations establishing and setting out the framework for future Smart Data schemes are appropriately scoped to maximise the benefits of data intermediaries while mitigating any risks they pose. We will also monitor the development of intermediaries markets to consider the case for future legislation or other measures, including for example to mitigate risks associated with complex, high-volume data sharing that could be enabled by intermediaries operating across sectors. We will consider the case for non-legislative action, including clarifying intermediaries’ existing rights and obligations, and testing promising approaches to supporting intermediaries.

Lawful grounds regarding data intermediaries (question 1.7.2)

The consultation asked various open questions on whether lawful grounds, other than consent, might be applicable to data intermediary activities. Many respondents who agreed that other lawful grounds might be applicable referenced legitimate interest as an alternative lawful ground. Respondents who disagreed believed that consent should be the only lawful ground. The government does not have any immediate plans in relation to this question but will factor respondents’ comments into any potential future policymaking in this area.

1.8 Further questions

At the end of the chapter, the government asked for views on whether any of the proposals would impact on anyone with protected characteristics. There were no specific concerns related to the impact any proposals would have on those with protected characteristics.

The consultation posed some open-ended questions at the end of this chapter for respondents to provide any other comments on the proposed reforms. There were no significant issues raised by respondents in relation to the further questions posed at the end of this chapter.

Chapter 2: Reducing burdens on businesses and delivering better outcomes for people

2.1 Summary

In chapter 2 of the consultation, the government considered how to provide a high standards flexible data protection regime, whilst ensuring that businesses can benefit from an agile regulatory approach, as described in the government’s Plan for Digital Regulation.

The government highlighted the opportunity to develop a regime that incentivises organisations to invest more in governance, policies, tools, people and skills that lead to better outcomes for both businesses and individuals. The government’s proposed reforms aim to deliver greater flexibility and more proportionate and targeted compliance activities, which will unlock the value of data assets, rather than creating a disproportionate regulatory burden with limited value for individuals.

2.2 Reform of the accountability framework

The government views the principle of accountability as fundamental, but also recognises that current requirements for data controllers to demonstrate how they are complying with the data protection legislation can put a disproportionate burden on some organisations. The government put forward a proposal to introduce a more flexible accountability framework, underpinned by “privacy management programmes”. The steps an organisation would be required to take to implement an effective programme would reflect the volume and sensitivity of the personal data involved.

The privacy management programme approach would be based on a number of elements at the core of accountability, such as:

  • leadership and oversight
  • risk assessment
  • policies and processes
  • transparency
  • training and awareness of staff
  • monitoring, evaluation and improvement

To support the implementation of privacy management programmes, the government also proposed removing the existing requirements to:

(i) designate a data protection officer under Articles 37 to 39,
(ii) undertake data protection impact assessments under Article 35, and
(iii) maintain a record of processing activities under Article 30.

In their place, the government proposed complementary measures under the privacy management programme, such as:

(i) appointing a suitable senior individual to be responsible for the programme,
(ii) ensuring organisations implement risk assessment tools which help assess, identify and mitigate risks, and
(iii) a more flexible record keeping requirement.

Reform of the accountability framework (question 2.2.1)

The majority of respondents disagreed that the current framework should feature fewer prescriptive requirements and be more risk-based, as many respondents consider the current legislation to be sufficiently flexible and risk-based already. However, around a third of respondents agreed, with those who supported a need to reform the accountability framework arguing that a more flexible and risk-based approach will allow data controllers and processors to take a more thoughtful approach to both complying and demonstrating compliance.

However, when looking at responses from organisations only, the majority of respondents agreed that the framework should be more flexible and risk-based. Organisations responded that allowing a more flexible approach to accountability could provide flexibility to allow organisations to focus their resources more effectively.

Introduction of new privacy management programmes (questions 2.2.2, 2.2.3)

There were mixed views regarding whether organisations will benefit from being required to develop and implement a risk-based privacy management programme. Similarly, there were mixed views on whether individuals will benefit from a risk-based privacy management programme. Some of those who opposed the proposal to introduce privacy management programmes believed it risks causing confusion about how to comply and generating inconsistent outcomes for data subjects, and that further regulatory changes would be costly to implement. However, those who were in support suggested the approach would allow for resources to be assigned to the greatest areas of risk and for a pragmatic approach to be taken by organisations, and that privacy management programmes would help eradicate a one-size-fits-all approach, which would be particularly helpful for small and medium-sized enterprises (SMEs).

The government plans to proceed with the requirement for organisations to implement privacy management programmes, with the legislation designed in a way that addresses concerns raised during the consultation process. In particular, the government acknowledges the concerns raised around the time and resources that organisations have invested to establish policies and processes in order to comply with the UK GDPR, and that any further regulatory changes would lead to further costs.

However, the requirement to implement privacy management programmes will allow organisations to integrate current accountability mechanisms as elements of a holistic approach to accountability. This means organisations that are currently compliant with the UK GDPR would not need to significantly change their approach to be compliant with the new requirements, unless they wanted to take advantage of the additional flexibility that the new legislation will provide. Furthermore, the government believes that replacing the requirements with improved, more flexible ones will lead to better outcomes for individuals, as organisations will be required to tailor their processing activities to meet the outcomes of the new regime, rather than following a tick-box process.

The principle of accountability is key for privacy management programmes, and responsible use of personal data will continue to be at the heart of the accountability framework under the new regime. A move to a framework based on privacy management programmes will enable organisations to take a more proportionate approach in meeting the requirements of the UK’s regime. It will help to reduce the prescriptive regulatory burdens faced by smaller organisations, while enabling many organisations to focus on the outcomes required to help demonstrate compliance to relevant stakeholders.

The government’s consultation recognises that respondents fear a potential lowering of standards. However, under the revised regime, organisations will have to implement a privacy management programme based on the level of processing activities and the volume and sensitivity of personal data they handle. Therefore, organisations that process highly sensitive data (i.e. special category data) or large volumes of high-risk data, will be expected to have the most robust approaches to accountability. The government believes that privacy management programmes will place greater emphasis on the principles at the core of accountability such as organisational responsibility; risk management; transparency; training and awareness of staff; and continuous monitoring, evaluation and improvement of data protection management within an organisation.

To align with the existing penalties, the privacy management programme requirement will also be subject to the same sanctions as under the current regime, carrying maximum fines of the greater of £8.7m or 2% of annual worldwide turnover.

Removal of data protection officers (questions 2.2.4, 2.2.5, 2.2.6)

The majority of respondents disagreed with the proposal to remove the requirement to designate a data protection officer. Respondents mainly cited concerns that removal of the data protection officer requirement would result in a loss of data protection expertise and that the lack of independence could lead to a potential fall in trust and reassurance to data subjects. Whilst many individuals felt their organisations would not maintain a similar role if it was no longer mandatory, most organisations said that they likely would. However, those who were in support of removing the requirement to designate a data protection officer mentioned that this would be beneficial for smaller businesses, and in particular those which do not process highly sensitive personal data. A small minority of respondents have suggested that it is difficult to appoint someone who is truly independent.

The new requirement to appoint a senior responsible individual will shift the emphasis to ensure data protection is established at a senior level to embed an organisation-wide culture of data protection. For that reason, the government plans to proceed with removing the requirement to designate a data protection officer. Most of the tasks of a data protection officer will become the ultimate responsibility of a designated senior individual to oversee as part of the privacy management programme.

The designated senior individual’s role will include:

  • representing or delegating a representative to the ICO and data subjects
  • ensuring appropriate oversight and support is in place for the programme and appointing appropriate personnel
  • providing tailored training to ensure staff understand the organisation’s policies
  • regularly auditing the efficacy of the programme

The government acknowledges that organisations have different governance structures when it comes to the designation of the data protection officer. The privacy management programme offers flexibility which allows organisations that previously used a data protection officer to continue to do so, as long as there is appropriate oversight from the senior accountable individual. For example, some organisations that process large volumes of highly sensitive data might continue to appoint and resource data protection officers where they consider that is the best way to monitor and improve compliance.

Removal of data protection impact assessments (questions 2.2.7, 2.2.8)

The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments. However, some respondents stated they would welcome a more flexible approach to data protection impact assessments which would allow them to be tailored to the needs of the organisation. This is because data protection impact assessments can be a more prescriptive duplication of other risk assessments that achieve the same outcome performed within an organisation; for example, organisations which have compliance teams performing wider risk analysis which sometimes ends up duplicating some of the requirements under the data protection impact assessment requirement.

Under the new privacy management programme, organisations will still be required to identify and manage risks, but they will be granted greater flexibility as to how to meet these requirements. For example, organisations will no longer be required to undertake data protection impact assessments as prescribed in the UK GDPR, but they will be required to ensure there are risk assessment tools in place for the identification, assessment and mitigation of data protection risks across the organisation. For that reason, the government plans to proceed with removing the requirement to undertake data protection risk assessments. The government understands from the consultation that organisations would benefit from more flexibility when carrying out data protection impact assessments. Organisations may wish to continue to use data protection impact assessments but tailor them based on their processing activities. Our consultation also made clear that existing DPIAs would remain valid as a way of achieving the new requirement.

Removal of the record of processing activities requirement (questions 2.2.11, 2.2.16)

The government also sought views on whether the requirement to maintain a record of processing activities was burdensome, and proposed to replace it with a more flexible record keeping requirement under the privacy management programme.

The majority of respondents disagreed with the proposal to remove this requirement. Many respondents felt that the existing requirement is not burdensome and it allows them to easily understand what personal data they process and how sensitive it is. However, a similar number of respondents felt that the current requirement is burdensome to both create and maintain, and that it duplicates other documentation requirements in the legislation, and would therefore welcome greater flexibility to take a more tailored approach to record keeping.

Respondents were also asked whether or not elements of record keeping requirements are duplicative of Articles 13 and 14 without any particular benefit. Around half of respondents disagreed with this, and felt that the record keeping requirement provides the fundamental building block which then helps them to comply with the requirements under Articles 13 and 14 to inform individuals about how their data is going to be used and with whom it might be shared. Around a quarter of respondents agreed that there was duplication without any particular benefit.

Organisations will need to have personal data inventories as part of their privacy management programme which describe what and where personal data is held, why it has been collected and how sensitive it is, but they will not be required to do so in the way prescribed by the requirements set out in Article 30. For that reason, the government plans to proceed with removing the requirement for record keeping provisions. Privacy management programmes will still require organisations to document the purposes of processing, but in a way which is more tailored to the organisation.

Overall, the new regime will continue to require organisations to maintain the same high standards for data protection, but they will be given flexibility to do this in a way that reflects the volume and sensitivity of the personal data they handle, and the type of data processing they carry out. The government believes that providing a new framework which encourages organisations to focus on the design of their privacy management programme, rather than meet a prescriptive tickbox list, will also lead to greater transparency practices. This is because organisations will need to focus more on the effectiveness and appropriateness of their risk assessment and communications programme, and how they effectively communicate their programmes to relevant data subjects.

Prior consultation requirements (question 2.2.9)

The government wants to further encourage a more proactive, open and collaborative dialogue between the ICO and organisations, so that they can work together to better identify and mitigate risks. Under Article 36 (1)-(3) of UK GDPR, if organisations identify a data processing activity which poses high risks which cannot be mitigated, they must inform the ICO. The government is aware that compliance with Article 36 is low.

The consultation put forward a proposal to remove the mandatory requirement for organisations to consult the ICO prior to any high-risk processing activity, and instead make voluntary prior consultation with the regulator a mitigating factor which the ICO may take into account when taking any enforcement action against an organisation.

Some respondents felt that replacing the current requirement to consult the ICO with a voluntary incentive would result in better and more proactive conversations between the regulator and organisations prior to any high-risk processing. The majority of respondents agreed that organisations are likely to approach the ICO before commencing high-risk processing activities on a voluntary basis, if this is taken into account as a mitigating factor during any future investigation or enforcement action. At the same time, some felt that there would still be reticence for organisations to consult the ICO before commencing any high-risk processing, particularly in the context of new or innovative methods of processing. The government plans to proceed with this proposal and remove the mandatory requirement in favour of a voluntary mechanism.

Voluntary undertakings process (question 2.2.13)

The government sought views on whether to introduce a new voluntary undertakings process, which would be similar to Singapore’s Active Enforcement regime. This would mean that any organisation that has shown it has taken a proactive approach to accountability would be able to provide the ICO with a remedial action plan when they discovered an infringement - so long as that plan highlighted the likely causes and steps to solve the problem.

Around half of respondents agreed with the proposal and around a third disagreed. Concerns were raised that the process may be of limited use to small and medium-sized organisations given the prescriptive and resource-intensive nature of the requirements of the remedial action plan, and that organisations without documented prior engagement with the ICO might not benefit. Moreover, the current framework already requires organisations to document any personal data breaches and set out remedial action taken, in turn allowing the ICO to verify compliance under Article 33(5) of UK GDPR. As such, the government will not be pursuing this proposal.

Breach reporting requirements (question 2.2.12)

Breach reporting requirements are set under Article 33 of the UK GDPR. These requirements mean that an organisation must inform the ICO of a data breach, ‘unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons’. In practice, this has led to the reporting of some relatively minor breaches that do not always trigger a follow-up investigation by the ICO. The government therefore sought views on the impact of changing the threshold for reporting a personal data breach, so that only breaches which posed ‘material’ risks to individuals would have to be reported.

Responses to the consultation were mixed, between: those that thought it might reduce disproportionate burdens on businesses; those that thought it would make no difference at all (because they would still report breaches to ‘err on the side of caution’); and those who thought it could reduce protections for individuals. With regard to the latter, some organisations pointed out that the true severity of a breach is not always known at the time of a personal data breach. Others said that what might seem like a minor breach to one organisation might be or become a more serious breach if the same mistake was repeated across a whole sector.

Some respondents, including the ICO, highlighted the indirect impact a legislative change in this area could have, if it led to a reduction of valuable intelligence data for the ICO and other agencies about potential cyber risks. In addition, many respondents thought that additional guidance from the ICO on when to report a breach would be more helpful in reducing burdens on them than legislative changes.

Upon considering these responses, the government will not pursue legislative change, but will continue to work with the ICO to explore the feasibility of clearer guidance for organisations on breach reporting.

The consultation proposed alternative standalone reforms if the government chose not to pursue the implementation of the privacy management programme. As the government plans to proceed with the introduction of the requirement to implement privacy management programmes, these will not be pursued.

2.3 Subject access requests

The right of access is one of the key rights of the data protection framework, and subject access requests are an important part of this. They allow for data subjects to check the accuracy of data about them, learn more about how it is being used, and who it is being shared with.

While subject access requests are a critical mechanism to empower data subjects to have control over their data, dealing with requests can be time-consuming and resource intensive for organisations.

Resource impact of subject access requests (question 2.3.1)

The government asked an open question on the extent to which organisations found subject access requests time-consuming or costly to process. Responses showed that a variety of small and large organisations - including GP surgeries; local councils; telecoms providers; and property, financial and legal services - found dealing with subject access requests time-consuming. Respondents from various sectors (including charity and non-profit, legal, financial, public and health) explained that the level of resourcing required to respond to subject access requests could be very high, especially for SMEs. Other organisations noted that subject access requests can also be used as a means of circumventing strict disclosure protocols, to gain access to information on prospective litigation. Most responses, both in favour of and against changes to the subject access requests regime, emphasised the importance of data subjects having access to information held about them.

‘Manifestly unfounded’ threshold for subject access requests (question 2.3.2)

The government invited views on whether the ‘manifestly unfounded’ threshold for refusing to respond to, or charge a reasonable fee for, a subject access request was too high. There were mixed views on whether the threshold was too high. Respondents noted that the threshold is often perceived as too vague to encompass requests which are clearly unreasonable in nature and should be allowed to be refused, such as instances where employees leaving on bad terms used subject access requests to disrupt their former employer.

Consultation responses also noted that some claims management companies are using subject access requests to fish for opportunities; bringing claims against organisations if information relating to their clients has been omitted in their responses. Others noted that further guidance on the time limit for responding to a subject access request would be welcome. Many respondents indicated the need for appropriate infrastructure and processes to handle subject access requests to alleviate the time and effort of responding to requests.

Alignment with the Freedom of Information regime (introducing a cost limit and amending the threshold) (question 2.3.3)

The government sought views on whether to align the subject access request regime with the Freedom of Information request regime; and whether to:
(i) introduce a cost ceiling, and
(ii) amend the threshold for response to ‘vexatious’, to alleviate organisations’ capacity constraints when responding to subject access requests.

There were mixed views on these proposals. Organisations in favour of a cost limit argued that it would be particularly beneficial to SMEs and make complying with subject access requests more manageable. Those who were not in favour of a cost limit expressed concern that it could be detrimental to data subject rights, and could hamper the public’s understanding of how their personal data is being used.

The government recognises that the rights of all data subjects must be protected, including vulnerable people. Information provision, and the right of access, remains the default.

Taking into account views expressed as part of this question, as well as the question regarding the ‘manifestly unfounded’ threshold above, the government plans to proceed with changing the current threshold for refusing or charging a reasonable fee for a subject access request from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’, which will bring it in line with the Freedom of Information regime. The government does not intend to introduce a cost ceiling for subject access requests.

Introducing a nominal fee (question 2.3.4)

The government also asked whether there was a case for re-introducing a small nominal fee for processing subject access requests, akin to the previous approach in the Data Protection Act 1998. The majority of respondents disagreed with this proposal. Many respondents felt it could disadvantage more vulnerable people in society, although other respondents felt it may discourage vexatious and repetitive requests.

The government does not intend to re-introduce a nominal fee for processing subject access requests. The government, in line with existing guidance, also continues to encourage organisations to implement appropriate processes and infrastructure to respond well to subject access requests.

Finally, the government is considering how to address specific sectoral needs as raised in the consultation response (e.g. healthcare) as well as those of small and medium-sized businesses.

2.4 Privacy and electronic communications

Whilst the majority of the consultation focused on possible changes to the UK GDPR and the Data Protection Act 2018, the government also consulted on possible changes to the Privacy and Electronic Communications Regulations 2003 (PECR). PECR supplements the UK GDPR with specific rules relating to confidentiality of terminal equipment (e.g. cookie rules), unsolicited direct marketing communications (e.g. nuisance calls), and communications security (e.g. network traffic and location data). PECR’s rules protect companies as well as individuals.

Cookies (questions 2.4.1, 2.4.2, 2.4.3, 2.4.4, 2.4.5, 2.4.6, 2.4.7, 2.4.8)

Under the current legislation, cookies (and similar technologies such as tracking pixels) are not allowed to be placed on a device without the consent of the user.

There are currently only two limited exceptions from gaining consent. These are:

a) for purposes that are essential to provide an online service at someone’s request (e.g. to remember what’s in their online basket, or to ensure security in online banking); or
b) where needed to transmit a communication over a communications network.

Consent is usually sought through a pop-up notice or banner which appears when a person visits a website. Some respondents explained that their ability to collect potentially useful information, such as how many people are visiting their websites and what pages they are looking at the most, is restricted by the current strict rules on consent. Individuals also find the number of cookie pop-ups a source of annoyance, and routinely accept the terms without reading them.

To address these issues, the government sought views on whether there were any other occasions when cookies (and similar technologies such as tracking pixels) should be permitted to be placed on a person’s device without their explicit consent. This could include, for example, cookies that are placed for audience measurement purposes, or for the purpose of detecting faults on an organisation’s website.

The majority of respondents agreed with the proposal that organisations should be able to use cookies and similar technologies without consent for a wider range of non-intrusive purposes, although there was some disagreement. The majority of respondents agreed that cookies that allowed an organisation to measure traffic to its webpage and improve its offering to users should be included.

The government also sought views on whether prior consent should be removed for all cookies; or alternatively, whether there was potential for innovative technologies such as browser-based solutions to manage a person’s consent preferences so they were not continually confronted by consent banners on every site they visited.

The vast majority of respondents disagreed with removing the consent requirement for all types of cookies, particularly more intrusive varieties which collect personal data for the purposes of real-time bidding and the micro-targeting of advertisements. Many respondents argued that web users should be given clearer information about these types of cookies so that they could exercise their right to reject them where appropriate.

There was also support for websites respecting individuals’ preferences set through their browser, software applications or device settings. Many respondents highlighted that it would improve user experiences and benefit trust in those using services, but some highlighted risks relating to healthy competition and queried whether different platforms and services would have the technological capability to provide legally compliant technology.

Following consideration of responses, the government intends to legislate to remove the need for websites to display cookie banners to UK residents. In the immediate term, the government will permit cookies (and similar technologies) to be placed on a user’s device without explicit consent, for a small number of other non-intrusive purposes. These changes will apply not only to websites but connected technology, including apps on smartphones, tablets, smart TVs or other connected devices.

In the future, the government intends to move to an opt-out model of consent for cookies placed by websites. In practice, this would mean cookies could be set without seeking consent, but the website must give the web user clear information about how to opt out. This would allow the government to realise its ambition to improve the user experience and remove the need for unnecessary cookie consent banners. The opt-out model would not apply to websites likely to be accessed by children.

Responses to the government’s consultation highlighted that users value privacy and want control over how their personal data is used. To address this, the government will work with industry and the regulator on browser-based and similar solutions that will help people manage their cookie and opt-out preferences. The government will take forward proposals that require websites to respect automated signals emitted by these technologies - and will move to an opt-out model of consent for cookies only when the government assesses these solutions are widely available for use.

Direct marketing (question 2.4.9)

PECR also regulates the use of electronic messages for direct marketing. Currently, businesses can contact individuals with whom they have previously been in touch during a sale or transaction with further marketing material about similar or related products, provided that the individuals were given the opportunity to opt-out of such contact at the time they provided their details. This is known as the ‘soft opt-in’, as it does not require the customer’s explicit consent.

The government asked for views on whether the soft opt-in should be extended to non-commercial organisations such as charities. The majority of respondents agreed with this proposal, although around a third disagreed. Support was stronger amongst organisations it would benefit, though privacy rights groups warned this could result in larger volumes of unwanted communications.

Following consideration of responses, the government intends to extend the soft opt-in to non-commercial organisations, but in parallel will take steps to make sure that appropriate safeguards are in place to protect individuals who do not wish to continue receiving communications.

Nuisance calls (questions 2.4.10, 2.4.11, 2.4.12, 2.4.13)

The consultation proposed measures to help the ICO to tackle rogue ‘direct marketing’ firms that are responsible for generating large volumes of nuisance calls, and asked a range of open questions on this issue.

There was support for measures to allow the ICO to take enforcement action against organisations on the basis of the number of calls they generate (rather than purely on the number that are connected, which is the position under the current legislation). There was also support for the introduction of a ‘duty to report’ on communications providers, to require them to inform the ICO of suspicious levels of traffic on their networks.

The government plans to proceed with both proposals. In addition, the government is not ruling out placing further requirements on telecoms companies to block a greater volume of nuisance calls at source, if the measures outlined here do not produce tangible results.

Bringing PECR’s enforcement regime in line with UK GDPR and DPA 2018 (questions 2.4.16, 2.4.17, 2.4.18)

The consultation asked for views on whether PECR should be amended to allow the ICO to levy fines of up to £17.5m or 4% of a business’s global turnover. The vast majority of respondents supported this proposal.

The consultation also asked for views on whether the ICO should be able to serve assessment notices and carry out audits on organisations suspected of infringing PECR, in line with the powers under UK GDPR and DPA 2018. The vast majority of respondents supported this proposal.

There was a general feeling amongst respondents that the current enforcement regime was not dissuasive enough, and that these proposals would help improve compliance and raise the profile of PECR with organisations. Some respondents to the consultation called for greater harmonisation between the enforcement provisions in PECR and those under the UK GDPR and DPA 2018.

The government agrees that there is a case for this, particularly as PECR’s current enforcement provisions are drawn from the DPA 1998, which can cause complexity for data controllers and the regulator alike. The government therefore plans to proceed with these proposals.

2.5 Use of personal data for the purposes of democratic engagement

The government invited views on how PECR supports the democratic process. Currently, communications from political parties are treated as direct marketing - for example, political parties should not make automatic calls, email, or text the electorate unless they have obtained prior consent.

Communications for political campaigning (question 2.5.1, 2.5.2)

The government asked for views on whether communications from political parties should continue to be subject to the same rules as communications from commercial organisations given the importance of democratic engagement to a healthy democracy.

The majority of respondents agreed that political communications should be covered by PECR’s rules on direct marketing. There were mixed views on whether to extend the soft opt-in to political parties and other political entities.

Those who supported the soft opt-in proposals noted that extending the communication channels for political messaging could allow greater numbers of people to be reached and therefore lead to better engagement with politics. However, those opposed noted that larger numbers of people may be encouraged to part with their personal data.

The government plans to consider further whether political communications should remain within the scope of PECR’s direct marketing rules for democratic engagement. It intends to add a regulation-making power to the legislation so that Parliament can make exceptions to the direct marketing rules in the future if necessary. It also intends to extend the soft opt-in rule so that political parties and elected representatives can contact individuals who have previously shown an interest in the activities of the party (for example, by attending a conference or making a donation) without their explicit consent, provided they have been given an opportunity to refuse such communications at the point of providing their details. Individuals would also have the same rights to opt-out of receiving communications from political parties as they do in relation to marketing communications from commercial organisations. This would align political parties and elected representatives with organisations, which can already use the soft opt-in.

Lawful grounds for processing personal data under UK GDPR and DPA 2018 (questions 2.5.4, 2.5.5)

The government invited views on how the UK GDPR and DPA 2018 can continue to support the democratic process. Section 8 of the DPA 2018 makes clear that necessary processing of personal data for democratic engagement purposes falls within the public interest tasks lawful ground under Article 6(1)(e) of the UK GDPR, but this is underpinned by Article 6(3) of the UK GDPR which provides that organisations must also identify a separate legal basis in law. This has sometimes been difficult for political parties and elected representatives, so the government asked for evidence from political parties about the barriers to effective political engagement and any ideas for possible solutions.

The government also asked for views on paragraphs 22 and 23 of Schedule 1 to the DPA 2018, which set out two of the conditions in which sensitive data may be processed without consent. Paragraph 22 states that political parties can process personal data about people’s political opinions without consent, if it is necessary for the purpose of political activities and does not cause substantial damage or substantial distress. Paragraph 23 states that elected representatives can process sensitive data without consent, when they have been asked to to act on behalf of a constituent.

The majority of respondents disagreed that Article 6 of the UK GDPR impedes the use of personal data for the purposes of democratic engagement, though it was also noted that the requirements in Article 6(3) can cause some practical difficulties for political parties and elected representatives because some of their functions are not set out clearly in statute. One party said that it might be sensible if processing of personal data for the purposes of democratic engagement were given a firmer legal basis.

The majority of respondents disagreed that paragraphs 22 and 23 of Schedule 1 to the DPA 2018 impede the use of sensitive data by an elected representative. Those who agreed said the rules for elected representatives when processing personal data for political activities (rather than when carrying out constituency casework) could be made clearer.

Having considered these responses, the government is minded to make some changes to Article 6 of the UK GDPR, to clarify the lawful grounds that can be relied upon by registered political parties, permitted participants in a referendum and elected representatives when processing personal data for the purposes of democratic engagement. The government also plans to amend paragraph 22 of Schedule 1 to the DPA 2018 to make it clear that elected representatives may also process personal data about political opinions for the purposes of democratic engagement (e.g. a councillor or MP undertaking a survey of local residents). Currently that condition only specifically refers to political parties, which causes unnecessary confusion.

2.6 Further questions

At the end of the chapter, the government asked for views on whether any of the proposals would impact on anyone with protected characteristics. Many respondents were concerned that fees for subject access requests may harm the most vulnerable. As outlined above, the government is not seeking to re-introduce a nominal fee for subject access requests. The government recognises that the rights of all data subjects must be protected, including vulnerable people, and is not proceeding with the proposal to introduce a cost ceiling.

There was also some support for consolidating the UK GDPR, DPA 2018 and PECR, noting that it will lead to simplification of existing legislation. While the government recognises this might be a worthwhile project in future years, it is keen that swift progress is made now to clarify and improve areas of the current framework that are causing organisations the most difficulties.

Chapter 3: Boosting trade and reducing barriers to data flows

3.1 Summary

In chapter 3 of the consultation, the government set out that global networks of personal data flows are critical to the UK’s prosperity and modern way of life. In this chapter, the government sets out the importance of removing unnecessary barriers to cross-border data flows, including by progressing an ambitious programme of adequacy assessments.

The government intends to create an autonomous framework for international data transfers that reflects the UK’s independent approach to data protection, that helps drive international commerce, trade and development and underpins modern day business transactions and financial institutions. The UK’s approach will be driven by outcomes for individuals and organisations. Continuing to ensure high standards of data protection will remain at the core of the future international transfers regime. Reforms will allow the UK government and businesses to take an agile approach, recognising there are varying frameworks operating internationally that offer a high level of data protection to data subjects. Businesses and organisations will benefit from reduced burdens and clarity on how and where personal data can be transferred across borders.

A more agile approach to the international transfer regime will help domestic businesses to connect more easily with international markets, and attract investment from abroad by businesses that rightly have confidence in the responsible use of personal data within the UK. Individuals will reap the benefits of organisations, hospitals and universities being able to share personal data quickly, efficiently and responsibly for the public good. International data flows help people to stay emotionally and socially connected to friends, families and communities around the world.

3.2 Adequacy

The government recognises that organisations currently face challenges and uncertainty when transferring personal data internationally.

Risk-based approach to adequacy (question 3.2.1)

The government proposed underpinning the UK’s future approach to adequacy decisions with principles of risk assessment and proportionality.

Around half of respondents agreed with this proposal. Respondents thought that the proposal represented a pragmatic approach, with some articulating that the UK should be flexible and not prescriptive when making adequacy decisions. Many respondents made clear that an outcomes-based approach should not come at the expense of data protection standards.

The government will take forward reforms that better enable the UK to approach adequacy assessments with a focus on risk-based decision-making and outcomes, and continuing to support the UK’s commitments relating to data flows. The reformed regime will retain the same broad standard that a country needs to meet in order to be found adequate, meaning individuals’ data will continue to be well-protected by a regime that ensures high data protection standards. Where countries meet those high data protection standards, the law will recognise that the DCMS Secretary of State may also consider the desirability of facilitating international data flows when making adequacy decisions.

The government is respectful of countries’ sovereign rights, and the different cultural and legal traditions that can contribute to high standards of data protection. The reformed regime will recognise the contexts in which other countries operate, and take account of the different factors that play a part in protecting personal data.

Adequacy for groups of countries, regions and multilateral frameworks (question 3.2.2)

The government consulted on whether to make adequacy regulations for groups of countries, regions and multilateral frameworks. The majority of respondents agreed with this approach. Support was based on the greater interoperability that this approach would provide. Conversely there was some concern that this could be of limited use as adequacy decisions will still need to consider the laws of each country, as recognised in the consultation document.

The government will consider the implications of using this approach in the future, especially as it seeks to prioritise work on multilateral solutions for data flows, but does not intend to make any immediate legislative changes.

Relaxing the requirement to review adequacy regulations every 4 years (question 3.2.3)

The government also proposed investing in ongoing monitoring of adequacy regulations and relaxing the requirement to review adequacy regulations every 4 years. There were mixed views on this proposal, with some respondents believing that this would be both a more pragmatic and more effective approach, while others cited concerns that the change could present risks to data protection standards.

Adequacy regulations must be scrutinised in the face of constant developments to the legal landscapes of adequate countries. The government recognises the balance of views, and assesses that, if used appropriately, ongoing monitoring can safeguard data subjects more effectively than intermittent review points. A well-functioning, rigorous and ongoing monitoring process removes the need for a formal review. The government therefore intends to proceed with relaxing the requirement to review adequacy regulations every 4 years.

Redress requirements (question 3.2.4)

The government proposed clarifying that when assessing another country for adequacy, it is acceptable for that country to provide either administrative or judicial redress for UK data subjects, as long as the redress mechanism is effective. Around half of respondents agreed with this proposal. The most common view was that the effectiveness of redress is more important than its form.

The government intends to proceed with this proposal. Under the reformed regime, the government will not specify the form in which redress should be provided. Instead, when conducting adequacy assessments, the government will consider the effectiveness of redress mechanisms available.

3.3 Alternative transfer mechanisms

Alternative transfer mechanisms provide a route for cross-border transfers of personal data to countries that are not subject to an adequacy decision. Many organisations send data all over the world and often have complex infrastructure to support such data sharing.

Proportionality of appropriate safeguards (questions 3.3.1, 3.3.2)

To facilitate safe and effective international data sharing, the government proposed changes to reinforce the importance of proportionality when assessing risk for alternative transfer mechanisms. The government called for views on support or guidance that could help organisations assess and mitigate risks in relation to international transfers of personal data when using alternative transfer mechanisms.

The majority of respondents agreed with this proposal, noting the challenges organisations have faced in assessing the risks of international transfers using alternative transfer mechanisms. Respondents who disagreed cited concerns around a perceived risk of degrading data protection standards. Respondents showed strong support for more clarity and guidance on the requirements for alternative transfer mechanisms.

The government will take forward reforms which ensure that data exporters can act pragmatically and proportionally when using alternative transfer mechanisms, whilst maintaining a high standard of protection for data subjects.

Reverse transfers (question 3.3.3)

The government proposed to remove ‘reverse transfers’ from the scope of the international transfer regime. There were mixed views regarding this proposal. Those in support agreed that it could reduce unnecessary and disproportionate burdens on organisations without corresponding risks to data protection.

However, there were mixed views on the extent of the positive impact anticipated by respondents. Those who disagreed with the proposed reform did so on the basis of the potential risk to data subject rights.

It was also noted that this reform will be complex and may not lead to a reduction in complexity for data controllers. The government agrees with this appraisal and therefore will not pursue legislative reform.

Adaptable transfer mechanisms (questions 3.3.4, 3.3.5, 3.3.6)

The government proposed allowing organisations to create or identify their own transfer mechanism. The majority of respondents disagreed with this proposal, on the grounds that this reform would be too difficult to use and may generate uncertainty about what is required for safeguarding transfers. Some expressed concern that the proposal may risk data protection standards. Respondents who supported the proposal believed it would help organisations overcome complex and specific transfer requirements in some situations.

Given the absence of clear use cases for this proposal, the government will not pursue this approach.

A power to create alternative transfer mechanisms (questions 3.3.7, 3.3.8)

The government proposed creating a new power for the DCMS Secretary of State to formally recognise new alternative transfer mechanisms, which would allow the Secretary of State to create new UK mechanisms for transferring data overseas or recognise in UK law other international data transfer mechanisms, if they achieve the outcomes required by UK law.

There were mixed views on this proposal, with support for the opportunities for interoperability and future-proofing, but some concern about the potential risk that new transfer mechanisms may not maintain data protection standards.

The government recognises the mixed views on this issue, but assesses that this reform will help to future-proof the UK’s approach to international transfers by allowing the UK to respond rapidly to international developments. The government intends to take this reform forward and ensure that new mechanisms must meet the same high data protection standards as the other alternative transfer mechanisms.

3.4 UK certification schemes

Certification schemes are voluntary, market-driven frameworks of context-specific rules that, under the UK GDPR, can be used to demonstrate a high standard of compliance and to provide appropriate safeguards for international transfers. They are characteristically framed at the sectoral or industry level, defining data protection rules and practices covering specific products, processes and services within the context of that sector, industry or similar group.

However, certification schemes are complex measures that require significant time and resources to design, implement and maintain, and demonstrate accountability for. The government considered modifications to the framework for certification schemes, to provide for a more globally interoperable market-driven system that better supports the use of certifications as an alternative transfer mechanism.

Amending certification schemes to increase interoperability (questions 3.4.1, 3.4.2)

To facilitate compatibility with a wider range of personal data protection regimes, the government proposed two changes: firstly to allow certification to be provided for by different approaches to accountability, such as privacy management programmes; and secondly to clarify that certification bodies outside the UK can be accredited to run UK-approved international transfer schemes.

There were mixed views on both proposals. Respondents noted that the reforms could benefit UK businesses by facilitating easier personal data flows. Other respondents were less sure that these proposals would benefit them or their organisation directly.

Supporting interoperability is important for organisations. However, there remains potential within the current approach to use certifications more extensively. It is also unclear, at this time, whether the proposed reforms are the best way to deliver greater interoperability. The government will consider other approaches to meeting this goal.

3.5 Derogations

The consultation set out that the current overarching approach to the use of derogations will be maintained: they should be used only in situations where they are necessary and where neither adequacy nor other safeguards are appropriate. As outlined in this section of the consultation, technical changes may help to clarify the restrictions on using derogations.

Repetitive use of derogations (question 3.5.1)

The government proposed establishing a proportionate increase in flexibility for the use of derogations, by making explicit that repetitive use of derogations is permitted. There was some opposition to this proposal, with respondents expressing concerns that it may negatively impact data subject rights. There was minimal support, with respondents agreeing that it would represent a proportionate increase in flexibility for transfers. The government will not pursue this reform.

3.6 Further questions

At the end of the chapter, the government asked for views on whether any of the proposals would impact on anyone with protected characteristics. There were no specific concerns related to the impact any proposals would have on those with protected characteristics, nor were there significant issues raised by respondents in relation to the questions and topics in this chapter.

The consultation posed some open-ended questions at the end of this chapter for respondents to provide any other comments on the proposed reforms. Respondents highlighted the importance of protecting people’s personal data when these reforms come into force, particularly those with protected characteristics. The protection of people’s personal data is already at the heart of the UK’s data regime and will continue to be as the government embarks on its reform agenda.

Chapter 4: Delivering better public services

4.1 Summary

In chapter 4 of the consultation, the government highlighted the opportunities to build on the lessons learned from the COVID-19 pandemic, in relation to the power of using personal data responsibly in the public interest, and the benefits of collaboration between the public and private sectors in all parts of the United Kingdom.

There are currently some challenges to doing this effectively, including:

  • data infrastructure that is not interoperable
  • legal and cultural barriers to data sharing
  • inconsistent data capability in the workforce
  • financial disincentives that discourage investment

The government wants to create a joined-up and interoperable data ecosystem for the public sector across the whole of the UK, that will address the limitations outlined above, whilst ensuring high levels of public trust. The proposed reforms include ways to improve the delivery of government services through better use and sharing of personal data, including data sharing across the whole of the UK.

4.2 Digital Economy Act 2017

Part 5 of the Digital Economy Act 2017 allows personal data sharing to support services and positive interventions in terms of public service delivery. To facilitate more responsive, joined-up public services, the government sought views in the consultation to extend the public service delivery powers under section 35 of the Digital Economy Act 2017 to business undertakings. This would be beneficial in a number of ways, including by using digital verification services, or accessing support when trying to start up new businesses.

Extend powers under s 35 of the Digital Economy Act 2017 (question 4.2.1)

There were mixed views on this proposal. Around a third of respondents agreed, recognising that businesses could benefit from joined-up public services targeted at them, such as identity verification services, accessing support when starting a new business, or applying for government grants and licences. Around a third of respondents disagreed, citing concerns that businesses were already well-supported by government bodies or that the proposal would allow the government to share citizens’ personal data with businesses. However, a similar number of respondents were neutral towards this proposal.

The government recognises the balance of views on this proposal. The government will continue to take this proposal forward; however the purpose will be to support personal data sharing within the public sector to improve public services, not to facilitate personal data sharing from the public to the private sector for other reasons. Any personal data sharing regulations made under the new provisions would be subject to further public consultation and parliamentary scrutiny.

4.3 Personal data use during the COVID-19 pandemic

During the COVID-19 pandemic, organisations across the public, private and voluntary sectors processed personal data in novel and responsible ways for the purpose of public health. The current legislation has generally not prevented such processing from taking place, but the consultation sought views on whether any aspects of the legal framework could be made clearer.

Non-public bodies delivering public tasks (questions 4.3.1, 4.3.2)

The consultation identified the under-reliance on the public task lawful ground (Article 6(1)(e) of the UK GDPR) as an area that would benefit from clarification, in particular when a non-public body is processing personal data in order to help a public body to deliver a public task or function. This might arise, for example, where a private body or a charity is asked by a public body to help it by providing information so that it can investigate a crime or deliver essential services to vulnerable people in a public health crisis. Due to a lack of clarity in the current law, the non-public body would usually rely on the legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR in this situation, unless it was under a legal obligation to deliver the service or was acting as a data processor on behalf of the controller under a formal data processing arrangement.

To address uncertainty in this area, the government proposed to clarify that private organisations and individuals asked to carry out an activity at the request of a public body may rely on that body’s lawful ground for processing the personal data under Art 6(1)(e).

The majority of respondents agreed with this proposal, although respondents noted the importance of ensuring sufficient safeguards were in place to protect the rights of individuals. For example, several respondents asked if individuals would still have the right to object to the processing and, if so, to which controller their request should be directed. Some respondents cited responding to a pandemic or the prevention of crime as areas where the proposals could be applied, but others suggested the measures could be applied to any situation where a non-public body was asked by a central government department, local authority or other public body to help deliver a service.

Respondents who were opposed to the proposal said that legislative changes would not be required if public bodies entered a formal controller-processor agreement with the non-public body and that, in the absence of such an agreement, the non-public body should continue to rely on the lawful ground of legitimate interests.

Having considered consultation responses, the government plans to introduce legislation to clarify which lawful grounds for processing are available to organisations under Article 6 of the UK GDPR when they are requested by a public body to help deliver a public task.

Processing health data in an emergency (questions 4.3.3, 4.3.4)

The consultation sought views on whether organisations outside of the healthcare sector were clear about what conditions in Schedule 1 to the DPA 2018 they could rely on when processing health data for the purposes of public health.

The majority of respondents agreed that greater clarity in the legislation would be helpful. However, a number of healthcare organisations and patients’ groups advised caution about widening the circumstances in which organisations outside the healthcare sector could process sensitive information about people’s health.

They pointed to the existing exemption in paragraph 3 of Schedule 1 to the DPA 2018, which already permits processing that is overseen by a healthcare professional or where a duty of confidence applies. This should be read in conjunction with ICO guidance which explains how a duty of confidence may arise. In addition, some respondents who were opposed to the proposal highlighted the role of Control of Patient Information Notices, which they report gave them a clear legal basis for sharing personal data with the relevant healthcare authorities in certain situations during the pandemic.

Having considered the arguments for and against, the government does not intend to pursue this proposal.

4.4 Building trust and transparency

Public trust is vital to the delivery of better public services and outcomes for individuals. Through the reforms in the consultation, the government wants to empower public bodies to share and utilise algorithmic tools to improve efficiency and service delivery, while safeguarding rights and interests of the members of the public.

Transparency mechanism for algorithms (questions 4.4.1, 4.4.2, 4.4.3)

Increasing transparency of the use of algorithmic tools for decision-making in the public sector is critical for maintaining public trust. The government proposed introducing compulsory transparency reporting on the use of algorithms in decision-making for public sector bodies.

The majority of respondents agreed with this proposal, with respondents outlining they believe it would improve public trust. Respondents felt that increased transparency would also improve accountability, especially if the reporting includes accountability mechanisms such as routes for appeal and redress. Respondents also stressed the importance of transparent information being presented in an accessible, interpretable, and meaningful way to the general public. Responses included a wide range of recommendations on the contents that should be included in the reporting requirement.

The recently launched Algorithmic Transparency Standard encompasses the categories recommended for transparency reporting during the consultation process. This includes providing information on:

  • how teams are using the algorithmic tool
  • why they are using the algorithmic tool
  • who owns and has responsibility for the algorithmic tool
  • the purpose of the tool
  • how the tool affects decision making
  • further information on the data used to train and deploy the tool
  • information about the risk and impact assessments conducted

The government does not intend to take forward legislative change at this time due to the early stage of this work, but remains strongly committed to algorithmic transparency and will continue to pilot and gather feedback on the standard and explore policy enforcement options in the future.

The consultation asked an open question about whether any exemptions should apply to the compulsory transparency reporting requirement. A range of views was expressed by respondents. Many stated that no exemptions should apply whatsoever, whereas others believed full or limited exemptions might be needed in areas such as national security and law enforcement.

The government will therefore continue to consider this issue in tandem with the piloting of the Algorithmic Transparency Standard, overall providing much greater information about, for example: what datasets are being used, the technical specifications of the algorithms, and how ethical considerations (such as mitigating bias) are addressed. The government also recognises respondents’ views on whether exemptions should apply to the compulsory transparency reporting requirement, and will consider recommendations from the consultation response in further work in this area.

Processing in the substantial public interest (questions 4.4.4, 4.4.5, 4.4.6, 4.4.7)

As outlined in the consultation, sensitive personal data cannot be processed unless there is explicit consent from the data subject, or it is expressly permitted for purposes listed in the UK GDPR and Schedule 1 to the DPA 2018. It can be difficult to find the balance between ensuring provisions are sufficiently flexible to allow all necessary processing of sensitive data, and ensuring provisions are specific enough to give data subjects transparency and controllers certainty.

There were mixed views on whether there are situations involving the processing of sensitive data that are not adequately covered by the current list of activities in the UK GDPR and Schedule 1 to the DPA 2018. However, the majority of respondents agreed that it is difficult to distinguish processing that is in the substantial public interest from processing that is in the public interest.

To address this the government proposed including in legislation a definition of ‘substantial public interest’ and offered to include new situations in Schedule 1 DPA 2018 to permit certain activities on grounds of substantial public interest, falling within any new statutory definition.

The majority of respondents supported the introduction of a definition of substantial public interest. Respondents recommended that any definition must not be too narrow or too broad to risk maintaining the status quo, and there was also recognition that it could be very difficult to define.

The government has considered how to devise a definition of substantial public interest, but does not believe a definition would add value and cover the range of activities that are required at this time. Furthermore, the government concluded that the substantial public interest test already fulfils its purpose without a definition. It already effectively compels data controllers to consider the specific benefits of their processing to the wider public and ensures controllers evaluate the necessity of their processing. Therefore the government has decided not to define substantial public interest at this time.

Around half of respondents agreed that there may be a need to add to, or amend, the list of specific situations in Schedule 1 to the DPA 2018, with some interesting situations raised with good rationale. Respondents who disagreed believed that expanding the list in Schedule 1 to the DPA 2018 would pose too great a risk to privacy and data rights.

The government is currently giving further thought to these proposals, including evidence presented by religious organisations on the difficulty of identifying a relevant condition in Schedule 1 to allow them to carry out some routine administrative duties. The government is also considering proposals from sporting bodies that would permit them to fairly assess the eligibility of athletes for restricted category events. There was also a suggestion from one respondent that exemptions which currently permit the processing of race and ethnicity data to improve diversity at senior levels within organisations could be extended to other types of data relating to under-represented groups.

Clarifying rules on biometric data in policing (question 4.4.8)

Technologies such as DNA and fingerprint analysis, and, increasingly, facial image recognition, are important public safety tools for the police. The public rightly expects the police to use these tools within a framework that ensures use is fair, transparent and proportionate.

There is an existing legal framework for the police to use biometric technologies to protect the public, which includes data protection legislation. Taken as a whole, the framework is largely principles-based, which means it is substantially ‘future-proofed’ in its applicability to new technologies. However, it can be complex and difficult for both the police and the public to understand how different pieces of legislation interact to respect privacy, human and equality rights.

The majority of respondents supported improving the clarity of the existing legal framework. A minority felt that certain technologies, such as facial recognition, required specific legislation. However, others agreed that a principles-based approach remains appropriate bearing in mind the pace of technological change in this area, with some highlighting the risk that specific legislation could hinder law enforcement’s adoption of new technology to protect the public. Several respondents also said that there might need to be specific guidance on particular technologies or use cases.

Evidence from the consultation also suggested that implementation challenges are not limited to the use of biometric capabilities, but are common to the police’s adoption of all new major data-driven operational technologies. Reflecting the feedback from the consultation and wider stakeholder engagement, the government recognises the benefits for the police and public from greater clarity on how the police can and should develop these technologies. Transparency and public communication are critical in building public understanding, support and trust when these technologies are being developed and used.

The government will work with policing authorities to promote high standards and best practice in the responsible and effective use of new technologies, including supporting the development of policing-led guidance such as new codes of conduct, as set out in section 4.5 (public safety and national security).

A small number of respondents supported greater consistency between the rules around different types of biometrics. The government is continuing to work with policing authorities on more closely aligning the rules on the retention of custody images with those for DNA and fingerprints, with an expectation that similar rules would apply to any other biometrics in future.

4.5 Public safety and national security

The effective sharing of personal data for law enforcement and national security purposes is vital to the work of our law enforcement bodies and UK intelligence services, and is in the interests of public safety. To drive consistency across the UK GDPR, and Part 3 (Law enforcement processing) and Part 4 (Intelligence services processing) of the Data Protection Act 2018, the government proposed exploring whether it is possible to align key terms that are used across these different data processing frameworks.

The majority of respondents agreed with this proposal, stating that it will provide greater clarity and create consistency across the regimes. Respondents also highlighted that it is important to recognise where and how the two regimes deliberately impose differing obligations on those processing under them. The government will continue to take this proposal forward. For example, the UK GDPR contains a definition of consent, but there is no equivalent definition under Part 3 of the Data Protection Act 2018; the government is therefore considering introducing this to provide greater clarity.

Another example where the government could create consistency across regimes is in regard to codes of conduct. In the UK GDPR, these can be produced by representative bodies (such as trade associations) to clarify the application of data protection laws in particular sectors, which are then approved by the ICO. There is currently no equivalent power under Part 3 of the Data Protection Act 2018. The government is considering allowing the law enforcement sectors to produce such codes in the future, so that they have the same ability as other sectors. In line with the feedback received from respondents, the government will ensure that any changes are clear, to prevent further confusion.

4.6 Further questions

At the end of the chapter, the government asked for views on whether any of the proposals would impact on anyone with protected characteristics. There were no specific concerns related to the impact any proposals would have on those with protected characteristics, nor were there significant issues raised by respondents in relation to the questions and topics in this chapter.

The consultation posed some open-ended questions at the end of this chapter for respondents to provide any other comments on the proposed reforms. Respondents highlighted the importance of protecting people’s personal data when these reforms come into force, particularly those with protected characteristics. The protection of people’s personal data is already at the heart of the UK’s data regime and will continue to be so as the government embarks on its reform agenda.

Chapter 5: Reform of the Information Commissioner’s Office

5.1 Summary

In chapter 5 of the consultation, the government explored proposals to reform the Information Commissioner’s Office. The ICO plays a critical role in an increasingly data-driven world; a modernising reform agenda is an investment in its future success and will sustain its world-leading reputation. The proposed reforms sought to better equip the ICO in performing its function as an agile and forward-looking regulator, by ensuring strong leadership and governance, and creating a clearer mandate for a risk-based and proactive approach to its regulatory activities.

Specifically, the government set out how it intends to improve the legislative framework that underpins the ICO by:

  • setting new and improved objectives and a clearer strategic vision for the regulator
  • changes to its governance model, improving accountability mechanisms, and extending its investigatory powers
  • refocusing its statutory commitments away from handling a high volume of low-level complaints, and towards addressing the most serious threats to public trust and inappropriate barriers to responsible personal data use

This will create a stronger regulator for a data-driven age. It will be more clearly required to take a risk-based and proactive approach; tackling the highest-risk data processing activities while helping organisations comply with the law from the outset, rather than coming in later and telling them what they’ve done wrong. The ICO’s activity and objectives need to be more transparent, so that Parliament and the public can more easily hold the ICO to account as to whether it is meeting its responsibilities.

5.2 Strategy, objectives and duties

Currently, the UK GDPR does not provide the ICO with a clear framework of strategic objectives and duties against which to prioritise its activities and resources, evaluate its performance and be held accountable by its stakeholders. Instead, the ICO is obliged to fulfil a long list of tasks and functions, as set out in Article 57 of the UK GDPR, but without a strategic framework to guide its work.

The government proposed introducing a new, statutory framework that sets out the strategic objectives and duties that the ICO must aim to fulfil when exercising its data protection functions. A clearer set of statutory strategic objectives and duties for the ICO will offer greater clarity and stability to the ICO’s role and purpose, improve transparency, and strengthen accountability in line with best practice of other regulators.

A new statutory framework for the ICO (question 5.2.1)

Almost half of respondents agreed that the ICO would benefit from a new statutory framework for its objectives and duties. Respondents indicated that the reforms provide additional clarity for the ICO, businesses and the general public on the ICO’s powers, duties and obligations. Respondents also noted that a new statutory framework would provide a stronger basis for the ICO to focus on transparent objectives, through which it can be held accountable via Parliament. There was some disagreement with the proposal, with respondents expressing concern around the potential risk, perceived or real, to the ICO’s independence, and arguing that the current framework is sufficiently clear.

The government recognises these views, particularly noting respondents’ concerns around independence, and has carefully considered the need to maintain the independence of the ICO in developing these proposals. The government plans to proceed with the proposal to introduce a new statutory framework of objectives and duties, but with some changes to the original proposals which are set out below.

To ensure clarity on how the objectives and duties are to operate alongside the ICO’s existing functions and tasks, the new framework will be designed in a way that ensures that the ICO will be able to uphold data subject rights and encourage trustworthy and responsible personal data use, while also having regard to growth and innovation, competition, and public safety.

Overarching objective (questions 5.2.2, 5.2.3)

The government proposed to introduce a new overarching objective for the ICO with 2 components that relate to upholding data rights and encouraging trustworthy and responsible personal data use. The majority of respondents supported this proposal.

Some of those who agreed highlighted that the overarching objective aligns with the ICO’s current mission, and enshrining it in law would be a welcome step, allowing the ICO to ensure that, on a statutory basis, individuals’ personal data is protected, whilst enabling organisations to use personal data effectively. Where respondents disagreed with the proposal, some argued that the ICO’s primary duty should be to uphold data rights only. Additional clarity from the government on what a new principal objective or duty would mean in practice was also encouraged.

The government plans to proceed with the proposal to introduce a new principal objective for the ICO. The government will also seek to ensure that the ICO takes a proportionate, risk-based approach to its regulatory activities.

Sitting below the principal objective, the government will introduce specific statutory duties that acknowledge the breadth of the ICO’s remit and the impact it has on other areas. These duties will strengthen the ICO’s existing obligations and ensure it is equipped to factor in interactions with other areas.

Growth, innovation and competition duties (questions 5.2.4, 5.2.5)

The government proposed to introduce new statutory duties to empower the ICO to take greater account of impacts in other domains as it supervises and enforces the UK’s data protection regime.

The majority of respondents disagreed with the proposal to introduce a growth and innovation duty. Respondents raised concerns that the duty would pose a conflict of interest for the ICO between its primary objective of upholding data rights, and having regard to growth and innovation. Respondents also felt that the ICO should not concern itself with impacts on growth and innovation, and should instead focus only on data protection priorities.

Similarly, the majority of respondents disagreed with the proposal to introduce a competition duty. As with the growth and innovation duty, concerns were raised that a competition duty would create a conflict of interest for the ICO regarding its primary objective.

However, in both cases, respondents who agreed with the proposals cited that the ICO already incorporates regard for economic growth, innovation and competition when discharging its functions, for example via the ICO’s existing collaboration with the Competition and Markets Authority, and that the proposals will formalise existing arrangements to ensure the ICO has sectoral intelligence, horizon scanning and effective regulatory sandbox activity. Those who agreed with the proposals also highlighted the importance for the overarching objective to take precedence if the ICO faces a conflict between its new duties and its data protection priorities.

The government sees the ICO’s remit as increasingly important for competition, innovation and economic growth, and therefore intends to ensure that the regulator is required to have regard to competition, growth and innovation.

Public safety duty (question 5.2.10)

There were mixed views regarding the proposal to introduce specific language recognising the need for the ICO to have regard to public safety when exercising its functions. Respondents suggested that the language used in a public safety duty would need to be clear on how this duty will be used and what impact it will have on the rights and freedoms of individuals.

The government will be introducing a duty to ensure the ICO also has regard to public safety.

Statement of strategic priorities (question 5.2.11)

The government proposed to introduce a power for the DCMS Secretary of State to prepare a statement of strategic priorities (SSP) for the ICO to have regard to when discharging its data protection functions. The majority of respondents disagreed with this proposal. Concerns were raised that this measure would pose a risk to the ICO’s independence and suggestions were made such as including a role for Parliament or limiting the parameters of the SSP so that it excludes matters relating to the public sector.

The government plans to proceed with the power to introduce an SSP as it will be a transparent way for the government to set out its priorities on data policy as well as providing the ICO with useful context. Given the government’s commitment to ensuring the ICO’s independence, the SSP will sit below the ICO’s primary objective and duties under the UK GDPR and the DPA 2018. While the ICO will be required to respond to the priorities contained in the SSP, the ICO will not be legally bound to act in accordance with the statement. Further, the SSP will be subject to parliamentary approval before it is designated.

Collaboration and enhanced information sharing gateways (questions 5.2.6, 5.2.7, 5.2.8, 5.2.9)

The majority of respondents agreed with the proposal to introduce a duty for the ICO to cooperate with and consult other regulators, with respondents highlighting that the proposal will help joined-up working and improve cooperation between regulators. Respondents who agreed with the proposal also highlighted that the duty will ensure working relationships with other regulators are on a statutory footing. Respondents that disagreed argued that the duty would complicate the regulatory landscape.

The government will amend the proposal to introduce a duty for the ICO to cooperate and consult with other regulators. The ICO already has mechanisms in place which allow for close cooperation with other regulators, primarily through Memoranda of Understanding. As a result, the government plans to introduce a requirement on the ICO to consult with relevant regulators and any other relevant bodies when exercising its duties to have regard to growth, innovation and competition.

There were mixed views on the proposal to introduce an enhanced information sharing gateway for the ICO. Some suggested that an improved information sharing gateway would help regulators standardise a consistent regulatory approach. Others highlighted that the ICO already has mechanisms in place to do this and that the case for an enhanced information sharing gateway was unclear.

The government will not continue with the proposal to introduce an enhanced information sharing gateway as we believe the ICO’s existing information-sharing powers are sufficient.

The ICO’s international role (questions 5.2.12, 5.2.13)

As part of the new statutory framework, the government proposed the introduction of an international objective for the ICO to consider the government’s wider international priorities when conducting its own international activities. The majority of respondents disagreed with this proposal, with risks to the ICO’s independence consistently raised as the primary concern. Some respondents also cited knock-on risks to the ICO’s international reputation and the UK’s adequacy status.

The government will not continue with the proposal to include an international objective for the ICO. Instead, the government may state its international priorities on data policy in the SSP. Setting government priorities out in a public statement will provide greater clarity and context for the ICO and more effectively enable the ICO to prioritise its international work.

Around half of respondents supported the proposal to require the ICO to deliver a more transparent and structured international strategy, with some respondents arguing the proposal would enhance the ICO’s accountability. Respondents who disagreed with the proposal cited independence concerns.

The government will not continue with the proposal for the ICO to deliver a more transparent and structured international strategy. Instead, the government will pursue proposals for enhanced strategy setting and reporting for the breadth of the ICO’s work, as set out in section 5.4 (accountability and transparency), which outlines requirements for the ICO to publish the key strategies that guide its work. This approach will ensure improved strategy-setting across both the ICO’s international and domestic work.

5.3 Governance model and leadership

The ICO is currently structured as a ‘corporation sole’: an individual person who represents an official position as a single legal entity. The powers and responsibilities of the ICO lie solely with the Information Commissioner, without a chair or an independent board created by statute. This model can lead to a lack of the diversity, challenge and scrutiny which are critical to robust governance and decision making. Most regulators in the UK function as a ‘body corporate’, including a separate independent board created by statute, which provides direction to - and scrutiny of - the executive.

New governance model (question 5.3.1)

The government proposed moving away from the corporation sole structure and introducing a statutory board with a chair and chief executive. This change will bring the ICO in line with other UK regulators such as Ofcom and the FCA.

There were mixed views on the proposed change in governance model for the regulator. While some of those in support acknowledged that a change in the governance model represented good practice, those who opposed the proposal cited concerns over the ICO’s independence or that they saw no issue with the current model.

The government recognises the balance of views in this proposal and believes that having powers and responsibilities spread across a board, rather than with one individual, should ensure greater independence and integrity. The government does not see any erosion of the ICO’s independence in modernising its governance model. While the current model has been in place since the regulator’s establishment in 1984, the ICO has grown significantly in size and importance. The government wants to ensure it has greater diversity in its leadership and its governance is in line with best practice, and therefore intends to proceed with this proposal.

Appointments process (questions 5.3.2, 5.3.3, 5.3.4)

The consultation outlined the proposed appointment processes for the statutory roles of the new governance model. This was to appoint the chair via the same process as that currently set out for the appointment of the Information Commissioner in the Data Protection Act 2018. This means that the chair would be appointed by Her Majesty by Letters Patent, following a recommendation from the government based on merit, after a fair and open competition. The government also proposed that the individual non-executive members of the ICO’s future board and its chief executive officer role would be appointed by the DCMS Secretary of State via the public appointment process.

There were mixed views on the proposed changes to the appointment process for the role of chair, the proposed appointment process for the non-executive members of the ICO’s board, and the proposed public appointment process for the chief executive role. Respondents believed that a fair and rigorous process was important, but concerns were raised over the impact the proposed appointment processes would have on the independence of the regulator, particularly for the role of chief executive.

Following the views outlined above, the government intends to mirror the current Information Commissioner appointment process (by Her Majesty by Letters Patent) for the new chair role so that, in that respect, there is consistency with the existing legislation. The government believes that the public appointment process for the non-executive members of the board has sufficient safeguards to protect the ICO’s independence, as ministers are accountable to Parliament for public appointments and the appointment processes are governed by the Governance Code for Public Appointments, regulated by the Commissioner for Public Appointments. This balances the importance of the ICO’s independence with appropriate oversight by the government and Parliament. The government therefore, in line with other regulatory peers, intends to appoint the non-executive members via the public appointment process.

Acknowledging that chief executive roles are more commonly appointed by the board, the government will not appoint the role of chief executive via the public appointment process as proposed in the consultation. Rather, this role will be appointed by the ICO’s board in consultation with the DCMS Secretary of State.

Information Commissioner’s salary (question 5.3.5)

The current legislation requires parliamentary approval to amend the Information Commissioner’s salary. Given the government’s planned changes to the ICO’s governance model, the government proposed to remove this requirement. There were mixed views on this proposal, with around a third of respondents being neutral. Respondents in favour felt this was in line with other regulators, whereas those who opposed felt that parliamentary scrutiny was helpful, and removing this requirement would give too much control to government ministers to increase or decrease the salary in direct response to the individual’s performance.

The government does not see the ability to amend the salary of any of the new statutory roles in the proposed governance model as a way to directly incentivise certain behaviours, but rather to give the government the flexibility to ensure it can provide the remuneration necessary to attract and retain the best possible candidates. Given the governance model change, the removal of this requirement would bring this in line with other regulators, which do not require salary approval from the House of Commons. Public corporation salaries over £150,000 are already governed by HM Treasury’s Guidance for approval of senior pay, which the government believes provides sufficient safeguards to ensure value for money. The government therefore intends to proceed with the removal of this requirement.

Name of the Information Commissioner’s Office (not included in the consultation)

Given the change in governance model, the name of the regulator being the ‘Information Commissioner’s Office’ may no longer accurately reflect the organisation, as it will not be the office of one individual.

To ensure that the name of the body is not misleading, the government is considering options for a new name for the regulator.

5.4 Accountability and transparency

As set out earlier in this chapter, there is a lack of clarity around the ICO’s strategic priorities in the current legislative framework. This means that there are no clear objectives for the ICO to measure its performance against and report on. The government therefore set out proposals to introduce new reporting requirements for the ICO, to aid external scrutiny of the ICO’s performance.

New reporting requirements (questions 5.4.1, 5.4.2, 5.4.3, 5.4.4)

The majority of respondents agreed with the proposals to strengthen the ICO’s reporting requirements. Respondents felt this would increase the accountability of the ICO to the organisations it regulates, as well as to Parliament and the public. Respondents who disagreed believed that the ICO already had sufficient reporting requirements in place and that this could cause unnecessary bureaucracy.

The vast majority of respondents agreed with the proposal to require the ICO to publish the key strategies and processes that guide its work. Those in favour felt that this would increase the transparency of the ICO and align it with the best practice of other regulators. Respondents who disagreed recognised that the ICO already published key strategies and that further publishing requirements for the regulator would add little value.

The majority of respondents supported the proposal to introduce a requirement for the ICO to develop and publish comprehensive key performance indicators. Respondents felt this proposal would allow for the performance of the ICO to be assessed publicly and provide evidence of how the ICO is meeting its statutory obligations. Respondents who disagreed believed the current reporting mechanisms in place are sufficient.

The government plans to proceed with introducing legislative requirements for the ICO to report on its approach and performance. In particular, the ICO will be required to publish:

  1. A strategy setting out how it will discharge its functions and deliver against its objectives. The ICO will be required to report annually on performance against its strategy.

  2. Key Performance Indicators (KPIs). The ICO will be required to report at least annually against its KPIs.

  3. Its approach to delivering its new objectives and duties framework. The ICO will be required to report annually on how it has discharged its functions in line with the new framework (discussed in section 5.2 (strategy, objectives and duties)).

  4. A response to the government’s Statement of Strategic Priorities explaining what it proposes to do as a consequence of this statement. The ICO will also be required to report annually on activities taken, as set out in its response to the statement ((discussed in section 5.2 (strategy, objectives and duties)).

  5. Its approach to enhanced consultation and setting up expert panels with regards to codes of practice and statutory guidance. The composition of the panel and the rationale for the composition should be published in advance of the panel convening. A summary of their engagement should be published as well as justification for why the ICO has or has not adopted recommendations from the panel. Impact assessments produced in relation to codes of practice and statutory guidance should also be published (discussed in section 5.5 (codes of practice and guidance)).

  6. Its approach to exercising its discretion concerning complaints handling (discussed in section 5.6 (complaints)).

  7. Statutory guidance under s.160 of the Data Protection Act 2018, which will be extended to include the ICO’s new enforcement powers.

The ICO will also be required to report annually on its approach to enforcement and use of its powers, including the number of investigations undertaken and their nature, the enforcement powers used, the timeframes for all completed investigations and the outcome of the investigation process.

Independent review (questions 5.4.6, 5.4.7)

In certain circumstances, the government may wish to assess the ICO’s performance independently. Currently, the government can conduct reviews of the ICO with the agreement of the Information Commissioner but there is nothing in legislation that permits a fully independent review by a third party. Therefore, the government proposed to empower the DCMS Secretary of State to initiate an independent review of the ICO’s activities and performance.

There were mixed views on the proposal. Respondents who agreed highlighted the importance of the ICO being subject to scrutiny, while those who disagreed cited concerns around the impact on the ICO’s independence.

The government recognises the balance of views on this issue and the importance of proper accountability mechanisms for the ICO. The government plans not to proceed with the proposal to empower the Secretary of State to initiate an independent review, and will continue to utilise the current non-legislative mechanisms available. The process for this follows the Cabinet Office guidance on reviews of non-departmental public bodies.

5.5 Codes of practice and guidance

The DPA 2018 requires the ICO to prepare codes of practice on four specified data processing activities, in order to provide practical guidance on compliance and outline best practice for organisations. The DPA 2018 requires the Information Commissioner to consult the DCMS Secretary of State, and any other individuals and organisations considered appropriate by the Commissioner, before preparing or amending three of the codes.

The ICO is also required by law to publish statutory guidance on various areas, for example, its regulatory action, and is required to consult the Secretary of State and other appropriate persons when doing so. In addition to codes of practice and other statutory guidance, under its general functions, the Information Commissioner has powers to develop and publish non-statutory guidance on processing activities that relate to data protection.

The ICO produces codes of practice and guidance on complex and technical areas. For smaller organisations which do not have large compliance divisions, it can be particularly challenging to understand and comply with such codes of practice and guidance. Furthermore, codes of practice are required to be taken into account by the courts, and non-statutory guidance is a reliable indicator of what the Information Commissioner considers to be good practice and may also be taken into account by the courts. These documents are authoritative sources of information for regulated persons in practice. As such, it is crucial that the ICO’s codes of practice and guidance are accessible and enable regulated persons to comply with the legislation efficiently and easily.

Requirement to carry out impact assessments (question 5.5.1)

The government proposed creating a statutory requirement for the ICO to undertake and publish impact assessments when developing codes of practice and guidance on complex or novel issues. Whilst the ICO already carries out impact assessments for new codes of practice, this is only done as best practice and without statutory underpinning.

The majority of respondents supported the proposal to require the ICO to undertake impact assessments when developing codes of practice and complex or novel guidance. Respondents acknowledged that this would increase accountability of the ICO whilst serving as an example of best practice.

To ensure the ICO’s codes of practice and significant guidance are effective and useful, the government will take forward the proposal to require the ICO to carry out impact assessments. This will apply to all codes of practice and statutory guidance unless exempt. Formalising an approach to impact assessments will ensure consistency in how the ICO considers the economic impacts and the potential cost to business when developing its products.

Requirement to set up expert panels (question 5.5.2)

The government proposed creating a statutory requirement for the ICO to set up a panel of experts to review a code of practice or guidance on complex or novel issues during its development. This would build on existing best practice, for example, the expert panel set up by the ICO to support the age-appropriate design code.

Almost half of respondents agreed with the proposal to require the ICO to set up expert panels, with respondents arguing that the establishment of expert panels would help ensure a full diversity of views were considered. There was some opposition to the proposal, with respondents raising concerns over membership of the expert panels and the need for transparency in the governance process.

To ensure the ICO’s codes of practice and statutory guidance are effective and useful, the government will take forward the proposal to require the ICO to set up expert panels. This will apply to all codes of practice and statutory guidance unless exempt. As the ICO is a cross-sector regulator, a broad and transparent consultation process with an expert panel will improve the ICO’s understanding of how legislation may apply to different sectors and data use cases. The ICO will be required to publish a rationale for the composition of the panel, a summary of its findings and a response on how, and to what extent, they have reflected these findings in the code or statutory guidance.

Approval of codes of practice and complex or novel guidance (question 5.5.3)

The government proposed that the requirements to carry out impact assessments and set up expert panels would be accompanied by a power for the DCMS Secretary of State to approve codes of practice and complex or novel guidance.

The majority of respondents disagreed with this proposal. Respondents mainly highlighted concerns about the risk to the ICO’s independence.

The government believes that the new requirements on the ICO to carry out impact assessments and to set up expert panels will ensure that codes and guidance procedures are more robust. However, the government also believes it is important for democratic accountability that such guidance is approved by the DCMS Secretary of State, as a final safeguard for this process, before the guidance is laid in Parliament.

The government therefore intends to proceed with introducing a process for the Secretary of State to approve ICO codes of practice and statutory guidance, unless exempt (for example, guidance on ICO internal processes). For transparency, the Secretary of State will be required to publish their rationale for approving or not approving a statutory code or statutory guidance produced by the ICO.

Applying these proposals more broadly (questions 5.5.4, 5.5.5)

The government also asked whether these proposals should apply to a broader set of the ICO’s regulatory products, and whether the ICO should be required to undertake and publish an impact assessment on each and every guidance product.

There were mixed views on whether to extend the proposals beyond complex or novel guidance and require an impact assessment on every guidance product. Respondents highlighted the burden that overextending requirements may cause the ICO, as well as the impact on ICO independence. The government will not pursue extending the requirements, and will instead limit the scope to all statutory codes of practice and statutory guidance unless exempt.

5.6 Complaints

The current legislation forces the ICO to allocate a significant amount of its resources to handling data protection complaints, yet many such data protection complaints could be resolved more effectively between the complainant and relevant data controller or processor, prior to intervention by the ICO.

Under the UK GDPR and the DPA 2018, there is currently no threshold to make a complaint to the ICO. Internationally, this contrasts with other regimes such as the New Zealand Privacy Act (2020) which, whilst enshrining the right of data subjects to complain to the Commissioner, also provides guidelines outlining why the Commissioner may decide not to investigate a given complaint, including if the complainant has not made reasonable efforts to resolve the complaint directly with the data controller first.

Reform of the complaints framework (questions 5.6.1, 5.6.2, 5.6.3, 5.6.4)

The government set out proposals to create a more efficient and effective model that would require a complainant to attempt to resolve their complaint directly with the relevant data controller before lodging a complaint with the ICO, alongside a requirement on data controllers to have a simple and transparent complaints-handling process in place to deal with data subject complaints.

To further enable the ICO to allocate resources efficiently and effectively against the highest-risk data processing activity, and to provide clarity for data subjects and data controllers, the government also explored whether to introduce criteria by which the ICO can decide not to investigate a given complaint.

Almost half of respondents agreed that the ICO would benefit from a more proportionate regulatory approach to data protection complaints. Respondents were keen to highlight some concern over a potential risk of undermining data subject rights in this area, namely the need to maintain a clear pathway for data subjects to complain to the ICO or seek redress. At the same time, many respondents agreed that a more proportionate approach to data protection complaints would likely result in better outcomes and clearer expectations for all parties.

The majority of respondents supported the proposal to require complainants to reach out to the data controller before contacting the ICO. Respondents felt this would lead to better conversation and more dialogue between data subjects and data controllers, and would reduce the volume of ‘premature’ complaints going to the ICO. Many respondents noted that this proposal also currently reflects best practice, since the ICO will often encourage complainants to talk through their issue with the data controller prior to any further regulatory action.

Around half of respondents agreed with the proposal to set out criteria in legislation by which the ICO can decide not to investigate a complaint. Concerns were raised that the criteria should be clear and not too rigid or overly prescriptive on the ICO.

The vast majority of respondents agreed with introducing a requirement for data controllers to have a simple and transparent complaints-handling process to deal with data subjects’ complaints, although some respondents stressed that any such requirement must be proportionate and should avoid being burdensome on organisations.

The government intends to proceed with these proposals. The ICO will have the ability to use its discretion to decide when and how to investigate complaints. This will include clear discretion in legislation not to investigate certain types of data protection complaint, including vexatious complaints, and complaints where the complainant has not first attempted to resolve the issue with the relevant data controller. This will empower the ICO to exercise its discretion with confidence. In turn, data controllers will be required to consider and respond to data protection complaints lodged with them.

The government recognises the concerns raised alongside the broader support, and in proceeding with these proposals will ensure that data subject rights are properly safeguarded. The government will retain a clear pathway for data subjects to complain to the ICO if the issue cannot be resolved with the controller.

This new regulatory approach will empower the ICO to take a more agile approach to complaints, in turn allowing its resources to be used in a more risk-based way

5.7 Enforcement powers

The ICO is responsible for monitoring and enforcing the UK’s data protection regime. The ICO should be a strong, effective regulator that is equipped with the powers it needs to investigate compliance with the legislation and take appropriate action, where necessary, when organisations and individuals undertake unlawful personal data processing.

The government explored 3 areas where the ICO’s powers may need to be extended in limited circumstances, which focus on enabling the ICO to carry out more effective and efficient enforcement activity. These were:

  1. a power for the ICO to commission technical reports

  2. a power to compel witnesses to attend and answer questions at interview

  3. amending the statutory deadline for the ICO to issue a penalty following a notice of intent

The majority of respondents agreed that the current enforcement provisions are broadly fit for purpose, and that the ICO has the tools appropriate to both promote compliance and to impose robust, proportionate and dissuasive sanctions where necessary. Of those who did not agree, many felt that the ICO was too slow in taking enforcement action.

Power to commission technical reports (questions 5.7.2, 5.7.3, 5.7.4)

The vast majority of respondents supported the proposal to give the ICO the power to commission technical reports. Respondents felt that these powers would lead to better-informed ICO investigations. At the same time, respondents were keen to stress that the use of any new powers should be proportionate, combined with appropriate safeguards and should not adversely affect the timeliness of investigations.

The government plans to proceed with the power for the ICO to commission technical reports. While the ICO will be granted discretion to issue a technical report notice when it considers its use fair and reasonable, in making such decisions the ICO will be required to have regard to using alternative investigatory tools. The ICO will also need to take into consideration the relevant knowledge and expertise available to the controller or processor and the impact of the cost of producing the report.

A power to compel witnesses to attend and answer questions at interview (question 5.7.5, 5.7.6)

The majority of respondents supported the proposal to compel witnesses to interview and answer questions.

The government plans to proceed with the power for the ICO to compel witnesses to attend an interview and to compel the witness to answer questions. While there was recognition from many respondents that other regulators (such as the Competition and Markets Authority and Financial Conduct Authority) had similar powers, some respondents wanted reassurance that appropriate safeguards would prevent this power from being misused by the ICO.

The ICO’s new power will not be established in a way which interferes with the rights of individuals not to self-incriminate, rights to legal professional privilege and several of the procedural mechanisms which ensure that investigations and witness interviews are conducted in a way that is proportionate and fair.

Amend the statutory deadline for the ICO to issue a penalty following a notice of intent (questions 5.7.7, 5.7.8)

The consultation tested 2 options to grant the ICO more flexibility on the time available to issue a penalty notice following a Notice of Intent, where needed: firstly, to extend the current statutory 6-month deadline to 12 months, which the majority of respondents supported, or secondly, introducing a provision to permit the ICO additional time beyond the 6-month deadline under certain circumstances, which the majority of respondents supported. Respondents who supported these measures felt they would give the ICO greater flexibility, whereas respondents who opposed these measures were concerned about the regulator acting too slowly.

The government plans to proceed with the second of these options, as this will provide the ICO with greater flexibility in the course of conducting complex investigations but does not extend the deadline across the board.

Enhancing the ICO’s transparency regarding investigations (question 5.7.9)

The consultation proposed greater transparency mechanisms by placing a requirement on the ICO to set out anticipated timelines for the phases of an investigation to the relevant data controller at the beginning of an investigation.

The majority of respondents supported this proposal. Respondents felt that ICO investigations can sometimes be too opaque, and that this measure would help to tackle that. Of those that disagreed, respondents argued that this was an unnecessary process for the regulator.

The government plans to proceed with this proposal in order to provide greater clarity to data controllers once an ICO investigation is underway.

5.8 Biometrics Commissioner and Surveillance Camera Commissioner

The consultation outlined the need to simplify the oversight framework for police use of biometrics, and police and local authority overt use of surveillance cameras. The regulatory landscape is crowded and confusing, which can inhibit innovation and public confidence. There are overlaps in the roles of the Biometrics Commissioner, the Surveillance Camera Commissioner and the ICO. This can be particularly confusing where biometric and surveillance technologies converge, for example the use of live facial recognition.

The government recently simplified these arrangements by appointing 1 person to take on what were previously 2 part-time roles: the Biometrics Commissioner and the Surveillance Camera Commissioner. The consultation set out the potential for further simplification by absorbing the functions of those commissioners into the ICO, with the aim of creating a single route for advice, guidance and redress for data controllers and the public.

The majority of respondents were supportive of simplification. Of those who agreed with the proposal, the key reason was that the current oversight landscape is confusing. For example, some respondents highlighted the potential for greater efficiency in dealing with a single body as well as removing areas of overlap, such as similar guidance produced by the ICO and the Surveillance Camera Commissioner.

Some respondents disagreed with the proposal to absorb the functions of the Biometrics and Surveillance Camera Commissioners into the ICO. Some said that the ICO might deprioritise oversight, given the breadth of its regulatory scope. Others argued that it would be inappropriate to transfer to the ICO, as a regulator, the quasi-judicial functions of the Biometrics Commissioner, who reviews applications from the police to retain biometrics in very limited circumstances where people have not been convicted.

In light of this feedback and wider engagement, including with the current Biometrics and Surveillance Camera Commissioner and law enforcement partners, the government will take forward the proposal to simplify the oversight framework in this area.

The Biometrics Commissioner’s casework functions provide important independent oversight, particularly in relation to national security. Reflecting the feedback from the consultation, the government has decided not to transfer these functions to the ICO. Instead, we will consider transferring these functions to the Investigatory Powers Commissioner, acknowledging the relevant expertise they can provide in relation to national security.

Independent oversight of the use of surveillance technologies is also important to maintain public trust. The Surveillance Camera Commissioner provides independent oversight of police and local authority use, while the ICO has enforcement powers and oversees the use of these technologies by all users. This duplication was highlighted by respondents in the consultation, particularly in relation to guidance issued by both the ICO and the Surveillance Camera Commissioner. The government thinks that the police need clear and consistent guidance to best protect the public and maintain their trust, and therefore plans to simplify oversight by removing this duplication.

The government is considering whether the ICO or other existing bodies could carry out some of the Biometrics and Surveillance Camera Commissioners’ ancillary activities, such as the third-party certification scheme for surveillance camera operators. Bodies such as Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Services and the College of Policing also have the ability to play a larger role.

5.9 Further questions

At the end of the chapter, the government asked for views on whether any of the proposals would impact on anyone with protected characteristics. There were no specific concerns related to the impact any proposals would have on those with protected characteristics, nor were there significant issues raised by respondents in relation to the questions and topics in this chapter.

The consultation also posed some open-ended questions at the end of this chapter for respondents to provide any other comments on the proposed reforms. Respondents highlighted the importance of protecting people’s personal data when these reforms come into force, particularly those with protected characteristics. The protection of people’s personal data is already at the heart of the UK’s data regime and will continue to be so as the government embarks on its reform agenda.

Questions on analysis of expected impact

The consultation was published alongside a note that outlined the analytical approach to measuring the expected impact of the original policy proposals. In this note, we outlined our methodologies, some initial results and also the areas we believed were lacking evidence that could be gained from the consultation stage for the final impact assessment.

Respondents were asked a series of questions surrounding the compliance activities that they currently undertake under the current regime, how much they estimate these currently cost them and how they believed these would change if the reforms were implemented. These responses have been helpful in providing us with further insight into UK business compliance activities and strengthening the evidence base behind our modelling.

The consultation responses indicate that the stakeholders that responded to these questions are largely content with the analysis estimates so far. This includes the methodological approach and sources of costs and benefits of these reforms. What it also indicated is that the impacts might differ based on the size of the firms these are applied to. This has allowed us to further evidence our modelling and analytical methodology by conducting a sensitivity analysis on the results and further taking into account business size in our modelling. This way we can account for the considerable diversity between firms affected and reflect the inherent uncertainty on impacts that respondents indicated to us. We will ensure these are incorporated in the final impact assessment.

Over half of all respondents to these questions carry out all of the compliance activities listed in our analysis. The vast majority of these respondents highlighted that they spend time establishing a legal basis for personal data processing or acquiring consent. Alongside this, the vast majority of these respondents stated that they currently spend time responding to subject access requests, preparing data protection impact assessments and keeping records. These figures highlight the scale of impact that changes to these processes will have on UK businesses.

Responses on the current cost of these activities for UK businesses varied, the consensus being that for smaller businesses and sole traders it is harder to calculate these costs. Often, it forms only a proportion of a full-time equivalent worker’s (FTE’s) time, whilst larger firms face larger costs that are easier to measure. There were also a number of firms that highlighted that although some compliance activities are costly in time, they are overall beneficial when creating a framework and can lead to cost savings in the long run.

Respondents had a mixture of opinions on how the proposed reforms may impact the compliance costs they currently face. Many respondents highlighted the potential benefits the reforms would bring in terms of the reduction in time spent processing personal data and responding to subject access requests. However, the majority of respondents were also concerned that whilst there may be benefits, there would also be time and cost involved in familiarising themselves with the new framework. There were further concerns around the impacts to consumer confidence in privacy schemes and the impact these reforms may have on international relations. Of all of the activities firms are involved in when processing personal data, the majority felt that compliance activities were most likely to be impacted by the implementation of these reforms.

In terms of international transfers, the majority of respondents either rely on EU adequacy or the use of Standard Contractual Clauses (SCC’s) to transfer personal data across borders, and welcome a set of reforms that make the use of these transfer mechanisms easier. The biggest concern raised in response to changes in transfer mechanisms is the risk of a change in the UK’s current adequacy agreements and the impact on businesses if adequacy were to be lost.

The government will be producing an impact assessment which will include updated analysis of the expected impact of the proposals. Following the policy changes made in light of the response we have received, as well as the updates we have made to our methodology, we expect the results of this analysis to differ materially to the analysis that was published alongside the consultation.

Public sector equality duty analysis

The government has given due consideration to the public sector equality duty, under the Equality Act 2010, and the consultation exercise was especially informative in relation to the following areas:

Subject access requests. Our pre-consultation analysis considered whether the re-introduction of a nominal fee for subject access requests could harm or disadvantage individuals who make these requests, and this was highlighted in a small percentage of consultation responses as having a potential impact on individuals with protected characteristics, particularly age and disability. The government will not re-introduce a nominal fee for subject access requests and will not proceed with introducing a cost ceiling. The government also considered whether introducing a similar safeguard in the Data Protection Act 2018 as the one provided under Section 16 of the Freedom of Information Act (for public bodies) to help data subjects by providing advice and assistance is necessary. The government deems the current duty to facilitate data subject rights under Article 12(2) UK GDPR and Section 52(6) of the DPA 2018 to be sufficient. Potential revised guidance from the ICO would further mitigate any impact on groups with protected characteristics. The government is satisfied that proposals in this area will not disproportionately impact groups with protected characteristics.

Legitimate interests. Our pre-consultation analysis noted the potential indirect impacts of the legitimate interests proposals on groups with protected characteristics. This was supported by respondents, who warned that removal of the requirement to take individuals’ rights and interests into account in a wide range of situations when processing personal data could disadvantage children or vulnerable groups in society, who are less able to complain to the regulator if their personal data has been misused. Having considered consultation responses, the government is minded to pursue the proposal only in respect of a narrow range of processing activities where there are clear public interest reasons for the processing to occur. This could include, for example, processing that is necessary for crime prevention or safeguarding. Removal of the balancing test and associated compliance paperwork in these situations could encourage organisations to make the authorities aware of individuals who are at risk, without delay. This could have direct benefits for children and other groups with protected characteristics. Even if the balancing test were removed in these scenarios, data controllers would continue to be required to comply with data protection principles (for example, on lawfulness, fairness and transparency), which would further reduce the risks of any adverse impact on groups with protected characteristics.

Cookie proposals. Some respondents raised concerns about websites processing increased volumes of personal data without consent, especially where the data relate to children or people with disabilities or mental health issues. Concerns were raised by some respondents about the importance of not undermining the age-appropriate design code (AADC) standards, notably the need for a high level of transparency when children’s data is being collected. The proposals that the government will take forward (i.e. permitting audience measurement and some other non-intrusive cookies without consent) will be carefully designed, with safeguards to protect the rights of individuals, such as limiting any information that is processed for audience measurement purposes to aggregate statistical information, and not using the data for more intrusive purposes. A move from an opt-in to an opt-out consent model for websites would only take place once ministers are content that users have access to technology that supports them to effectively manage their preferences on how their data is processed.

Extending the soft opt-in to non-commercial organisations (such as political parties and charities). Our pre-consultation analysis recognised that this proposal would mean that some people will receive direct marketing material that they would not have received previously. Some groups in society (e.g. older people, people with mental health issues) may be more concerned than others by emails, messages, texts from people they do not know well. To mitigate the risks identified here, the government will design this proposal so that non-commercial organisations are subject to exactly the same rules as commercial organisations in terms of respecting a person’s right to opt-out and making it easy for them to do so.

ICO complaints. Concerns were raised by some respondents that requiring data subjects to complain to the relevant controller before complaining to the ICO would create a barrier between data subjects and the ICO, and prevent data subjects being able to exercise their rights to complain or seek redress. To mitigate these risks, we will combine this proposal with appropriate safeguards. As such, the data subject will retain their ability to escalate their complaint to the ICO if they have not received an adequate response after a set time period, or if the data controller has not provided the data subject with contact details to raise a complaint, for example. We will also retain current statutory accountability mechanisms to ensure that the ICO does not use its discretion to not respond to complaints too freely. Article 78 of UK GDPR confers the right of judicial remedy where the supervisory authority ‘does not handle a complaint or does not inform the data subject within three months on the progress or outcome of the complaint’, and this will be maintained.

AI proposals. In our pre-consultation analysis we considered how our proposal on creating a new condition for processing of sensitive personal data for bias monitoring and correction in relation to AI systems would impact the public sector equality duty under section 149(1) Equality Act 2010. This proposal is likely to lead to an increase in processing of sensitive personal data on individuals with protected characteristics. The purpose of this proposal is to support organisations to monitor harmful bias and eliminate discriminatory outcomes, so any detrimental impact is considered justifiable on this basis. Furthermore, the more representative the data that an AI system is trained on, the more the system will reflect a broader cross-section of the population. This in itself is likely to mitigate bias and resulting discrimination, on an individual with protected characteristics. Similar considerations were involved in a closely related proposal for processing of personal data for bias monitoring and correction in a list of legitimate interests for which organisations would not be required to take individuals’ rights and interests into account when relying on the “legitimate interests” ground to process personal data. The government is continuing to assess the merits of this related proposal as part of the proposed reforms to legitimate interests outlined above in the paragraph on legitimate interests.

Future-proofing Article 22. Consultation responses raised concerns over safeguards for automated decision-making, and particularly that removing the right to human review could have a disproportionately negative impact on people with protected characteristics, for example on the basis of their sex or race. A frequently cited example of this was the 2020 A-Level results algorithm, which respondents felt discriminated unfairly against pupils. Though precautions were taken to prevent bias based on protected characteristics, the profiles of those attending different schools inevitably led to outcomes being different based on their protected characteristics, including race and sex. In this example, there was uncertainty over the appeal mechanisms pupils could access. Our proposals retain human review as currently required under Article 22, but will ensure that a data subject has access to clearer safeguards for any significant decision made without meaningful human involvement, potentially to include a justification of how a decision is reached which may enable a data subject to more easily identify how protected characteristics have been factored into a decision.

Removal of prescriptive requirements to complete data protection impact assessments (DPIAs). Our pre-consultation analysis considered whether this proposal could mean that potential disproportionately detrimental effects of the processing for individuals with protected characteristics is not identified. However, organisations will still be required to consider risk through implementation of their risk-based privacy management programme, and therefore this in itself is likely to mitigate the potential risk of protected characteristics not being identified.

Overall, our analysis is still that the government does not consider that any negative impact of our proposals on individuals with protected characteristics are disproportionate.

Annex A: List of consultation proposals

Question Proposal Next steps
1.2.1 Consolidating research provisions into a single chapter A - The government plans to proceed with this proposal but in a more targeted way
1.2.2 Creating a statutory definition of scientific research A - The government plans to proceed with this proposal
1.2.8 Incorporating broad consent for scientific research into legislation A - The government plans to proceed with this proposal
1.2.6, 1.2.7 Creating a new lawful ground for processing for research purposes C - The government does not plan to proceed with this proposal
1.2.9, 1.3.1, 1.3.2, Clarifying that further processing for an incompatible purpose may be lawful when based on a law that safeguards an important public interest or when the data subject has re-consented A - The government plans to proceed with this proposal
1.3.3 Clarifying when further processing can be undertaken by a controller different to the original controller B - The government is considering this proposal further
1.3.4 Clarifying when further processing may occur when the original lawful ground was consent A - The government plans to proceed with this proposal
1.2.10, 1.2.11 Extending the “disproportionate effort” exemption on information provision requirements for further processing for research purposes of personal data collected directly from the data subject A - The government plans to proceed with this proposal
1.4.1, 1.4.2, 1.4.4, Creating a limited list of legitimate interests for businesses to process personal data without applying the balancing test A - The government plans to proceed with this proposal, but for a narrower range of situations than set out in the consultation paper
1.4.3 Including process necessary for the purpose of ensuring bias monitoring, detection and correction in the list of “legitimate interests” it proposed be created under 1.4 C - This will not be included in the narrow list of activities for which the balancing test is not required.
1.5.11, 1.5.12, 1.5.13 Enable organisations to use sensitive personal data for the purpose of managing the risk of bias in their AI systems by clarifying that Schedule 1 Paragraph 8 can be used for processing necessary for the purpose of ensuring bias monitoring, detection and correction A - The government plans to proceed with this proposal
1.5.3 Developing a safe regulatory space for the responsible development, testing and training of AI C - The government does not plan to proceed with this proposal
1.5.1, 1.5.2 Addressing uncertainty about the scope & substance of fairness in the DP regime as applied to the development, deployment and outcomes of AI systems and the ICO’s regulatory reach B - The government is considering this proposal further
1.5.14, 1.5.15, 1.5.16, 1.5.17 Clarifying the limits and scope of Article 22 UK GDPR A - The government plans to proceed with this proposal
1.5.17 To remove Article 22 of UK GDPR and solely automated decision making permitted where it meets a lawful ground in Article 6(1) (and Article 9-10 (as supplemented by Schedule 1 to the Data Protection Act 2018) where relevant) and subject to compliance with the rest of the data protection legislation C - The government does not plan to proceed with this proposal
1.6.1, 1.6.2 Adopting the Council of Europe’s test for anonymisation into legislation A - The government plans to proceed with this proposal
1.6.3 Confirming that the test for anonymisation is a relative one A - The government plans to proceed with this proposal
2.2.12 Raising the threshold for when data breaches are notifiable to the ICO under Article 33 (1) of the UK GDPR C - The government does not plan to proceed with this proposal
2.2.2, 2.2.3, 2.2.14, 2.2.15 Require organisations to operate a privacy management programme A - The government plans to proceed with this proposal
2.2.5, 2.2.6, Replace the requirement to appoint a DPO with a requirement to designate a suitable individual to oversee the organisation’s DP compliance A - The government plans to proceed with this proposal
2.2.8 Remove the requirement for Data Protection Impact Assessments A - The government plans to proceed with this proposal
2.2.9 Remove the requirement for Prior Consultation with the ICO on high risk processing A - The government plans to proceed with this proposal
2.2.10 Introduce a new “voluntary undertakings” process C - The government does not plan to proceed with this proposal
2.3.3 Introducing a cost ceiling for complying with subject access requests C - The government does not plan to proceed with this proposal
2.3.1, 2.3.2, 2.3.3 To amend the threshold for refusing to respond to/charge a reasonable fee for a subject access request from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’ A - The government plans to proceed with this proposal
2.3.4 Introducing a nominal fee for subject access requests C - The government does not plan to proceed with this proposal
2.4.4 To remove the requirement for prior consent for all types of cookies (governed by Regulation 6 of PECR) A - The government plans to proceed with this proposal in the future for websites, once automated technology is widely available to help users manage online preferences
2.4.1 To remove the consent requirement for analytics cookies and similar technologies (governed by Regulation 6 of PECR); treat them in a similar way as “strictly necessary” cookies A - The government plans to proceed with this proposal
2.4.3 To remove the consent requirements in Regulation 6 of PECR for a wider range of circumstances where the controller can demonstrate legitimate interest for processing the data A - The government plans to proceed with this proposal
2.4.2 To remove the consent requirements in Regulation 6 of PECR when controllers are using cookies or similar technology in compliance with an ICO-approved sector code or regulatory guidance A - The government plans to proceed with this proposal
2.5.2 Exclude political parties and elected representatives from PECR’s rule on direct marketing by electronic means C - The government does not plan to proceed with this proposal
2.4.9 Extending the soft opt-in for direct marketing to communications from political parties A - The government plans to proceed with this proposal
2.5.3 Extending the soft opt-in for direct marketing to other political entities such as candidates and registered (with the Electoral Commission) third-party campaign groups A - The government plans to proceed with this proposal
2.4.9 Extending the soft opt-in for direct marketing to communications from other non-commercial organisations A - The government plans to proceed with this proposal
2.4.10 Empowering ICO to take action against organisations for the number of unsolicited direct marketing calls ‘sent’ as well as calls ‘received’ and connected A - The government plans to proceed with this proposal
2.4.11 Introducing a ‘duty to report’ on communication service providers to report suspicious traffic transiting their networks A - The government plans to proceed with this proposal
2.4.12 Other measures to help reduce the number of unsolicited direct marketing and fraudulent calls B - The government is considering this proposal further
2.4.12 Mandating communications providers to do more to block calls / texts at source B - The government is considering this proposal further
2.4.12 Mandating communications providers to provide free of charge services to block incoming calls not on an allow list B - The government is considering this proposal further
2.4.17, 2.4.18 Empowering ICO to impose assessment notices on companies suspected of PECR breaches A - The government plans to proceed with this proposal
2.4.6, 2.4.7 Requiring websites to respect preferences set by individuals through their browser A - The government plans to proceed with this proposal
2.4.16 Increasing fines under PECR to GDPR levels A - The government plans to proceed with this proposal
2.4.16 Other measures to help ensure the enforcement regime is effective, proportionate and dissuasive A - The government plans to proceed with this proposal
3.3.7 To reform the DCMS Secretary of State’s adequacy making power. A - The government plans to proceed with this proposal
3.2.2 Making adequacy regulations for groups of countries, regions and multilateral frameworks B - The government will continue to consider this approach
3.2.3 To remove the requirement for the DCMS Secretary of State to conduct a review adequacy decisions every 4 years A - The government plans to proceed with this proposal
3.3.7 Clarifying that either judicial or administrative redress is acceptable for international transfers A - The government plans to proceed with this proposal
3.3.3 Exempt “reverse transfers” from the scope of the UK ITR C - The government does not plan to proceed with this proposal
3.3.4 Empowering organisations to create their own ATMs C - The government does not plan to proceed with this proposal
3.3.1, 3.3.8 Creating a new power for the DCMS Secretary of State to formally recognise new ATMs A - The government plans to proceed with this proposal
3.2.3 Reinforcing the importance of proportionality when using ATMs A - The government plans to proceed with this proposal
3.4.1 Allowing certification for international transfers to be provided for by different approaches to accountability C - The government does not plan to proceed with this proposal
3.4.1 Clarifying that prospective certification bodies outside the UK can be accredited to run UK-approved international transfer schemes C - The government does not plan to proceed with this proposal
3.5.1 Establishing a proportionate increase in flexibility for use of derogations by making explicit that repetitive use of derogations is permitted C - The government does not plan to proceed with this proposal
4.3.1, 4.3.2 Clarifying that organisations asked to carry out an activity on behalf of a public body may rely on that body’s lawful ground for processing the personal data under Art 6(1)(e) A - The government plans to proceed with this proposal
4.2.1 To extend powers under section 35 of the Digital Economy Act 2017 aimed at improving public service delivery to business undertakings, beyond the current scope of solely individuals and households A - The government plans to proceed with this proposal
4.4.5, 4.4.6 Defining “substantial public interest” C - The government does not plan to proceed with this proposal
4.4.7 Specifying new situations in sch 1 DPA 2018 to permit certain activities on grounds of substantial public interest. A - The government plans to proceed with this proposal
4.3.3, 4.3.4 Clarifying that health data can be lawfully processed when necessary for reasons of substantial public interest in a public health or other emergency without oversight by healthcare professionals C - The government does not plan to proceed with this proposal
4.4.8 Clarifying rules on the collection, use and retention of biometric data by the police A - The government plans to proceed with this proposal
4.5.1 Align key terms that are used across the different data processing frameworks to drive consistency. A - The government plans to proceed with this proposal
4.4.1, 4.4.2, 4.4.3 Introducing compulsory transparency reporting providing information on how public authorities and government use complex automated tools to support decision-making C - The government does not plan to proceed with this proposal
5.2.1 New statutory framework setting out the ICO’s strategic objectives and duties A - The government plans to proceed with this proposal
5.2.2 A new overarching duty for the ICO to uphold data rights and to encourage trustworthy and responsible data use A - The government plans to proceed with this proposal
5.2.4 New duty for the ICO to have regard to economic growth and innovation A - The government plans to proceed with this proposal
5.2.5 New duty for the ICO to have regard to competition issues A - The government plans to proceed with this proposal
5.2.6, 5.2.7 New duty for the ICO to consult with relevant regulators and any other relevant bodies when exercising its duties to have regard to growth, innovation and competition A - The government plans to proceed with this proposal
5.2.8, 5.2.9 Establish a new information sharing gateway to support regulatory cooperation C - The government does not plan to proceed with this proposal
5.2.12 Requirement for the ICO to deliver a more transparent and structured international strategy C - The government does not plan to proceed with this proposal
5.2.13 New statutory objective for ICO to consider wider HMG international priorities C - The government does not plan to proceed with this proposal
5.2.10 New duty for the ICO to have regard to public safety A - The government plans to proceed with this proposal
5.2.11 New power for the DCMS SoS to prepare a statement of strategic priorities which the ICO must respond to A - The government plans to proceed with this proposal
5.3.1 Establish an independent Board and Chief Executive for the ICO A - The government plans to proceed with this proposal
5.3.2 Appointing the Chair by the same process as that for appointing the IC under DPA 2018 A - The government plans to proceed with this proposal
5.3.3 Appointing the non-executive board members by the DCMS SoS A - The government plans to proceed with this proposal
5.3.4 Chief Executive to be appointed by the DCMS Secretary of State. Alternative recommendation is for this to be an ICO Board appointment, in consultation with the DCMS Secretary of State. C - The government does not plan to proceed with this proposal
5.4.1 Removing Parliamentary approval to amend IC’s salary A - The government plans to proceed with this proposal
5.4.2 Requirement for the ICO to develop & publish KPIs A - The government plans to proceed with this proposal
5.4.3 Requirement for the ICO to publish key strategies and processes guiding its work A - The government plans to proceed with this proposal
5.4.3 Requirement for ICO to publish other information to aid transparency A - The government plans to proceed with this proposal
5.4.6 A power for the DCMS Secretary of State to initiate an independent review of the ICO’s activities and performance C - The government does not plan to proceed with this proposal
5.5.3 A process for the DCMS Secretary of State to approve statutory codes of practice and statutory guidance ahead of laying them in Parliament The government plans to proceed with this proposal
5.5.1 A power for the DCMS Secretary of State to require the ICO to set up a panel of experts when developing all codes of practice and statutory guidance, unless exempt A - The government plans to proceed with this proposal
5.5.2 To require the ICO to undertake and publish impact assessments when developing all codes of practice and statutory guidance unless exempt A - The government plans to proceed with this proposal
5.6.2, 5.6.3 Requiring data controllers to have complaints-handling mechanisms A - The government plans to proceed with this proposal
5.6.2, 5.6.3, 5.6.4 Set out in legislation the criteria the ICO can use to determine whether to pursue a complaint in order to provide clarity and enable the ICO to take a more risk-based and proportionate approach to complaints. A - The government plans to proceed with this proposal
5.7.2, 5.7.3, 5.7.4 New power to commission a technical report to aid breach investigations A - The government plans to proceed with this proposal
5.7.7 Changing statutory deadline for ICO to issue a penalty from 6 to 12 months C - The government does not plan to proceed with this proposal
5.7.8 Introducing a provision to permit the ICO additional time beyond the 6-month statutory deadline to issue a penalty, under certain circumstances. A - The government plans to proceed with this proposal
5.7.9 Introducing a requirement for the ICO to set out to the relevant data controller/s at the beginning of an investigation the anticipated timelines for phases of its investigation. A - The government plans to proceed with this proposal
5.7.6 New power for the ICO to compel witnesses to interview during investigations and answer questions A - The government plans to proceed with this proposal
5.8.1, 5.8.2 Further simplification of the oversight framework for the police use of biometrics and overt surveillance A - The government plans to proceed with this proposal

Annex B: List of organisations that responded to the consultation

  • Advertising Association
  • Aberdeen City Council
  • abrdn plc
  • ABUKAI
  • Access Now (NGO working for the protection of human rights in the digital age)
  • Action Medical Research
  • ActionAid UK
  • Ada Lovelace Institute
  • ADR UK
  • ADS Group
  • Airbus
  • Airbus Operations Limited
  • Akrivia Health
  • Alan Turing Institute
  • Alternative Investment Management Association (AIMA) and the Alternative Credit Council (ACC)
  • Alzheimer’s Society
  • Amberhawk Training Limited
  • Anekanta Consulting
  • Aneurin Bevan University Health Board (NHS Wales for Gwent)
  • Anthony Nolan
  • AOP - Association of Online Publishers
  • APIMS Limited
  • AQA
  • Arc Data Limited
  • Archives and Records Association
  • Ardonagh Advisory
  • ARTICLE 19
  • Association for Interactive Media and Micropayments
  • Association of British HealthTech Industries
  • Association of British Insurers
  • Association of Medical Research Charities (AMRC)
  • AstraZeneca
  • ATI
  • Audit Wales
  • Avast Software
  • Awin
  • Azorus
  • BACP
  • BAE Systems Applied Intelligence Limited (ICO registration number: Z5470019)
  • Bail for Immigration Detainees
  • Battersea Cats and Dogs Home
  • Baycloud Systems
  • BCP COUNCIL
  • BCS - The Chartered Institute for IT
  • Behold.ai Technologies Limited
  • Best Companies Limited
  • Betting and Gaming Council
  • Big Brother Watch
  • BILETA
  • Biotechnology Innovation Organisation
  • Birmingham University
  • Bolton Council
  • BongoIT
  • BPE Solicitors LLP
  • Breakthrough Communications and Strategies Limited
  • Breast Cancer Now
  • Bird & Bird LLP
  • British American Business
  • British and Foreign Bible Society
  • British and Irish Law, Education and Technology Association (BILETA)
  • British Heart Foundation
  • British Irish Chamber of Commerce
  • British Medical Association
  • British Red Cross
  • British Retail Consortium
  • British Security Industry Association (BSIA)
  • BSA
  • BSI Group
  • BT Group
  • Buckinghamshire Council
  • CACI UK
  • Caerphilly County Borough Council
  • Cancer Research UK
  • Capita
  • Capsticks solicitors llp
  • Cardiff University
  • Catalyst Housing Limited
  • CEDPO
  • Center for AI and Digital Policy
  • Center for Data Innovation
  • Centre for Digital Trust and Society
  • Centre for Intellectual Property and Information Law (CIPIL)
  • Centrica plc
  • Cerner
  • CGI
  • Chartered Institute of Fundraising
  • Cheshire West and Chester Council
  • Chichester College Group
  • Child Poverty Action Group
  • Children 1st
  • Children’s Hearings Scotland
  • Church of England
  • Cielo
  • Cifas
  • CIPL
  • Citizens Advice
  • City of Edinburgh Council
  • City of London Law Society.
  • Clarion
  • Clean Up Gambling
  • CloudPay
  • Code Poets Limited
  • Colchester Borough Council
  • Colleges Scotland
  • Collingwood Insurance Company Ltd
  • Collinson
  • Confederation of British Industry
  • Connected Places Catapult
  • Consumer Credit Trade Association
  • Consumer Data Research Centre and Leeds Institute for Data Analytics, University of Leeds
  • Consumer Markets Authority
  • Cornwall Council
  • Coventry City Council
  • Crafted Media
  • Creative Privacy Ltd
  • Credit Services Association
  • CREST
  • CREST International
  • Crisp
  • Croud Inc Ltd.
  • CrowdStrike, Inc.
  • Curve OS Ltd
  • CVG Solutions Ltd
  • Data Improvement Across Government
  • Data Management Association (DAMA)
  • Data Marketing Association (DMA)
  • Data Privacy Advisory Service
  • Data Protection Education Ltd
  • Data Protection Network
  • Data Trusts Initiative
  • Data, Tech and Black Communities
  • De Montfort University
  • Debt Recovery Plus Ltd
  • defenddigitalme (children’s privacy group)
  • Dementia UK
  • Demondude Ltd
  • Demos
  • DENBIGHSHIRE COUNTY COUNCIL
  • Department for Education, UK Gov
  • Department of Finance (NI)
  • Department of Health NI
  • Derbyshire County Council
  • Devon and Somerset Fire and Rescue Service
  • Devon County Council
  • DFN UK
  • DIFC Authority
  • DIFC Authority - Commissioner of Data Protection
  • Digital Europe
  • Digital Health & Care Wales
  • Direct Line Group
  • Discover Learning Trust
  • DMG Media
  • Donr
  • Dorothy House Hospice Care
  • Dorset & Wiltshire Fire and Rescue Service
  • Dun & Bradstreet
  • Dyfed Powys Police
  • E.ON
  • East Riding of Yorkshire Council
  • EasyJet
  • EFL Trust
  • Eisai Europe Limited
  • Element
  • Elzware Ltd
  • EPUT
  • ESOMAR
  • Essex County Council
  • Ethicall
  • Euro parking Services Ltd
  • European Centre for Digital Rights
  • European Publishers Council
  • evalian
  • Evergreen Finance London Ltd
  • Experian
  • Fable Data
  • Facewatch Ltd
  • Faculty of Actuaries
  • Faculty of Advocates
  • Faegre Drinker Biddle & Reath LLP
  • Fair Trials
  • Farrer & Co LLP
  • Federation of Small Businesses
  • Finance and Leasing Association
  • Financial Conduct Authority
  • For Your Information
  • Forbes Solicitors LLP
  • Fundraising Regulator
  • Future Plc
  • Garmin (Europe) Limited
  • GBG PLC
  • GDPR In Schools Limited
  • GDS Analytics Limited
  • Gemserv
  • Gener8
  • General Council of the Bar
  • General Medical Council
  • Glasgow Kelvin College
  • Global Cyber Alliance
  • Global Data Alliance
  • Global Privacy Alliance
  • Google
  • GP Practices
  • Great Ormond Street Hospital
  • Great Western Air Ambulance
  • Greater Manchester Combined Authority and Transport for Greater Manchester
  • GreenNet Ltd
  • GSK
  • Hammersmith & Fulham Council
  • Hampshire and Isle of Wight Fire and Rescue Service
  • HD Media CIC
  • Healios
  • Health Data Research UK
  • Health Research Authority
  • Health Shield friendly Society
  • HelloDPO Ltd
  • Help for Heroes
  • Heriot-Watt University
  • Hermes Parcelnet Ltd
  • Higher Education Statistics Agency
  • Historic Environment Scotland
  • HIWFRS
  • HM Revenue and Customs
  • Hogan Lovells International LLP
  • HQBS Ltd
  • Huawei Technologies (UK) CO.,Ltd
  • HY Education Solicitors Limited
  • Hymans Robertson LLP
  • ICAEW
  • ICO
  • IGfL - Information Governance for London
  • ILF Scotland
  • Immuta
  • Impellam Group plc
  • Imperial College London
  • Independent Healthcare Providers Network
  • Information Accountability Foundation
  • Information and Records Management Society.
  • Information Governance for London
  • Information Governance, Betsi Cadwaladr University Health Board
  • Information Technology Industry Council (ITI)
  • Innoscite Limited
  • Inside Futbol Ltd
  • Institute of Development Professionals in Education (IDPE)
  • Institute of Economic Affairs
  • Interactive Advertising Bureau
  • International Committee of the Red Cross
  • International Regulatory Strategy Group (IRSG)
  • Internet Association
  • Investment & Life Assurance Group
  • Ionx Solutions
  • IPRS Group Ltd
  • Ipsos MORI
  • IQVIA
  • ISACA
  • ISBA
  • ITN
  • ITV
  • iVIEW
  • Japan Christian Link
  • Jisc
  • JP Legal Assist Limited
  • Judicium
  • Just Algorithms Action Group
  • Keep Our NHS Public
  • King Edward VI School
  • King’s College Hospital Charity
  • KIng’s College London
  • Kingston University
  • Lane, Clark and Peacock
  • LAUDIS Business Advisors
  • Law Society of England and Wales
  • Law Society of Scotland
  • Lawyers in local government
  • Liberal Democrats
  • Liberty
  • LimApp Ltd
  • Lincolnshire Partnership NHS Foundation Trust
  • Linklaters LLP
  • Local Government Association
  • Leicester, Leicestershire and Rutland Local Medical Committee
  • London and International Insurance Brokers Association
  • London Borough of Barnet
  • London Borough of Hounslow
  • London School of Economics
  • London Stock Exchange Group
  • London Strategic Information Governance Networks’ Forum
  • Londonwide Local Medical Committees
  • Look Ltd
  • Lowell Financial Ltd
  • LSE Law School
  • Macmillan Cancer Support
  • Marie Curie UK
  • Market Research Society
  • Mastercard
  • medConfidential
  • Medical Justice
  • Medical Research Council
  • Medtronic plc
  • MeiraGTx
  • Meta
  • Methods Analytics
  • Microsoft
  • Midland Heart Housing Association
  • Migrants’ Rights Network
  • Mind (mental health charity)
  • Moorfield Eye Hospital
  • More Partnership
  • Morgan Sindall Property Services
  • Mozilla
  • MSite
  • Multidimensional Analytics Solutions Ltd
  • Munich Reinsurance
  • My Data Protection World
  • Mydex Community Interest Company
  • mySociety
  • National Consumer Federation
  • National Data Guardian
  • National Deaf Children’s Society
  • National Fire Chiefs’ Council (NFCC) Information Governance Group
  • National Galleries of Scotland
  • National Highways
  • National Institute for Care Excellence (NICE)
  • National Library of Scotland
  • National Physical Laboratory
  • National Police Chiefs Council
  • Natural History Museum
  • NCC Group
  • NEC Software Solutions Limited
  • New College Durham
  • Newcastle City Council
  • News Media Association
  • News UK
  • Newsquest Media Group Limited
  • NHS Blood and Transplant
  • NHS Business Services Authority
  • NHS Confederation
  • NHS London Shared Service
  • NHS Scotland’s Information Governance Forum
  • NILGOSC
  • Norfolk County Council
  • North and Mid Wales Local Authority Data Protection Officers Group
  • North Wales Police
  • Northdoor plc
  • Northern Ireland General Practice Committee
  • Northumbria University
  • Northumbrian Water Ltd
  • noyb - European Center for Digital Rights
  • NPCC
  • Nuffield Council on Bioethics
  • Nuffield Health
  • Office for Nuclear Regulation
  • One West (part of Bath & North East Somerset Council)
  • Onfindo
  • Open Data Institute (ODI)
  • openDemocracy
  • Open Rights Group
  • Open University
  • Opendium
  • Orrick, Herrington and Sutcliffe (UK) LLP
  • Oxfam GB
  • Oxford University
  • Pact
  • Paddock Wood Community Advice Centre
  • Paper Frogs Ltd
  • Parity Technologies, Inc.
  • Parking Control Management (UK) Ltd
  • Parkingeye
  • PCC
  • PDSA
  • Personal Investment Management and Financial Advice Association (PIMFA)
  • Personify XP
  • Perth and Kinross Council
  • Peterborough City Council and Cambridgeshire County Council
  • Peterlee Town Council
  • PHG Foundation
  • Phillips 66 Limited
  • Plaid Financial Ltd
  • PomVom UK Ltd
  • Pool Data Foundation
  • Portsmouth City Council
  • Premier Park Ltd
  • PreterLex Ltd.
  • PrimeConduct Limited
  • Prisons and Probation Ombudsman
  • Pritchetts Law LLP
  • Privacy and Consumer Advisory Group
  • Privacy International
  • Privacy Partnership
  • PrivDash
  • Professional Publishers Association
  • Public CCTV Managers Association
  • Public Law Project
  • Racial Justice Network
  • RAF Association
  • Re:Cognition Health Limited
  • REaD Group Ltd
  • Redkite innovations
  • Reed Global
  • Refugee Trauma Initiative
  • RELX
  • REPHRAIN
  • Reply Limited
  • Reset
  • Revoke Limited
  • RGDP LLP
  • Ripon Spa surgery
  • Royal & Sun Alliance Limited
  • Royal and Sun Alliance Limited
  • Royal Botanic Garden Edinburgh
  • Royal Holloway University
  • Royal Mail
  • Royal Statistical Society
  • Royal Yachting Association
  • RSPCA
  • Russell Group
  • Sage
  • Salesforce UK & Ireland
  • Salford CVS and the Greater Manchester VCSE Information Governance Lead Group
  • Salvation Army
  • SAS Software UK
  • Sasa Marketing Ltd
  • SAT-7 UK
  • Save the Children
  • Scottish Chambers of Commerce
  • Scottish Fire and Rescue Service
  • Scottish Government
  • Scottish Water
  • Siccar
  • Sight Scotland and Sight Scotland Veterans
  • Skills Development Scotland
  • Sky
  • Smart DCC Ltd
  • Snapchat
  • Software & Information Industry Association
  • Software and Information Industry Association
  • Soha Housing Ltd
  • Solutionlabs Ltd
  • South East Open Source Solutions Ltd.
  • South Somerset District Council
  • Specsavers Optical Group
  • Sphere Data Protection
  • St Albans City and District Council
  • Staffordshire County Council
  • STAMMA
  • Statewatch
  • Station10
  • Stewardship
  • StopWatch
  • Strategic Communications (U.K.) ltd
  • STREETINVEST
  • Sue Ryder
  • Surrey MAISP Action Group
  • Takeback Software Ltd
  • TechUK
  • Telford & Wrekin Council
  • Tendring District Council
  • Terrafirma IDC Ltd
  • Tesco Mobile
  • Tewkesbury Borough Council
  • Thales UK
  • Thames Water
  • The 3 Million
  • The Age Verification Providers Association
  • The Alternative Investment Management Association and the Alternative Credit Council
  • The Association of British Investigators
  • The Association of the British Pharmaceutical Industry
  • The British Library
  • The Chartered Institute of Fundraising
  • The Chartered Institute of Marketing
  • The Citizens
  • The Coalition for a Digital Economy
  • The Common Framework
  • The Council for Advancement and Support of Education (CASE)
  • The DPO Centre
  • The Electoral Commission
  • The International Parking Community
  • The Joint Council for the Welfare of Immigrants (JCWI)
  • The Kubernesis Partnership LLP
  • The Law Society of England and Wales
  • The Legal Education Foundation
  • The Purbeck School
  • The Royal Society
  • The Salvation Army
  • The Scottish Children’s Reporter Administration
  • The Scouts
  • The Southern Co-Operative
  • The University of Bradford
  • The University of Manchester
  • The University of Sheffield
  • The Vale of Glamorgan Council
  • The Wrekin Housing Group Limited
  • Thurrock Council
  • TIGA
  • TISA (The Investing and Saving Alliance)
  • Tony Blair Institute for Global Change
  • TRACE Enforcement Group
  • Trades Union Congress (TUC)
  • TransUnion
  • trueCall Ltd
  • Trunomi
  • Trusthogen.com
  • UDL Intellectual Property
  • UK Car Park Management
  • UK Citizens
  • UK Ethics Pandemics Accelerator
  • UK Interactive Entertainment (Ukie)
  • UK Online Measurement Company
  • UKAS
  • UKSG
  • Ulster University
  • UniFida
  • UNISON
  • Unite
  • United Kingdom Lubricants Association
  • United Kingdom Security Vetting
  • Universities Superannuation Scheme
  • University College London
  • University Hospitals Sussex NHS Foundation Trust
  • University of Bradford
  • University of Leeds DPO office
  • University of Liverpool
  • University of Strathclyde
  • University of the Arts London (UAL)
  • University of Winchester
  • UNJUST C.I.C
  • Unlock Democracy
  • US-UK Business Council
  • use MY data
  • Veritau Group
  • Victoria and Albert Museum
  • Virgin Media O2
  • Virtualgo2 Ltd
  • Visa Europe
  • Vodafone
  • WaterAid
  • WEA (Workers Educational Association)
  • Wellcome Sanger Institute
  • Welsh Government
  • Welsh Revenue Authority
  • West London Welcome
  • West Midlands Fire Service
  • West Yorkshire Police
  • Which?
  • William Reed Business Media Ltd
  • Wise Parking Ltd
  • Workday
  • Xynics Data Solutions
  • YouGov
  • Your Homes Newcastle
  • Your Housing Group
  • Zorva Consulting Limited
  • 3 Billion Pairs
  • 360Giving
  • 5Rights Foundation