Background to the review
I am delighted to have the opportunity to talk to you about the review I have recently undertaken and grateful to the King’s Fund for inviting me to speak here today on the day that the review has been published.
The review makes recommendations on the matters of data security, consent and opt-outs across the health and social care system.
As many of you will know, this is not the first review I had undertaken on issues concerning information governance, confidentiality and consent.
Twenty years ago in 1996 I was asked to chair a committee looking at how patient information was stored and transferred. This was just at the beginning of computers being used in a relatively widespread way across the NHS to manage health information. We produced what we hoped were clear and widely applicable principles to protect patients and support staff working in this changing environment. These became known as the Caldicott Principles
Over a decade later in 2012 I undertook another review of how information should be managed and shared.
We found that in the intervening years there had been some change. The original report had been written in 1997 when the service was more paternalistic and much less patient-centred.
Our 2013 report found that citizens were a lot more interested in what happens to their health and care information; who has access to it, for what purposes is it used, and why isn’t it shared more frequently when common sense and good practice might say that it should be.
In 3 years since that review the interest among the public has increased considerably. This is the case in all sorts of contexts, not just health and care.
People expect to be asked to choose their privacy settings when they sign up for social media. They expect their bank to provide additional security measures to enable them to transact online. They are aware of breaches to databases held by some commercial companies and expect to be given honest information about these.
Meanwhile, in the field of health, the opportunities for data analysis and insight that technology offers have developed quickly.
New technology and big data offers potential for improvements to care which can benefit all of us. And these advances have implications for how data must be safeguarded and used.
But the dialogue with the public and its understanding have not grown at the same speed as the capacity of technology.
We have an almost paradoxical situation where on the one hand people expect that the system shares their data when it doesn’t eg when they attend a hospital appointment. While on the other, many are not aware of some of the routine uses of health and care data for purposes beyond care.
When someone arrives at A&E they expect the staff to have access to their medical history. We know that often they do not. When a patient is transferred after a stay in hospital to a care home, they expect the care team to know all about what has happened to them in hospital and what medicines they are now taking. We know this is by no means always the case.
So we have a gap between expectation and reality when it comes to the sharing of information for people’s direct care.
Meanwhile, there is much that can be and is done with people’s data for purposes beyond their care, such as gaining new insights into patterns of disease or monitoring the quality of care in a particular locality.
And yet evidence we have gathered over recent months during this review has shown us that there is very little awareness that individuals’ data might be used for such purposes – we have a gap between reality and expectation here too.
Unaddressed, these gaps will lead to a diminution in public trust. In the absence of information, understanding and trust, anxiety will grow about whether data, which can be deeply personal, is being held securely and treated with respect.
When we talk about issues concerning data security, technology, data sharing and consent, the conversation can quickly become focused on the complex technical and legal details.
These are important. But what must remain our guiding principle is trust. It is essential that the way health and care system treats people’s data and the way that conversations are conducted about it with people engenders public trust and confidence.
We must be realistic that there is a wide spectrum of opinion about these matters and that wherever we draw the line to balance the interests of confidentiality and the benefits that can be gained by using data, some people will be unhappy.
But it is nevertheless essential that we have these conversations.
This is why I was so pleased to be invited in 2014 by the Secretary of State to become the first National Data Guardian and to be charged with acting as an independent champion for the public in this area.
And it is why I believe his commissioning of the review that we are publishing today is so timely.
In parallel with my review, the Secretary of State for Health also asked the Care Quality Commission to undertake a review of data security in the NHS.
He asked me to develop new security standards for health and social care.
He also asked me to develop and a new consent and opt-out model. The goal was to produce a model that makes it clear to patients how their health and care information can be used - and in what circumstances they can opt-out of it being shared for purposes other than their direct care.
A review based on evidence
My experiences leading the previous pieces of work to which I have referred have shown that achieving change depends not only on an engaged and supportive government, but also on ensuring that recommendations are rooted in strong evidence.
So for this review, it was absolutely vital to me that we examined the evidence and canvassed opinion as thoroughly and broadly as possible in the time available to us.
I am immensely grateful to the first class support I have had in undertaking this work. An independent team was assembled under the leadership of Katie Farrington, who has since become the Director of Digital and Data at the Department of Health.
The review set to work to engage a huge range of stakeholders.
Our recommendations were developed iteratively with input from:
- patient and service users
- clinicians, Medical Royal Colleges, the BMA
- Information Commissioner’s Office
- the various arm’s length bodies of the Department of Health
- UK Caldicott Guardian Council
- service providers
- research community and research charities
- civil society
- providers of IT systems
- data security experts
The CQC’s findings in its review of data security in the NHS were very important in influencing our work to develop new data security standards; there is a great deal of common ground between our reports and we make complementary recommendations.
Given its expertise in technology and data, the HSCIC, soon to become NHS Digital, has also provided invaluable support. While our review was not asked to consider implementation, we have nonetheless consulted the HSCIC closely over the feasibility of our proposals.
I have also been very pleased to be able to rely as ever on the sound advice of the panel of experts who support me in my role as National Data Guardian, and on the team in the NDG’s office ably led by Simon Gray.
Data security findings and recommendations
When we considered data security, we found, as did the CQC, that while there is commitment and much good work being undertaken, there is room for improvement.
The NDG review identified 3 key areas for action - people, processes and technology.
Where breaches do occur, these are often because staff who are motivated above all by providing excellent care find themselves having to find ‘workarounds’ to burdensome processes or outmoded technology.
We are calling in particular on leaders of organisations to take much greater responsibility for data security. The senior leaders in health and care should be giving data security the same level of priority as they do clinical and financial processes.
We have developed 10 new standards that should be applied across all health and social care organisations. These have been designed to be as applicable to small organisations, such as a GP surgery or small social care provider, as they are to big hospital trusts.
We are recommending that a redesigned IG toolkit is used to embed these standards.
It is important that there is internal and external scrutiny of how well organisations are meeting these standards. But we have also been at pains to ensure that compliance is assured in a way that is not burdensome and is appropriate to the different types of organisation.
For instance, the review recommends that the HSCIC works with primary care to provide the support needed for their meeting of the standards, using the IG Toolkit to identify organisations requiring additional help, and to enable peer support.
Under recommendations under the first of our 3 themes of people, processes and technology, we are putting forward standards to ensure staff are equipped to handle information respectfully and safely, according to the Caldicott Principles.
The standards include that all staff complete data security training at least once a year which is appropriate to them and that they pass a mandatory test. This will be provided through the new toolkit.
As well as requiring proactive efforts to train and educate, the standards are designed to foster an environment where staff identify, acknowledge and learn from weaknesses.
We heard striking evidence from former members of the aviation sector about the importance of encouraging staff to speak up, and of listening to staff to derive valuable intelligence to enable a swift reaction to a potential threat.
We need organisations where near misses, hazards and insecure behaviours can be reported without fear of recrimination.
So under the processes theme, the standards set out that reviews must take place at least annually to identify and improve processes which have caused breaches or near misses, or which force staff to use workarounds which compromise data security.
The standards require action to be taken immediately following a data breach or a near miss, with a report made to senior management within 12 hours of detection.
In this area security advice from the new CareCERT service run by HSCIC should be responded to.
Under the technology theme, the standards look to ensuring that technology is secure and up-to-date.
This includes that no unsupported operating systems, software or internet browsers are used within the IT estate and that a cyber security strategy is in place which is based on a proven cyber security framework.
The standards also demand that IT suppliers are held accountable via contracts for protecting the personal confidential data they process and meeting the standards.
Opt-out findings and recommendations
Turning to consent and opt-out, the review underlines how vital data is for high quality health and care services.
If the public will trust health and care services with their data, there can be huge benefits for all of us. “Information about me” can be combined to create “knowledge about us” which is vital for a wide range of uses, from researchers making breakthroughs in developing life-saving medicine to regulators seeing problems promptly when things go wrong.
As I said earlier, we found that most people trust the NHS in particular to do the right thing, but there are still very low levels of understanding about data use and sharing
When we opened up conversations with people about how data is used and could be used, we found that many express altruistic views about the use of their information. Where people can perceive the public benefit to their information being used, they will usually support this.
But they also expect to be told about what is happening and to be given a choice.
When it came to developing a new opt-out, the importance of simplicity became clear. Whether we were talking to patients, clinicians, service users, researchers or technological experts we heard that the opt-out had to be simple to explain, to understand and to implement.
We have therefore developed what we think are a simple opt-out and have recommended that it is applied across the whole health and social care system. It is described the report in an 8 point model. We have also put forward some ways that this might be presented to the public to make a choice, with suggested wording for leaflets or a website. The wording we have put forward has been tested, but it is important that it is put to further testing so that we get this right.
The opt-out model firstly reassures people that whether they opt-out or not, they are protected by the law. Personal confidential information will only ever be used where allowed by law. It will never be used for marketing or insurance purposes, unless someone has separately consented to this.
It also describes that information is essential for both care and for other beneficial purposes, as I have laid out already today.
It nonetheless offers the right to opt-out of personal confidential information being used for these other purposes beyond direct care.
The opt-out that we have designed covers both running the system and research.
The category of running the NHS and social care system would include surveys, work by regulators and those providing care to check quality and commissioning.
The category of research would include university researching the effectiveness of a programme and researchers writing to individuals to invite them to participate in a specific approved research project.
The choice could be presented as 2 separate opt-outs. Or there could be a single opt-out covering personal confidential information being used both in running the health and social care system and to support research and improve treatment and care.
We spent time testing both approaches and found support for both. We have recommended that there is further testing of both a 2 question and a single question model with patients and professionals to see if people would prefer to have more than one choice.
Our model states that the opt-out should be respected by all organisations that use health and social care information. We have also said that people should only need to opt out once, either online or in person, for their choice to be applied across the health and social care system. People will of course be able to change their mind.
Under the model, explicit consent will continue to be possible. People will be able to take part in specific research studies if they wish.
We also heard that the public was generally comfortable with their anonymised data being used.
The review is recommending that the opt-out does not apply to information anonymised in line with the established code laid down and monitored by the Information Commissioner’s Office as good practice.
We are also recommending that confidential data should flow, as the law permits, to the NHS’s statutory safe haven - the HSCIC - soon to be renamed NHS Digital.
This would mean a complete dataset would be held within the NHS family and NHS Digital could then anonymise and share this information with those that are authorised to use it.
The review took the view that this will be an effective way of incentivising the use of anonymised data, minimising the unnecessary use of personal confidential
information. Under this model, NHS managers and researchers will have less need to use people’s personal confidential information and less justification for doing so.
Finally, the model also maintains the position that opt-outs will not apply where there is a mandatory legal requirement or an overriding public interest.
These will be areas where there is a legal duty to share information (for example a fraud investigation) or an overriding public interest (for example to tackle the ebola virus).
Alongside this model, we are calling upon the government to consider introducing criminal sanctions for deliberate and negligent re-identification of individuals.
We are proposing also that there should be a new tool allowing people to see when and by whom their information has been accessed if they wish.
Next steps: the public conversation
As I have said, the central theme running through this review and my work as National Data Guardian is trust and the system demonstrating its trustworthiness.
Building public trust for the use of health and care data means giving people confidence that their private information is kept secure and only used in their interests.
I am convinced that we must have much fuller conversations and debate with the public about how its data is collected, managed and shared.
My review calls upon the government to conduct a full and comprehensive consultation as the first step in this process. The Secretary of State agreed to this earlier today.
My recommendations are far-reaching and so it is very important that this consultation takes place and we hear more about the views of the public, health and care professionals, researchers, data security and technology experts on the proposals. I hope that many of you in this room will play your part in this.
There will then need to be a great deal of work to implement the recommendations, and support will be required for the organisations and professionals who will be affected.
The review itself was not asked to look at implementation, however I am very clear that neither the embedding of the new data security standards nor the application of the new opt-out model are projects that can take place overnight.
I hope that the review and its recommendations can attract the support from across the health and care system and from government machinery that will be vital for these changes to take place. And of course there will also need to be funding in due course to help its achievement.
In the same way, the conversation that I am talking about is not a single occurrence. It must be an ongoing dialogue. It will be about listening to the public, as well as speaking to people.
In the UK we are in a privileged position. Because such a large proportion of health services are delivered by the NHS, the picture we can gain from those using them has the potential to be comprehensive.
Undoubtedly there are huge insights in that data that we have yet to discover; patterns of disease that when we see them will enable us to treat people better, emerging health problems that we need to develop new ways to combat, and inevitably places where care is simply not good enough and must be improved.
But however noble the goal, it would be a mistake to take for granted the public’s agreement to our using individual’s data to meet these goals.
Instead we need to address those gaps that have already grown between public understanding and agreement, and the reality.
Having that conversation will not always be easy. But it is essential to earning public trust and it can’t be short circuited.
I hope the recommendations I have put forward to the Secretary of State, and which he has accepted will be beneficial in taking this forward.