The National Data Guardian reflects upon the importance of maintaining public trust when innovations with data are taken forward to improve care
Today the Information Commissioner’s Office (ICO) announced the result of its investigation into the way that the Royal Free NHS Foundation Trust shared identifiable data about 1.6 million patients with DeepMind.
This is a matter which I have considered carefully with my advisory panel and on which I provided advice to the ICO at her request. Like the ICO, my panel and I found problems with the way this data had been shared. In particular, the ICO asked my opinion on the question of whether the Royal Free had used an appropriate legal basis for the initial data sharing. After much careful deliberation with my panel, I came to the view that they had not.
Reflecting on this matter, I believe it is important to underline that as National Data Guardian (NDG) I am a strong advocate of work to develop new technology which can improve care and save lives. In this case, the Royal Free and DeepMind developed an app to alert hospital nurses and doctors to inpatients who might have acute kidney injury, a very serious condition, which can be hard to diagnose but can develop rapidly.
The issue that concerned my panel members and me was not that innovation was taking place to help patients affected by this condition. Far from that, it was the legal basis which the Royal Free had used to share data which could identify more than 1.6 million patients to DeepMind.
Innovation within a legal framework
In this instance the Royal Free shared the information on the basis of ‘implied consent for direct care’. This is a legal basis that doctors, nurses and care professionals rely on every day to share information in order to make sure the individuals they are looking after receive the care they need. However, it is my view that this legal basis cannot be used to develop or test new technology, even if the intended end result is to use that technology to provide care.
I’m afraid that a laudable aim – in this case developing and testing life-saving technology – is not enough legally to allow the sharing of data that identifies people without asking them first. We need to reassure the public there are always strong safeguards in place to make sure that confidential information will only ever be used transparently, safely and in line with the law and regulatory framework.
Getting the balance right
The tension between supporting and enabling innovation and acting in line with public expectation is not new. We wrestled with these issues 20 years ago during the Caldicott Committee’s Review of Patient Identifiable Information. This was commissioned when the NHS was just starting to move from paper to computerised records, with the result that information could be sent much more easily from one part of the health service to another.
Our 1997 report highlighted the benefits of what we called an ‘information explosion’ - more effective and efficient care for patients and a better basis for planning and monitoring services. It also acknowledged a tension between the needs of those running and planning services to use patient information on the one hand and, on the other, patients’ expectations about how information about them would be used.
That review said that managing that tension by “adhering to explicit and transparent principles of good practice” will “reassure patients and those treating them that confidentiality is safeguarded” and that “patients expect nothing less.”
If what we were seeing in 1997 was an information explosion, what we are seeing now could be called an information explosion many fold. The technology of today presents extraordinary opportunities to provide even better, safer, more individualised care.
The challenge continues
The technology may be different, it may be developing more quickly than ever, but I believe we face a similar tension which we must address - between the reality that heath and care data is often needed for innovation and public expectations about how such data will be used.
Now, as then, we have to work with where the public is. Confidentiality remains as important as ever. People need to be able to tell their doctor, nurse, or care worker things about themselves and their health and care needs in confidence. If such information is then used in a way that patients and service users do not expect, this precious trust will become undermined. The mantra of “no surprises” sums this up.
A conversation and safeguards to reassure
In the review I published last summer, I highlighted that we had found very low levels of public knowledge and understanding about how health and care data is used. Many were altruistic about data about them being used for research and innovation – but they wanted to know about that and to have a choice. In my report, I argued strongly that it is the responsibility of all health and social care, research and public organisations to make the case for data sharing to the public.
An ongoing conversation with the public is essential. This must be a two-way dialogue, in which people’s expectations are both listened to and informed. We will also need to reassure the public that there are strong safeguards in place to protect personal confidential data.
Just as was the case 20 years ago, I believe we can earn public support for the use of data in innovation, by “adhering to explicit and transparent principles of good practice” to “reassure patients and those treating them that confidentiality is safeguarded”. Now as then, the public rightly expects nothing less.