The NHS changes greatly over time, but there are few “big bang” changes overnight, that happen without involving the patient. Your health context can change in the course of a single consultation, but the system does not change – only how you interact with it. Press releases may suggest that the NHS is rushing towards genomics and AI, but it’s much more a slow stroll.
The publication of Caldicott 3 called for an “informed” “continuing conversation” about health data. We agree – the best way for a patient to understand how their data may be used next month, is to be able to see how it was used last month. But if there are caveats that remain hidden from the public, a dishonest entry is worse than no entry.
Every patient has a personal lived experience of the NHS, and using that as the starting point for accountability of data use is vital. Data usage reports can give a patient the information about how data is used, in a context that directly relates to their personal experience of the NHS. Some of that they were involved in, and some of it is the system doing its thing and hoping no one notices.
Databases: poor and past?
Why are some patients being told to bring their passport to receive care, even though the NHS was there when they were born?
Databases that have benefits will receive public support for doing what they were supposed to do, but there is a widespread recognition that some past data choices by the NHS may have not been wise.
Whether that legacy will be repaired, or left to fester, is now up to the Department of Health, when they respond to the Caldicott Review. The Review left a number of hard questions unanswered, including the abuse of some patients that has been described as tantamount to “blackmail”. Care.data was just one of those. There are others that have hidden under a rock for some time, and followed care.data as it it were a guidebook.
The databases proliferate, there is almost no evidence for whether they are useful. Is the energy spent on them worthwhile? Is there a better way of delivering the goals they were designed to meet? There is an opportunity cost to doing anything…
There are many good reasons to use data, but just because a data collection has existed for decades, doesn’t mean it’s still the best way to deliver on the goals. Continued secrecy about the effectiveness of some data projects suggests that perhaps the claims of benefits are overblown, and are not supported by the evidence of what actually happened.
A continuing conversations requires ongoing evidence of reality, not political hyperbole.
Will patients be shown the benefits?
Will patients be provided with the evidence to show how their wishes have been implemented? What was the outcome of projects where their data was included?
What was the outcome of the “necessary” projects where dissent was ignored?
Will the Caldicott Consent Choice ignore the choices patients were previously offered?
In 2016, NHS Digital have made the final preparatory steps to telling patients how their data is used, which was firstly, keeping track (a side effect of beginning to honor objections), but they also now publish a detailed data release register – with sufficient detail for you to work out where some of your data went and why. Such a register allows for independent scrutiny of any data flow, and is a necessary prerequisite to a data usage report.
It does not tell an individual whether their data was used, nor what the knowledge generated was (e.g. see notices tab), but it is the key step. And while two thirds of data sold by NHS Digital does not honour your opt out, Public Health England sneak a copy of NHS data, refuse to honour objections, and hide those actions from their data release register. (As of December 2016, some administrators pretend that there was no opt out offered from “anonymised” hospital data… here’s the video from Parliament).
Digital, Deepmind, and beyond
How AI will support care is a choice for the future, but if there is going to be any move towards that world (and there already is), the transparency of all digital services must be fundamentally, inviolable, and clear — it can include AI, but can’t include dodgy caveats.
If there is any secrecy about how patient data is used, NHS institutions may hope to be given the benefit of the doubt for secrecy, Google not so much. If there is secrecy for the NHS organisations, companies will try and sneak in too.
Similarly, if patients are to be offered digital services that they can use without fear, there must be an accountability mechanism for when those services were accessed, that they can view when they wish. Otherwise, the lowest form of digital predators will descend on health services like it’s feeding time. It doesn’t have to happen – unless there is a political decision that mistakes can be covered up.
When companies put out a press release, we often get called for comment and insight on what is actually going on. That’s a journalist’s job, and ours, because some good intentions come with too high a price.
Will the mistakes of the past begin to be rectified, creating the consensual, safe, and transparent basis for the (digital) health service of the future?
Demonstrations of Delivery on promises
There will always be a demand to do more with data – but any framework has to respect that some things will not be permitted.
As Caldicott 3 recognised, telling patients how their data has been used is necessary for public confidence in the handling of data. If there is to be confidence in the system, and allowing data to be used to its full potential, then there should be a recognition that when that use is objected to by an individual, then that objection is respected.
We focus on health data, but this applies across the public sector, where there is a desire to make data great again in 2017…