medConfidential response to “If you don’t share data…”

At a conference a few weeks ago, NHS England admitted it still had to “make the case to a large enough number of people that sharing data is fundamental”, as it insists this will (amongst other things, eventually) help the health service identify areas of good practice and reduce variation in quality of care. “If we can’t make the case for that then we will be in a very difficult situation,” said Tim Kelsey.

The crude assertion is this: if we don’t “share” your individual data, diseases won’t get cured, they can’t run the NHS, terrorists will win, and you may suffer directly. The rhetoric is largely the same, the projects vary only a little. But maybe there’s an alternative, which is to give people a choice and actually ask them about what you’re planning to do? (A decade or more of evidence shows this works, e.g. for ethically-approved research using medical records.)

So what’s the justification?

DH’s long-awaited response to its “Accredited Safe Havens” consultation on where “shared” data can go, will likely have to address comments from local authorities, some of whom feel morally obliged to take detailed individual-level data from the medical records of people in their area, to “share” with the social landlords, ‘just in case’ someone isn’t claiming a benefit they could.

That ‘initiatives’ like this may make some people’s (most likely bureaucrats’) lives easier is probably true – it’s easier to run a system if you never actually have to talk to the people using it. But the distress and harm that will come to someone who’d made an active choice not to reveal information which could have negative side effects, or when the wrong information gets used (and experience tells us no system is perfect) will most certainly not occur to those receiving the data; the impact will be on those to whose lives the information relates.

The same false comparison keeps being drawn by proponents of the institutional need for data sharing – transparent in their envy of what the commercial sector ‘gets away with’. People seem happy to “give” their data to Sainsbury’s or Facebook, they say – so why not some public authority?

Of course, no state actor would ever act against an individual’s wishes or best interests

Being a public servant means serving all of the public, including those you casually write off as “teenagers” – not just those that happen to agree with you. Sainsbury’s understands this, and its equivalence in a commercial context. Stores could refuse to serve individuals who don’t use their ‘loyalty’ cards, but they don’t. They recognise and (by and large) claim to respect individuals’ choices about which transactions they make using the loyalty card, and which ones they don’t.

The fact that this comparison is drawn time and again – most often out of naive misconception rather than any deliberate intent to mislead – shows a worryingly blinkered lack of appreciation for the fact that a supermarket can’t evict you from your home. It can’t can’t cut off your social security financial lifeline, attach your earnings, deny you medical treatment, restrict your movements, exercise powers of entry, or detain you at Her Majesty’s pleasure.

And were Tesco, for example, to do something to annoy you, there are a whole range of other supermarkets (ditto social networks, etc.) available – a choice that simply doesn’t apply in the public sector.

Misplaced priorities

For NHS England to claim that without the “sharing” of bulk personal datasets, it won’t know which hospitals to close, may prove to be an exceptionally risky strategy. But, as we have seen, NHS England’s priorities can be utterly unconnected to the wishes of local communities. The credulous assumption that “NHS England knows best” didn’t work out too well for; it is unlikely to work much better on any other issue. Especially those that are already publicly contentious.

Mass “sharing” – though a better word might be transfer, or traffic, or trade – of bulk personal datasets between bodies and organisations includes very little scope for individual choice. (At least not yet). And it’s often the case that entrenched departmental and institutional egos are unlikely to respect – or trust – each other anyway.

So when patients are handed from Hospital into Social Care, they may be assessed for which services they will need and when by the NHS. But when they are ‘received’ by the Social Care system, the first thing that happens is a re-evaluation – and often a large downgrade in support – because the Social Care process (which cares about £££) doesn’t trust the NHS process (which cares about care).

If egotistical fiefdoms already don’t trust each other’s judgement and already won’t talk to each other, what makes you think more data will help? It’ll just be more stuff that gets ignored whenever it’s not in the direct interests of whoever looks at it, and abused whenever that serves a(nother) purpose.

A problem of trust

One of the features of Gov.UK Verify – the Government’s approved ‘identity assurance’ scheme – is the concept of “attribute exchange”. If there was genuine trust in the system, when a registered medical provider had given an individual an attribute – in essence a digital token, or certificate – that relates to disability, the DWP would simply honour it.

Will it? Or will DWP insist that it must “revalidate” the person, at great time and expense (for the person, for DWP and ultimately for the taxpayer) but under their control? What about the local council trusting the NHS? Or even NHS bodies creating a basis to trust other NHS bodies?

Until trust within and between the silos is discussed and resolved, departments and bodies will continue to hoard bulk personal datasets in their own narrow bureaucratic interests, rather than in the interest of the individual.

Culture-change doesn’t happen overnight. And it certainly doesn’t happen if what’s imposed from the top, and modelled by so-called leaders, is some of the worst possible behaviour. So, unfortunately, it would seem that the point at which all of the various bureaucracies are themselves respecting (and being trusted to respect) individuals and their data is probably quite a way off.

But bodies that want to establish their trustworthiness, and to help individuals, can do something very simple: don’t start with a data grab.

Less about data, more about quality

The quality of a hospital does not necessarily relate to the individual, detailed medical records of each patient. That may be how Dr Foster designed its business, but it certainly doesn’t have to be the case.

In a system that has integrity, the data that should be openly published is aggregated counts of volumes and outcomes at relevant point along a pathway or across an institution – measuring that which is important. There are many metrics that should be used to determine the quality of a hospital; the obsessive prioritisation of a single metric (as with political target-setting) leads inevitably to ‘gaming’ of the statistics or, even worse, bending the service out of shape.

If what must be published are multiple, diverse (data-driven, but aggregate) standards, then the easiest way to improve your standing – to change your metrics – is not to hire consultants to help you massage your statistics, but to actually provide better care.

Scaremongering and coercion

Telling people that if they opt out of your open-ended ‘secondary uses’ database, their direct care may be affected and they may not be called for vital screening is both dishonest and malicious; quite possibly, abuse of public office. It’s certainly scaremongering worthy of the worst kind of institutional bureaucracy.

That the million or more patients who opted out at the beginning of 2014 are being told mid-way through 2015 that their opt-outs can’t be honoured because – applying the strictest possible interpretation of some technical wording few patients ever saw – this would break the promise that their care wouldn’t be affected was all entirely avoidable.

NHE England made and then failed to correct its own error (probably due to a failure to fully appreciate what role the Information Centre plays) then, even when that error was pointed out in late 2013, relaunched anyway and kept the problem hidden for the rest of the year. When eventually it needed a further excuse for having done nothing but keep the (hospital) data flowing, NHS England unceremoniously dumped the problem onto HSCIC November 2014, and continues to refuse to authorise or resource the practical solution which HSCIC proposed pretty much straight away.

This is not the way to ‘build trust’.

For that you first need to show you are trustworthy which, as Baroness Onora O’Neill has said, means demonstrating competence, honesty and reliability in all that you do.

A way forward?

If data can be shared, the criteria for services can also be clearly written down. Just because a citizen does not wish you to do everything with their data, that does not mean you should refuse to do anything.

There is no reason that services as a whole should be impacted by some people choosing to exercise their right to restrict the use of their sensitive data. This may mean some services have to evolve and not take the easy approach of “collect it all” for every bulk personal dataset they can imagine. But to minimise risk and take only what is absolutely needed is not only common sense: it’s the law. And it’s (your) right.

In a health context, any one individual refusing consent for their data to be “shared” will have an infinitesimal impact on whether new future treatments are developed as quickly, and it should most certainly never affect the choices or available treatments for your care. Bullying patients into surrendering their data with implied threats is no way to build trust.

medConfidential agrees with Tim Berners-Lee that you should know everywhere your data has gone, and why. The research world recognises that the data they need has some risks, and that these risks that cannot be mitigated completely, so other steps must be taken – such as keeping all individual-level data in a safe setting, and reporting back to patients. Do public bodies like NHS England think the problems of data handling that others have to deal with aren’t equally present for them?

Or will the various silos continue to act like Gollum, hoarding and hissing “my preciousssss” over bulk personal datasets that don’t actually even belong to them? As this version of the story plays out, it is obsessing over the ring of data that drives Gollum insane…