As the House of Lords Select Committee on AI looks at health data, it is only 2 weeks shy of the 20th anniversary the 1997 ‘Caldicott Review’. In retrospect, it understood the world that was coming then, and the review still holds up well for the future that is still coming now. It said:
“Increasing adherence to the principles will reassure patients and those treating them that confidentiality is safeguarded. Such progress should be monitored and appropriately identified, and individuals held to account wherever patient-identifiable data is present in the Service. We believe that the principles outlined here should also be applied to information identifiable to individual patients concerned with their clinical care, and medical research. It is clear that patients expect nothing less.”
20 years on, patients still expect nothing less.
The Review could have said one other thing – that data hygiene should have been treated as a part of clinical hygiene, and thereby integrating information into clinical governance. It would have avoided a great deal of problems over the last few years, but may also have made the the original creation of Caldicott Guardians effectively impossible. It is a step operationalised by Caldicott 3. While it could have led to a very much more Digital NHS today, it is all too easy to forget that hospital hygiene, under the oversight of clinical governance, has had problems from failed incentives around outsourcing, infected by the profit motive.
Today, the demand from profit seeking technical startups is even greater, the desire to skirt the rules intense, and modern startups push a “lobbyist viable product” onto a cash strapped NHS – selling patients’ data as an asset when a parent company the patient may not have heard of inevitably gets bought.
Whatsapp’s sharing of phone numbers to facebook would breach the rules around patient confidentiality; but then lobbyists went to see NHS England, who changed the rules and put the burden on each clinician to choose…
Whatsapp still shares its data to facebook to offer it’s “patients you may know” feature, and shadow profiles, and the other creepiness. Any officially sanctioned messenger could never do this, but NHS England doesn’t care – it has taken a problem off its desk for free, and dumped it onto every clinician in the country.
As NHS Digital takes egregious actions in the name of “burden reduction”, NHS England increases a burden yet further, because it’s reduces the burden on them from lobbyists wanting a change.
Messaging in the NHS remains a problem unsolved at scale – so the lobbyists now swarm offering their solutions to similar but different problems. NHS England, not running any hospitals, caved. Pagers work because they are used only for one thing – when they go ding, the doctor is needed for something at a level of urgency that has been triaged; but also the doctor can ignore it while dealing with something else. Doctor judgement is supreme over the tools – the pager is a busy doctor’s bullshit blocker – ignore that feature at your peril. End-to-end properly encrypted messaging is not hard, but the easiest of the tasks.
If NHS England commissioned a messaging system, institutionally it would abuse it the same way it does email, and destroy any benefits due to its own worst institutional micromanaging impulses – a problem non-existent in the pager world. But those who would change the system would include changes that distract clinicians, so the status quo continues. Such perverse incentives were well understood by Dame Fiona Caldicott when she was writing 20 years ago.
As a result, Caldicott Reviews 1, 2 and 3, are all still relevant. The history is still relevant for designing consent, and designing out unethical and harmful behaviours. That NHSE abdicates any political responsibility does not mean NHS Digital may do so when designing technical systems.
For example, the “GP at hand” service, a rebadged “Babylon Health” product invested in by DeepMind’s founders, says it has the right to sell the medical records it holds as an asset of the company when babylon get bought (which it must in order to pay back the investors). This is a model that Caldicott 1 and GMC/BMA guidance has previously made clear is not appropriate. But as with the dodgy deal with the Royal Free, it is the NHS institutions left holding the bag as the company takes what it wants. Why did NHS England approve that service with those conditions?
Companies will change their rules for profit, and use their public relations machines to argue the NHS “harms patients” by walking away from a disturbing deal the companies only offer on a take it or leave it basis.
Such a business model may be fine for a profit focussed business with no sense of public purpose or accountability to anything beyond their bottom line, but the AI companies claim a higher standard… but also, they don’t:
Q52 Lord Swinfen: In your view, do investors have a duty to ensure that artificial intelligence is developed in an ethical and responsible way? If so, how should they do this? Should such development be regulated?
Eileen Burbidge: I thought this was an incredibly insightful question when I saw it on the papers for the session. The stark and objective answer, strictly speaking, is that I do not think investors have a duty to ensure ethical and moral behaviour. Most investors sign up to a code of conduct and are authorised persons by the FCA because they have fiduciary responsibilities as a first and foremost point. That is simply the objective current situation.
…
To be quite clear, our obligation to our investors is to generate as strong a financial difference as possible.
…
It will be more social pressure and market pressure.
If the companies also refuse any social or market pressure from the NHS, then they have learnt no lessons at all.
It was with that insight that Caldicott 1 laid down strong and sustainable guidance for handling of patient data. Every failure of the last 20 years has come because that guidance was ignored.
Dame Fiona Caldicott was precient in her first report. That first report from 1997 can be applied to AI and genomics – 2 ideas that would have been almost inconceivable when it was written – and we are confident it will apply to whatever comes after AI and genomics.
Let us also hope that there doesn’t need to be a fourth part in this trilogy…