Author Archives: medcon

medConfidential comment on the Government’s response to the Caldicott 3 Review

medConfidential’s comment on the Written Ministerial Statement responding to the Caldicott 3 Review

While more details will emerge over the next several weeks, and given this is only a response to Dame Fiona Caldicott’s Review (and not any of the work by NHS England which depends upon it), medConfidential is in the first instance cautiously positive.

Original statement: http://www.parliament.uk/business/publications/written-questions-answers-statements/written-statement/Lords/2017-07-12/HLWS41/

In summary, the Statement says a number of things:

  • Patients will be offered a digital service through NHS.uk that will enable them to see how their medical records are used: both for direct care, and secondary uses beyond direct care.
  • Existing opt-outs preventing patients’ data being extracted from GP practices are protected until at least 2020.
  • There will be further consultations on the details of any changes.
  • Patients who have opted out will be written to about the Caldicott consent model when implementation is finalised (but before changes take effect).
  • NHS Improvement will begin to take cyber security into account. CQC now do.

Reflecting the very strong response from front-line clinicians and technical staff to the WannaCry ransomware outbreak, the Statement is very strong on cyber-security. Whether the analogue administrators that caused so much unnecessary hassle during that event have learnt lessons will become clear, next time…

With the newly-digital DCMS about to launch the Data Protection Bill, will the Government actually deliver on its commitment to a Statutory National Data Guardian?

Phil Booth, Coordinator of medConfidential said:

“We welcome the clear commitment that patients will know how their medical records have been used, both for direct care and beyond. This commitment means that patients will have an evidence base to reassure them that their wishes have been honoured.

“Some of the details remain to be worked out, but there is a clear commitment from the Secretary of State. The focus on digital tools shows the benefit to the whole NHS of the work towards NHS.uk. It is now up to NHS Digital and NHS England to deliver.

“The wait for consensual, safe, and transparent data flows in the NHS is hopefully almost over, and then new data projects can move forwards to deliver benefits for patients and vital research. Today’s announcement is about fixing what NHS England had already broken. The perils of a National Data Lake may lie ahead, but we hope lessons have been learnt, so we don’t end up back here in another 4 years.”

Google now tries to blames Doctors and Snapchat for its unlawful behaviour

Responding to Google’s claims that doctors “use” Snapchat to send photos for a second opinion, coordinator of medConfidential Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about this. The Report blames doctors for hygiene, and the hospital for it’s IT systems, everyone but Google. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship – something Google clearly fails to understand.”

If the assertions are based on evidence acquired in the Review, that should have been reported to CQC – unless there was a see no wrong, hear no wrong policy in place. Google provided no evidence that Doctors actually do this, just that they could install an app. They could also use any google messaging tool (except no one uses any of them). We fully expect DeepMind will “surprisingly” come out with a messaging app for doctors, which will be no better than email, and so solve none of the widely understood problems that mean fax machines are still useful. 

Doctors are responsible for safely caring for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.

We’re mostly surprised that Google didn’t use this to kick Facebook; but perhaps they didn’t want to criticise another member of the Partnership on AI…

Original press release here: https://medconfidential.org/2017/medconfidential-initial-comment-on-the-google-deepmind-independent-reviewers-report/

medConfidential initial comment on the Google DeepMind Independent Reviewers’ report

UPDATE 2pm: responding to Google’s claims that doctors use secure messaging to send photos, Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about it. The report blames doctors for hygiene, and the hospital for it’s IT systems. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship.”

Doctors care for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.


The report answers none of the obvious questions that a supposedly independent Review of unlawful data copying should have answered.  

The ICO confirmed on Monday that DeepMind Health’s deal with the Royal Free had broken the Data Protection Act in at least 4 ways [1], and they have been given weeks to fix it. There is now a formal undertaking in place for correction of their project’s ongoing breaches of the Data Protection Act [2]. As of this week, DeepMind remains in clear breach of UK privacy laws. (page 7)

The National Data Guardian’s letter, referred to by the Review, shows clearly that DeepMind were aware of the unlawful nature of their processing last December[3] and the Review suggests they chose to do nothing about it.

In addressing “law, regulation and data governance”, the Reviewers say “We believe that there must be a mechanism that allows effective testing without compromising confidential patient information” (page 9, right column). So many people agree that there are already such processes – DeepMind just didn’t use any of them. It is unclear why the “Independent Reviewers” feel this is anyone but Google’s problem. (Here’s the sandbox for Cerner – which the Royal Free uses.)

If, as Prof John Naughton analogises, the Royal Free’s response to the ICO decision was “like a burglar claiming credit for cooperating with the cops and expressing gratitude for their advice on how to break-and-enter legally”, this report is DeepMind saying “It wasn’t me! Ask my mum…” thinking that’s an alibi.

DeepMind accepts no reponsibility [4], and its Reviewers seem happy with that.  Which, given DeepMind’s broad AI ambitions, should frankly be terrifying…

Responding to the Review, medConfidential Coordinator Phil Booth said:

“If Page 7 (right column) is accurate in its description of record handling at the Royal Free, then CQC must conduct an urgent inspection of data hygiene at the hospital; or was this just “independent” hyperbole to make Google look good?”

“The Reviewer’s way to not criticise DeepMind is to avoid looking at all the things where DeepMind did anything wrong. The Reviewers may think “this is fine”, but anyone outside the Google bunker can see that something has gone catastrophically wrong with this project.”

“Google DeepMind continues to receive excessive amounts of data in breach of four principles of the Data Protection Act, and the Independent Reviewers didn’t think this worth a mention. DeepMind did something solely because they thought it might be a good idea, ignorant of the law, and are now incapable of admitting that this project has unresolvable flaws. The ICO has forced both parties to fix them within weeks having ignored them for approaching 2 years.

“DeepMind Health needs real senior management with a experience of caring for patients, i.e. a Regulated Medical Professional, as a Chief Medical Officer. The second paragraph on the inside front cover (which isn’t even a numbered page in the printed document, but page 2 in the PDF) shows how badly they have failed from the start.”

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on 07974 230 839 or coordinator@medconfidential.org

 

Notes to editors:

  1. Information Commissioner’s Office summary of their finding https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/07/royal-free-google-deepmind-trial-failed-to-comply-with-data-protection-law/
  2. The ICO requires that the Royal Free and DeepMind take actions within a month of the undertaking issuance – page 7. https://ico.org.uk/media/action-weve-taken/undertakings/2014352/royal-free-undertaking-03072017.pdfMany of these issues were highlighted to DeepMind by MedConfidential last year, and which they have repeatedly and systemically ignored.
  3. Sky News reported in May that the unlawful nature of the DeepMind data processing was first formally brought to the Royal Free & DeepMind’s attention in December 2016 by the National Data Guardian. http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142 Paragraph 4 of the letter from the National Data Guardian to the Hospital clearly shows that they were first formally of their legal failings in December.
  4. Details of medConfidential’s complaint are available here:
  5. This complaint has now been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

    i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]
    ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]
    iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]
    iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

    b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

    c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

    i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]
    ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”
    iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”
    iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
    20160504­deepmind­radio4­pm.mp3?dl=1
    20160504­deepmind­radio4­pm.mp3?dl=1

    d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

    …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

    e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

    i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”
    ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

     

  6. DeepMind’s response to the ICO finding has been to blame everyone but themselves. As they begin to regularly refresh part of their Review board, perhaps Shaun Spicer will be available to help.

 

-ends-

medConfidential Bulletin, 30th June 2017

So, we have a new Government (after a fashion). And, whatever else, there’s some continuity at the Department of Health…

Given this continuity, the completely unambiguous Conservative Manifesto commitment, and cross-party support for the National Data Guardian, it was a bit disappointing that a statutory footing for NDG was absent from the Queen’s Speech.

We can’t help but note – with a Data Protection Bill on its way, arbitrary data-sharing powers available in the Digital Economy Act, and Theresa May threatening to roll back human rights – that it is protections such as these that underpin the privacy of all our medical records.


What just happened?

The election put a lot on hold, but you may remember a dodgy deal with the Royal Free Hospital that got Google DeepMind into a spot of trouble with the ICO and National Data Guardian when we complained about it.

The NDG’s formal view came out during the election period, and we await the ICO’s ruling – due any day now. We are therefore entirely unsurprised that DeepMind’s “Independent” Reviewers’ report is also delayed. One might question “independence” when a whitewash coincidentally comes out a day after the regulator’s critique…

What’s happening next?

We don’t comment on every future project press release from Google DeepMind – their PR flacks cost many times our annual budget. But last week’s announcement that its next project will be to provide a hospital IT system for Taunton is worthy of some attention; the relevant detail is at the bottom of page 2 of this document.

It’s understood that companies will provide the NHS with IT systems – GPs and hospitals buy in systems all the time. But accepting ‘gift horses’ from aggressively data-seeking US info corps already known for not playing by the rules may not necessarily be wise. For one thing, as many have learned, if you’re not a paying customer you tend to end up being the product.

If, however, the decision is that the people of Taunton are most in need of better infrastructure – NHS England certainly felt they were, this area being one of the ‘pathfinders’ for the cancelled care.data scheme (more on its successor below) – then starting in Somerset is as good a place as any.

But this doesn’t mean you can ignore the regulatory implications. Or future cost.

As recently as January, DeepMind assured Regulators that its tools were not used for clinical decision making, yet in June it has signed contracts to run a hospital using it. To be used in direct care, the central IT system of a hospital is a closely regulated system – these are, after all, the systems that run Intensive Care – although Google, chasing the profits rather than patients, probably won’t choose to help those in most acute need.

Has Google started the Regulatory  process to run that system, or is it trying ‘deployment via press release’? Does it want DeepMind to mark its own homework too?

The only way for patients to know if their data was used in such a programme is for everyone to know where, when and why their medical records have been accessed. Google says it won’t use patients’ data for other purposes; our concern is that minds change. After all, the company said it wouldn’t start building this system for 3 years – that was 7 months ago.

For as long as DeepMind Health is led by an entrepreneur – and has no Chief Medical Officer who is bound by the Hippocratic Oath – its position can change, purely for business reasons. Its corporate officers may stand on stage and say they won’t, but they say many things which they change their minds about. One can be an AI visionary, or run a health infrastructure service – but people have every right to be nervous when you try to do both, especially if you claim you aren’t doing so.

It is inevitable that the future model for this service will be ‘AI assistants’ offering hints and references to doctors via the Streams app; the principle of A&E triage, applied hospital-wide.

This being the case, if these AI systems are modular and compartmentalised for the delivery of care, then they can each be regulated separately. If, however, the individual systems are not interoperable and transparent, then the entire infrastructure must be regulated tightly. (Research, i.e. the development of such systems – including the justification, with evidence, of what data they actually need – is already regulated, by MHRA and other bodies.)

Until the situation is clear, questions as to whether DeepMind’s approach to Regulators is the same as Uber’s (they do, after all, share investors) will remain.

We should point out, as DeepMind buried it in the small print, that no money is changing hands here – and neither party is obligated to do anything. This may yet be just another Silicon Valley startup (the TV show, that is – not the place) that puts out a stream of press releases, delivering for investors over patients.

 

What’s happening where you live? And what can you do?

Wherever you live, in England, there are changes coming to your local NHS.

The ever-so-subtly again renamed STPs (now “Sustainability and Transformation Partnerships”, not just Plans) and their further regional reorganisation – over “several years” – into Kaiser Permanente-style Accountable Care Organisations represent the Government’s and NHS England’s view of the future.

Bearing in mind the massive democratic deficit in the NHS, will accountability be to patients or to the analogue administrators?

Given that – most of the time at least – care records follow patients, one of the best ways to see how the NHS works is to look at the data trail that you leave behind you.

So if you have a login for your GP practice’s website, we encourage you to look at the letters that have been scanned into your record, and to simply count the logos. (If you don’t already have a login for online access, here’s how to get one.) Then, as your NHS changes over the next few years, keep count; over time do you see more commercial logos, or fewer?

While you’re at it, you might also want to check who’s accessed your GP record. Both EMIS and TPP have now switched on basic access to your GP record’s ‘audit trail’ – and as more and more people use it, this vital transparency feature should improve over time.

Things are clearly going to stay busy for a good while yet. Four years in, medConfidential exists entirely through your donations and the generosity of the Joseph Rowntree Reform Trust, to whom we are applying for a further grant. We appreciate all donations – and your support helps with other funding.

 

A digital strategy for the NHS: remember Martha’s Talisman

“Apply the following test. Recall the face of the poorest and the weakest, the most digitally-disengaged patient whom you may have seen, and ask yourself if the step you contemplate is going to be of any use to them? Will they gain anything by it? Will it restore them to a control over their own life and destiny? Will they have the information to make an informed decision?

– with apologies to Gandhi and Martha Lane-Fox

Any strategy for a Digital NHS must account for the furthest first. And, while addressing their needs, must also recognise the circumstances and humanity of all those whom the data is about, via user research. Wanting to help people is not the same as actually helping them – as previous recent NHS strategies have demonstrated.

An effective strategy must be short enough that people can both remember what it is, and hold it in their mind while thinking about the challenge in front of them. A 200-page PDF is not only indigestible, it is undeliverable; our attempt above is at a strategy people could remember.

What follows are guidelines on how not to misapply it.

Strategy

The handling of medical records must be underpinned by accountability – whether “handling” means digital services used by clinicians, by patients, or for secondary uses. If built on a basis of pervasive transparency on all data flows, flawed decisions can be identified and corrected, and progress made within an environment characterised by evidence rather than promises.

Some strands of the Five Year Forward View are mired in secrecy and political choices, which – while any one decision may work out well (or otherwise) for patients – is an unsustainable basis for long-term effective and efficient delivery of public health and care services at nation-scale.

High quality digital services are built with humility, by learning from the real world, with meaningful involvement in the process by patients and clinicians – and others who also contribute, e.g. researchers, administrators, and commercial providers.

There may well be an extremely narrow case for sharing a patient’s entire clinical treatment history with the NHS.UK website backend in order to personalise the front page of that website on an initial visit, but the harm of doing so without fully-informed choice and consent is far greater than the harm of not having that feature at all. And with every such decision arises the opportunity cost of those things (whether treatment or prevention) that will not subsequently be possible, due to the impact of such flawed priorities, and/or patient fears.

Only the NHS

Only the NHS connects people through their lives from cradle to grave – and can therefore tell people how they contributed to research, even long after the event.

Unlike, for example, shonky ‘public-private’ initiatives, hiding behind the NHS ‘brand name’, set up to profit from a ‘Bonfire of the Faxes’, the NHS proper doesn’t bodge it and scarper, leaving others to clean up its messes. It is the NHS that cleans up the messes created by others; thousands upon thousands of true public servants caring for people under their shared and lived understanding of the Hippocratic Oath: Do No Harm.

In the digital world, there is a Talisman that can direct every significant choice. It will not stop post-rationalisation or self-justification of pre-conceived ideas – that outcome is outwith any strategy, lying as it does in the hearts and minds of the strategists themselves. But if the Talisman helps, and is respected and used as a touchstone across the entire system, then it should stop incorrect ideas before they can go wrong at scale, and also encourage good work to flourish.

For if nothing else, this must be a fundamental goal of any (digital) strategy: to support and encourage positive innovation in care and prevention, while not killing people through ignorance, oversight or ideology.

Decision making by the Information Commissioner

The Information Commissioner’s Office operates on legal realities, i.e. “What is currently the case?”. This explains why the ICO may enforce at one minute past midnight on the day a programme comes into force, but not before. It can be infuriating, but that is what a regulator is empowered to do.

“Being legal” is a binary state – something is either legal or it isn’t.

If there is one way in which a situation or scheme or system is legal, and no ways in which it is illegal, even if there are many ways in which it is really creepy, it’s still legal. This is often infuriating in the private sector, but in the public sector there is a very different environment – because, most of the time, public sector bodies don’t get to operate in ‘stealth mode’. In the private sector, the ICO by and large regulates against dishonesty rather than for good data hygiene. The public sector is held to a higher standard.

Either way, before 00:01 on the first day of operation, the ICO operates only on scenarios, or possibilities.

You can in fact put a scenario to the ICO and, while its officials don’t necessarily like hypotheticals, they will offer an opinion based on what you have said.

What most people fail to understand is that ICO decisions are based exclusively upon the scenario (or evidence) as presented to it.

If you tell the ICO that you will do X, and its officials suggest that X is most likely legal, then that opinion will simply not apply if at 9:12 am on the second Thursday of the following month it turns out you instead do X plus Y; that is a different scenario.

Clearly, if you miss out critical information from the scenarios you present, then the ICO’s opinion cannot and does not reflect what you are actually doing; it only reflects what you say you are doing. Remember, the ICO operates on reality – which is why it can only enforce at 00:01 on the first day of operation.

Where the ICO issues “contradictory advice”, it is almost always because the information it was presented with changed.

In a hypothetical scenario, when the scenario changes, the ICO reserves the right to change its mind. What else would it do?

If ICO officials “change their minds” when presented with what is ostensibly “the same” information, it likely demonstrates the fact that – in the ICO’s opinion – material information was omitted the first time.

For example, care.data’s communications programme collapsed because what NHS England told the ICO turned out to be incomplete – when other information was added, and checked against reality, what NHS England said it would do, and what it actually did, were shown to be materially different.

If you want to understand why the ICO changes its mind, the best place to start is with what you didn’t tell its officials, that someone else did.

A first look at the Manifestos

For the party manifestos, medConfidential had a single request:

Will patients know how their medical records are used?

How did the parties respond? (Remembering that the Conservatives are in Government, so should have more detail than the opposition parties.)


Conservative Manifesto

Quite a bit of good news, if the currently most likely next Government remembers what it said:

“We will put the National Data Guardian for Health and Social Care on a statutory footing to ensure data security standards are properly enforced.” (p80)

The NDG’s statutory footing should be based on Jo Churchill’s Bill (our view) which was published before the election. While the Government didn’t enable the Bill to go to Committee, putting it as a Part in the forthcoming Data Protection Bill (mentioned elsewhere in the manifesto) should not be controversial. Allowing the Data Guardian to ‘follow the data’ means that public health copies of NHS data are also covered, and therefore can be properly consented.

 

“We will give people new rights to ensure they are in control of their own data “ (p79)

It’s impossible to control what you don’t see – so a citizen’s view of Government data use (or a patient’s view of the uses of their medical records, or a customer’s view of commercial data use) is a prerequisite for control.

Whether “control” means taking back control from those who would copy data “for the greater good” in secret, e.g. for “decommissioning”, or whether there will simply be transparency and accountability over where data is copied, it will be hard for anyone to argue that this line does not commit the Government to a single overarching opt-out from secondary uses of medical records – in line with Caldicott 3.

 

“to ensure the very best standards for the safe, flexible and dynamic use of data and enshrining our global leadership in the ethical and proportionate regulation of data” (p80)

While this isn’t quite consensual, safe, and transparent, it is a beginning. However, with the Data Controller in Chief believing there is no data use that could not be ‘proportionate’ – on the tautological basis that if it is being used, then it must be proportionate – this will likely lead to controversy. The scale of problems will be determined by the level of secrecy we refer to in our previous paragraph: will there be secrets?

We acknowledge that this is, however, an improvement over the current state of affairs – having the conversation is far better than not having it at all.

 

“To create a sound ethical framework for how data is used, we will institute an expert Data Use and Ethics Commission to advise regulators and parliament on the nature of data use and how best to prevent its abuse.” (p79)

While this may sound good in theory, in practice – as we’ve seen with Google DeepMind – such advisors often end up acting as a rubber stamp for deniable practices. That is, when they’re not ignored entirely. Whether this Commission will have teeth, or will have failings similar to those of the various other bodies created recently, will depend on the details.

We look forward to the consultations…

 

“…we shall roll out Verify, so that people can identify themselves on all government online services by 2020, using their own secure data that is not held by government. We will also make this platform more widely available, so that people can safely verify their identify to access non-government services such as banking. We will set out a strategy to rationalise the use of personal data within government, reducing data duplication across all systems, so that we automatically comply with the ‘Once-Only’ principle in central government services by 2022 and wider public services by 2025.”

Good. The Verify infrastructure and principles can be used to deliver consensual, safe, and transparent digital services – whether in the NHS, across Government, or beyond.

Alongside the commitment to safety, this suggests that the privacy protections of Verify can be used to solve the design failures of the pornography rules in the Digital Economy Act – although we don’t expect Verify to be renamed ‘PornID’ any time soon!

If the controversial proposal for showing ID at a polling station is shown to be necessary, Verify offers a digital mechanism for a non-centralised form of validatable ID, including full “same-day” voter registration, using only a mobile phone (including a pre-paid mobile phone, which can be used to create a Verify account, and then the credential to vote), for free, for everyone. This would be an improvement over the status quo.

The explicit rejection of “sweeping, authoritarian measures” such as the failed Home Office ID scheme is missing, but a wider rollout of Verify – along with services offered in G-Cloud 9, resulting from a privacy discussion with the DG of HMPO – should make any return to ID cards not only unnecessary, but shown to be motivated by other desires. (There’s also no reference to the 53 million genomes project – but, given the delays in the 100,000 genomes project, and the problems with that approach in the delivery of health care, that shouldn’t be a surprise.)

Especially around Verify, but also given the response to wider events, recent weeks have shown the failures of the current digital leadership in Government. Whether digital transformation will cease to come from Government, and instead again come to Government, remains unclear.

 

Will citizens, will patients, will customers, will users know what these changes mean for them in practice? Will they know how their data is used?

It’s all too easy to forget the human details when you’re working on “great challenges”. Which goes for everyone, at every level, however they claim to represent others. This manifesto (as do the others) contains many fine words, but aspirations aren’t actions. Promises must be delivered, and be seen to be delivered. And those who make decisions based on our data, and about our lives, must and will be held to account – by the people affected by those decisions.


Labour manifesto

Without access to the civil service, it’s hard for opposition parties to have details on unannounced Government policy – much of the Conservative manifesto quoted above is a delivery of existing work.

“Labour is committed to growing the digital economy and ensuring that trade agreements do not impede cross-border data flows, whilst maintaining strong data protection rules to protect personal privacy.”

That statement leaves very little space between Labour and the Conservatives on this topic.


Liberal Democrats

Despite lots of detail on many things, there is no clear policy from the Lib Dems on consent and data privacy, although in a section entitled “Defend Rights, Promote Justice and Equalities”, it says:

“As liberals, we must have an effective security policy which is also accountable, community and evidence-based, and does not unduly restrict personal liberty.”

This is the closest that we get to data. However, since this applies in the secret part of Government, it must also apply in the non-secret parts.


The Green Party & UKIP manifestos haven’t been published as of the time of writing.

medConfidential response to “technology company DeepMind” Press Release

For immediate release – Tuesday 28 February 2017

One year after first telling the public that “technology company DeepMind” [1] was going to help the NHS, it is still unclear whether Google’s duplicitous offer still includes forcing the NHS to hand over the medical history of every patient who has visited the hospital. [2]

It is no surprise that digital tools help patients, but is Google still forcing the NHS to pay with its patients’ most private data?

As the NHS reorganises itself again with the Secret Transformation Plans, [3] NHS England plans a ‘National Data Lake’ for all patient data. [4] Of which this is one. In defending giving data on all its patients to Google, Royal Free’s Chief Executive, David Sloman, said “it is quite normal to have data lying in storage”. [5]

Tomorrow the Government announces the UK’s new digital strategy, [6] including new money for the Artificial Intelligence in which DeepMind specialises. Is copying of data on a whim what the future holds?

Clause 31 of the Digital Economy Bill suggests precisely that [7] – data can be ‘shared’ (copied) to anyone associated with a public or NHS body [8] who can justify it as “quite normal to have data lying in storage”.

As Downing Street takes the Trump approach to health data, [9] does Google now say the ends justify the means?

Phil Booth, coordinator of medConfidential said:

“So toxic is the project, the latest press release doesn’t even use the word “Google”.

“It is good that 11 patients a day get faster care due to this tool; but Google will still not say why they wanted data on thousands of patients who visit the hospital daily.

“Until patients can see where their medical records have gone, companies will continue to predate upon the NHS to extract its most important resources.”

Notes to Editors

1) This is how Google’s wholly-owned subsidiary, DeepMind – based in the Google offices in London – was misleadingly described in this press release published by the Royal Free: https://www.royalfree.nhs.uk/news-media/news/new-app-helping-to-improve-patient-care/

2) ‘Google handed patients’ files without permission: Up to 1.6 million records – including names and medical history – passed on in NHS deal with web giant’, Daily Mail, 3/5/16: http://www.dailymail.co.uk/news/article-3571433/Google-s-artificial-intelligence-access-private-medical-records-1-6million-NHS-patients-five-years-agreed-data-sharing-deal.html

3) Hospital cuts planned in most of England: http://www.bbc.co.uk/news/health-39031546

4) medConfidential comments on NHS England’s National Data Lake: https://medconfidential.org/2017/fishing-in-the-national-data-lake/

5) The Government confirms that the bulk data copied by DeepMind, i.e. SUS, “are maintained for secondary uses” and not direct care: http://www.parliament.uk/business/publications/written-questions-answers-statements/written-question/Lords/2016-12-07/HL3943

6) Due to launch on Wednesday, being now pre-briefed by the Minister: https://twitter.com/MattHancockMP/status/835835027611127809

7) Clause 31 of the Digital Economy Bill as currently drafted would allow any provider of a service to a public body (such as Google to the NHS) to share data with (i.e. provide a copy to) any other provider.

8) While the Draft Regulations for Clause 31 state that Department of Health bodies are excluded from the Clause, medConfidential has received confirmation that such bodies will be included in the final regulations after Parliament has considered the Clause without health included.

9) The NHS is being forced to release the names and addresses of vulnerable patients to the Home Office: http://buzzfeed.com/jamesball/trumping-donald-trump

Questions that remain unanswered from May 2016 include:

  • What was the basis for Google to get 5 years of secondary uses data on every patient who visits the hospital? Google is getting thousands of people’s data per day, yet the hospital admits it is helping only a small fraction of them.
  • Why did the app not simply access the data it could clinically justify, when it needed to display it? That would have provided all the benefits of the app to patients and clinicians, and not given Google the medical records of patients which it had no justification for receiving. Did Google even talk to the hospital’s IT provider about access to only the data it needed before demanding all the data the hospital held?

medConfidential made a complaint to the ICO and National Data Guardian about the project in June 2016. Google and the Royal Free Hospital have failed to yet provide satisfactory answers and we understand the investigation remains ongoing.

-ends-

Digital Economy Bill: Part 5, Chapter 1, clause 30 and Part 5, Chapter 2 from a health data perspective

medConfidential asks Peers to:

  • Express support for Baroness Findlay’s amendment on Part 5 (NC213A-D)
  • Express support for either amendment to Part 5 Chapter 2 (Clause 39)
  • Oppose current Clause 30 of Part 5 in Committee and on Report

We attach a briefing, with a more detailed consideration of these points, but in summary:

In 2009, the then Government removed clause 30’s direct predecessor – clause 152 of the Coroners and Justice Bill – because the single safeguard offered then was ineffective. Bringing that back, this Government has not only excluded important aspects of Parliamentary scrutiny, it is trying to introduce “almost untrammeled powers” (para 21), that would “very significantly broaden the scope for the sharing of information” (para 4) without transparency, and with barely any accountability. The policy intent is clear:

“the data-related work will be part of wider reforms set out in the Digital Economy Bill. [GDS Director General Kevin] Cunnington said as an example, that both DWP and the NHS have large databases of citizen records, and that “we really need to be able to match those”. (interview)

While there is a  broad prohibition on the use of data from health and social care for research further down on the face of this Bill, in Chapter 5, the approach taken in clause 30 is very different, and contains no such prohibition. Regulations (currently draft) published under clause 36 simply omit the Secretary of State for Health from the list of Ministers, thereby excluding NHS bodies but not copies of health data others require to be provided. This is another fatal flaw in clause 30.

medConfidential is deeply concerned that Chapter 2 of Part 5 contains no safeguards against bulk copying. We accept the case for a power to disclose civil registration information on an individual consented basis – a citizen should be able to request the registrar informs other bodies of the registration – but, just as clause 30 contains insufficient safeguards and is designed to enable bulk copying, so is Chapter 2. One of the amendments laid to Part 5 Chapter 2 should be accepted.

Governments have had since 2009 to solve the problems that clause 30 not only leaves unaddressed, but exacerbates. The Government should either heavily amend Clause 30 at Report stage, or ensure it is removed before Third Reading. This clause is a breeding ground for disaster and a further collapse in public trust, and it simply doesn’t have to happen.

While medConfidential is open to legislation that treats sensitive and confidential personal data in a consensual, safe and transparent manner, this legislation does not. Despite more than 2 years of conversations about accessing data through systems that respect citizens and departments (ie data subjects and data controllers) and the promises they make to each other; Cabinet Office instead took a clause from 2009 off the shelf, and has been actively misleading about the process.


Briefing for Committee stage

Your hospital data is still being sold – and here’s why it matters

Every flow of health data should be consensual, safe, and transparent. The Wellcome Trust found that up to 39% of people would have concerns about the use of their hospital data (page 92). Those concerns are well founded, and the safeguards currently insufficient.

NHS Digital says that the “pseudonymised Hospital Episode Statistics” of each man, woman and child in the country are not “personal confidential information” and so your opt outs don’t apply.  But the Hospital Episode Statistics are not “statistics” in any normal sense. They are raw data; the medical history of every hospital patient in England, linked by an individual identifier (the pseudonym), over the last 28 years. This article is an explanation of what that means, and why it is important.

To understand the risk that NHS Digital’s decision puts you in, it is necessary to see how your medical records are collected, and what can be done with them when they have been collated.

A proper analogy is not to your credit card number, which can easily be fixed by your bank if compromised; but the publication of your entire transaction history. Your entire medical history cannot be anonymised, is deeply private, and is identifiable.

 

How do your treatments get processed?

Each hospital event creates a record in a database. Some large treatments create a single record (e.g. hip replacement); some smaller routine events create multiple records (e.g. test results).

The individual event may be recorded using a code, but the description of what each code means is readable online. As Google DeepMind asserted, this data is sufficient to build a hospital records system (we argued that they shouldn’t have; we agreed it was possible).

As for how millions of those single events get put together, here’s a screenshot of the commercial product “HALO Patient Analyser”, sold by a company called OmegaSolver, which uses the linking identifier (the pseudonym) to do just that:

OmegaSolver HALO Patient Analyser screengrab

 

The identifier links your records, and that’s the problem.

While a stolen credit card number might sell online for $1, a stolen medical history goes for more like $100.

The loss of a medical record is very different to losing a credit card. If your credit card is stolen, your bank can make you financially whole again, and give you a new credit card. A month later, the implications are minimal, and your credit history is clear. But if someone gets hold of information about your medical history, that knowledge cannot be cancelled and replaced – you can’t change the dates of birth of your children, and denial of a medical event can have serious health implications.

The Department of Health is correct that the identifier used to link all of an individual patient’s data together – the pseudonym, which you could equate to a credit card number – is effectively “unbreakable”, in the sense that it won’t reveal the NHS number from which it is derived. No one credible has ever argued otherwise. You cannot readily identify someone from their credit card number.

But that misses the point that there are plenty of ways to identify an individual other than their NHS number.  This is not a new point, but it has never been addressed by NHS Digital or the Department of Health. In fact, they repeatedly ignore it. It was medConfidential that redacted the dates from the graphic above, not the company who published it on their website.

Whenever we talk to NHS Digital or the Department of Health, they repeatedly argue their use of pseudonyms as linking identifiers keeps medical information safe because they hide one of the most obvious ways to identify someone, i.e. their NHS number. We don’t disagree, and we agree that making the pseudonym as unbreakable as possible is a good idea. But what this utterly fails to address is that it is the very use of linking identifiers that makes it possible to retrieve a person’s entire hospital history from a single event that can be identified.

Focussing narrowly on the risk that the linking identifier could be “cracked” to reveal someone’s NHS number misses the far more serious risk that if any one of the events using that pseudonym is identified, the pseudonym itself is the key to reading all the other events – precisely as it is designed to be. That multiple events are linked by the same pseudonym introduces the risk that someone could be identified by patterns of events as well as details of one single event.

In the same way that you cannot guess someone’s identity from their phone number alone, you won’t be able to guess someone’s identity from their linking identifier. But just as in reading your partner’s phone bill, you could probably figure out who some of the numbers are from knowledge of the person, such as call patterns and timings. And once you’ve identified someone’s number, you can then look at other calls that were made…

Hospital Episodes Statistics (HES) provides all that sort of information – and allows the same inferences – for the medical history of any patient who has been treated in an NHS hospital, about whom you know some information. Information that may be readily accessible online, from public records or things people broadcast themselves on social media.

In the event of an accident that leads to HES being ‘published’, this is what NHS Digital says “could happen” – allowing people who know, or decide to find out something about you, to identify your medical history. This is how, in the event that one thing goes wrong, the dominoes destroy your medical privacy and (not coincidentally) the medical privacy of those directly connected with you.

Returning to the example of the phone bill – from a call history, you could infer your partner is having an affair, without knowing any details beyond what’s itemised on the phone bill.

Linking identifiers are necessary to make medical information useful for all sorts of purposes but, for reasons that should now be obvious, they cannot be made safe. That is why safe settings and opt outs are vital to delivering usable data with public confidence.

 

With 1.5 billion events to search through, what does this mean in practice?

Health events, or accidents, can happen to anyone, and the risk of most people being individually targeted by someone unknown is generally low – a risk the majority may be prepared to take for the benefit of science, given safeguards. But while it may be fair to ask people to make this tradeoff, it is neither fair nor safe to require them to make it.

As an exercise, look in your local newspaper (or the news section of the website that used to be your local newspaper) and see what stories involve a person being treated in hospital. What details are given for them? Why were they there?  Have you, or has anyone you know, been in a similar situation?

The annex to the Partridge Review gives one good example, but here are several others:

  • Every seven minutes, someone has a heart attack. This is 205 heart attacks per day, spread across 235 A&E departments. If you know the date of someone’s heart attack (not something normally kept secret), the hospital they went to, and maybe something else about them, using the Hospital Episode Statistics, their entire history would be identifiable just out of sheer averages.
  • If a woman has three children, that is 3 identifiable dates on which some medical events occurred (most likely) in a hospital. Running the numbers on births per day, 3 dates will give you a unique identifier for the person you know. Are your children’s birthdays secret?
  • If misfortune befalls someone, and information ends up in the public domain due to an incident that affects their health, (e.g. a serious traffic accident), or a person who is in the public eye, or with a public profile who publicly thanks the NHS for recent care (twitter), how many events of that kind happen to that kind of person per day? The numbers are low, and the risk is very high.

More information simply makes things more certain, and you can exclude coincidences based on other information – heart attacks aren’t evenly distributed round the country, for example, and each event contains other information. Even if you don’t know which of several heart attack patients was the person you know, it’s likely that you have some other information about their person, their location, their medical history, or other events that can be used to exclude the false matches.

It only takes one incident to unlock your entire hospital history. All that protects those incidents is a contract with the recipients of HES to say they will not screw up, and the existence of that contract is accepted by the Information Commissioner’s Office as being compliant with its “anonymisation code of practice”, because the data is defined as being “anonymous in context”.

This may or may not be true, but relying on hundreds of companies never to screw up is unwise – we know they do.

All this goes to explain why the Secretary of State promised that those who are not reassured could opt out: