Author Archives: medcon

What is NHS England’s National Data Lake?

A key metric for care, especially complex care, is how patients feel about it. An extremely expensive drug may delay an outcome, but that measures only money and time, not quality of life – which is difficult to quantify. The way the NHS usually does that is via a “Patient Reported Outcome Measure” — how did the treatment make you feel afterwards?  Do the patients who had it, feel it was worth the side effects? As a key metric, we would have expected some movement towards a digital “100% PROMs” (as covered here in point 3). A treatment may be possible, and easy for hospitals, but does it help patient outcomes?

As a result of NHS England’s neglect, the data lake will do nothing to improve patient measures (and some issues only appear in PROMS); it’s all accounting measures from NHS England’s accountants and Whitehall micromanagers. Their idea of a consultant is not someone in a white coat, but someone with a calculator. When NHS England talks about improving care, it’s clear their idea of engaging with patients has not improved since care.data was abandoned due to their failure.

The ‘Target Architecture’ document shows NHS England has learnt nothing.  As always suspected (and denied) for care.data, NHS England now admits that it wants “near real time” access to medical records: letting their accountants and expensive management consultants in an office second guess your nurse and your doctor who listen to you.

  • NHS England’s approach is still driven by its desire to do “near real time” monitoring of doctor and nurse performance – there will be no opt out of accountants looking at your records.
    • 125. Sensitive personal and confidential data (which is fully identifiable) will almost certainly be required to achieve interoperability and to facilitate precision medicine and case finding. The NDG Review opt out will not apply.”

 

  • NHS England clearly considers its micromanaging more important than either Accountable Care or CCG/STP access…  See the middle box in Figure 2 (p14) and para 40 (p10); more on the fundamental problems with the use of ‘secondary datasets’ for operational purposes in this paper.

 

 

  • The only reason reason to make this “near real time” (figure 2 – i.e. at least daily) is to force organisations to hand over operational data to those who have no operational role – ‘secret micromanaging of hospitals from afar’

 

  • According to NHS England, patients will have to opt out again in May next year, given the GDPR if they do not want their data used. (This suggests medConfidential will have to run an opt out process since NHS England say it will not be otherwise available to you)
    • It does acknowledge two GDPR opt-outs will have to be respected, but in the process it breaks the existing opt-out choice for patients.

 

 

  • “125… The NDG Review opt out will not apply. However the GDPR Right to object and the GDPR right to restrict processing will apply should a data subject wish to exercise that right and certain criteria are met.”

 

    • Clause 15 of the Data Protection Bill (as laid) gives the Government the ability to, by Regulation, remove those rights. Either way, the Caldicott Opt Out should be extended to cover the opt outs possible under GDPR. Since every patient is being written to, that can be made clear to everyone involved, and they can update their choices according to their wishes in the new environment

 

Fundamentally, paragraph 125 is entirely in conflict with what the Department of Health implied in their response to Caldicott 3. However, it is NHS England who are expected to organise and fund the individual letters to patients, and it will have their logo on; possibly not a DH logo. Will DH put their logo on this?

  • The mishandling of the NHS internal market:
    • While section 4.3 in the Annex tries to muddy the waters, stating patient information will be “anonymised or is provided in aggregate views sometimes linked with wider information sources”, in practice these activities are consistently done using identifiable patient data under perpetually-renewed Section 251 ‘support’.

 

  • Lessons from care.data and Caldicott 3 have simply been ignored:
    • The document is dated 13 July – just one day after the Government’s response to Caldicott 3, on 12 July…

 

  • While nods towards “transparency” and “trust” are scattered throughout the document, with one whole section devoted to the notion of a “diameter of trust”, NHS England provides no indication of how it intends to deliver the Government’s commitments on transparency of access to patients; nothing appears in any of the ‘architecture’ diagrams, or in the text.
    • The model relies heavily on the use of “de-identified” data to avoid answering difficult questions that care.data ignored. Such data would clearly be linkable to the individual – how else “personalised care”? – so, unless NHS England proposes to also ignore the Government’s transparency commitments, patients will have to be told.
    • This also ignores the critical point that de-identification will never be direct care.

 

  • “… To promote Better Health for all”, para 6 (p3). While prevention must clearly involve ‘behavioural change through information’, a failure to be honest (and transparent) about the two meanings inherent in “promotion” – as in that has contributed to this mess. That they lead off with this phrase may be good – the NHS needs to take a more preventative approach – but it provides cover still for those with more commercial / corrupting agendas.

 

  • Commercial reuse will continue, and expand. As NHS England has failed to convince anyone else that it should get a new data collection, it will instead expand the dataset collected under HES (which it has powers to do unilaterally).
    • These new datasets will flow into the “Interoperable regional data hubs” (July), aka “regional data ponds” aka “national data lake” (January), and from there, can be drained by anyone who wants to some data.
    • “Information Brokers” (which NHS Digital calls “information intermediaries”) have also expanded beyond NHS, and now include selling access to the cancer registry.

The Department of Health has confirmed that it will write to every patient who has opted out about the new arrangements. NHS England’s long standing refusal to match that for patients who are in the dark risks creating a perverse incentive for those who gave Jeremy Hunt the benefit of the doubt.

One of the changes since 2014 is that there should be a Doctor in charge of Information at NHS England – when the new Chief Clinical Information Officer is appointed, they should not inherit a toxic programme. Care.data contained catastrophic failures as a result of being designed by someone who had never spent a day at medical school.

As they plan the data lake (“regional data ponds” or “interoperable regional data hubs”) has the institutional disregard for patients mean that they’ve forgotten every lesson? Using the language of business, forgetting that no one benefits if the NHS just charges money to itself. There are alternatives.

‘data lake’ is not a clinical term, nor a clinical tool. It’s an IT term, for IT people to talk about large technology projects, and to sell the tools for those projects. The benefits to the NHS are entirely tangential from making a small number of techies in suits feel good about what they’re doing without the “burden” of talking to doctors or patients. Paddling in a data lake, they’re able to talk with confidence – despite their talk being irrelevant to solving problems that front line clinicians and their patients face. The data lake is a solution in search of a problem.

Care.data wanted to link all data and sell it in secret without consent; it’s new data lake links all data and sells it in secret without consent.  What changed in the intervening 4 years?

Those who want to read your medical records have had weekly updates on this document since January; we saw it via a leak in August, despite being lied to repeatedly by a co-author about being shown a copy. It is very clear why. No one from NHS England was willing to put their name on the document (Will Smart’s name was on the January version). Who on NHS England’s Board will be expected to sign off on this mandate to STPs?

Despite the rhetoric from the Department of Health, it’s pretty clear none of the details in NHS England have changed. Even the Wellcome Trust – who saw weekly updates – expect the same problems from 2014 all over again.

We will be here…

Early October Update

What’s happening?

By next summer, we will have a new Data Protection law, and a new NHS opt out model.

We should have a good idea by the end of November what the details all look like. The Department of Health are still playing coy – as until everything is final, then nothing is final.

Decisions in recent weeks have moved from a “big bang” launch next March, into a more gradual rolling start, which can deliver when things are ready. This is a great improvement.

Whatever happens, as things continue to change, we’ll update our scorecard of loopholes to keep you informed. It was first published as part of our recent “annual report”, but things will move on as the process rolls on.

 

NHS data: The rolling start has begun

As the rules stand today, any existing opt out will be upheld automatically within the new system. You can go to your GP receptionist, with our existing form, and they will make the change on their system which takes effect. As a patient, how it works – which system is in use – shouldn’t matter to you.

Shortly, the NHS Digital website will appear to give patients the information on how any data is used, and later a service to tell them how your data is used.

At some point next year, hopefully after you can see how your current wishes have been respected, you can express new wishes (as you can now). But the rolling start added by the last Direction to NHS Digital makes this better and simpler: There is no big bang launch, but a steady rollout as things start. If one thing is delayed, the consequences are fewer.

Your consent choice should follow your data, and when/why your wishes were honoured, or not. There are legitimate exceptions, but there are no legitimate secret exemptions.

As progress rolls forward, our scorecard can keep you up to date on where things are.

 

What else might happen next?

Any future Direction from either the Secretary of State or NHS England, must either leave the effects of your existing opt out in place, or explicitly take an action to remove it. Will the Department of Health or the National Data Guardian going to allow the removal of opting out that NHS Digital has already begun?

That would be a dramatic and novel change to public trust in a new system – undermining the point somewhat.

There is potential for a good outcome :

  • Single tick box, online, covering all secondary data uses in and outside the NHS
    • This includes commercial reuse of cancer data by Public Health England. The ICO is investigating our complaint on this topic, which boils down to a simple question: does PHE tell the truth? (evidence says no)
  • Existing care.data opt outs merged into the new one giving a clear path forwards
  • Letters to every patient about the new arrangements.

Any of these would undermine any other good work:

  • Undoing opting out that is already in place
  • Multiple forms being needed
  • Letters not going to every patient who did not opt out
  • Multiple steps (and digital dark patterns – paragraph 2) in the opt out process.

We do not yet know all the details – and we’ll tell you when there’s evidence in practice. But there is progress.

While the NHS is moving towards a rolling start, the road they’re on is akin to an ambulance going down a busy high street with lights flashing – there’s a good idea how long it should take, but if someone does something unwise in the belief that thinking their goal is more important, it might take a little longer while an obstacle is removed. It’s been nearly 4 years since care.data collapsed. If it takes another few months, that’s ok.

But if the NHS data environment is like a normal street, the rest of Government is more like the Wacky Races.

 

What’s next?

In a couple of weeks, we’ll have an update on the National Data Guardian Bill, which is currently queued up in the House of Commons, and the Data Protection Bill, which is currently in the House of Lords.

While our main focus is on medical data, in our free time, we look at the rest of Government – both central and local.

They are themselves doing some thinking about how data is used, and while views are variable, it mostly reflects the initial reactions to care.data in the NHS. That it couldn’t happen there, and why do they need to change anything.

The lesson from the last 4 years, is that doing this properly takes time. We have taught the NHS this once, and will remain here to make every data flow in the NHS consensual, safe, and transparent.

It would be a surprise if the Government chooses to have worse data handling than the NHS. They will have only themselves to blame.

Overview of Current Data Discussions – October 2017

Two weeks after our annual report and rest of government supplement, there are now a number of data consultations on going. We attempt to summarise them all here.

Data Protection Bill

The Data Protection Bill is passing through the House of Lords. Clause 15 if so significant concern, giving Ministers the ability to carve a hole in the Data Protection Act at will – something this Government claimed it wouldn’t do, as it was key safeguard in the Digital Economy Act earlier this year. As written, it is a dramatic change from the data protection status quo, and gives the Government broad powers to exempt itself from the rule of law.

We have a briefing on the Bill for Second Reading in the Lords.

As the NHS moves towards transparency over medical records, the very information provided via transparency must be subject to the same protections against enforced SAR as the records themselves. It’s unclear whether clause 172(1) does this sufficiently.

Implementing the Digital Economy Act: “Better Use of Data”

To plagiarise Baroness O’Neill, whose approach is very relevant here: better than what?

The Cabinet Office are consulting on the Digital Economy Act Codes of Practice. We have a draft response to that consultation, which goes into more detail on a number of issues raised in our rest of government supplement.

As for how that will be used in practice, the Cabinet Office are having meetings about updating their data science ethics framework, and the ODI is seeking views on their proposed data canvas. The canvas is better, but to qualify as science, it can’t just be some greek on a whiteboard, but must include a notion of accountability for outcomes, and falsifiability of hypotheses.

Otherwise, it’s not science, it’s medieval alchemy – with similar results.

Most interestingly, it appears that despite all it’s flaws, the current “data science ethics framework” is in use by Departments, and they do find it useful for stopping projects that are egregiously terrible. So while the framework allows unlawful and unethical projects through, preventing those was not their goal – the hidden goal was to stop the worst projects where every other “safeguard” has demonstrably failed. This is a good thing; it’s just a pity that the previous team denied it existed. The honesty from the post-reset team is welcome – the previous approach included denying to our face that a meeting like this one was taking place, after someone else had already told us the date.

Part 2 to follow

medConfidential on Life Sciences Strategy

The Government has launched its life sciences strategy.

The operative line which underlies all of this from an NHS perspective is:

“This may require some trade-off between trials infrastructure for nursing and for digital,”

Business want such trade offs, but the NHS and patients will likely have something to say about that. Did DH agree to it?

medConfidential coordinator Phil Booth said:

“The missing piece in here is patient consent. While the strategy mentions Dame Fiona’s Review, it doesn’t actually say whether the human tissue they want to buy will be consented or not” (top of page 8)

“Until we see what the NHS itself is planning, there’s nothing in here that wasn’t on the life sciences wishlist 4 years ago from the flawed care.data scheme; and nothing to suggest they’ve learnt any lessons.

“The Government has confirmed that patients who have opted out will be contacted about the new arrangements; but what will those who trusted the NHS to do the right thing be told?

Any Data Lake will fail; there is an alternative

We’ve added some new words to our front page.

Any attempt to solve problems of records following patients along a care pathway that involves putting all those records into a big pile, will either fail – or first breach the Hippocratic Oath, and then fail.

A Data Lake does not satisfy the need for doctors to reassure their patients (e.g. false positive tests), does not satisfy the need for doctors to hold information confidentially from others (e.g. in the case of Gillick competency, or on the request of a patient), or when institutions cannot tell doctors relevant details, e.g. in situations where there is “too much data, but no clear information”.

From the NHS’ national perspective, micromanagers at NHS England will get to reach into any consultation room and read the notes – especially in the most controversial cases. They might be trying to help, and while members of Jeremy Hunt’s Office itself might not reach in (to be fair, they probably wouldn’t), do you believe the culture at NHS England is such that some NHS middle-manager wouldn’t think that is what they were expected to do, urgently, under the pressure of a crisis?

This is also why any ‘blockchain approach’ to health (specifically) will fail. Such technologies don’t satisfy the clinical and moral need to be opaque – deniability is not a user need of your bank statement.

Just as every civil servant recognises aspects of Sir Humphrey in their colleagues, it is the eternal hope of the administrator – however skilled, and especially when more so – that if a complex system worked just as they think it should, everything would be eternally perfect.

Such a belief, whether held by NHS England, DH, or the Cabinet Office is demonstrable folly. If you build a better mousetrap, the system will evolve a better mouse; everything degrades over time.

It was a President of the Royal Statistical Society who talked about “eternal vigilance”. This is why, and it also provides the solution.

As we’ve outlined before, the alternate approach to a leaky Data Lake is to add accountability to the flow of data along a care pathway.

The system already measures how many patients are at each stage, and their physical transfers; it should give the same scrutiny to measuring how many records follow electronically. Where the patient goes, but their data doesn’t, should be as clear to patients as statistics on clinical outcomes – because access to accurate data is necessary for good clinical outcomes.

Interoperability of systems, in a manner that is monitored, is already being delivered by care providers up and down the country. Creating lakes of records is simply an administrator’s distraction from what we already know works for better care.


medConfidential takes donations

medConfidential comment on DCMS Data Protection “Statement of Intent”

DCMS’s intent is clearly to pay more attention to Civil Service silos than citizens’ data.

Sometimes you reveal as much in what you don’t say, as in what you do. Or in what you pointedly ignore…

The ‘Statement of Intent’ document suggests that the confidential information in your medical records deserves no better protection than your local council’s parking list. This is contradicted by both the Conservative Party Manifesto, and the pre-election commitment around Jo Churchill MP’s Bill in the last Parliament to put the National Data Guardian on a statutory footing. So why is DCMS saying no?

DCMS says it intends this to be a “world leading “ Data Protection regime. Even if this weren’t the UK’s implementation of the General Data Protection Regulation, DCMS would know its intent falls short had its Ministers and officials paid any attention to what’s happening outside their own offices.

Three weeks ago, the Government and the NHS committed to telling data subjects when their NHS medical records have been used, and why; and multinationals such as Telefonica have argued clearly and cogently that full transparency to data subjects is the only way forwards with innovation and privacy, without pitchforks.

The Government, however, is doing the minimum legally necessary – and already failing to meet the promises that it was elected on.

Given the Government’s manifesto and the Government’s commitments elsewhere, it is entirely possible for the UK to use digital tools to implement a world class data transparency and protection framework… So why is DCMS saying no?

Summer reading: Data for Research and Statistics in 2017

We’ve previously published an overview entitled “Governance of a Digital Economy in the medium term: AI, blockchain, genomics, and beyond” and a detailed answer to three questions we get asked. We now add this fourth on current questions around research and statistics (and NHS England).

Those questions are:

  1. Should the UK sequence the full genome of the entire population? (pg 1) (no)
  2. Can there be innovative, speculative analysis of individual-level sensitive data in a way that is Consensual, Safe, and Transparent? (pg 2) (yes)
  3. Is there a need for “AI exceptionalism” in data handling and administrative data? (pg 3) (no)
  4. Implications for research and statistics on extending Secondary Uses to facilitate third party time-sensitive micromanagement of Direct Care.

Also related, is the medConfidential response to the Code of Practice on Statistics consultation from the UK Statistics Authority.


Care Episode Histories: There will be a new dataset that replaces HES. The question is where that dataset will be copied, who will access it and on what terms, and whether dissent will be honoured for secondary uses.

The Government’s response to Caldicott 3 has made very clear: Patients will know about every access to their records, whether for direct care or secondary uses.

NHS England’s non-clinical staff look at it purely in terms of data protection; what about the medical profession’s obligation to confidentiality?

For PHE/CPRD, there remain copying loopholes that may remain in theory, and it’s unclear whether they wish their activities to be consensual, safe, or transparent.

The NHS has said that it will use digital tools to tell individuals how data about them is used, and have a public register of data sharing – both are necessary for trustworthiness. Whereas the Government still hasn’t committed to a Register of where it copies any data, including your medical information, under the Digital Economy Act; let alone mandating that its many digital services tell you how your data gets used.

Given the GDS/DCMS claims of digital leadership, being this far behind the NHS has got to get embarrassing. Given Government manifesto commitments, and the unknown hopes of a “Digital Charter”, we’ll see if anything is implemented

medConfidential comment on the Government’s response to the Caldicott 3 Review

medConfidential’s comment on the Written Ministerial Statement responding to the Caldicott 3 Review

While more details will emerge over the next several weeks, and given this is only a response to Dame Fiona Caldicott’s Review (and not any of the work by NHS England which depends upon it), medConfidential is in the first instance cautiously positive.

Original statement: http://www.parliament.uk/business/publications/written-questions-answers-statements/written-statement/Lords/2017-07-12/HLWS41/

In summary, the Statement says a number of things:

  • Patients will be offered a digital service through NHS.uk that will enable them to see how their medical records are used: both for direct care, and secondary uses beyond direct care.
  • Existing opt-outs preventing patients’ data being extracted from GP practices are protected until at least 2020.
  • There will be further consultations on the details of any changes.
  • Patients who have opted out will be written to about the Caldicott consent model when implementation is finalised (but before changes take effect).
  • NHS Improvement will begin to take cyber security into account. CQC now do.

Reflecting the very strong response from front-line clinicians and technical staff to the WannaCry ransomware outbreak, the Statement is very strong on cyber-security. Whether the analogue administrators that caused so much unnecessary hassle during that event have learnt lessons will become clear, next time…

With the newly-digital DCMS about to launch the Data Protection Bill, will the Government actually deliver on its commitment to a Statutory National Data Guardian?

Phil Booth, Coordinator of medConfidential said:

“We welcome the clear commitment that patients will know how their medical records have been used, both for direct care and beyond. This commitment means that patients will have an evidence base to reassure them that their wishes have been honoured.

“Some of the details remain to be worked out, but there is a clear commitment from the Secretary of State. The focus on digital tools shows the benefit to the whole NHS of the work towards NHS.uk. It is now up to NHS Digital and NHS England to deliver.

“The wait for consensual, safe, and transparent data flows in the NHS is hopefully almost over, and then new data projects can move forwards to deliver benefits for patients and vital research. Today’s announcement is about fixing what NHS England had already broken. The perils of a National Data Lake may lie ahead, but we hope lessons have been learnt, so we don’t end up back here in another 4 years.”

Google now tries to blames Doctors and Snapchat for its unlawful behaviour

Responding to Google’s claims that doctors “use” Snapchat to send photos for a second opinion, coordinator of medConfidential Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about this. The Report blames doctors for hygiene, and the hospital for it’s IT systems, everyone but Google. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship – something Google clearly fails to understand.”

If the assertions are based on evidence acquired in the Review, that should have been reported to CQC – unless there was a see no wrong, hear no wrong policy in place. Google provided no evidence that Doctors actually do this, just that they could install an app. They could also use any google messaging tool (except no one uses any of them). We fully expect DeepMind will “surprisingly” come out with a messaging app for doctors, which will be no better than email, and so solve none of the widely understood problems that mean fax machines are still useful. 

Doctors are responsible for safely caring for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.

We’re mostly surprised that Google didn’t use this to kick Facebook; but perhaps they didn’t want to criticise another member of the Partnership on AI…

Original press release here: https://medconfidential.org/2017/medconfidential-initial-comment-on-the-google-deepmind-independent-reviewers-report/

medConfidential initial comment on the Google DeepMind Independent Reviewers’ report

UPDATE 2pm: responding to Google’s claims that doctors use secure messaging to send photos, Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about it. The report blames doctors for hygiene, and the hospital for it’s IT systems. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship.”

Doctors care for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.


The report answers none of the obvious questions that a supposedly independent Review of unlawful data copying should have answered.  

The ICO confirmed on Monday that DeepMind Health’s deal with the Royal Free had broken the Data Protection Act in at least 4 ways [1], and they have been given weeks to fix it. There is now a formal undertaking in place for correction of their project’s ongoing breaches of the Data Protection Act [2]. As of this week, DeepMind remains in clear breach of UK privacy laws. (page 7)

The National Data Guardian’s letter, referred to by the Review, shows clearly that DeepMind were aware of the unlawful nature of their processing last December[3] and the Review suggests they chose to do nothing about it.

In addressing “law, regulation and data governance”, the Reviewers say “We believe that there must be a mechanism that allows effective testing without compromising confidential patient information” (page 9, right column). So many people agree that there are already such processes – DeepMind just didn’t use any of them. It is unclear why the “Independent Reviewers” feel this is anyone but Google’s problem. (Here’s the sandbox for Cerner – which the Royal Free uses.)

If, as Prof John Naughton analogises, the Royal Free’s response to the ICO decision was “like a burglar claiming credit for cooperating with the cops and expressing gratitude for their advice on how to break-and-enter legally”, this report is DeepMind saying “It wasn’t me! Ask my mum…” thinking that’s an alibi.

DeepMind accepts no reponsibility [4], and its Reviewers seem happy with that.  Which, given DeepMind’s broad AI ambitions, should frankly be terrifying…

Responding to the Review, medConfidential Coordinator Phil Booth said:

“If Page 7 (right column) is accurate in its description of record handling at the Royal Free, then CQC must conduct an urgent inspection of data hygiene at the hospital; or was this just “independent” hyperbole to make Google look good?”

“The Reviewer’s way to not criticise DeepMind is to avoid looking at all the things where DeepMind did anything wrong. The Reviewers may think “this is fine”, but anyone outside the Google bunker can see that something has gone catastrophically wrong with this project.”

“Google DeepMind continues to receive excessive amounts of data in breach of four principles of the Data Protection Act, and the Independent Reviewers didn’t think this worth a mention. DeepMind did something solely because they thought it might be a good idea, ignorant of the law, and are now incapable of admitting that this project has unresolvable flaws. The ICO has forced both parties to fix them within weeks having ignored them for approaching 2 years.

“DeepMind Health needs real senior management with a experience of caring for patients, i.e. a Regulated Medical Professional, as a Chief Medical Officer. The second paragraph on the inside front cover (which isn’t even a numbered page in the printed document, but page 2 in the PDF) shows how badly they have failed from the start.”

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on 07974 230 839 or coordinator@medconfidential.org

 

Notes to editors:

  1. Information Commissioner’s Office summary of their finding https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/07/royal-free-google-deepmind-trial-failed-to-comply-with-data-protection-law/
  2. The ICO requires that the Royal Free and DeepMind take actions within a month of the undertaking issuance – page 7. https://ico.org.uk/media/action-weve-taken/undertakings/2014352/royal-free-undertaking-03072017.pdfMany of these issues were highlighted to DeepMind by MedConfidential last year, and which they have repeatedly and systemically ignored.
  3. Sky News reported in May that the unlawful nature of the DeepMind data processing was first formally brought to the Royal Free & DeepMind’s attention in December 2016 by the National Data Guardian. http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142 Paragraph 4 of the letter from the National Data Guardian to the Hospital clearly shows that they were first formally of their legal failings in December.
  4. Details of medConfidential’s complaint are available here:
  5. This complaint has now been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

    i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]
    ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]
    iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]
    iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

    b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

    c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

    i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]
    ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”
    iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”
    iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
    20160504­deepmind­radio4­pm.mp3?dl=1
    20160504­deepmind­radio4­pm.mp3?dl=1

    d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

    …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

    e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

    i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”
    ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

     

  6. DeepMind’s response to the ICO finding has been to blame everyone but themselves. As they begin to regularly refresh part of their Review board, perhaps Shaun Spicer will be available to help.

 

-ends-