Category Archives: Press releases

medConfidential press releases

[PRESS RELEASE] Government’s new (draft) ‘data strategy’ for health boils down to “Data saves lives, so we’re going to sell it”

The Government’s new draft strategy, “Data Saves Lives: Reshaping Health and Social Care with Data” is published today. [1] 

The draft document – which took guidance on data and innovation from the same team as the OfQual A-levels mess last summer [2] – was developed by Matt Hancock’s ‘tech vision’ unit, NHSX (notably not an NHS body) being best known for its first draft of the COVID-19 app, and for thinking that no-one cares about the privacy of their GP records any more.

The draft manages just one, single, passing reference to last month’s catastrophically mishandled and miscommunicated GP data grab, [3] cited in “Progress so far” [4] to be implemented “later in 2021”. (Is this the announcement of the delay beyond September?)

Key paragraphs:

Aside from the stalled GP data grab, GPDPR, that same “Progress so far” list also mentions the highly secretive NHS England / NHSX COVID-19 Data Store, which feeds data to Palantir for Government’s favourite AI mercenaries, Faculty Science, to build dashboards. [5]

For over a year now, NHSE/X have refused to publish the list of projects and organisations to which they release data. The Government claims benefits, but shows none at all – if this is “progress”, then evidence, transparency and good governance are clearly out of the window.

One line that really matters is buried in section 3, “Supporting local and national decision makers with data”, [6] which says:

…we will use secondary legislation in due course to enable the proportionate sharing of data including, where appropriate, personal information for the purposes of supporting the health and care system without breaching the common law duty of confidentiality (ongoing)

[Further down the same section, it shows how they’ll give patients’ data to DWP…]

Does this Government really believe it can use “secondary legislation” to overturn the millennia-long trusted principle of doctor–patient confidentiality that lies at the very heart of healthcare?

The strategy regurgitates, almost unchanged, a set of “principles for data partnerships” [7] that were a Pharma-Bro-friendly Goat Rodeo [8] when they first surfaced two years ago, and which haven’t improved since. Its plaintive references to “transparency” and “citizen engagement” [9] don’t really hold water if – as medConfidential was – you participated in, gave expert testimony to, or were on the advisory panel to these efforts.  

We’ve also heard quite a lot of this “strategy” before. Compare, for example, the “commitment” at the bottom of the first section of 2021’s draft strategy, [10] to “give citizens the ability to see what research their data has informed, and who has had access to their data, as soon as the technology allows (ongoing)” with the “Paperless 2020” vision of the previous administration, in particular the section on page 50, [11] about “Digital Services – the offering to the Citizen and his apps supplier (Phased 2015 –2018)”

Phil Booth, coordinator of medConfidential said:

“Boris Johnson’s Government says “Data Saves Lives”, but buried in the small print is a rather more dubious deal: “If you want better care from your data, you must let us sell it.”

“Once, the PM remembered it was nurses and doctors who saved his life – and the next time Matt Hancock pontificates that patients “own their own data”, he should remember that taking something someone “owns” without their permission isn’t ‘sharing’ or ‘innovation’, it’s just plain theft.”


Notes to Editors

1) The draft strategy is published here: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft – as was first announced in the Government’s wider ‘National Data Strategy’ last September: “For example, NHSX is developing a Data Strategy for Health and Social Care in Autumn 2020, which will aim to build on the permissive approach to data sharing for care seen during the coronavirus response…” https://web.archive.org/web/20200909080611/https://www.gov.uk/government/publications/uk-national-data-strategy/national-data-strategy 

2) CDEI’s (the Centre for Data Ethics and Innovation) advice is cited across the document; its outgoing Chair was in post at the time the advice was given: https://twitter.com/medConfidential/status/1357037423172141061

3) Links to news coverage of the 2021 GP data grab here: https://medconfidential.org/information/media-coverage/

4) In section 3, “Supporting local and national decision makers with data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#supporting-local-and-national-decision-makers-with-data

5) Even Dominic Cummings seems unaware that NHS patients’ personal data, collected by NHS England under the COPI powers, was being fed to Palantir so that Faculty could build dashboards: https://twitter.com/EinsteinsAttic/status/1403496331965050881 

6) Section 3, “Supporting local and national decision makers with data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#supporting-local-and-national-decision-makers-with-data 

7) In section 7, “Helping developers and innovators to improve health and care”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#helping-developers-and-innovators-to-improve-health-and-care 

8) https://medconfidential.org/wp-content/uploads/2019/10/business-models.pdf – see page 3 for links to examples of how the proposed ‘business models’ have already all failed in practice.

9) At the bottom of the “Next steps” section, immediately above Annex A: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#next-steps

10) Section 1, “Bringing people closer to their data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#bringing-people-closer-to-their-data

11) Tim Kelsey, the architect of the previous GP data grab (“care.data”) in 2013/14, was the ‘Director for Patients and Information’ at NHS England when he presented this strategy: https://www.kingsfund.org.uk/sites/default/files/media/NIB%20Master%20Slide%20Deck%20v6.pdf

__

medConfidential has tracked the development of both Data Strategies since they surfaced last year: https://medconfidential.org/2020/the-national-data-strategy/ and has followed the evolution of ‘Shared Care Records’ for years before that: https://medconfidential.org/2021/shared-care-records/ – graphics on how the data will be used are in the second half of the page.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every use of data in and around the NHS and wider care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on phil@medconfidential.org

– ends –

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

GDPR DAY BRIEFING: ‘Your data matters’ and NHS patients’ data, with the ‘new’ National Data Opt-Out

May 25th 2018 will start an awareness-raising programme with the public that their data matters – and what, if anything, changes as a result of patients’ increased rights under GDPR.

With regard to NHS patients’ health data:

  • A new NHS ‘National Data Opt-Out’ commences on GDPR day (May 25th);
  • NHS Digital continues to sell (i.e. disseminates under contract, on payment of a fee) patients’ data, as it has been doing for years;
  • GDPR expands the definition of ‘identifiable data’ (one reason why everyone’s privacy policies are changing);
  • Will NHS Digital ignore patient opt-outs on these newly-identifiable data releases, relying on definitions from the old Data Protection Act?
  • NHS Digital / DH refuse to ask data applicants why 98% of NHS patients’ data isn’t enough for them; while there may be legitimate reasons to override patient opt-outs, pretending new legislation does not apply to data releases (yet again) is not one of them.

Your Data Matters… but NHS Digital sells it anyway

NHS Digital still forgets about patients. Unfortunately, it sees them less as people and more as ‘lines in a database’.

NHS Digital continues to sell a product called ‘Hospital Episode Statistics’ (HES); a dataset that is not actually statistics but that rather consists of individual patients’ lifelong hospital histories, with every medical event dated and linked together by a unique identifier. As of May 24th, two-thirds of NHS Digital’s data disseminations do not respect patients’ right to object (‘opt out’) to their data being used for purposes beyond their direct care.

If you read NHS Digital’s own Data Release Registers, or view them at TheySoldItAnyway.com, [1] you can see for yourself the evidence of where data goes – and where patients’ express wishes are deemed not to matter.

After four years, and further breaches of health data, NHS Digital ignores the choices of the 1.4 million people who opted out and still sells their (and every other hospital patient’s) data for commercial reuse. Those who claim to need 100% of the data for some reason, need merely explain to a competent data release body why 98% of people’s data isn’t enough – an explanation they’re currently not even asked to provide.

GDPR clarifies that the hospital data NHS Digital continues to sell is identifiable data – so claimed exemptions (item 5) to people’s opt outs don’t apply. Especially for those who remember the dates of particular medical events in hospital, such as the birth dates of their own children, or who can read about them online. [2]

‘Could do better’

Last week, the Department for Education called a halt to the sale of the data it collects on schoolchildren [3] for the very reason the NHS continues using to justify its sale of patients’ data.

NHS Digital now has a research environment [4] which allows far higher safety for patients’ data – but the companies that don’t want the NHS to be able to see what they’re doing with the data are special pleading. It is precisely these hidden uses to which patients are most likely to object.

NHS Digital’s customers, for example, still include for-profit companies such as Harvey Walsh, an “information intermediary” that – exactly as it did in and before 2014, and despite having breached the terms of its contract since then – continues to service commercial clients including pharmaceutical companies, which use the information to promote their products to doctors.

The digital service launching for GDPR Day in fact does less than the form that’s been available on medConfidential’s website since late 2013. [5] Our GP form works immediately – if you use the new digital service, your GP won’t know about it for months.

Discussing a damning ‘report’ in the House of Commons, the chair of the Health Select Committee censured NHS Digital for its “dimmest grasp of the principles of underpinning confidentiality”. [6] The Government has agreed to raise the bar for breaching patients’ confidentiality when handing information to the Home Office; will NHS Digital now respect the choices of those patients who wish to keep the information in their medical records confidential too?

The solution to this is straightforward: DH can Direct NHS Digital to respect objections (opt-outs) in all releases of HES that CAG has not approved to have data released without patients’ objections honoured. There may be projects that require 100% of patient data; two-thirds of them do not.

The ICO has not yet updated its (non-statutory) Anonymisation Code of Practice to match GDPR, although its guidance on the GDPR definition of personal data and newer codes on subject access rights show the definitions in GDPR mean NHS Digital’s current practice does not deliver on its promises to patients.

The NHS has ignored legislative changes and harmed research projects before – see note 4 in this post. This effect is one of the main things that prompted the Wellcome Trust to create the Understanding Patient Data initiative.

But it is still (a bit) better than it was…

NHS Digital now sells less of your data than it used to; it only sends out hundreds of copies of the nation’s hospital records – ‘pseudonymised’, but containing information that GDPR recognises makes it identifiable, and therefore still personal data.

You now have the ability to make a choice for you and (after a fashion) your family that will work, in due course [7] – but NHS Digital needs to listen to Dr Wollaston, “take its responsibilities seriously, understand the ethical underpinnings and stand up for patients”, respect that patients’ data matters, and fully honour everyone’s choices.

Questions for interviewees:

  • What does the NHS’ online-only opt-out service not do on day one, that the GP-led process did last week?
  • How many steps does it take for a family to express their collective choice on how their data is used?
  • When this new digital dissent process was signed off under the Government’s Digital Service Standard, did Ministers spare a thought for their families at all?
  • Will patients’ opt-outs be honoured in the dissemination of HES under GDPR?
    • If not, will those patients who already opted out be told why not?
  • A mother with 2 children is over 99% likely to be identifiable from their children’s birth dates alone; given the enhanced GDPR ‘right of access’ to any recipient data to which opt-outs have not been applied, will NHS Digital keep selling what GDPR defines as the identifiable patient data of those who have opted out?
    • What is the burden on medical research of this choice by NHS Digital, made to placate its commercial customers?

If you or any patient would like to see how their data is used, and where their choices are being ignored, please visit TheySoldItAnyway.com

Notes for Editors

1) NHS Digital publishes its data release register as a spreadsheet but it fails to link to, e.g. its own audit reports – so medConfidential has created a more readable version that does.

2) All the information required to identify Ed Sheeran’s entire hospital history in HES appears in this BBC News article, published online on 19 May 2018: http://www.bbc.co.uk/news/uk-england-suffolk-44155784

3) ‘Sharing of school pupils’ data put on hold’, BBC News, 15 May 2018: http://www.bbc.co.uk/news/technology-44109978

4)  A ‘safe setting’, as medConfidential and others recommended in evidence to Parliament back in 2014: https://digital.nhs.uk/services/research-advisory-group/rag-news-and-case-studies/remote-data-access-environment-to-speed-up-access-to-data

5) We have updated our form to reflect the name change. See https://medconfidential.org/how-to-opt-out/

6)  https://www.theyworkforyou.com/debates/?id=2018-05-09a.746.6#g771.0

7) The National Data Opt-out should be respected by bodies across the health and care system “by 2020”.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

– ends –

Response to the House of Lords AI Select Committee Report

The AI Select Committee of the House of Lords published their report this morning.

In respect of the NHS, it suggests nothing the NHS wasn’t already doing anyway.

The suggestion that ‘data trusts’ be created for public sector datasets – such as tax data – will likely cause fundamental distrust in AI amongst the public (paragraphs 82 & 84). The NHS has shown how that model ends badly when the prime drivers are commercial, not ‘human flourishing’.

Sam Smith, a coordinator at medConfidential said (referring to paragraphs 99, 129, 317-318, 386, 419-420) :

“A week after Facebook were criticised by the US Congress, the only reference to the Rule of Law in this report is about exempting companies from liability for breaking it.

“Public bodies are required to follow the rule of law, and any tools sold to them must meet those legal obligations. This standard for the public sector will drive the creation of tools which can be reused by all.

 

-ends-

medConfidential are speaking at the APPG Rule of Law in Parliament from 11 – 12:30, and more details are now available.

medConfidential comment on the Government’s response to the Caldicott 3 Review

medConfidential’s comment on the Written Ministerial Statement responding to the Caldicott 3 Review

While more details will emerge over the next several weeks, and given this is only a response to Dame Fiona Caldicott’s Review (and not any of the work by NHS England which depends upon it), medConfidential is in the first instance cautiously positive.

Original statement: http://www.parliament.uk/business/publications/written-questions-answers-statements/written-statement/Lords/2017-07-12/HLWS41/

In summary, the Statement says a number of things:

  • Patients will be offered a digital service through NHS.uk that will enable them to see how their medical records are used: both for direct care, and secondary uses beyond direct care.
  • Existing opt-outs preventing patients’ data being extracted from GP practices are protected until at least 2020.
  • There will be further consultations on the details of any changes.
  • Patients who have opted out will be written to about the Caldicott consent model when implementation is finalised (but before changes take effect).
  • NHS Improvement will begin to take cyber security into account. CQC now do.

Reflecting the very strong response from front-line clinicians and technical staff to the WannaCry ransomware outbreak, the Statement is very strong on cyber-security. Whether the analogue administrators that caused so much unnecessary hassle during that event have learnt lessons will become clear, next time…

With the newly-digital DCMS about to launch the Data Protection Bill, will the Government actually deliver on its commitment to a Statutory National Data Guardian?

Phil Booth, Coordinator of medConfidential said:

“We welcome the clear commitment that patients will know how their medical records have been used, both for direct care and beyond. This commitment means that patients will have an evidence base to reassure them that their wishes have been honoured.

“Some of the details remain to be worked out, but there is a clear commitment from the Secretary of State. The focus on digital tools shows the benefit to the whole NHS of the work towards NHS.uk. It is now up to NHS Digital and NHS England to deliver.

“The wait for consensual, safe, and transparent data flows in the NHS is hopefully almost over, and then new data projects can move forwards to deliver benefits for patients and vital research. Today’s announcement is about fixing what NHS England had already broken. The perils of a National Data Lake may lie ahead, but we hope lessons have been learnt, so we don’t end up back here in another 4 years.”

Google now tries to blames Doctors and Snapchat for its unlawful behaviour

Responding to Google’s claims that doctors “use” Snapchat to send photos for a second opinion, coordinator of medConfidential Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about this. The Report blames doctors for hygiene, and the hospital for it’s IT systems, everyone but Google. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship – something Google clearly fails to understand.”

If the assertions are based on evidence acquired in the Review, that should have been reported to CQC – unless there was a see no wrong, hear no wrong policy in place. Google provided no evidence that Doctors actually do this, just that they could install an app. They could also use any google messaging tool (except no one uses any of them). We fully expect DeepMind will “surprisingly” come out with a messaging app for doctors, which will be no better than email, and so solve none of the widely understood problems that mean fax machines are still useful. 

Doctors are responsible for safely caring for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.

We’re mostly surprised that Google didn’t use this to kick Facebook; but perhaps they didn’t want to criticise another member of the Partnership on AI…

Original press release here: https://medconfidential.org/2017/medconfidential-initial-comment-on-the-google-deepmind-independent-reviewers-report/

medConfidential initial comment on the Google DeepMind Independent Reviewers’ report

UPDATE 2pm: responding to Google’s claims that doctors use secure messaging to send photos, Phil Booth said: “Had Google managed to buy Snapchat, they wouldn’t have said anything about it. The report blames doctors for hygiene, and the hospital for it’s IT systems. Now they’re blaming doctors for their choice of secure messaging apps to care for patients with whom they have a direct care relationship.”

Doctors care for their patients, and it’s up to them which safe and lawful tool to use. The only reason DeepMind care is they have an tool to sell; and they’re still in denial that they way they built it was unlawful.


The report answers none of the obvious questions that a supposedly independent Review of unlawful data copying should have answered.  

The ICO confirmed on Monday that DeepMind Health’s deal with the Royal Free had broken the Data Protection Act in at least 4 ways [1], and they have been given weeks to fix it. There is now a formal undertaking in place for correction of their project’s ongoing breaches of the Data Protection Act [2]. As of this week, DeepMind remains in clear breach of UK privacy laws. (page 7)

The National Data Guardian’s letter, referred to by the Review, shows clearly that DeepMind were aware of the unlawful nature of their processing last December[3] and the Review suggests they chose to do nothing about it.

In addressing “law, regulation and data governance”, the Reviewers say “We believe that there must be a mechanism that allows effective testing without compromising confidential patient information” (page 9, right column). So many people agree that there are already such processes – DeepMind just didn’t use any of them. It is unclear why the “Independent Reviewers” feel this is anyone but Google’s problem. (Here’s the sandbox for Cerner – which the Royal Free uses.)

If, as Prof John Naughton analogises, the Royal Free’s response to the ICO decision was “like a burglar claiming credit for cooperating with the cops and expressing gratitude for their advice on how to break-and-enter legally”, this report is DeepMind saying “It wasn’t me! Ask my mum…” thinking that’s an alibi.

DeepMind accepts no reponsibility [4], and its Reviewers seem happy with that.  Which, given DeepMind’s broad AI ambitions, should frankly be terrifying…

Responding to the Review, medConfidential Coordinator Phil Booth said:

“If Page 7 (right column) is accurate in its description of record handling at the Royal Free, then CQC must conduct an urgent inspection of data hygiene at the hospital; or was this just “independent” hyperbole to make Google look good?”

“The Reviewer’s way to not criticise DeepMind is to avoid looking at all the things where DeepMind did anything wrong. The Reviewers may think “this is fine”, but anyone outside the Google bunker can see that something has gone catastrophically wrong with this project.”

“Google DeepMind continues to receive excessive amounts of data in breach of four principles of the Data Protection Act, and the Independent Reviewers didn’t think this worth a mention. DeepMind did something solely because they thought it might be a good idea, ignorant of the law, and are now incapable of admitting that this project has unresolvable flaws. The ICO has forced both parties to fix them within weeks having ignored them for approaching 2 years.

“DeepMind Health needs real senior management with a experience of caring for patients, i.e. a Regulated Medical Professional, as a Chief Medical Officer. The second paragraph on the inside front cover (which isn’t even a numbered page in the printed document, but page 2 in the PDF) shows how badly they have failed from the start.”

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on 07974 230 839 or coordinator@medconfidential.org

 

Notes to editors:

  1. Information Commissioner’s Office summary of their finding https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2017/07/royal-free-google-deepmind-trial-failed-to-comply-with-data-protection-law/
  2. The ICO requires that the Royal Free and DeepMind take actions within a month of the undertaking issuance – page 7. https://ico.org.uk/media/action-weve-taken/undertakings/2014352/royal-free-undertaking-03072017.pdfMany of these issues were highlighted to DeepMind by MedConfidential last year, and which they have repeatedly and systemically ignored.
  3. Sky News reported in May that the unlawful nature of the DeepMind data processing was first formally brought to the Royal Free & DeepMind’s attention in December 2016 by the National Data Guardian. http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142 Paragraph 4 of the letter from the National Data Guardian to the Hospital clearly shows that they were first formally of their legal failings in December.
  4. Details of medConfidential’s complaint are available here:
  5. This complaint has now been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

    i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]
    ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]
    iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]
    iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

    b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

    c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

    i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]
    ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”
    iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”
    iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
    20160504­deepmind­radio4­pm.mp3?dl=1
    20160504­deepmind­radio4­pm.mp3?dl=1

    d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

    …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

    e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

    i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”
    ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

     

  6. DeepMind’s response to the ICO finding has been to blame everyone but themselves. As they begin to regularly refresh part of their Review board, perhaps Shaun Spicer will be available to help.

 

-ends-

[PRESS RELEASE] Google DeepMind deal with the Royal Free Hospital broke the law

The Information Commissioner’s Office has today ruled that the deals which gave Google DeepMind copies of 1.6 million patients’ hospital records are unlawful:

https://ico.org.uk/action-weve-taken/enforcement/royal-free-london-nhs-foundation-trust/

The ICO’s ruling determines that the deals breached four of the Data Protection principles:

https://ico.org.uk/media/action-weve-taken/undertakings/2014353/royal-free-undertaking-cover-letter-03072017.pdf

medConfidential first complained to the National Data Guardian and ICO in June 2016. [1]

In February 2017, the National Data Guardian said that copying of patients’ data to develop the Streams app was on an “inappropriate legal basis”:

http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

Google DeepMind – the AI company developing the app – has given various contradictory quotes about its intent over time, repeatedly asserting that what it was doing was lawful. [2]

Apparently entirely coincidentally, the “Independent Reviewers” of Google DeepMind Health have a report due out, via the Science Media Centre at 00:01 this Wednesday. The timing may be a coincidence – just as it was apparently a complete coincidence that the Royal Free released a press release about how wonderful the project was, without mentioning the word Google once, 72 hours after receiving the letter from the National Data Guardian saying the data use was unlawful. [3]

On seeing the ICO’s ruling, Phil Booth, coordinator of medConfidential said:

“We look forward to Google DeepMind’s Independent Reviewers’ report on Wednesday.”

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on 07974 230 839 or coordinator@medconfidential.org

Notes to editors

1) Details of medConfidential’s complaint are available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/
deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

2) This complaint has now been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

…“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

3) https://www.royalfree.nhs.uk/news-media/news/new-app-helping-to-improve-patient-care/

 

[PRESS RELEASE] Google DeepMind unlawfully copied the medical records of 1.6 million NHS patients

“A core part of Google” has been told it has no lawful basis to process 5 years’ of patient data from the Royal Free Hospital in London. [1] With no legal basis, the data must be deleted.

In May 2016, the New Scientist reported [2] that Google DeepMind had access to a huge haul of patient data, seemingly without appropriate approvals. In July 2016, the MHRA confirmed [3] that DeepMind had not received any approvals for a trial involving patients, using patient data. In November 2016, DeepMind signed a replacement contract covering exactly the same data. [5d]

The National Data Guardian has provided a view on this matter (all emphasis added): [1]

The Royal Free “…confirmed to us [NDG] that 1.6 million identifiable patient records were transferred to Google DeepMind and that implied consent for direct care was the legal basis for the data processing.

“…Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer. My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.

It is unclear whether Google DeepMind has complied with the finding that it had no legal basis for processing this data; nor is it clear what it was that first attracted DeepMind executives to unlawfully copy 1.6 million people’s medical records, repeatedly insisting on direct care as the sole legal basis. [8]

medConfidential agrees with the Information Commissioner, when she said in a speech to technology companies: “I do not believe data protection law is standing in the way of your success.” She reminded her audience: “It’s not privacy or innovation – it’s privacy and innovation.” [4]

In this case, this DeepMind project turned out to be neither of those things. [9]

The National Data Guardian’s investigation has made clear – despite their claims to the contrary – that DeepMind had no legal basis for their actions in this project.

medConfidential coordinator, Phil Booth, said:

“This letter shows that Google DeepMind must know it had to delete the 1.6 million patient medical records it should never have had in the first place. There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it.

“Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.

“While the NHS sent doctors to a meeting, DeepMind sent lawyers and trained negotiators. What this boils down to is whether Google’s AI division followed the law and told the truth; it now appears they may have done neither.

“As events this weekend have shown, it’s the number of copies of patient data that matter – encryption locks won’t reassure anyone, if the wrong people have been given the keys.”

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on coordinator@medconfidential.org

Notes to editors

1) “The NDG has provided a view on this matter to assist the ICO’s investigation” was the National Data Guardian’s comment on the publication of the University of Cambridge paper, ‘Google DeepMind and healthcare in an age of algorithms’: https://link.springer.com/article/10.1007%2Fs12553-017-0179-1 and http://www.cam.ac.uk/research/news/
deepmind-royal-free-deal-is-cautionary-tale-for-healthcare-in-the-algorithmic-age

Sky News published a copy of the letter from the National Data Guardian on 15 May 2017: http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

2) medConfidential raised a complaint [4] to the ICO following reports in the New Scientist, and follow-ups elsewhere, about secretive data use by Google DeepMind:

a) New Scientist, 29/4/16: https://www.newscientist.com/article/2086454-
revealed-google-ai-has-access-to-huge-haul-of-nhs-patient-data/

b) New Scientist, 13/5/16: https://www.newscientist.com/article/2088056-did-
googles-nhs-patient-data-deal-need-ethical-approval/

c) Daily Mail, 4/5/16: http://www.dailymail.co.uk/news/article-3573286/NHS-
trust-handed-private-patient-details-Google-says-implied-permission-emerges-hospital-talks-internet-giant.html

d) BBC, 19/7/16: http://www.bbc.co.uk/news/technology-36783521

e) Guardian, 6/5/16 (note 9 May & 25 July updates at the bottom of the article): https://www.theguardian.com/technology/2016/may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder

3) “DeepMind is currently working with the MHRA to ensure that the device complies with all relevant medical device legislation before it is placed on the market” – TechCrunch, 20/7/17: https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

4) Information Commissioner’s speech, ‘Transparency, trust and progressive data protection’, 29 September 2016: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/
2016/09/transparency-trust-and-progressive-data-protection/

5) medConfidential’s complaint is available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

d) The end of the first ‘Note to editors’ in a press release from the Royal Free Hospital on 22 November 2016 clearly states: “The new agreement does not change the number of patients whose data will be processed by Streams”: https://www.royalfree.nhs.uk/news-media/news/nhs-and-technology-leaders-agree-groundbreaking-partnership-to-improve-safe/

6) Claims by the New Scientist have been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

i) …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

7) The claim to reach “HSCIC level 3” was a self-assessment by DeepMind, which was revoked upon examination. [See the 25 July update to this Guardian article].

8) In a controversial press release by the hospital on 24 February 2017, the word “Google” did not appear once, despite point 6 (a)(i) above: https://www.royalfree.nhs.uk/news-media/
news/new-app-helping-to-improve-patient-care/
and a subsequent Guardian article on 9 March 2017, from a press release by Google DeepMind, which explicitly attributes actions to Google DeepMind: https://www.theguardian.com/technology/2017/mar/09/google-deepmind-health-records-tracking-blockchain-nhs-hospitals

9) “ “With health data, and government acquired health data, we need to be sure we aren’t, in effect, giving oxygen away for free to a private company that will start to sell it back to us,” says Azeem Azhar, who writes the popular Exponential View newsletter…” – Quartz, 17/3/17: https://qz.com/934137/googles-goog-deepmind-got-too-much-health-data-from-
britains-nhs-research-paper-says/

– ends –

medConfidential comment on Google DeepMind briefing on an academic paper

We read many academic papers about data projects. It is rare they result in anything at all, let alone anonymous briefings against academic inquiry.

We were therefore intrigued by two points in this Wired article, written with access to Google DeepMind executives:

  1. It reuses a quote from medConfidential that is 9 months old, as if nothing has changed in the last 9 months. If that was true, why did Wired write about it again?
  2. That the quote from the Google DeepMind executive suggests the academic paper to which the article refers has errors.

If, as DeepMind says, “It makes a series of significant factual and analytical errors”, we look forward to DeepMind publishing evidence of any errors as a scientifically rigorous organisation would, rather than hiding behind anonymous briefings from their press office and a hospital. Google claims “ “we’re completely at the mercy and direction” of the Royal Free”, but from the last 2 paragraphs of the same article, that’s obviously not completely true…

medConfidential has confidence in the scientific inquiry process – and we are aware DeepMind also do, given their own authorship of academic articles about their work.

While it is highly unusual, it is not a factual or analytical error to write an academic paper that is readable by all.

We expect that DeepMind was aware of the substance of the paper prior to publication, and didn’t say anything about any of those problems then. This behaviour is entirely consistent with DeepMind’s duplicity regarding our timeline of public facts about their original deal – they claim errors in public, but will say nothing about them when asked.

Colleagues at the Wellcome Trust are right – mistakes were made.

This is how AI will go wrong; good people with good intentions making a mistake and being institutionally incapable of admitting that most human of characteristics, imperfection.