Author Archives: Phil

[PRESS RELEASE] Government’s new (draft) ‘data strategy’ for health boils down to “Data saves lives, so we’re going to sell it”

The Government’s new draft strategy, “Data Saves Lives: Reshaping Health and Social Care with Data” is published today. [1] 

The draft document – which took guidance on data and innovation from the same team as the OfQual A-levels mess last summer [2] – was developed by Matt Hancock’s ‘tech vision’ unit, NHSX (notably not an NHS body) being best known for its first draft of the COVID-19 app, and for thinking that no-one cares about the privacy of their GP records any more.

The draft manages just one, single, passing reference to last month’s catastrophically mishandled and miscommunicated GP data grab, [3] cited in “Progress so far” [4] to be implemented “later in 2021”. (Is this the announcement of the delay beyond September?)

Key paragraphs:

Aside from the stalled GP data grab, GPDPR, that same “Progress so far” list also mentions the highly secretive NHS England / NHSX COVID-19 Data Store, which feeds data to Palantir for Government’s favourite AI mercenaries, Faculty Science, to build dashboards. [5]

For over a year now, NHSE/X have refused to publish the list of projects and organisations to which they release data. The Government claims benefits, but shows none at all – if this is “progress”, then evidence, transparency and good governance are clearly out of the window.

One line that really matters is buried in section 3, “Supporting local and national decision makers with data”, [6] which says:

…we will use secondary legislation in due course to enable the proportionate sharing of data including, where appropriate, personal information for the purposes of supporting the health and care system without breaching the common law duty of confidentiality (ongoing)

[Further down the same section, it shows how they’ll give patients’ data to DWP…]

Does this Government really believe it can use “secondary legislation” to overturn the millennia-long trusted principle of doctor–patient confidentiality that lies at the very heart of healthcare?

The strategy regurgitates, almost unchanged, a set of “principles for data partnerships” [7] that were a Pharma-Bro-friendly Goat Rodeo [8] when they first surfaced two years ago, and which haven’t improved since. Its plaintive references to “transparency” and “citizen engagement” [9] don’t really hold water if – as medConfidential was – you participated in, gave expert testimony to, or were on the advisory panel to these efforts.  

We’ve also heard quite a lot of this “strategy” before. Compare, for example, the “commitment” at the bottom of the first section of 2021’s draft strategy, [10] to “give citizens the ability to see what research their data has informed, and who has had access to their data, as soon as the technology allows (ongoing)” with the “Paperless 2020” vision of the previous administration, in particular the section on page 50, [11] about “Digital Services – the offering to the Citizen and his apps supplier (Phased 2015 –2018)”

Phil Booth, coordinator of medConfidential said:

“Boris Johnson’s Government says “Data Saves Lives”, but buried in the small print is a rather more dubious deal: “If you want better care from your data, you must let us sell it.”

“Once, the PM remembered it was nurses and doctors who saved his life – and the next time Matt Hancock pontificates that patients “own their own data”, he should remember that taking something someone “owns” without their permission isn’t ‘sharing’ or ‘innovation’, it’s just plain theft.”


Notes to Editors

1) The draft strategy is published here: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft – as was first announced in the Government’s wider ‘National Data Strategy’ last September: “For example, NHSX is developing a Data Strategy for Health and Social Care in Autumn 2020, which will aim to build on the permissive approach to data sharing for care seen during the coronavirus response…” https://web.archive.org/web/20200909080611/https://www.gov.uk/government/publications/uk-national-data-strategy/national-data-strategy 

2) CDEI’s (the Centre for Data Ethics and Innovation) advice is cited across the document; its outgoing Chair was in post at the time the advice was given: https://twitter.com/medConfidential/status/1357037423172141061

3) Links to news coverage of the 2021 GP data grab here: https://medconfidential.org/information/media-coverage/

4) In section 3, “Supporting local and national decision makers with data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#supporting-local-and-national-decision-makers-with-data

5) Even Dominic Cummings seems unaware that NHS patients’ personal data, collected by NHS England under the COPI powers, was being fed to Palantir so that Faculty could build dashboards: https://twitter.com/EinsteinsAttic/status/1403496331965050881 

6) Section 3, “Supporting local and national decision makers with data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#supporting-local-and-national-decision-makers-with-data 

7) In section 7, “Helping developers and innovators to improve health and care”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#helping-developers-and-innovators-to-improve-health-and-care 

8) https://medconfidential.org/wp-content/uploads/2019/10/business-models.pdf – see page 3 for links to examples of how the proposed ‘business models’ have already all failed in practice.

9) At the bottom of the “Next steps” section, immediately above Annex A: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#next-steps

10) Section 1, “Bringing people closer to their data”: https://www.gov.uk/government/publications/data-saves-lives-reshaping-health-and-social-care-with-data-draft/data-saves-lives-reshaping-health-and-social-care-with-data-draft#bringing-people-closer-to-their-data

11) Tim Kelsey, the architect of the previous GP data grab (“care.data”) in 2013/14, was the ‘Director for Patients and Information’ at NHS England when he presented this strategy: https://www.kingsfund.org.uk/sites/default/files/media/NIB%20Master%20Slide%20Deck%20v6.pdf

__

medConfidential has tracked the development of both Data Strategies since they surfaced last year: https://medconfidential.org/2020/the-national-data-strategy/ and has followed the evolution of ‘Shared Care Records’ for years before that: https://medconfidential.org/2021/shared-care-records/ – graphics on how the data will be used are in the second half of the page.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every use of data in and around the NHS and wider care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on phil@medconfidential.org

– ends –

Your medical records and the 2019 Election manifestos

At the last Election in 2017, medConfidential had a single, simple request:

Will patients know how their medical records are used?

While the Government seemed to make noises in that direction, it has delivered nothing. For this Election, we reiterated that question and highlighted a number of key issues and commitments that the next Government would have to make around how NHS patients’ data is used and how it buys technology.

So what did the parties all say in their manifestos?

Continue reading

Alternatives to the DWP tender for a “medical records broker”

The existing ‘Atos’ process* for Employment and Support Allowance (ESA), some components of Universal Credit (UC) and Personal Independence Payments (PIP) is recognised to be brutal and inhumane. The Department of Work and Pensions’ desire to replace it is fundamentally to be welcomed.

Where DWP ends up with regard to health information will likely be something modelled on the British Medical Association’s agreement with the Association of British Insurers on GP reports. This agreement provides a published definition of what data is medically relevant, with clear standards and a template form in which a report must be produced, including the structures and uses to which the data can and cannot be put, and a commitment on the part of the data recipient (in this case, the insurance company) to respect the medical view.

It is this last element with which DWP is most likely to struggle – but without it, all the rest is undermined.

The information a patient gives to their doctor(s) must be untainted by external priorities, otherwise people will come to harm – whether from excess or under-reporting. As Professor Helen Stokes-Lampard, chair of the Royal College of GPs has said, “We are doctors, whose first interest is the care of our patient: we are not border guards, and we are not benefits assessors.”

Were data sharing with DWP to be perceived in this way, the provision of care and medical research – and public confidence and trust in both – would be undermined to a far greater extent than care.data ever did.

Such data sharing would arguably be more dangerous and harmful overall than the Atos assessment process, as GPs would have consider competing incentives around  the information provided by their patients – not knowing in advance which Departments would get to see it, nor when DWP might send letters demanding they change their medical judgment.
DWP cannot address these issues alone and in secret – a tender is never a good place to start, and the current one could potentially catastrophically undermine any improvements.

There aren’t many ‘fully automated’ decisions

Were a third party being contracted to make automated decisions, DWP could (and should) expose the ‘business logic’ upon which those decisions are to be made. But that is not what DWP is doing.

Some of the decisions being made are really, really simple – being pregnant, for example, is a binary state indicated by a medical test, from which processes can result. And a ‘terminal’ state is a medical decision, made for medical reasons.

While the actions of DWP and its contractors may suggest that some terminally-ill patients are ‘fit for work’, we do not imagine this to be an explicit policy intent – more a result of the systematic process neglect that the current Secretary of State has expressed a desire to resolve. Will DWP accept a ‘terminal’ definition from the NHS in future? If not, as now, nothing that DWP does will matter.

Most areas of controversy are not fully automated, nor even fully automatable. While a recorded status of ‘terminal’ may (trivially) be the result of a human pressing a button, when the doctor presses that button has to be based on a carefully considered discussion with the patient – which should only be about how they wish to die, without any implications or insinuations from target-obsessed bureaucracies of the state.

Medical decisions are human decisions

The Atos processes exist because DWP does not trust the data it could get from the NHS. As has become evident, there are processes where NHS doctors offer ‘fit notes’ the DWP requires them to provide, but pressures them not to. This is DWP gaming the very system that it set up in an attempt to make it not be gameable by anyone else. And this remains the context in which the tender has been issued.

Patients must believe that the information they give to their doctor will not be used against them. No data protection law, old or new, allows the DWP to rifle through GP records without an explicit legal case. However the Atos process is replaced, whatever replaces it is going to require new legislation – informed by all of the stakeholders, in a public debate.

DWP brings ‘business logic’ to a political problem

Any data access or data sharing cannot be done under Data Protection law alone. It is not part of the NHS’ or GP’s public task to assess benefits for DWP, and DWP cannot ‘ask’ for consent when that ‘consent’ is a condition of the social safety net that makes it possible for a person to buy food. (DWP may try; it will fail.) So any replacement is going to require primary legislation – and that legislation cannot be initiated by DWP alone, and certainly not by issuing a tender.

The tender document is quite clear on the policy intent, and how DWP sees the world. But DWP cannot fix this alone. It may try because it believes it has no other levers to pull, given wider distractions. So this current approach – trying to fix a complex political issue with more technology – will likely go no better than the Home Office’s roll out of the Settled Status digital process, and could go a lot worse…

Technical barriers imposed by others

DWP has no way of knowing that some of these barriers even exist.

For example, the new ‘NHS Login’ service will be necessary for any digital service that interacts with the NHS. And, in an explicit decision decision by NHS Digital (we argued against it; they ignored us, in writing), the NHS Login service passes the patient’s NHS number to every digital service.

While this may be good for direct care, it is terrible for anything else. As a result, any DWP ‘data broker’ or ‘trusted third party’ using the NHS Login will have a copy of the patients’ NHS number that the NHS would argue they are prohibited by law from having.

On an even more fundamental matter, the DWP is going to have to work with the NHS and medical professional bodies if only for the reason that it has little – if any – experience with coded health data on which the health service runs. Interpreting a person’s condition into (or from) dated medical events is a highly-valued clinical skill and, on the evidence of the outsourced work capability assessors, not one that will prove easy to duplicate.

We also have a further, modest, proposal.


* By ‘ATOS process’, we mean the Work Capability Assessment for ESA or UC – run by Centre for Health and Disability Assessments Ltd, a subsidiary firm of Maximus – and the assessments for PIP – run by Capita and an arm of Atos, trading as ‘Independent Assessment Services’.

Local Health and Care Records: exemplary, or just more of the same old?

The Australian ‘care.data’ isn’t going well. Taking its lead from the English plan, the Australian ‘My Health Record’ programme claims to be about direct care and creates a new repository of patients’ data hiding behind a badly-run opt-out process, with minimal publicity that’s all about direct care; those who support it don’t actually understand it, and in the small print, the data can be copied for purposes that aren’t direct care at all. None of the lessons of care.data have been learnt there at all.

Meanwhile, in the UK, NHS England has announced a set of pilots for ‘Local Health and Care Record’ schemes – 3 in May, 2 more in June – which claim to be about direct care, which may or may not create a new repository of data without an opt-out process, and about which all the publicity is only about direct care; those who support it seem not to articulate fully what it is, and in the small print, the data can be copied for purposes that aren’t direct care at all.

‘Just in time’ or ‘Eventually in case’?

There is clear value in the principal public goal of Local Health and Care Records (LHCR): when a patient arrives at a new care setting, doctors should be able to see care provided only hours before – especially when it’s that which may have put the patient there. An obvious example would be a hospital doctor being able to see new medicines, prescribed that morning. This has obvious clinical value, assuming the data is up to date enough – showing data that is as current as the patient’s arrival, not yesterday.

In practice, this would mean looking up records across the system as needed, rather than building yet another pile of outdated data – most of which would never be touched for care purposes, and which could be dangerously out of date if it were. The ‘glue’ required for interoperability is to simplify ‘just in time’ interoperability, rather than to copy data from everywhere ‘just in case’ it’s needed somewhere else.

The driving principle must be the provision of relevant, necessary, up-to-date information. Which, as any serious technologist will tell you, means using APIs – not making and passing around endless copies of data. (‘Paperless’ shouldn’t mean promiscuous…) Building yet another set of out-of-date databases only works for the people who sell and run databases.

The local areas offered as pilots apparently get to choose their own technology for their own needs. But before the ink on the LHCR announcements was dry, there were other announcements about projects that want to copy the data that the ‘pilots’ apparently haven’t yet decided upon. The money is already flowing as if they will.

Clearly they all know something that NHS England isn’t telling the public. Where data is made available to ‘research’, NHS England (as data controller) will also want to use the GP data it copies for its own purposes – clinical performance management and ‘developments’ that skirt the line between research and micromanaging. The National Data Opt Out should apply to these uses – whether it will or not remains to be seen – but even so, the creation of another copy of patients’ medical records, under the control of NHS England rather than doctors, has apparently been mandated without public debate, discussion, or basic honesty.

Will patients be sold the figleaf of ‘direct care’, with other uses hidden behind, in a very Australian way?

However a local care record system is implemented, every patient needs to be able to have clear and specific answers to their questions: who is looking at my data? Why? And what are my choices?

The advocates of dangerous data copying will continue to push for it – while the ‘spawn of care.data’ resurfaces in Australia, the toxic causes of care.data are now reappearing across this country, in the form of ‘data trusts’. Until there is a binding commitment to transparency over all data access, those who wish to copy patients’ information for secret reasons will continue to publicly claim patient ‘benefits’ for activities that are far more sordid.

 

We have also done a deep dive into the systems, which is possibly far more than you ever wanted to know about local health and care record systems.

150,000 patients’ opt-outs not honoured; their confidential information sold for 3 years

A serious error affecting 150,000 NHS patients has been reported in the media in recent days, after it was uncovered last week. We understand the error affects patients who set an opt-out between March 2015 and June 2018 and whose GP practices use TPP’s SystmOne software – their opt-out codes were not sent to NHS Digital until last week.

As a consequence of this error, from April 2016 until 26 June this year, those patients’ confidential, identifiable data was sold to a range of organisations, including private companies. This will obviously be of concern to a great many people.

Both TPP and NHS Digital are taking remedial action; the coding error has been corrected to ensure opt-outs will be uploaded properly from now on, affected GP practices were written to on Monday 2 July, and the individual patients affected should be written to by the end of the month.

Until then, based on current information, this is what you can do:

If you have recently received a letter from NHS Digital about the conversion of your Type-2 opt-out to the National Data Opt-out then you weren’t affected by this incident. (These letters were sent out during June.)

If however you haven’t received a letter, and you are over 16, and you remember opting out any time from March 2015 onwards, then either:

  1. a) you are affected by the TPP incident, or
  2. b) separately, your opt-out was never applied to your GP record.

Anyone over the age of 13 should be able to check their current opt-out status by using NHS Digital’s new online National Data Opt-out process:

If the light blue status box does not appear when you check and you do not wish your confidential, identifiable medical information to be used for any purposes beyond your own direct care, then you need to set the option on this screen to “No”.

This new online process only works, however, for individuals over 13 years old – and not for families with children or adult dependants. medConfidential’s (now improved!) GP opt-out form continues to work, as it has done since late 2013. It also lets you prevent confidential, identifiable information leaving your GP record, which the National Data Opt-out does not cover.

But – given this incident, and every previous breach of public trust – why can’t every patient see their data, so they can know what has happened?

Everyone agrees how bad the situation created by TPP’s error, with consequences for patients from their data being used against their wishes, really is:

Professor Helen Stokes-Lampard, Chair of the Royal College of GPs, said:

Patient data held by the National Health Service should only ever be used morally, safely and responsibly, and we must all work together to ensure mistakes of this nature are never repeated. We need to be able to reassure patients that their wishes concerning their data are being respected.

Understanding Patient Data said in response (their emphasis):

This incident highlights the critical need for transparency – to ensure that it is clear where data is going and how choices are honoured. It also demonstrates that a trustworthy system must not just say the right things but also do the right things in practice as well: if opt-outs are claimed to be honoured, they absolutely must be. If these standards are not upheld, there has be clear accountability in the system, with sanctions if necessary to demonstrate that these issues are taken seriously, or public confidence will again suffer.

Dr Natalie Banner, who now leads the ‘Understanding Patient Data’ project, tweeted:

Astonishing and appalling failure to uphold patient objections: but what sanctions to ensure providers uphold the standards we expect of them? New opt-out, which is patient-registered rather than GP-registered, *should* be less liable to such errors though.

Mr Harry Evans, from the Kings Fund policy team, said:

We are all agreed on the importance of the public not being surprised by how NHS uses data, so this is just remarkable.

These are fine words, but when will they speak out about the people NHS Digital disregarded in its new ‘digital’ process – a process that Ministers signed off – which separates processing for parents and children? (Not every American policy approach should be replicated in the NHS…)

In a recent explanation for OurNHS, we showed the ‘proxy’ form itself says:

…if your family has children under the age of 13, or if you look after a dependent older relative, then things are even more complicated. Rather than giving a simple instruction to your doctor, those who would prefer their children’s data wasn’t sold to third parties for unknown purposes, will be required to send to NHS Digital, by post, four pieces of ID documentation along with a seven-page form. So much for Jeremy Hunt’s much-vaunted commitment to a ‘paperless’ NHS.

Given the significant effect this will have on people far wider than the 150,000 currently affected, you might want to ask (a) Understanding Patient Data, or (b) your MP, what they are doing to ensure the broken process for families making a decision together is fixed.

As the dust settles from GDPR Day…

…we’ve updated our scorecard.

One of the existing patient opt-outs has been renamed as the new National Data Opt-out, but a whole swathe of issues that have festered, unaddressed, for years still remain.

We consider these issues below, framed by questions of – and glaring omissions to – the ‘Your NHS Data Matters’ communications campaign, launched on GDPR Day.

Overview

“Your health and adult social care information supports your individual care. It also helps us to research, plan and improve health and care services in England.”

The word “us” appears to be doing a lot of work here; would patients expect “us” to include, for example, commercial information intermediaries such as Harvey Walsh Ltd?

“You can choose whether or not your confidential patient information is used for research and planning.”

All well and good, if true – but what about all other ongoing (and future) uses of patient information besides “research and planning”? Why does the new National Data Opt-out not use the far clearer, more accurate, and comprehensive Caldicott 3 definition of “purposes beyond direct care”?

If the new National Data Opt-out does cover the use of patients’ information for all purposes beyond their direct or ‘individual’ care, then why not say so? If it does not, then how does the ‘new’ opt-out meet the requirements of Caldicott 3, the recommendations of which the Government accepted in full?

medConfidential recommends: Be accurate! All public communications must in the first instance use the proper Caldicott 3 formulation, “purposes beyond direct care”.

These purposes may be further explained in terms of “research” and “planning”, but public explanations must be clear that uses are not limited to only these two purposes. To do otherwise would both mislead patients and undermine the effectiveness of the opt-out, and could lead to further collapses in public trust when people become aware of uses that do not clearly fall into either category.

“Information that only identifies you like your name and address is not confidential patient information and may still be used.”

This is utterly muddle-headed, and goes against what many (if not most) people reasonably understand is meant by the word “confidential”. While the example given is relatively benign it is precisely this loophole that, not coincidentally, led to the scandal of the Home Office MoU.

“Your confidential patient information is used in two different ways:”

This is not even close to true! We consider other uses, such as commissioning and commercial re-use in more detail below, but this statement is demonstrably untrue: take, for example, the HRA Confidentiality Advisory Group’s Non-Research Register, which contains ongoing uses such as invoice reconciliation, risk stratification, commissioning and projects that explicitly mix direct care and secondary uses.

medConfidential recommends: Don’t mislead patients! Be more clear and explicit about the range of uses to which patients’ information are put.

While public communications must be as clear and as understandable as possible, they must also be accurate – and true. The split between “your individual care” and “research and planning” (a description that we note above is itself too narrow and misleading) is far too simplistic, especially when patients are being asked to make an informed consent choice.

“Most of the time, we use anonymised data for research and planning. So your confidential patient information isn’t always needed.”

No definition of “anonymised” is provided. Using this word without explaining what it means is misleading; the natural (i.e. many patients’) assumption is that “anonymised data” is anonymous, which is not the case. GDPR and guidance from the ICO now makes it clear that what NHS Digital has been disseminating “most of the time” is identifiable data.

That only some identifiers are being removed, or pseudonyms substituted, must be explained – and linking to a third party, non-NHS information source to do this will hardly be reassuring to many patients. Hiding behind narrow legalistic reasoning and jargon never has and (especially post-GDPR) never will solve major long-standing issues.

medConfidential recommends: Follow the law! Stop implying that ‘anonymised’ is the same as anonymous, and respect people’s opt-outs for all identifiable data – don’t keep trying to find loopholes and excuses.

Benefits of data sharing

We are not aware that this is, or ever has been, in dispute. Clearly there are benefits to lawful, ethical, consensual, safe, and transparent data sharing.

Problems come when, as with the care.data programme and previous attempts at public communication, the NHS tries exclusively ‘selling the benefits’ without even mentioning any of the risks. Addressing these directly helps people make sense of the mitigations used – and such measures are no longer just arbitrary claims or assertions – and enables a more informed choice on that basis.

Who uses your data

As we note above, the narrow definition “research and planning” does not even come close to defining the full range of uses, for purposes beyond their direct care, to which patients’ information is put.

These omissions aside, and while ‘Your NHS Data Matters’ lists some types of organisations that may do “research” and acknowledges the use of patients’ information “to support the delivery of health and social care” (conflating both direct care and “planning”) it makes no mention of the types of organisations that may be involved in “planning”, and all of the activities that term is supposed to encompass.

Given that it is precisely those people and organisations that may have access to their information that matters to most patients who have concerns, this is another serious omission. Without it, how are patients supposed to make an informed choice?

medConfidential recommends: Be honest and upfront about who will have access to patients’ information; patients should not be assumed (or required) to understand the inner workings of the health and care system in order to make choices.

It may be argued that NHS Digital’s Data Release Register performs this function. However, linking to a massive Excel spreadsheet, containing literally thousands of individual entries, puts too much of a burden on any normal individual and – given the disparity between this and the level of detail provided elsewhere – begins to look like obfuscation.

We understand NHS Digital is working on a more clearly-formatted HTML publication of its Data Release Register but, in its absence, medConfidential has provided a more readable version that – unlike the current Register – contains links to NHS Digital’s audit reports, for those organisations that have been audited.

medConfidential recommends: Continue improving the transparency of data use.

For example, besides audits (and actions taken) future Registers should link to the relevant DARS and/or IGARD documentation; showing there is a process, and that the process is being applied competently and consistently is an important way to build and maintain trust.

It is unfortunate that the “NIC numbers” given in current Registers are entirely self-referential to anyone performing, e.g. a Google search; concealing or obscuring relevant parts of the process raises and persists doubts.

“Research bodies and organisations include:
– university researchers
– hospital researchers
– medical royal colleges
– pharmaceutical companies researching new treatments”

Why are “pharmaceutical companies” the only ones on this list whose use of patients’ information is qualified? Is the claim that pharmaceutical companies only receive patients’ data for the specific purpose of researching new treatments? This is patently untrue, and leads onto the further spurious claim that patients’ information will not be sold for “marketing or insurance” purposes.

While this claim may be narrowly true, at least for “commercial insurance purposes”, it omits to mention that at least some information intermediaries (i.e. commercial re-users, now sometimes referred to as “health and care analysis companies”) which regularly receive data from NHS Digital still service pharmaceutical marketers.

NHS Digital cannot state definitively who does or does not reuse patients’ medical records, as it specifically chooses not to know.

medConfidential recommends: Stop misleading patients as to the ultimate uses of their data, and stop sending out copies of everyone’s hospital histories to companies which (also) profit from servicing non-NHS commercial interests.

How data is protected

‘Your NHS Data Matters’ makes quite a few assertions about what will and will not be done with your data – though, and especially given the tendency to use jargon and narrow legalistic terms, it would be good to provide evidence and to clearly define key phrases. For example, we presume “confidential patient information” and “data” are two different things.

In addition, as noted above, linking to a third party, non-NHS information source to achieve some of this – however good the explanation – will hardly be reassuring to patients with concerns.

Another glaring omission, given the significant number organisations that do not provide health and care services, but that do use patients’ information for purposes beyond their direct care, is the list of the steps those organisations are supposed (required?) to take to protect patients’ data.

The list of steps for such organisations clearly cannot be the same as those for NHS bodies, in that some of these organisations do not “make it clear why and how data is being used”, and others hide behind the information intermediaries’ commercial reuse contracts to, e.g. measure the efficacy of pharma sales, and to market their products to NHS doctors and commissioners.

medConfidential recommends: Make it a requirement to report to NHS Digital (and thence to patients) how data is used by everyone; stop relying on commercial reuse contracts and the “promotion of  health” loophole in the Care Act 2014 to persist what are self-evidently marketing uses.

Manage your choice

The new ‘digital’ National Data Opt-out process cannot cope with dependant children, and assumes that all 13 year olds have mobile phones or e-mail accounts which can be accessed safely without duress. It appears as if, when the process was signed off under the Government’s Digital Service Standard, Ministers did not spare a thought for their families at all…

The entire process is overly bureaucratic and intimidating, especially when compared with the existing ‘Type-2’ opt-out process: rather than simply instructing your own GP, who already knows you, you must identify yourself to officials at a remote call centre – and may even have to send in up to four pieces of ID and documentation with a form. (Check pages 2-3 of NHS Digital’s 7-page ‘Non-Digital Proxy Opt-Out Form’ for a list of requirements.)

This feels more like an inquisition than a ‘choice’.

Given how fundamentally broken NHS Digital’s new ‘Beta’ opt-out process is, medConfidential recommends patients who have concerns use the opt-out form we’ve provided since late 2013.

We updated our form in line with recent changes and it still works for you, your dependant children and others for whom you are responsible – it also protects your GP data from uses beyond your direct care, not just your data supplied to NHS Digital.

With all that is and will be changing, medConfidential also strongly recommends you get yourself a Patient Online account, if you don’t already have one.

We provide more information about this on the ‘For patients’ section of our website.

Though it will still be some time until you can see how all of your data has been used, by which organisations and for what purposes, a Patient Online login to your GP’s online system should already allow you to see how your GP data is being used.

Where an opt out doesn’t apply

One critical question is whether patients’ opt-outs will now be honoured in the dissemination of ‘Hospital Episode Statistics’. HES comprises two-thirds of data releases from NHS Digital, most of those to commercial companies – including all of the commercial reusers. Until now, over 1.4 million patients’ wishes in this regard have been ignored.

Apparently officials believe IGARD, a body of NHS Digital’s own creation, can decide to override patients’ dissent when, in fact, the only body with a statutory basis to approve such exceptions is the Confidentiality Advisory Group (CAG) at the HRA.

Both GDPR and the UK’s new Data Protection Act clarify and extend the definition of identifiable data such that – the day before GDPR came into effect – staff at NHS Digital were ordered not to disseminate any “anonymised” patient data. Data releases were resumed the following day, but NHS Digital is still in discussions with the Information Commissioner’s Office as to what patient information can now be considered “anonymous”.

Under GDPR, this is unlikely to include HES in its current form: individual-level, lifelong hospital medical histories, where every event is dated and linked by a pseudonym.

Given a mother with two children is over 99% likely to be identifiable from her children’s birth dates alone, and given the enhanced GDPR ‘right of access’ to any data received by any customer of NHS Digital to which opt-outs have not been applied, it would seem not only unlawful but highly irresponsible for NHS Digital to keep selling what GDPR now defines as the identifiable patient data of those who have opted out.

If you – or any patient – would like to see how your data is used, and where your choices are being ignored, medConfidential recommends you visit TheySoldItAnyway.com

GDPR DAY BRIEFING: ‘Your data matters’ and NHS patients’ data, with the ‘new’ National Data Opt-Out

May 25th 2018 will start an awareness-raising programme with the public that their data matters – and what, if anything, changes as a result of patients’ increased rights under GDPR.

With regard to NHS patients’ health data:

  • A new NHS ‘National Data Opt-Out’ commences on GDPR day (May 25th);
  • NHS Digital continues to sell (i.e. disseminates under contract, on payment of a fee) patients’ data, as it has been doing for years;
  • GDPR expands the definition of ‘identifiable data’ (one reason why everyone’s privacy policies are changing);
  • Will NHS Digital ignore patient opt-outs on these newly-identifiable data releases, relying on definitions from the old Data Protection Act?
  • NHS Digital / DH refuse to ask data applicants why 98% of NHS patients’ data isn’t enough for them; while there may be legitimate reasons to override patient opt-outs, pretending new legislation does not apply to data releases (yet again) is not one of them.

Your Data Matters… but NHS Digital sells it anyway

NHS Digital still forgets about patients. Unfortunately, it sees them less as people and more as ‘lines in a database’.

NHS Digital continues to sell a product called ‘Hospital Episode Statistics’ (HES); a dataset that is not actually statistics but that rather consists of individual patients’ lifelong hospital histories, with every medical event dated and linked together by a unique identifier. As of May 24th, two-thirds of NHS Digital’s data disseminations do not respect patients’ right to object (‘opt out’) to their data being used for purposes beyond their direct care.

If you read NHS Digital’s own Data Release Registers, or view them at TheySoldItAnyway.com, [1] you can see for yourself the evidence of where data goes – and where patients’ express wishes are deemed not to matter.

After four years, and further breaches of health data, NHS Digital ignores the choices of the 1.4 million people who opted out and still sells their (and every other hospital patient’s) data for commercial reuse. Those who claim to need 100% of the data for some reason, need merely explain to a competent data release body why 98% of people’s data isn’t enough – an explanation they’re currently not even asked to provide.

GDPR clarifies that the hospital data NHS Digital continues to sell is identifiable data – so claimed exemptions (item 5) to people’s opt outs don’t apply. Especially for those who remember the dates of particular medical events in hospital, such as the birth dates of their own children, or who can read about them online. [2]

‘Could do better’

Last week, the Department for Education called a halt to the sale of the data it collects on schoolchildren [3] for the very reason the NHS continues using to justify its sale of patients’ data.

NHS Digital now has a research environment [4] which allows far higher safety for patients’ data – but the companies that don’t want the NHS to be able to see what they’re doing with the data are special pleading. It is precisely these hidden uses to which patients are most likely to object.

NHS Digital’s customers, for example, still include for-profit companies such as Harvey Walsh, an “information intermediary” that – exactly as it did in and before 2014, and despite having breached the terms of its contract since then – continues to service commercial clients including pharmaceutical companies, which use the information to promote their products to doctors.

The digital service launching for GDPR Day in fact does less than the form that’s been available on medConfidential’s website since late 2013. [5] Our GP form works immediately – if you use the new digital service, your GP won’t know about it for months.

Discussing a damning ‘report’ in the House of Commons, the chair of the Health Select Committee censured NHS Digital for its “dimmest grasp of the principles of underpinning confidentiality”. [6] The Government has agreed to raise the bar for breaching patients’ confidentiality when handing information to the Home Office; will NHS Digital now respect the choices of those patients who wish to keep the information in their medical records confidential too?

The solution to this is straightforward: DH can Direct NHS Digital to respect objections (opt-outs) in all releases of HES that CAG has not approved to have data released without patients’ objections honoured. There may be projects that require 100% of patient data; two-thirds of them do not.

The ICO has not yet updated its (non-statutory) Anonymisation Code of Practice to match GDPR, although its guidance on the GDPR definition of personal data and newer codes on subject access rights show the definitions in GDPR mean NHS Digital’s current practice does not deliver on its promises to patients.

The NHS has ignored legislative changes and harmed research projects before – see note 4 in this post. This effect is one of the main things that prompted the Wellcome Trust to create the Understanding Patient Data initiative.

But it is still (a bit) better than it was…

NHS Digital now sells less of your data than it used to; it only sends out hundreds of copies of the nation’s hospital records – ‘pseudonymised’, but containing information that GDPR recognises makes it identifiable, and therefore still personal data.

You now have the ability to make a choice for you and (after a fashion) your family that will work, in due course [7] – but NHS Digital needs to listen to Dr Wollaston, “take its responsibilities seriously, understand the ethical underpinnings and stand up for patients”, respect that patients’ data matters, and fully honour everyone’s choices.

Questions for interviewees:

  • What does the NHS’ online-only opt-out service not do on day one, that the GP-led process did last week?
  • How many steps does it take for a family to express their collective choice on how their data is used?
  • When this new digital dissent process was signed off under the Government’s Digital Service Standard, did Ministers spare a thought for their families at all?
  • Will patients’ opt-outs be honoured in the dissemination of HES under GDPR?
    • If not, will those patients who already opted out be told why not?
  • A mother with 2 children is over 99% likely to be identifiable from their children’s birth dates alone; given the enhanced GDPR ‘right of access’ to any recipient data to which opt-outs have not been applied, will NHS Digital keep selling what GDPR defines as the identifiable patient data of those who have opted out?
    • What is the burden on medical research of this choice by NHS Digital, made to placate its commercial customers?

If you or any patient would like to see how their data is used, and where their choices are being ignored, please visit TheySoldItAnyway.com

Notes for Editors

1) NHS Digital publishes its data release register as a spreadsheet but it fails to link to, e.g. its own audit reports – so medConfidential has created a more readable version that does.

2) All the information required to identify Ed Sheeran’s entire hospital history in HES appears in this BBC News article, published online on 19 May 2018: http://www.bbc.co.uk/news/uk-england-suffolk-44155784

3) ‘Sharing of school pupils’ data put on hold’, BBC News, 15 May 2018: http://www.bbc.co.uk/news/technology-44109978

4)  A ‘safe setting’, as medConfidential and others recommended in evidence to Parliament back in 2014: https://digital.nhs.uk/services/research-advisory-group/rag-news-and-case-studies/remote-data-access-environment-to-speed-up-access-to-data

5) We have updated our form to reflect the name change. See https://medconfidential.org/how-to-opt-out/

6)  https://www.theyworkforyou.com/debates/?id=2018-05-09a.746.6#g771.0

7) The National Data Opt-out should be respected by bodies across the health and care system “by 2020”.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

– ends –

[PRESS RELEASE] Google DeepMind deal with the Royal Free Hospital broke the law

The Information Commissioner’s Office has today ruled that the deals which gave Google DeepMind copies of 1.6 million patients’ hospital records are unlawful:

https://ico.org.uk/action-weve-taken/enforcement/royal-free-london-nhs-foundation-trust/

The ICO’s ruling determines that the deals breached four of the Data Protection principles:

https://ico.org.uk/media/action-weve-taken/undertakings/2014353/royal-free-undertaking-cover-letter-03072017.pdf

medConfidential first complained to the National Data Guardian and ICO in June 2016. [1]

In February 2017, the National Data Guardian said that copying of patients’ data to develop the Streams app was on an “inappropriate legal basis”:

http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

Google DeepMind – the AI company developing the app – has given various contradictory quotes about its intent over time, repeatedly asserting that what it was doing was lawful. [2]

Apparently entirely coincidentally, the “Independent Reviewers” of Google DeepMind Health have a report due out, via the Science Media Centre at 00:01 this Wednesday. The timing may be a coincidence – just as it was apparently a complete coincidence that the Royal Free released a press release about how wonderful the project was, without mentioning the word Google once, 72 hours after receiving the letter from the National Data Guardian saying the data use was unlawful. [3]

On seeing the ICO’s ruling, Phil Booth, coordinator of medConfidential said:

“We look forward to Google DeepMind’s Independent Reviewers’ report on Wednesday.”

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on 07974 230 839 or coordinator@medconfidential.org

Notes to editors

1) Details of medConfidential’s complaint are available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/
deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

2) This complaint has now been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

…“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

3) https://www.royalfree.nhs.uk/news-media/news/new-app-helping-to-improve-patient-care/

 

medConfidential rapid responses to DeepMind’s statements about their “legally inappropriate” data copying

We shall update this page as more information becomes available (newer items at the top).


Tuesday 11am:

Yet more questions raised about the usage of the Streams app

The Sky News footage shows that the Streams app is still in use, displaying information from patients – their name, their date of birth, gender, and other demographic information.

Where does that information come from? How does the ‘calendar gap’ affect patient care?

There are 3 choices:

  1. It comes via the first contract that has been found to be unlawful (with the calendar gap)
  2. The second contract is being breached (which also contains the calendar gap)
  3. There is a third secret contract hidden from scrutiny

Or Google’s AI team has come up with something else legally dubious to continue to unlawfully copy 1.6 million patient records… this suggests an uber-esque approach to the law, and to safety.

What is the ‘calendar gap’?

The data Google DeepMind unlawfully copy is up until “last month”. It is currently the 16th May 2017, and at best, the data they copy will run up until 30 April 2017. On the 29 May, they will only have data until the end of April. When there’s a new month, they get an updated dataset covering the new “last month”. (It possibly takes a few days to actually happen, but you get the idea.)

Streams will help you if you were in the RFH last month. If you were there last week, however, the effect of the contract is that Streams could cause you harm – as Google’s app may prioritise longer-term patients it knows more about, over newer ones it knows less about.

Such problems are why MHRA approvals and a proper testing regimen are so important. To be absolutely clear, this failure is not endemic to Streams – the DeepMind deal with Imperial does not contain it, for example – but it appears as a dangerous symptom of the deal from DeepMind, that has been found to be unlawful.

We’ll ask the National Data Guardian for clarity later today.


Tuesday 10am:

We’ve seen this piece being discussed: the article is correct about patients who were receiving direct care – but out of the 1.6 million patients’ data it copied, DeepMind in February 2017 said it had assisted in the direct care of just “more than 26”.

So while 27 records may have had a lawful basis, 1,599,973 didn’t.

It is the 1,599,973 records that are of concern here. Similarly, while there is not necessarily any problem with testing an app, testing an app isn’t the same as providing direct care. It is a separate process that DeepMind didn’t go through, as their interviews at the time made very clear (Note 6).


Tuesday 10am:

If Google DeepMind didn’t receive the letter containing the NDG’s finding, as they have said to medConfidential (after the date on the letter), they should have a chat to the gmail team about such a convenient problem that no one else sees…

Even if that excuse was valid in the past, there are now lots of copies of the letter on the internet, evidencing their unlawful behaviour. Although Dodgy Donald from DeepMind might be in denial about even that.


Monday night:


Under the heading, ‘What we’ve learned so far’, a newly updated page on DeepMind’s website states:

There’s very low public awareness of NHS technology, and the way patient data is routinely used in the provision of care. For example, many people assume that patient records are normally stored on NHS-owned computers, when they are in fact routinely processed by third party IT providers. This confusion is compounded by the potential future use of AI technologies in the NHS.

medConfidential comment:

This response by Google shows that DeepMind has learnt nothing. There may well be lawful reasons for third party IT providers to process data for direct care for 1.6 million patients – unfortunately for Google’s AI division, developing an app is not one of them.

Google told the public as little as they thought they could get away with – and being duplicitous, they still are. And, in so doing, they are trying to force the NHS into taking the blame for their mistakes.


Regarding the investigation by Sky News into the sharing of patients’ records, which begins:

Google’s artificial intelligence arm received the personally identifying medical records of 1.6 million patients on an “inappropriate legal basis”, according to the most senior data protection adviser to the NHS.

medConfidential comment:

Google’s lawyers are expensive, but “inappropriate legal basis” is still a euphemism for unlawful.

Buried in the interview footage is a statement from a nurse that the app is still in use with patients today. Also:

“The testing for the Streams app has now concluded and it is being used at the Royal Free Hospital, Prof Powis told Sky News, under a second agreement which is not being investigated.” (Sky News article)

Unfortunately for Google, their own press release from last November states that the same data is shared under both agreements.


 

[PRESS RELEASE] Google DeepMind unlawfully copied the medical records of 1.6 million NHS patients

“A core part of Google” has been told it has no lawful basis to process 5 years’ of patient data from the Royal Free Hospital in London. [1] With no legal basis, the data must be deleted.

In May 2016, the New Scientist reported [2] that Google DeepMind had access to a huge haul of patient data, seemingly without appropriate approvals. In July 2016, the MHRA confirmed [3] that DeepMind had not received any approvals for a trial involving patients, using patient data. In November 2016, DeepMind signed a replacement contract covering exactly the same data. [5d]

The National Data Guardian has provided a view on this matter (all emphasis added): [1]

The Royal Free “…confirmed to us [NDG] that 1.6 million identifiable patient records were transferred to Google DeepMind and that implied consent for direct care was the legal basis for the data processing.

“…Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer. My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.

It is unclear whether Google DeepMind has complied with the finding that it had no legal basis for processing this data; nor is it clear what it was that first attracted DeepMind executives to unlawfully copy 1.6 million people’s medical records, repeatedly insisting on direct care as the sole legal basis. [8]

medConfidential agrees with the Information Commissioner, when she said in a speech to technology companies: “I do not believe data protection law is standing in the way of your success.” She reminded her audience: “It’s not privacy or innovation – it’s privacy and innovation.” [4]

In this case, this DeepMind project turned out to be neither of those things. [9]

The National Data Guardian’s investigation has made clear – despite their claims to the contrary – that DeepMind had no legal basis for their actions in this project.

medConfidential coordinator, Phil Booth, said:

“This letter shows that Google DeepMind must know it had to delete the 1.6 million patient medical records it should never have had in the first place. There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it.

“Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.

“While the NHS sent doctors to a meeting, DeepMind sent lawyers and trained negotiators. What this boils down to is whether Google’s AI division followed the law and told the truth; it now appears they may have done neither.

“As events this weekend have shown, it’s the number of copies of patient data that matter – encryption locks won’t reassure anyone, if the wrong people have been given the keys.”

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on coordinator@medconfidential.org

Notes to editors

1) “The NDG has provided a view on this matter to assist the ICO’s investigation” was the National Data Guardian’s comment on the publication of the University of Cambridge paper, ‘Google DeepMind and healthcare in an age of algorithms’: https://link.springer.com/article/10.1007%2Fs12553-017-0179-1 and http://www.cam.ac.uk/research/news/
deepmind-royal-free-deal-is-cautionary-tale-for-healthcare-in-the-algorithmic-age

Sky News published a copy of the letter from the National Data Guardian on 15 May 2017: http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

2) medConfidential raised a complaint [4] to the ICO following reports in the New Scientist, and follow-ups elsewhere, about secretive data use by Google DeepMind:

a) New Scientist, 29/4/16: https://www.newscientist.com/article/2086454-
revealed-google-ai-has-access-to-huge-haul-of-nhs-patient-data/

b) New Scientist, 13/5/16: https://www.newscientist.com/article/2088056-did-
googles-nhs-patient-data-deal-need-ethical-approval/

c) Daily Mail, 4/5/16: http://www.dailymail.co.uk/news/article-3573286/NHS-
trust-handed-private-patient-details-Google-says-implied-permission-emerges-hospital-talks-internet-giant.html

d) BBC, 19/7/16: http://www.bbc.co.uk/news/technology-36783521

e) Guardian, 6/5/16 (note 9 May & 25 July updates at the bottom of the article): https://www.theguardian.com/technology/2016/may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder

3) “DeepMind is currently working with the MHRA to ensure that the device complies with all relevant medical device legislation before it is placed on the market” – TechCrunch, 20/7/17: https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

4) Information Commissioner’s speech, ‘Transparency, trust and progressive data protection’, 29 September 2016: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/
2016/09/transparency-trust-and-progressive-data-protection/

5) medConfidential’s complaint is available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

d) The end of the first ‘Note to editors’ in a press release from the Royal Free Hospital on 22 November 2016 clearly states: “The new agreement does not change the number of patients whose data will be processed by Streams”: https://www.royalfree.nhs.uk/news-media/news/nhs-and-technology-leaders-agree-groundbreaking-partnership-to-improve-safe/

6) Claims by the New Scientist have been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

i) …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

7) The claim to reach “HSCIC level 3” was a self-assessment by DeepMind, which was revoked upon examination. [See the 25 July update to this Guardian article].

8) In a controversial press release by the hospital on 24 February 2017, the word “Google” did not appear once, despite point 6 (a)(i) above: https://www.royalfree.nhs.uk/news-media/
news/new-app-helping-to-improve-patient-care/
and a subsequent Guardian article on 9 March 2017, from a press release by Google DeepMind, which explicitly attributes actions to Google DeepMind: https://www.theguardian.com/technology/2017/mar/09/google-deepmind-health-records-tracking-blockchain-nhs-hospitals

9) “ “With health data, and government acquired health data, we need to be sure we aren’t, in effect, giving oxygen away for free to a private company that will start to sell it back to us,” says Azeem Azhar, who writes the popular Exponential View newsletter…” – Quartz, 17/3/17: https://qz.com/934137/googles-goog-deepmind-got-too-much-health-data-from-
britains-nhs-research-paper-says/

– ends –