medConfidential rapid responses to DeepMind’s statements about their “legally inappropriate” data copying

We shall update this page as more information becomes available (newer items at the top).


Tuesday 11am:

Yet more questions raised about the usage of the Streams app

The Sky News footage shows that the Streams app is still in use, displaying information from patients – their name, their date of birth, gender, and other demographic information.

Where does that information come from? How does the ‘calendar gap’ affect patient care?

There are 3 choices:

  1. It comes via the first contract that has been found to be unlawful (with the calendar gap)
  2. The second contract is being breached (which also contains the calendar gap)
  3. There is a third secret contract hidden from scrutiny

Or Google’s AI team has come up with something else legally dubious to continue to unlawfully copy 1.6 million patient records… this suggests an uber-esque approach to the law, and to safety.

What is the ‘calendar gap’?

The data Google DeepMind unlawfully copy is up until “last month”. It is currently the 16th May 2017, and at best, the data they copy will run up until 30 April 2017. On the 29 May, they will only have data until the end of April. When there’s a new month, they get an updated dataset covering the new “last month”. (It possibly takes a few days to actually happen, but you get the idea.)

Streams will help you if you were in the RFH last month. If you were there last week, however, the effect of the contract is that Streams could cause you harm – as Google’s app may prioritise longer-term patients it knows more about, over newer ones it knows less about.

Such problems are why MHRA approvals and a proper testing regimen are so important. To be absolutely clear, this failure is not endemic to Streams – the DeepMind deal with Imperial does not contain it, for example – but it appears as a dangerous symptom of the deal from DeepMind, that has been found to be unlawful.

We’ll ask the National Data Guardian for clarity later today.


Tuesday 10am:

We’ve seen this piece being discussed: the article is correct about patients who were receiving direct care – but out of the 1.6 million patients’ data it copied, DeepMind in February 2017 said it had assisted in the direct care of just “more than 26”.

So while 27 records may have had a lawful basis, 1,599,973 didn’t.

It is the 1,599,973 records that are of concern here. Similarly, while there is not necessarily any problem with testing an app, testing an app isn’t the same as providing direct care. It is a separate process that DeepMind didn’t go through, as their interviews at the time made very clear (Note 6).


Tuesday 10am:

If Google DeepMind didn’t receive the letter containing the NDG’s finding, as they have said to medConfidential (after the date on the letter), they should have a chat to the gmail team about such a convenient problem that no one else sees…

Even if that excuse was valid in the past, there are now lots of copies of the letter on the internet, evidencing their unlawful behaviour. Although Dodgy Donald from DeepMind might be in denial about even that.


Monday night:


Under the heading, ‘What we’ve learned so far’, a newly updated page on DeepMind’s website states:

There’s very low public awareness of NHS technology, and the way patient data is routinely used in the provision of care. For example, many people assume that patient records are normally stored on NHS-owned computers, when they are in fact routinely processed by third party IT providers. This confusion is compounded by the potential future use of AI technologies in the NHS.

medConfidential comment:

This response by Google shows that DeepMind has learnt nothing. There may well be lawful reasons for third party IT providers to process data for direct care for 1.6 million patients – unfortunately for Google’s AI division, developing an app is not one of them.

Google told the public as little as they thought they could get away with – and being duplicitous, they still are. And, in so doing, they are trying to force the NHS into taking the blame for their mistakes.


Regarding the investigation by Sky News into the sharing of patients’ records, which begins:

Google’s artificial intelligence arm received the personally identifying medical records of 1.6 million patients on an “inappropriate legal basis”, according to the most senior data protection adviser to the NHS.

medConfidential comment:

Google’s lawyers are expensive, but “inappropriate legal basis” is still a euphemism for unlawful.

Buried in the interview footage is a statement from a nurse that the app is still in use with patients today. Also:

“The testing for the Streams app has now concluded and it is being used at the Royal Free Hospital, Prof Powis told Sky News, under a second agreement which is not being investigated.” (Sky News article)

Unfortunately for Google, their own press release from last November states that the same data is shared under both agreements.