Author Archives: Phil

medConfidential rapid responses to DeepMind’s statements about their “legally inappropriate” data copying

We shall update this page as more information becomes available (newer items at the top).


Tuesday 11am:

Yet more questions raised about the usage of the Streams app

The Sky News footage shows that the Streams app is still in use, displaying information from patients – their name, their date of birth, gender, and other demographic information.

Where does that information come from? How does the ‘calendar gap’ affect patient care?

There are 3 choices:

  1. It comes via the first contract that has been found to be unlawful (with the calendar gap)
  2. The second contract is being breached (which also contains the calendar gap)
  3. There is a third secret contract hidden from scrutiny

Or Google’s AI team has come up with something else legally dubious to continue to unlawfully copy 1.6 million patient records… this suggests an uber-esque approach to the law, and to safety.

What is the ‘calendar gap’?

The data Google DeepMind unlawfully copy is up until “last month”. It is currently the 16th May 2017, and at best, the data they copy will run up until 30 April 2017. On the 29 May, they will only have data until the end of April. When there’s a new month, they get an updated dataset covering the new “last month”. (It possibly takes a few days to actually happen, but you get the idea.)

Streams will help you if you were in the RFH last month. If you were there last week, however, the effect of the contract is that Streams could cause you harm – as Google’s app may prioritise longer-term patients it knows more about, over newer ones it knows less about.

Such problems are why MHRA approvals and a proper testing regimen are so important. To be absolutely clear, this failure is not endemic to Streams – the DeepMind deal with Imperial does not contain it, for example – but it appears as a dangerous symptom of the deal from DeepMind, that has been found to be unlawful.

We’ll ask the National Data Guardian for clarity later today.


Tuesday 10am:

We’ve seen this piece being discussed: the article is correct about patients who were receiving direct care – but out of the 1.6 million patients’ data it copied, DeepMind in February 2017 said it had assisted in the direct care of just “more than 26”.

So while 27 records may have had a lawful basis, 1,599,973 didn’t.

It is the 1,599,973 records that are of concern here. Similarly, while there is not necessarily any problem with testing an app, testing an app isn’t the same as providing direct care. It is a separate process that DeepMind didn’t go through, as their interviews at the time made very clear (Note 6).


Tuesday 10am:

If Google DeepMind didn’t receive the letter containing the NDG’s finding, as they have said to medConfidential (after the date on the letter), they should have a chat to the gmail team about such a convenient problem that no one else sees…

Even if that excuse was valid in the past, there are now lots of copies of the letter on the internet, evidencing their unlawful behaviour. Although Dodgy Donald from DeepMind might be in denial about even that.


Monday night:


Under the heading, ‘What we’ve learned so far’, a newly updated page on DeepMind’s website states:

There’s very low public awareness of NHS technology, and the way patient data is routinely used in the provision of care. For example, many people assume that patient records are normally stored on NHS-owned computers, when they are in fact routinely processed by third party IT providers. This confusion is compounded by the potential future use of AI technologies in the NHS.

medConfidential comment:

This response by Google shows that DeepMind has learnt nothing. There may well be lawful reasons for third party IT providers to process data for direct care for 1.6 million patients – unfortunately for Google’s AI division, developing an app is not one of them.

Google told the public as little as they thought they could get away with – and being duplicitous, they still are. And, in so doing, they are trying to force the NHS into taking the blame for their mistakes.


Regarding the investigation by Sky News into the sharing of patients’ records, which begins:

Google’s artificial intelligence arm received the personally identifying medical records of 1.6 million patients on an “inappropriate legal basis”, according to the most senior data protection adviser to the NHS.

medConfidential comment:

Google’s lawyers are expensive, but “inappropriate legal basis” is still a euphemism for unlawful.

Buried in the interview footage is a statement from a nurse that the app is still in use with patients today. Also:

“The testing for the Streams app has now concluded and it is being used at the Royal Free Hospital, Prof Powis told Sky News, under a second agreement which is not being investigated.” (Sky News article)

Unfortunately for Google, their own press release from last November states that the same data is shared under both agreements.


 

[PRESS RELEASE] Google DeepMind unlawfully copied the medical records of 1.6 million NHS patients

“A core part of Google” has been told it has no lawful basis to process 5 years’ of patient data from the Royal Free Hospital in London. [1] With no legal basis, the data must be deleted.

In May 2016, the New Scientist reported [2] that Google DeepMind had access to a huge haul of patient data, seemingly without appropriate approvals. In July 2016, the MHRA confirmed [3] that DeepMind had not received any approvals for a trial involving patients, using patient data. In November 2016, DeepMind signed a replacement contract covering exactly the same data. [5d]

The National Data Guardian has provided a view on this matter (all emphasis added): [1]

The Royal Free “…confirmed to us [NDG] that 1.6 million identifiable patient records were transferred to Google DeepMind and that implied consent for direct care was the legal basis for the data processing.

“…Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer. My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.

It is unclear whether Google DeepMind has complied with the finding that it had no legal basis for processing this data; nor is it clear what it was that first attracted DeepMind executives to unlawfully copy 1.6 million people’s medical records, repeatedly insisting on direct care as the sole legal basis. [8]

medConfidential agrees with the Information Commissioner, when she said in a speech to technology companies: “I do not believe data protection law is standing in the way of your success.” She reminded her audience: “It’s not privacy or innovation – it’s privacy and innovation.” [4]

In this case, this DeepMind project turned out to be neither of those things. [9]

The National Data Guardian’s investigation has made clear – despite their claims to the contrary – that DeepMind had no legal basis for their actions in this project.

medConfidential coordinator, Phil Booth, said:

“This letter shows that Google DeepMind must know it had to delete the 1.6 million patient medical records it should never have had in the first place. There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it.

“Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.

“While the NHS sent doctors to a meeting, DeepMind sent lawyers and trained negotiators. What this boils down to is whether Google’s AI division followed the law and told the truth; it now appears they may have done neither.

“As events this weekend have shown, it’s the number of copies of patient data that matter – encryption locks won’t reassure anyone, if the wrong people have been given the keys.”

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on coordinator@medconfidential.org

Notes to editors

1) “The NDG has provided a view on this matter to assist the ICO’s investigation” was the National Data Guardian’s comment on the publication of the University of Cambridge paper, ‘Google DeepMind and healthcare in an age of algorithms’: https://link.springer.com/article/10.1007%2Fs12553-017-0179-1 and http://www.cam.ac.uk/research/news/
deepmind-royal-free-deal-is-cautionary-tale-for-healthcare-in-the-algorithmic-age

Sky News published a copy of the letter from the National Data Guardian on 15 May 2017: http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

2) medConfidential raised a complaint [4] to the ICO following reports in the New Scientist, and follow-ups elsewhere, about secretive data use by Google DeepMind:

a) New Scientist, 29/4/16: https://www.newscientist.com/article/2086454-
revealed-google-ai-has-access-to-huge-haul-of-nhs-patient-data/

b) New Scientist, 13/5/16: https://www.newscientist.com/article/2088056-did-
googles-nhs-patient-data-deal-need-ethical-approval/

c) Daily Mail, 4/5/16: http://www.dailymail.co.uk/news/article-3573286/NHS-
trust-handed-private-patient-details-Google-says-implied-permission-emerges-hospital-talks-internet-giant.html

d) BBC, 19/7/16: http://www.bbc.co.uk/news/technology-36783521

e) Guardian, 6/5/16 (note 9 May & 25 July updates at the bottom of the article): https://www.theguardian.com/technology/2016/may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder

3) “DeepMind is currently working with the MHRA to ensure that the device complies with all relevant medical device legislation before it is placed on the market” – TechCrunch, 20/7/17: https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

4) Information Commissioner’s speech, ‘Transparency, trust and progressive data protection’, 29 September 2016: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/
2016/09/transparency-trust-and-progressive-data-protection/

5) medConfidential’s complaint is available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

d) The end of the first ‘Note to editors’ in a press release from the Royal Free Hospital on 22 November 2016 clearly states: “The new agreement does not change the number of patients whose data will be processed by Streams”: https://www.royalfree.nhs.uk/news-media/news/nhs-and-technology-leaders-agree-groundbreaking-partnership-to-improve-safe/

6) Claims by the New Scientist have been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

i) …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

7) The claim to reach “HSCIC level 3” was a self-assessment by DeepMind, which was revoked upon examination. [See the 25 July update to this Guardian article].

8) In a controversial press release by the hospital on 24 February 2017, the word “Google” did not appear once, despite point 6 (a)(i) above: https://www.royalfree.nhs.uk/news-media/
news/new-app-helping-to-improve-patient-care/
and a subsequent Guardian article on 9 March 2017, from a press release by Google DeepMind, which explicitly attributes actions to Google DeepMind: https://www.theguardian.com/technology/2017/mar/09/google-deepmind-health-records-tracking-blockchain-nhs-hospitals

9) “ “With health data, and government acquired health data, we need to be sure we aren’t, in effect, giving oxygen away for free to a private company that will start to sell it back to us,” says Azeem Azhar, who writes the popular Exponential View newsletter…” – Quartz, 17/3/17: https://qz.com/934137/googles-goog-deepmind-got-too-much-health-data-from-
britains-nhs-research-paper-says/

– ends –

medConfidential comment on “NHS” cyberincident

Regarding the ongoing [time of writing: 18:30, 12/5/17] international cybersecurity incident, currently affecting – amongst many others – a number of NHS hospitals and GP practices.

Phil Booth, coordinator of medConfidential said:

“medConfidential has confidence in clinicians continuing to treat their patients, and in GCHQ’s incident response – as has been demonstrated in previous similar incidents.  Unfortunately, we also fully expect NHS England’s analogue administrators’ tailspin to continue to learn as little from this event as from any other in the real world.”

Notes to editors

1) Dame Fiona Caldicott’s ‘Review of Data Security, Consent and Opt-Outs’, published in June 2016, made important points about NHS cybersecurity. As of the snap General Election, the Government had yet to publish its response to the Review: https://www.gov.uk/government/publications/review-of-data-security-consent-and-opt-outs

2) NHS Digital has run a programme called CareCERT since September 2015, partnering with agencies such as including CERT-UK, CESG and CPNI. One of CareCERT’s core functions is “national cyber security incident management”: http://content.digital.nhs.uk/carecert

3) At the time of writing, this does not appear to be a ‘NHS-only’ incident; there is evidence of similar issues arising in Telefonica, in Spain: e.g. https://www.bleepingcomputer.com/news/security/telefonica-tells-employees-to-shut-down-computers-amid-massive-ransomware-outbreak/ & https://www.thestreet.com/story/14132953/1/britain-s-national-health-service-suffers-cyber-attack-spain-s-telefonica-hit-in-similar-incident.html

4) At the time of writing, the vulnerability appears to be one used by the CIA and disclosed via wikileaks in March 2017. Microsoft shipped an emergency patch at that point: https://technet.microsoft.com/en-us/library/security/ms17-010.aspx Serious questions need to be asked of those responsible for maintaining Windows-based IT systems, who failed to patch their servers for two months.

Patients’ Data in the Party Manifestos

What medConfidential will be looking for in every party’s Manifesto is rather simple:

    Will patients know how their medical records have been used?

A straightforward “Yes, they will” or “No, they will not” will suffice.

Every flow of data into, across and out of the NHS and care system should be consensual, safe, and transparent – there need be no conflict between good research, good ethics and good medical care.

We shall provide more detail on how this relates to current issues like Genomics and AI in due course – but the question to which there must be a clear answer, for whatever the future brings is: Will you know how data about you is used?

We would like the Government to honour its commitment to a statutory National Data Guardian; to closing the commercial re-use loophole, putting promised safeguards and sanctions in place, and implementing transparency on all data flows across the health and social care systems – but we understand these are details that may not make the cut for a short manifesto.

Given all that has happened, would you trust any party that fails to commit to you knowing how your medical records have been used?


Related pieces:

On what principles will data be used in the Single Government Department?

Whitehall proceeds step-wise, and ever more rapidly, towards an end state of a “Single Government Department”. This is Sir Humphrey’s decades-old vision of “Joined-up Government”, predicated upon Government doing whatever the hell it likes with your data, wherever and however it gets it, in flagrant disregard of Data Protection and Human Rights, Articles 8 and 14 (at least). User needs or departmental needs?

In a world where there’s a single Government (and government) data controller – the Data Controller in Chief; the Prime Minister, the final arbiter – will a single Department’s policies, practices and prejudices determine the list of Government policies?

We don’t see how it doesn’t.

It may be useful to begin with an NHS analogy. It’s a gross simplification, but it carries the necessary meaning.

There are multiple hospitals in Manchester – Royal Manchester Children’s Hospital, Manchester Royal Infirmary, and St Mary’s – all on the same site, with interconnected modern buildings, all built at the same time. Why are there three hospitals? Because when the new buildings were constructed, and everything was consolidated on one site, though treating them as a single hospital would seem most sensible, that would (to many people) effectively be “closing two hospitals”. Hence, there are three.

What about Government departments?

In a Britain with a single government department, what is currently the Home Office – with its particular approach (covered elsewhere) – will go on the rampage across all areas of everything.

For how will weaker policy goals be defended against stronger ones? “It’s a matter of national security, don’t you know…”

Clause 38 of the Digital Economy Bill “solves” this problem by simply ignoring it – those with the highest bureaucratic power will win the fight; we’ve seen this already with the Home Office demanding the names and addresses (p16) of patients – and it’s quite clear they’d have grabbed everything if they’d wanted to.

In this context, with the Digital Strategy of DCMS, and Cabinet Office’s warmed-over Government Transformation Strategy in play, what should happen to make the world they’re trying to build safe?

The greatest concerns must be with the Transformation Strategy; the current “ethics framework” suggested by GDS (the part of Cabinet Office responsible for writing the Strategy) is so flawed, for example, that it suggests a Privacy Impact Assessment can fit on a single sheet of A4 – the self-same strategy used to justify care.data, relying on NHS England’s public statements. Thus far, the country has been saved from a systemic collapse in trust by the fact that this “ethics framework” isn’t actually used by departments.

So what’s the alternative? A citizen view of Government.

Government insists it should be able to copy our data – whatever it wants, wherever and whenever it likes – including to its commercial partners, e.g. Google (or rather, Alphabet) DeepMind and Palantir, for whatever policy whim catches the interest of any particular official. Proportionality and public acceptance are irrelevant; these are not what the civil service is set up to do.

As we saw with DeepMind at the Royal Free Hospital, one person with power can torpedo years of careful and diligent work in order to meet their own short-term, narrow perspective, self-interested goals.

The single Government department makes this worse, if left unaddressed. What should replace it is a citizen view of Government.

This conversation has never been had. The discussions that have been facilitated were designed to get to the pre-conceived end state of the Cabinet Office. As such, the answer was given and civil society time was wasted on a ‘debate’ that was entirely pointless; any wider opportunity to improve the use of data in Government through the Digital Economy Bill was lost.

As an example, well-defined APIs might work for departments – but if departmental silos weaken (as is the explicit goal: “to remove barriers to data sharing”) then things begins to fail. Citizens should not have to rely on how Government talks to itself.

The start of the conversation has to be with complete transparency to citizens – with the likes of Verify and public bodies being accountable to the citizens they work for. Citizens can now be shown what data is required for their transactions, and from where it will be accessed, and why. Operational decisions should inform democratic debates, both by policy makers and citizens who wish to engage in democratic debates about the services that affect them.

Civil servants all work for the Crown and not the public – whatever ‘flavour’ of Government is in power – and this may be a tension that needs consideration. What happens when the political will meets the public won’t? How is trust in institutions maintained?

Because without action, continued secrecy and the drip drip of cockup will undermine all trust.

This works in practice

Fortunately, some NHS GPSoC IT Providers (the data processors who provide IT systems to your GP) have taken the lead in fixing the systems from within the system. How many decades will it take Whitehall to catch up?

We have already demonstrated what this looks like – with Verify and other tools.

Rather than a “single government department”, the principle should be a “Citizen View of Government” – where every service a citizen has touched can be seen, with accountability for how they used data and why. This would make Government accountable to the citizen, as it should be – without the citizen having to understand the intricacies of how Government works.

In a “Citizen View” world, whether Government is one Department or many doesn’t matter as much. If civil servants want to justify access to data, they can – but they must be aware that citizens will be told what data and why, and might become unhappy about it if the reasons aren’t just.

Any Government that fails tell its citizens what it is doing and why, or which doesn’t really want them to know, will not be wanted in return – as the EU discovered with Brexit. This is what the open policy making process should have prepared the groundwork for; the price of that failure keeps going up as digital continues its march.

Unless we wish to treat data about human beings with less care than we treat the data about carcasses in our food supply chain, ‘Globalisation 2.0’ will be based on registers and code – determining risk and eligibility for consumers and for regulators. This simply does not square with a world of copying data; it can only work in a world of APIs to data where there is a lawful, published case for each access, grounded in fundamental accountability to citizens about their data.

It is obvious that data about the food we eat should not be locked in a filing cabinet in Whitehall. It should be equally obvious that “taking back control” shouldn’t mean giving every civil servant a copy of all the data on every citizen.


Related pieces:

The Home Office: Secretive, Invasive, and Nasty

In various guises, those who coordinate medConfidential have been dealing with the effects of Home Office missteps for what now in total amounts to decades.

Liberty Human Rights Awards 2010

Liberty Human Rights Awards 2010

Here is some of what we have learnt:

Home Office is the part of Government that must confront and ‘deal with’ the absolute worst in the world: murder, rape, terrorism, paedophilia – the stuff no-one really wants to have to know about; things from which civilised people prefer to turn their eyes. There are obvious – and legitimate – reasons that some of what the Home Office does must be confidential or classified.

The people who we task with dealing with these terrible issues deserve to work in a culture of compassion and competence, with solid foundations in Justice – the current Home Office has none of these.

Secret: Hiding errors in a file marked Secret harms the public good.

As can happen with bureaucracies more generally, the hint of secrecy at Home Office has spread into an all-encompassing security blanket around any information that might be be helpful to an informed debate in a democracy.

Treating information about every offence and misdemeanour as if they were the worst, keeping arbitrary secrets, and hiding your actions while telling others they must simply trust that “It’s for your own good” are the actions of someone who has lost perspective. Lost a sense of proportion. And lost the ability to discriminate, except in the prejudicial sense.

The examples of this are countless – from Ministers’ refrain of “trust us” about the ID scheme to “We know but we can’t tell you” about the Communications Data Bill; from petty refusals to extreme resistance to simply ignoring requests for information; and as evidenced by the secret ‘National Back Office’ in the NHS, only exposed in 2014, when Sir Nick Partridge reviewed what happened in the building where the previous-but-one Home Office-administered ID card scheme ended up.

Worse than that, on getting information via a backdoor into people’s medical records, the Home Office wrote in secret to people’s doctors, telling them to deny treatment.

Invasive: No consideration of innocence, or the consequences of action

The political culture pervading the Home Office has led to an organisation which cannot consider side-effects.

It sent round “Go home” vans because they might contribute to a “hostile environment” for illegal immigrants, without any regard to the effects of that hostile environment on innocent parties.

And it’s lost the ability to discriminate: to Home Office, everything it looks at is a crime – or a potential crime – so it is prejudicial towards everyone.

In being unable to discriminate between ‘crimes’ – including thoughtcrime, and perfectly normal behaviour, such as trying to keep your personal communications private – Home Office discriminates wildly and inappropriately against whole classes of people, and individuals who have in fact done nothing (or very little) wrong.

And, in pursuit of its obsessions, it considers nowhere, and nothing, sacred (Q78).

If it will not respect the boundary of the confidential relationship between you and your doctor, where is it that you believe the Home Office will not go?

In this world view, the entire country gets treated the way the Home Office treats illegal immigrants (which it claims is “respectful”!) and – after many attempts, including a RIP Act that for years emboldened nasty, technocratic petty-mindedness down to the local council level – it has finally got its Investigatory Powers Act, so it can snoop on all our communications data.

Nasty: Fear breeds paranoia, and suspicion is contagious

Bullies are fearful. They don’t always appear to be – especially when they get themselves a gang. But you can tell bullies by the way they pick on people, and who they pick on; the weak, the odd, the vulnerable. People who can’t put up a fight.

The Home Office delivers little itself; it cannot act directly in many of the areas for which it is responsible. For these areas of concern, it develops policy, dispenses budgets for various programmes, commissions systems, lobbies for legislation, and other things – but it assumes everything will fail, which leads to suggestions like a 15 foot high concrete wall around Parliament: “Operation Fortress Commons”.

But the few things it can do corrupt everything. It tries to turn everyone it leans on in every part of the public services into a border guard, or a snitch. Demanding the Met hand over details of those who witness crimes makes everyone less safe – if you are the victim of a crime, you want those who know something to share what they know with the police, without fear that it may be used against them. In this case, the hostile environment is hostile against innocent victims of street crime, because the Home Office has harmful priorities.

There are countless examples of each of these, which will appear over the course of the campaign. Some of them will even come from us…

The Home Office has been responsible for a string of high profile, national embarrassments in recent years. Flawed decisions by Home Office led to national humiliation at the opening of Terminal 5 – ever wondered why the baggage handlers couldn’t get to work? The shameful disarray of the G4S contract for the Olympic Games, from which Home Office had to be rescued by the military. The collapse of many criminal trials because policy at SOCA and NCA was simply unlawful. The harm to the UK’s economy, and international reputation, from the wrongful deportation of 48,000 students – because the Home Office panicked after watching a TV programme. And the harm to public safety and public confidence.

Shorn of Justice, the Home Office has lost touch with humanity, proportion, and the fundamentally positive spirit of the Britain. Human Rights are pretty much all that protects you from excesses or mistakes by the Home Office.

How does this relate to the NHS and privacy?

The greatest hazard in this election comes not to/from Brexit, but rather the deeper, more insidious threat to the autonomy of every citizen from the State. It forgets the worldview that created the NHS: that no matter what the world’s darkness, there will always be people there helping.

In a Brexit world, the Home Office worldview offers the NHS just three choices: be nasty to ‘brown people’; be nasty to everyone; or ID cards. These are the only choices its worldview can see, while the perspective of the NHS is quite simple; healthcare, free at the point of use, for all those in need. Without discrimination.


Related pieces:

medConfidential Bulletin, 21st April 2017

Though the political focus is on the General Election, the ‘STP shuffle’ remains highly significant. Whatever the result in June, both funding and decision- making for health and care services will be increasingly devolved to local areas.

What’s happened? General Election!

What medConfidential will be looking for in every party’s Manifesto is rather simple:

    Will patients know how their medical records have been used?

A straightforward “Yes, they will” or “No, they will not” will suffice.

Every flow of data into, across and out of the NHS and care system should be consensual, safe, and transparent – there need be no conflict between good research, good ethics and good medical care.

We shall provide more detail on how this relates to current issues like Genomics and AI in due course – but the question to which there must be a clear answer, for whatever the future brings is: Will you know how data about you is used?

Update on DPA Section 10 notices

Last December, NHS Digital and Public Health England (PHE) were sent hundreds of Section 10 Data Protection Act notices by patients who had opted out, insisting that their data should not be sold – even through a loophole.

Though there were some ‘boilerplate’ responses, both bodies effectively ignored every single one of those notices. Patients’ data continues to be sold for commercial re-use, and further problems have emerged:

  • PHE considers itself exempt from existing opt-outs; will it make you opt out again?
  • What about the NHS? Will the Government’s response to Caldicott 3 force yet another opt-out?

It is understood the Caldicott Consent Model should include overrides – and some exceptions, where required by law – but this should not be at the whim of Public Health England, which still copies patient data to companies in secret. PHE said it was becoming transparent, but its own actions give lie to this and still it demands more data.

If you want to know public health information about your area, PHE thinks you should use a site called “fingertips” – which gives you a mountain of statistics, a trowel, and suggests you start digging. If you want to see the biggest public health issues in your area, you may want to try this list instead.

Speaking of digging…

Questions for the elections; what is your lived experience of the NHS?

With STPs and financial devolution on the way, it’s the candidates who are elected in your area who’ll be making decisions that will impact directly on your, your family’s and your community’s health and care services – and the exploitation (or not) of your medical records.

In the run-up to the elections, all you need do is ask the people who canvas you some straightforward questions, share some of what you know from your own experience, and put up a poster to encourage your neighbours to do the same. Here are our suggestions:

  • Does [the candidate] agree that everyone should be told how the council and NHS use their data?
  • Given the political choices that are changing the NHS in your area, how would your own or your family’s past experience of the NHS have been different?
  • What are [the candidate]’s priorities for reducing problems that put a strain on your community’s NHS and care services?

If you get answers, please do post them on facebook and in other appropriate forums, so others can see them too.

Phil Booth & Sam Smith
21st April 2017

medConfidential Bulletin, 9th April 2017

Where does your data go? And do you know? These are questions to which we’ve been getting you answers for three years or so, but now you have an opportunity to ask these questions too… Local elections are coming up, and political parties want your vote…

But first:

What just happened?

In a 280-page PDF from NHS Digital is one item worth noting; “Programme 12: General Practice Data for Secondary Uses” (item C4 on page 56) with a deadline of this Christmas is – as far as medConfidential is aware – the first public sighting of… the return of care.data

So, although the Government has yet to issue the necessary CAG Regulations; or ‘one strike and you’re out’ sanctions for data misuse or abuse; has failed to close the “promotion of health” (i.e. Pharma marketing) and commercial re-use loophole; still hasn’t put the National Data Guardian on a proper statutory footing, let alone responded to the Caldicott 3 review; is mute on whether you will have to opt out again, and whether cancer patients will have their data copied anyway; and wants to copy data to any Government department under the Digital Economy Bill; it seems someone is eager to flood the “National Data Lake” we mentioned in our last bulletin.

What’s happening next?

Unless you pay close attention to NHS internal meetings, you could be forgiven for knowing little about how the NHS talks to itself, but the 44 Sustainability and Transformation Plans (STPs) is the jargon for a new NHS reorganisation that really matters. To you.

The NHS England website describes them as follows:

NHS organisations and local councils are developing shared proposals to improve health and care. Working in 44 geographical areas covering all of England (called ‘footprints’), the plans are led by senior figures from different parts of the local health and care system.

It is this top-down-mandated, bottom-up-driven restructuring into STP “footprints” that has led to the mega-CCG mergers in Manchester, Lancashire, and Liverpool, with more mergers planned in other cities of the North, and across the rest of England (e.g. in Buckinghamshire).

Why you should care is that this ‘STP shuffle’ will put your local council in partial control of where your medical records get copied – including how much of your personal data will end up being dumped into a “national data lake”.

In ducking responsibility, as they have since care.data started, NHS England claim all decisions will get made “locally”, but they can choose to send more cash for more data…

What can you do?

If you have elections in May, some of the candidates will end up choosing who sits on your local Health and Wellbeing Board. That will be the body that chooses how your area’s health budget gets spent – what gets funded, what gets cut, and what medical records they copy to the Data Lake in return for more resources…

Given this, we suggest you ask your council candidates a few questions that might them focus on the issues and evidence, and then help you and your community decide who’s paying proper attention to the impacts on your health and care, and medical confidentiality:

  • Community: Do they agree that you should be told how the council and NHS use your data?
  • Contribution: For the political choices that are changing the NHS in your area, how would your own or your family’s past experience of the NHS have been different??
  • Autonomy: What are their local priorities for reducing problems that put a strain on your local NHS?

If you get answers, please post them on facebook and other appropriate forums, so your neighbours can see them too; here are some ‘localised’ posters you can print out to help you.

If you’d like us to send you some, we’re offering five A3 posters for a £5 donation – when sending us the money, just add a comment with your address and we’ll send you posters for that postcode. (N.B. If you don’t add the comment, we won’t see your address.)

We’re glad to see a number of you are quite happy with our new badges (with text | no text) and are immensely grateful for the £20 donation medConfidential gets every time someone buys one. Thank you.

More next time on who wants to go fishing in the National Data Lake…

Phil Booth & Sam Smith
9th April 2017

What are the principles that should underlie a login infrastructure of a digital NHS?

DH / NHS Digital’s name for the work they are doing on patients identifying themselves digitally is the “citizen identity” programme – a name which demonstrates the fundamental misunderstanding of the problem that needs to be solved. They expect to launch in September (item A1, page 54).

Designed after the Home Office ID cards scheme was abolished, the Government’s generalised login solution is an implementation of the ID Assurance principles, usually called GOV.UK Verify. It would allow a range of different “identity assurance” providers to allow patients to log in to a wide range of different services, without creating an overarching “database” of anything. There are lots of constituent parts to Verify, all working together, underpinned by a set of principles that are accountable to an independent advisory group and Ministers. The principles are separate from the infrastructure design, which are separate to the deployment for Gov.UK.

The NHS should agree to follow the Principles, and make it’s own deployment of the infrastructure, basing an ‘identity’ assertion in a pre-existing legitimate clinical relationship.

As a model, this would be closely aligned to the current NHS model for patients logging into services, known as Patient Online, for which GPs distribute logins – as they know who patients are, and can manage the exception handling (lost passwords, verification, edge cases, etc.).

Meanwhile, over in the database state corner, there are still projects looking to build a centralised login infrastructure for all digital health services, derived from a legal document – such as a passport, driving license, or tax payments.

Identity Assurance Principles

The PCAG Identity Assurance Principles should apply to the NHS login infrastructure, and be overseen in a similar way. A patient has a choice over which GP they wish to use; which provides for the choice of identity provider. Due to the range of conditions handled by the NHS, it may be clinically necessary to in practice deny to a patient choices they may otherwise in principle have – but only for clear clinical reasons.

It is initially convenient that the Principles, and the current mechanism for handing out usernames and passwords to patients across the NHS (i.e. GPs) align extremely well. There will need to be work on the infrastructure middleware layers of the system, but the Patient Online programme – giving details to already identified patients – has already begun, and begun at scale.

Whatever system is used must accommodate and enable patients who wish to keep some aspects of their treatment entirely disconnected from other aspects. Whether this is via one login for all NHS services, or for particular areas carved off, should be entirely under the patient’s control, and not be restricted by NHS technical decisions.

Technically, this is not difficult. The infrastructure has already been designed and built by GOV.UK, and that code can be reused. Whether NHS Digital reuses the Cabinet Office servers and operations team is primarily an operational question.

As a political and Governance framework, the principles may be hard – and digital identity governance doesn’t currently exist in the NHS – but it does exist in PCAG. PCAG should therefore be asked by DH to assess whatever the NHS implementation is, against the PCAG principles. This will require some complex conversations, and learning on all sides.

The standards and code are copied, the principles are accepted, the identity providers and service acceptance standards are NHS specific.

Absent leadership from DH, this could be almost impossible. It is absolutely vital that this delivers, and delivers fast, in order to realise significant savings in the NHS Budget. Those who control the budget are not necessarily the people who are capable of delivering quickly, nor are their interests necessarily served by a solution with strong governance.

The NHS expects to find £1bn a year in savings from reducing missed appointments via a better digital Choose and Book. The service already exists – there is simply no way to log into it easily. Let us say that again: £1 billion in short-term savings, simply from the NHS having a proper digital infrastructure.

Patient Online works for assigning patients with usernames and passwords, based on a clinical relationship, and Verify’s infrastructure has been shown to scale. Ad hoc identity approaches have been shown to fail.

Should passwords get stolen, the Patient Online system can include an additional factor: needing to know the GP for which a stolen username/password is valid. It is likely that username/GP lists are rarely stored (other than by the GPs themselves) which gives the NHS regulatory assistance unavailable elsewhere.

Here is our demonstrator: if you have a login for your GP, feel free to try the blue buttons

There is no reason for any part of the NHS to have a big list of all of the services a patient has used.

If the current world view persists, initiatives like the excellent SH24 projects, and a digital Dean Street Clinic, are going to remain services that cannot function at scale – because there will be no national infrastructure for them to reuse. A Verify-based governance model can do that, and they would also be able to issue their own usernames/passwords since they deal directly with patients, as GPs do.

Our demonstrator is purely a proof-of-concept. NHS England could have published in machine-readable form the login page for each GP, but for some reason didn’t see the need. #NHSAlpha, who could have got them to do it, instead wanted to own the database – so started work on the impersonation problem. Badly. These are both things that other parts of the NHS handle every day, and DH can only do worse at greater expense from afar. It is concerning that the NHS ‘technical silos’ have not recognised that this is a system which can be encouraged, and instead sees it as a technical problem with a technical solution.

There must be better governance around logins and how digital health information accesses are run. The PCAG principles are the beginning of that discussion, not the end. GOV.UK Verify relies heavily on passport verification, and the issuance of passports relies heavily on NHS-derived data. It would be perverse to go round that entire loop in order to issue a GP login, when the GP is also someone relied on to prescribe mind and body altering substances. But along the corridors of DH and NHS England, there are a handful of people muttering “My Precioussss”, while trying to forge a database state for medical information – worse than that, none of the projects are actually talking to each other.

Sound familiar?

care.data lessons, unlearned

The HSCIC (the statutory body otherwise known as NHS Digital) has a form you can fill in to opt yourself out of the various HSCIC datasets; the form is 12 pages long. The equivalent form, ready to be handed to your GP, is one side of A4 and contains just 2 tick boxes – plus space for your name, address, etc.

The HSCIC 12-page form has those same tick boxes, but the other eleven and a half pages are all about verifying identity, so that a remote institution that very, very rarely deals directly with patients knows that the right person filled in the form.

A process, done at the wrong level, can generate that much extra paperwork

 

Summary:

Extend Patient Online using GOV.UK Verify’s infrastructure design, augmented with the following features:

  • Only those organisations that have a clinician-patient relationship should be responsible for issuing identity credentials to individuals;
  • NHS Patient Identity should follow the PCAG principles;

The identity requirement for the NHS is not a citizen identity, but it is a patient identity – even if the patient is entirely healthy.

Here are some available next steps.

medConfidential Bulletin, 24th March 2017

It has been a while since we last sent a newsletter. Our apologies for that, but we have been kept busy!

We are entering a period where a lot of things are happening – and are likely to happen – in quick succession, so we wanted to provide a perspective and some context that we hope will help explain at least some of what is going on.

For patients whose practices use TPP SystmOne

You may have seen the note on our website last week about TPP SystmOne. TPP has now updated its system with the capacity to allow your GP tell you how your GP-held data has been accessed. However, busy GPs won’t yet know how to turn that function on, as the documentation has not yet appeared (and we’ve not been told either).

If your practice uses TPP SystmOne, also branded SystmOnline, and you are able to log into your GP practice online (i.e. if you have a username/password for online access) then you may be able to see this option – to review the organisations which have accessed your GP data – right now. If not, check back in a week or two. It is coming.

This ability to see who has accessed your GP data matters, as the the hard part of informed consent is actually being informed about how your medical records are used. As the NHS evolves over time, and while you have a range of consent choices, you need to have accurate information to be able to make those choices for yourself and your family; in your situation, according to your concerns.

Problems tend to arise when people other than those directly affected take decisions that do not – indeed, cannot – account for many millions of people’s individual circumstances.

Google Artificial Intelligence (AI) subsidiary DeepMind

When in a hole, it seems some AIs will keep digging.

medConfidential’s complaint against Google DeepMind’s use of 1.2 million patients’ hospital data continues to be investigated. The National Data Guardian appears to have come to a view some time ago – which suggests the question currently under consideration is how badly Google broke the rules.

A long analysis from the University of Cambridge was published last week, which goes through the entire sorry story in a great deal of detail.

We do not know when the Information Commissioner and National Data Guardian will publish their findings, but fully expect Google DeepMind to leak some parts of those findings to sycophantic outlets the day before…

We shall respond, as we always do.

What’s next?  An NHS reorganisation that really matters

Has your area announced the reorganisation of your NHS yet? For several big cities of the North, and some other parts of the country, the picture is getting clearer. The ‘STP shuffle’ will put your local council in partial control of where your medical records get copied – including whether they end up being dumped into a “data lake”.

In hidden meetings, proposals for a “national data lake” continue to be discussed. While NHS England denies it is their current plan, they continue to write regular drafts of an updated document, which they’re sharing with no-one beyond those people who thought a ‘National Data Lake’ was a good idea in the first place…

In our next Bulletin,  we hope to have something for you to do to help your community, and may also give an update on the continuing failures around data at Public Health England.

As ever, we are grateful for your donations. Especially as, right now, we’re being legally threatened (we’re in ‘letters before action’ stage of an attempt to sue us for defamation) for expressing our concerns about a data breach reported as affecting 26 million patients – that’s a lot of new badges.

(We’re aware that, as badges, our button badges in two new designs are ridiculously overpriced. The price point is deliberately chosen so that a donation of £20 to us gets you one, automatically. Or set up a regular subscription for any amount – and we’ll post it to you.)

Thank you.

Phil Booth & Sam Smith
24th March 2017