Category Archives: News

Decision making by the Information Commissioner

The Information Commissioner’s Office operates on legal realities, i.e. “What is currently the case?”. This explains why the ICO may enforce at one minute past midnight on the day a programme comes into force, but not before. It can be infuriating, but that is what a regulator is empowered to do.

“Being legal” is a binary state – something is either legal or it isn’t.

If there is one way in which a situation or scheme or system is legal, and no ways in which it is illegal, even if there are many ways in which it is really creepy, it’s still legal. This is often infuriating in the private sector, but in the public sector there is a very different environment – because, most of the time, public sector bodies don’t get to operate in ‘stealth mode’. In the private sector, the ICO by and large regulates against dishonesty rather than for good data hygiene. The public sector is held to a higher standard.

Either way, before 00:01 on the first day of operation, the ICO operates only on scenarios, or possibilities.

You can in fact put a scenario to the ICO and, while its officials don’t necessarily like hypotheticals, they will offer an opinion based on what you have said.

What most people fail to understand is that ICO decisions are based exclusively upon the scenario (or evidence) as presented to it.

If you tell the ICO that you will do X, and its officials suggest that X is most likely legal, then that opinion will simply not apply if at 9:12 am on the second Thursday of the following month it turns out you instead do X plus Y; that is a different scenario.

Clearly, if you miss out critical information from the scenarios you present, then the ICO’s opinion cannot and does not reflect what you are actually doing; it only reflects what you say you are doing. Remember, the ICO operates on reality – which is why it can only enforce at 00:01 on the first day of operation.

Where the ICO issues “contradictory advice”, it is almost always because the information it was presented with changed.

In a hypothetical scenario, when the scenario changes, the ICO reserves the right to change its mind. What else would it do?

If ICO officials “change their minds” when presented with what is ostensibly “the same” information, it likely demonstrates the fact that – in the ICO’s opinion – material information was omitted the first time.

For example, care.data’s communications programme collapsed because what NHS England told the ICO turned out to be incomplete – when other information was added, and checked against reality, what NHS England said it would do, and what it actually did, were shown to be materially different.

If you want to understand why the ICO changes its mind, the best place to start is with what you didn’t tell its officials, that someone else did.

A first look at the Manifestos

For the party manifestos, medConfidential had a single request:

Will patients know how their medical records are used?

How did the parties respond? (Remembering that the Conservatives are in Government, so should have more detail than the opposition parties.)


Conservative Manifesto

Quite a bit of good news, if the currently most likely next Government remembers what it said:

“We will put the National Data Guardian for Health and Social Care on a statutory footing to ensure data security standards are properly enforced.” (p80)

The NDG’s statutory footing should be based on Jo Churchill’s Bill (our view) which was published before the election. While the Government didn’t enable the Bill to go to Committee, putting it as a Part in the forthcoming Data Protection Bill (mentioned elsewhere in the manifesto) should not be controversial. Allowing the Data Guardian to ‘follow the data’ means that public health copies of NHS data are also covered, and therefore can be properly consented.

 

“We will give people new rights to ensure they are in control of their own data “ (p79)

It’s impossible to control what you don’t see – so a citizen’s view of Government data use (or a patient’s view of the uses of their medical records, or a customer’s view of commercial data use) is a prerequisite for control.

Whether “control” means taking back control from those who would copy data “for the greater good” in secret, e.g. for “decommissioning”, or whether there will simply be transparency and accountability over where data is copied, it will be hard for anyone to argue that this line does not commit the Government to a single overarching opt-out from secondary uses of medical records – in line with Caldicott 3.

 

“to ensure the very best standards for the safe, flexible and dynamic use of data and enshrining our global leadership in the ethical and proportionate regulation of data” (p80)

While this isn’t quite consensual, safe, and transparent, it is a beginning. However, with the Data Controller in Chief believing there is no data use that could not be ‘proportionate’ – on the tautological basis that if it is being used, then it must be proportionate – this will likely lead to controversy. The scale of problems will be determined by the level of secrecy we refer to in our previous paragraph: will there be secrets?

We acknowledge that this is, however, an improvement over the current state of affairs – having the conversation is far better than not having it at all.

 

“To create a sound ethical framework for how data is used, we will institute an expert Data Use and Ethics Commission to advise regulators and parliament on the nature of data use and how best to prevent its abuse.” (p79)

While this may sound good in theory, in practice – as we’ve seen with Google DeepMind – such advisors often end up acting as a rubber stamp for deniable practices. That is, when they’re not ignored entirely. Whether this Commission will have teeth, or will have failings similar to those of the various other bodies created recently, will depend on the details.

We look forward to the consultations…

 

“…we shall roll out Verify, so that people can identify themselves on all government online services by 2020, using their own secure data that is not held by government. We will also make this platform more widely available, so that people can safely verify their identify to access non-government services such as banking. We will set out a strategy to rationalise the use of personal data within government, reducing data duplication across all systems, so that we automatically comply with the ‘Once-Only’ principle in central government services by 2022 and wider public services by 2025.”

Good. The Verify infrastructure and principles can be used to deliver consensual, safe, and transparent digital services – whether in the NHS, across Government, or beyond.

Alongside the commitment to safety, this suggests that the privacy protections of Verify can be used to solve the design failures of the pornography rules in the Digital Economy Act – although we don’t expect Verify to be renamed ‘PornID’ any time soon!

If the controversial proposal for showing ID at a polling station is shown to be necessary, Verify offers a digital mechanism for a non-centralised form of validatable ID, including full “same-day” voter registration, using only a mobile phone (including a pre-paid mobile phone, which can be used to create a Verify account, and then the credential to vote), for free, for everyone. This would be an improvement over the status quo.

The explicit rejection of “sweeping, authoritarian measures” such as the failed Home Office ID scheme is missing, but a wider rollout of Verify – along with services offered in G-Cloud 9, resulting from a privacy discussion with the DG of HMPO – should make any return to ID cards not only unnecessary, but shown to be motivated by other desires. (There’s also no reference to the 53 million genomes project – but, given the delays in the 100,000 genomes project, and the problems with that approach in the delivery of health care, that shouldn’t be a surprise.)

Especially around Verify, but also given the response to wider events, recent weeks have shown the failures of the current digital leadership in Government. Whether digital transformation will cease to come from Government, and instead again come to Government, remains unclear.

 

Will citizens, will patients, will customers, will users know what these changes mean for them in practice? Will they know how their data is used?

It’s all too easy to forget the human details when you’re working on “great challenges”. Which goes for everyone, at every level, however they claim to represent others. This manifesto (as do the others) contains many fine words, but aspirations aren’t actions. Promises must be delivered, and be seen to be delivered. And those who make decisions based on our data, and about our lives, must and will be held to account – by the people affected by those decisions.


Labour manifesto

Without access to the civil service, it’s hard for opposition parties to have details on unannounced Government policy – much of the Conservative manifesto quoted above is a delivery of existing work.

“Labour is committed to growing the digital economy and ensuring that trade agreements do not impede cross-border data flows, whilst maintaining strong data protection rules to protect personal privacy.”

That statement leaves very little space between Labour and the Conservatives on this topic.


Liberal Democrats

Despite lots of detail on many things, there is no clear policy from the Lib Dems on consent and data privacy, although in a section entitled “Defend Rights, Promote Justice and Equalities”, it says:

“As liberals, we must have an effective security policy which is also accountable, community and evidence-based, and does not unduly restrict personal liberty.”

This is the closest that we get to data. However, since this applies in the secret part of Government, it must also apply in the non-secret parts.


The Green Party & UKIP manifestos haven’t been published as of the time of writing.

medConfidential rapid responses to DeepMind’s statements about their “legally inappropriate” data copying

We shall update this page as more information becomes available (newer items at the top).


Tuesday 11am:

Yet more questions raised about the usage of the Streams app

The Sky News footage shows that the Streams app is still in use, displaying information from patients – their name, their date of birth, gender, and other demographic information.

Where does that information come from? How does the ‘calendar gap’ affect patient care?

There are 3 choices:

  1. It comes via the first contract that has been found to be unlawful (with the calendar gap)
  2. The second contract is being breached (which also contains the calendar gap)
  3. There is a third secret contract hidden from scrutiny

Or Google’s AI team has come up with something else legally dubious to continue to unlawfully copy 1.6 million patient records… this suggests an uber-esque approach to the law, and to safety.

What is the ‘calendar gap’?

The data Google DeepMind unlawfully copy is up until “last month”. It is currently the 16th May 2017, and at best, the data they copy will run up until 30 April 2017. On the 29 May, they will only have data until the end of April. When there’s a new month, they get an updated dataset covering the new “last month”. (It possibly takes a few days to actually happen, but you get the idea.)

Streams will help you if you were in the RFH last month. If you were there last week, however, the effect of the contract is that Streams could cause you harm – as Google’s app may prioritise longer-term patients it knows more about, over newer ones it knows less about.

Such problems are why MHRA approvals and a proper testing regimen are so important. To be absolutely clear, this failure is not endemic to Streams – the DeepMind deal with Imperial does not contain it, for example – but it appears as a dangerous symptom of the deal from DeepMind, that has been found to be unlawful.

We’ll ask the National Data Guardian for clarity later today.


Tuesday 10am:

We’ve seen this piece being discussed: the article is correct about patients who were receiving direct care – but out of the 1.6 million patients’ data it copied, DeepMind in February 2017 said it had assisted in the direct care of just “more than 26”.

So while 27 records may have had a lawful basis, 1,599,973 didn’t.

It is the 1,599,973 records that are of concern here. Similarly, while there is not necessarily any problem with testing an app, testing an app isn’t the same as providing direct care. It is a separate process that DeepMind didn’t go through, as their interviews at the time made very clear (Note 6).


Tuesday 10am:

If Google DeepMind didn’t receive the letter containing the NDG’s finding, as they have said to medConfidential (after the date on the letter), they should have a chat to the gmail team about such a convenient problem that no one else sees…

Even if that excuse was valid in the past, there are now lots of copies of the letter on the internet, evidencing their unlawful behaviour. Although Dodgy Donald from DeepMind might be in denial about even that.


Monday night:


Under the heading, ‘What we’ve learned so far’, a newly updated page on DeepMind’s website states:

There’s very low public awareness of NHS technology, and the way patient data is routinely used in the provision of care. For example, many people assume that patient records are normally stored on NHS-owned computers, when they are in fact routinely processed by third party IT providers. This confusion is compounded by the potential future use of AI technologies in the NHS.

medConfidential comment:

This response by Google shows that DeepMind has learnt nothing. There may well be lawful reasons for third party IT providers to process data for direct care for 1.6 million patients – unfortunately for Google’s AI division, developing an app is not one of them.

Google told the public as little as they thought they could get away with – and being duplicitous, they still are. And, in so doing, they are trying to force the NHS into taking the blame for their mistakes.


Regarding the investigation by Sky News into the sharing of patients’ records, which begins:

Google’s artificial intelligence arm received the personally identifying medical records of 1.6 million patients on an “inappropriate legal basis”, according to the most senior data protection adviser to the NHS.

medConfidential comment:

Google’s lawyers are expensive, but “inappropriate legal basis” is still a euphemism for unlawful.

Buried in the interview footage is a statement from a nurse that the app is still in use with patients today. Also:

“The testing for the Streams app has now concluded and it is being used at the Royal Free Hospital, Prof Powis told Sky News, under a second agreement which is not being investigated.” (Sky News article)

Unfortunately for Google, their own press release from last November states that the same data is shared under both agreements.


 

[PRESS RELEASE] Google DeepMind unlawfully copied the medical records of 1.6 million NHS patients

“A core part of Google” has been told it has no lawful basis to process 5 years’ of patient data from the Royal Free Hospital in London. [1] With no legal basis, the data must be deleted.

In May 2016, the New Scientist reported [2] that Google DeepMind had access to a huge haul of patient data, seemingly without appropriate approvals. In July 2016, the MHRA confirmed [3] that DeepMind had not received any approvals for a trial involving patients, using patient data. In November 2016, DeepMind signed a replacement contract covering exactly the same data. [5d]

The National Data Guardian has provided a view on this matter (all emphasis added): [1]

The Royal Free “…confirmed to us [NDG] that 1.6 million identifiable patient records were transferred to Google DeepMind and that implied consent for direct care was the legal basis for the data processing.

“…Streams was going through testing and therefore could not be relied upon for patient care, any role the application might have played in supporting the provision of direct care would have been limited and secondary to the purpose of the data transfer. My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose.

It is unclear whether Google DeepMind has complied with the finding that it had no legal basis for processing this data; nor is it clear what it was that first attracted DeepMind executives to unlawfully copy 1.6 million people’s medical records, repeatedly insisting on direct care as the sole legal basis. [8]

medConfidential agrees with the Information Commissioner, when she said in a speech to technology companies: “I do not believe data protection law is standing in the way of your success.” She reminded her audience: “It’s not privacy or innovation – it’s privacy and innovation.” [4]

In this case, this DeepMind project turned out to be neither of those things. [9]

The National Data Guardian’s investigation has made clear – despite their claims to the contrary – that DeepMind had no legal basis for their actions in this project.

medConfidential coordinator, Phil Booth, said:

“This letter shows that Google DeepMind must know it had to delete the 1.6 million patient medical records it should never have had in the first place. There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it.

“Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.

“While the NHS sent doctors to a meeting, DeepMind sent lawyers and trained negotiators. What this boils down to is whether Google’s AI division followed the law and told the truth; it now appears they may have done neither.

“As events this weekend have shown, it’s the number of copies of patient data that matter – encryption locks won’t reassure anyone, if the wrong people have been given the keys.”

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

For further information or for immediate or future interview, please contact Phil Booth, coordinator of medConfidential, on coordinator@medconfidential.org

Notes to editors

1) “The NDG has provided a view on this matter to assist the ICO’s investigation” was the National Data Guardian’s comment on the publication of the University of Cambridge paper, ‘Google DeepMind and healthcare in an age of algorithms’: https://link.springer.com/article/10.1007%2Fs12553-017-0179-1 and http://www.cam.ac.uk/research/news/
deepmind-royal-free-deal-is-cautionary-tale-for-healthcare-in-the-algorithmic-age

Sky News published a copy of the letter from the National Data Guardian on 15 May 2017: http://news.sky.com/story/google-received-16-million-nhs-patients-data-on-an-inappropriate-legal-basis-10879142

2) medConfidential raised a complaint [4] to the ICO following reports in the New Scientist, and follow-ups elsewhere, about secretive data use by Google DeepMind:

a) New Scientist, 29/4/16: https://www.newscientist.com/article/2086454-
revealed-google-ai-has-access-to-huge-haul-of-nhs-patient-data/

b) New Scientist, 13/5/16: https://www.newscientist.com/article/2088056-did-
googles-nhs-patient-data-deal-need-ethical-approval/

c) Daily Mail, 4/5/16: http://www.dailymail.co.uk/news/article-3573286/NHS-
trust-handed-private-patient-details-Google-says-implied-permission-emerges-hospital-talks-internet-giant.html

d) BBC, 19/7/16: http://www.bbc.co.uk/news/technology-36783521

e) Guardian, 6/5/16 (note 9 May & 25 July updates at the bottom of the article): https://www.theguardian.com/technology/2016/may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder

3) “DeepMind is currently working with the MHRA to ensure that the device complies with all relevant medical device legislation before it is placed on the market” – TechCrunch, 20/7/17: https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

4) Information Commissioner’s speech, ‘Transparency, trust and progressive data protection’, 29 September 2016: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/
2016/09/transparency-trust-and-progressive-data-protection/

5) medConfidential’s complaint is available here:

a) Timeline of events, as of 31/5/16: https://medconfidential.org/wp-content/uploads/
2016/06/medconfidential-deepmind-timeline.pdf

b) Complaint to Regulators: https://medconfidential.org/wp-content/uploads/2016/06/
medconfidential-to-regulators.pdf

c) Shortly after submission, the MHRA found that the project should have been registered with them (and wasn’t): https://techcrunch.com/2016/07/20/deepminds-first-nhs-health-app-faces-more-regulatory-bumps/

d) The end of the first ‘Note to editors’ in a press release from the Royal Free Hospital on 22 November 2016 clearly states: “The new agreement does not change the number of patients whose data will be processed by Streams”: https://www.royalfree.nhs.uk/news-media/news/nhs-and-technology-leaders-agree-groundbreaking-partnership-to-improve-safe/

6) Claims by the New Scientist have been vindicated by the investigation, despite an extremely strong PR response from Google. Contemporary quotes from project advocates, which now ring hollow, include: [all emphasis added]

a) Mustafa Suleyman, Co-Founder at DeepMind, has said:

i) “As Googlers, we have the very best privacy and secure infrastructure for managing the most sensitive data in the world. That’s something we’re able to draw upon as we’re such a core part of Google.” [Guardian, 6/5/16]

ii) “We have, and will always, hold ourselves to the highest possible standards of patient data protection.” [Daily Mail, 4/5/16]

iii) How this came about all started with Dr Chris Laing, of the Royal Free Hospital: “We went for coffee and ended up chatting for four hours.” [BBC News Online, 19/7/16]

iv) More recently, in an interview with Mr Suleyman published on 20/3/17: “When pushed on how the public would be assured that its sensitive data was safe, Suleyman replied, “first there is the law”.” [Digital Health, 20/3/17]

b) George Freeman MP, at the time a Minister in the Department of Health: “NHS patients need to know their data will be secure and not be sold or used inappropriately, which is why we have introduced tough new measures to ensure patient confidentiality.” [Daily Mail, 4/5/16]

c) Professor Hugh Montgomery, (consultant for Google’s DeepMind project) said, on Radio 4’s PM programme on 4 May 2016:

i) “So this is standard business as usual. In this case, it was a standard information data sharing agreement with another supplier, which meets all of those levels of governance. In fact, the agreement there, or the standards of management of those data, meets the very very highest levels. It meets something called HSCIC level 3, which most hospitals trusts don’t even reach.” [Recording of audio available, see link below]

ii) “So firstly, this isn’t research. Research is governed by an entirely separate process that would require anonymisation of data and all sorts. This is data processing.”

iii) “It’s fair to say again that not only is this data at the very highest standards, and beats every standard, and more in the United Kingdom. But the data is encrypted end-to-end, and they have to, like everyone else in the health service, stick to the law.”

iv) Recording of audio available at: https://www.dropbox.com/s/cfimojgec24rlrj/
20160504­deepmind­radio4­pm.mp3?dl=1

d) Will Cavendish, now Strategy Lead for DeepMind Applied, formerly Informatics Accountable Officer at the Department of Health, said (when IAO):

i) …“The vital importance of trust, security, and cyber security.” … “To be honest, it used to be that not a week goes by, now it’s not a day goes by, without stories of hacking, data leaks, inadvertent data sharing. This absolutely erodes the trust that underpins the work that we do.” https://www.youtube.com/watch?v=5Ej3PRF1jUw&t=2h15m5s

e) Dr Julian Huppert, Chair and “on behalf of the Panel of Independent Reviewers for Google DeepMind Health” said in an e-mail to medConfidential on 6/7/16:

i) “one of our roles is to look in detail at how DeepMind Health uses patient data, and to confirm that it complies with the highest ethical and regulatory standards.”

ii) “We believe from what we have seen so far that DeepMind has a clear commitment to the Caldicott Principles, and that they have to date been honest in their public and private comments. We also believe they are willing to work constructively with regulators, and remain within the law.

7) The claim to reach “HSCIC level 3” was a self-assessment by DeepMind, which was revoked upon examination. [See the 25 July update to this Guardian article].

8) In a controversial press release by the hospital on 24 February 2017, the word “Google” did not appear once, despite point 6 (a)(i) above: https://www.royalfree.nhs.uk/news-media/
news/new-app-helping-to-improve-patient-care/
and a subsequent Guardian article on 9 March 2017, from a press release by Google DeepMind, which explicitly attributes actions to Google DeepMind: https://www.theguardian.com/technology/2017/mar/09/google-deepmind-health-records-tracking-blockchain-nhs-hospitals

9) “ “With health data, and government acquired health data, we need to be sure we aren’t, in effect, giving oxygen away for free to a private company that will start to sell it back to us,” says Azeem Azhar, who writes the popular Exponential View newsletter…” – Quartz, 17/3/17: https://qz.com/934137/googles-goog-deepmind-got-too-much-health-data-from-
britains-nhs-research-paper-says/

– ends –

medConfidential comment on “NHS” cyberincident

Regarding the ongoing [time of writing: 18:30, 12/5/17] international cybersecurity incident, currently affecting – amongst many others – a number of NHS hospitals and GP practices.

Phil Booth, coordinator of medConfidential said:

“medConfidential has confidence in clinicians continuing to treat their patients, and in GCHQ’s incident response – as has been demonstrated in previous similar incidents.  Unfortunately, we also fully expect NHS England’s analogue administrators’ tailspin to continue to learn as little from this event as from any other in the real world.”

Notes to editors

1) Dame Fiona Caldicott’s ‘Review of Data Security, Consent and Opt-Outs’, published in June 2016, made important points about NHS cybersecurity. As of the snap General Election, the Government had yet to publish its response to the Review: https://www.gov.uk/government/publications/review-of-data-security-consent-and-opt-outs

2) NHS Digital has run a programme called CareCERT since September 2015, partnering with agencies such as including CERT-UK, CESG and CPNI. One of CareCERT’s core functions is “national cyber security incident management”: http://content.digital.nhs.uk/carecert

3) At the time of writing, this does not appear to be a ‘NHS-only’ incident; there is evidence of similar issues arising in Telefonica, in Spain: e.g. https://www.bleepingcomputer.com/news/security/telefonica-tells-employees-to-shut-down-computers-amid-massive-ransomware-outbreak/ & https://www.thestreet.com/story/14132953/1/britain-s-national-health-service-suffers-cyber-attack-spain-s-telefonica-hit-in-similar-incident.html

4) At the time of writing, the vulnerability appears to be one used by the CIA and disclosed via wikileaks in March 2017. Microsoft shipped an emergency patch at that point: https://technet.microsoft.com/en-us/library/security/ms17-010.aspx Serious questions need to be asked of those responsible for maintaining Windows-based IT systems, who failed to patch their servers for two months.

Patients’ Data in the Party Manifestos

What medConfidential will be looking for in every party’s Manifesto is rather simple:

    Will patients know how their medical records have been used?

A straightforward “Yes, they will” or “No, they will not” will suffice.

Every flow of data into, across and out of the NHS and care system should be consensual, safe, and transparent – there need be no conflict between good research, good ethics and good medical care.

We shall provide more detail on how this relates to current issues like Genomics and AI in due course – but the question to which there must be a clear answer, for whatever the future brings is: Will you know how data about you is used?

We would like the Government to honour its commitment to a statutory National Data Guardian; to closing the commercial re-use loophole, putting promised safeguards and sanctions in place, and implementing transparency on all data flows across the health and social care systems – but we understand these are details that may not make the cut for a short manifesto.

Given all that has happened, would you trust any party that fails to commit to you knowing how your medical records have been used?


Related pieces:

On what principles will data be used in the Single Government Department?

Whitehall proceeds step-wise, and ever more rapidly, towards an end state of a “Single Government Department”. This is Sir Humphrey’s decades-old vision of “Joined-up Government”, predicated upon Government doing whatever the hell it likes with your data, wherever and however it gets it, in flagrant disregard of Data Protection and Human Rights, Articles 8 and 14 (at least). User needs or departmental needs?

In a world where there’s a single Government (and government) data controller – the Data Controller in Chief; the Prime Minister, the final arbiter – will a single Department’s policies, practices and prejudices determine the list of Government policies?

We don’t see how it doesn’t.

It may be useful to begin with an NHS analogy. It’s a gross simplification, but it carries the necessary meaning.

There are multiple hospitals in Manchester – Royal Manchester Children’s Hospital, Manchester Royal Infirmary, and St Mary’s – all on the same site, with interconnected modern buildings, all built at the same time. Why are there three hospitals? Because when the new buildings were constructed, and everything was consolidated on one site, though treating them as a single hospital would seem most sensible, that would (to many people) effectively be “closing two hospitals”. Hence, there are three.

What about Government departments?

In a Britain with a single government department, what is currently the Home Office – with its particular approach (covered elsewhere) – will go on the rampage across all areas of everything.

For how will weaker policy goals be defended against stronger ones? “It’s a matter of national security, don’t you know…”

Clause 38 of the Digital Economy Bill “solves” this problem by simply ignoring it – those with the highest bureaucratic power will win the fight; we’ve seen this already with the Home Office demanding the names and addresses (p16) of patients – and it’s quite clear they’d have grabbed everything if they’d wanted to.

In this context, with the Digital Strategy of DCMS, and Cabinet Office’s warmed-over Government Transformation Strategy in play, what should happen to make the world they’re trying to build safe?

The greatest concerns must be with the Transformation Strategy; the current “ethics framework” suggested by GDS (the part of Cabinet Office responsible for writing the Strategy) is so flawed, for example, that it suggests a Privacy Impact Assessment can fit on a single sheet of A4 – the self-same strategy used to justify care.data, relying on NHS England’s public statements. Thus far, the country has been saved from a systemic collapse in trust by the fact that this “ethics framework” isn’t actually used by departments.

So what’s the alternative? A citizen view of Government.

Government insists it should be able to copy our data – whatever it wants, wherever and whenever it likes – including to its commercial partners, e.g. Google (or rather, Alphabet) DeepMind and Palantir, for whatever policy whim catches the interest of any particular official. Proportionality and public acceptance are irrelevant; these are not what the civil service is set up to do.

As we saw with DeepMind at the Royal Free Hospital, one person with power can torpedo years of careful and diligent work in order to meet their own short-term, narrow perspective, self-interested goals.

The single Government department makes this worse, if left unaddressed. What should replace it is a citizen view of Government.

This conversation has never been had. The discussions that have been facilitated were designed to get to the pre-conceived end state of the Cabinet Office. As such, the answer was given and civil society time was wasted on a ‘debate’ that was entirely pointless; any wider opportunity to improve the use of data in Government through the Digital Economy Bill was lost.

As an example, well-defined APIs might work for departments – but if departmental silos weaken (as is the explicit goal: “to remove barriers to data sharing”) then things begins to fail. Citizens should not have to rely on how Government talks to itself.

The start of the conversation has to be with complete transparency to citizens – with the likes of Verify and public bodies being accountable to the citizens they work for. Citizens can now be shown what data is required for their transactions, and from where it will be accessed, and why. Operational decisions should inform democratic debates, both by policy makers and citizens who wish to engage in democratic debates about the services that affect them.

Civil servants all work for the Crown and not the public – whatever ‘flavour’ of Government is in power – and this may be a tension that needs consideration. What happens when the political will meets the public won’t? How is trust in institutions maintained?

Because without action, continued secrecy and the drip drip of cockup will undermine all trust.

This works in practice

Fortunately, some NHS GPSoC IT Providers (the data processors who provide IT systems to your GP) have taken the lead in fixing the systems from within the system. How many decades will it take Whitehall to catch up?

We have already demonstrated what this looks like – with Verify and other tools.

Rather than a “single government department”, the principle should be a “Citizen View of Government” – where every service a citizen has touched can be seen, with accountability for how they used data and why. This would make Government accountable to the citizen, as it should be – without the citizen having to understand the intricacies of how Government works.

In a “Citizen View” world, whether Government is one Department or many doesn’t matter as much. If civil servants want to justify access to data, they can – but they must be aware that citizens will be told what data and why, and might become unhappy about it if the reasons aren’t just.

Any Government that fails tell its citizens what it is doing and why, or which doesn’t really want them to know, will not be wanted in return – as the EU discovered with Brexit. This is what the open policy making process should have prepared the groundwork for; the price of that failure keeps going up as digital continues its march.

Unless we wish to treat data about human beings with less care than we treat the data about carcasses in our food supply chain, ‘Globalisation 2.0’ will be based on registers and code – determining risk and eligibility for consumers and for regulators. This simply does not square with a world of copying data; it can only work in a world of APIs to data where there is a lawful, published case for each access, grounded in fundamental accountability to citizens about their data.

It is obvious that data about the food we eat should not be locked in a filing cabinet in Whitehall. It should be equally obvious that “taking back control” shouldn’t mean giving every civil servant a copy of all the data on every citizen.


Related pieces:

The Home Office: Secretive, Invasive, and Nasty

In various guises, those who coordinate medConfidential have been dealing with the effects of Home Office missteps for what now in total amounts to decades.

Liberty Human Rights Awards 2010

Liberty Human Rights Awards 2010

Here is some of what we have learnt:

Home Office is the part of Government that must confront and ‘deal with’ the absolute worst in the world: murder, rape, terrorism, paedophilia – the stuff no-one really wants to have to know about; things from which civilised people prefer to turn their eyes. There are obvious – and legitimate – reasons that some of what the Home Office does must be confidential or classified.

The people who we task with dealing with these terrible issues deserve to work in a culture of compassion and competence, with solid foundations in Justice – the current Home Office has none of these.

Secret: Hiding errors in a file marked Secret harms the public good.

As can happen with bureaucracies more generally, the hint of secrecy at Home Office has spread into an all-encompassing security blanket around any information that might be be helpful to an informed debate in a democracy.

Treating information about every offence and misdemeanour as if they were the worst, keeping arbitrary secrets, and hiding your actions while telling others they must simply trust that “It’s for your own good” are the actions of someone who has lost perspective. Lost a sense of proportion. And lost the ability to discriminate, except in the prejudicial sense.

The examples of this are countless – from Ministers’ refrain of “trust us” about the ID scheme to “We know but we can’t tell you” about the Communications Data Bill; from petty refusals to extreme resistance to simply ignoring requests for information; and as evidenced by the secret ‘National Back Office’ in the NHS, only exposed in 2014, when Sir Nick Partridge reviewed what happened in the building where the previous-but-one Home Office-administered ID card scheme ended up.

Worse than that, on getting information via a backdoor into people’s medical records, the Home Office wrote in secret to people’s doctors, telling them to deny treatment.

Invasive: No consideration of innocence, or the consequences of action

The political culture pervading the Home Office has led to an organisation which cannot consider side-effects.

It sent round “Go home” vans because they might contribute to a “hostile environment” for illegal immigrants, without any regard to the effects of that hostile environment on innocent parties.

And it’s lost the ability to discriminate: to Home Office, everything it looks at is a crime – or a potential crime – so it is prejudicial towards everyone.

In being unable to discriminate between ‘crimes’ – including thoughtcrime, and perfectly normal behaviour, such as trying to keep your personal communications private – Home Office discriminates wildly and inappropriately against whole classes of people, and individuals who have in fact done nothing (or very little) wrong.

And, in pursuit of its obsessions, it considers nowhere, and nothing, sacred (Q78).

If it will not respect the boundary of the confidential relationship between you and your doctor, where is it that you believe the Home Office will not go?

In this world view, the entire country gets treated the way the Home Office treats illegal immigrants (which it claims is “respectful”!) and – after many attempts, including a RIP Act that for years emboldened nasty, technocratic petty-mindedness down to the local council level – it has finally got its Investigatory Powers Act, so it can snoop on all our communications data.

Nasty: Fear breeds paranoia, and suspicion is contagious

Bullies are fearful. They don’t always appear to be – especially when they get themselves a gang. But you can tell bullies by the way they pick on people, and who they pick on; the weak, the odd, the vulnerable. People who can’t put up a fight.

The Home Office delivers little itself; it cannot act directly in many of the areas for which it is responsible. For these areas of concern, it develops policy, dispenses budgets for various programmes, commissions systems, lobbies for legislation, and other things – but it assumes everything will fail, which leads to suggestions like a 15 foot high concrete wall around Parliament: “Operation Fortress Commons”.

But the few things it can do corrupt everything. It tries to turn everyone it leans on in every part of the public services into a border guard, or a snitch. Demanding the Met hand over details of those who witness crimes makes everyone less safe – if you are the victim of a crime, you want those who know something to share what they know with the police, without fear that it may be used against them. In this case, the hostile environment is hostile against innocent victims of street crime, because the Home Office has harmful priorities.

There are countless examples of each of these, which will appear over the course of the campaign. Some of them will even come from us…

The Home Office has been responsible for a string of high profile, national embarrassments in recent years. Flawed decisions by Home Office led to national humiliation at the opening of Terminal 5 – ever wondered why the baggage handlers couldn’t get to work? The shameful disarray of the G4S contract for the Olympic Games, from which Home Office had to be rescued by the military. The collapse of many criminal trials because policy at SOCA and NCA was simply unlawful. The harm to the UK’s economy, and international reputation, from the wrongful deportation of 48,000 students – because the Home Office panicked after watching a TV programme. And the harm to public safety and public confidence.

Shorn of Justice, the Home Office has lost touch with humanity, proportion, and the fundamentally positive spirit of the Britain. Human Rights are pretty much all that protects you from excesses or mistakes by the Home Office.

How does this relate to the NHS and privacy?

The greatest hazard in this election comes not to/from Brexit, but rather the deeper, more insidious threat to the autonomy of every citizen from the State. It forgets the worldview that created the NHS: that no matter what the world’s darkness, there will always be people there helping.

In a Brexit world, the Home Office worldview offers the NHS just three choices: be nasty to ‘brown people’; be nasty to everyone; or ID cards. These are the only choices its worldview can see, while the perspective of the NHS is quite simple; healthcare, free at the point of use, for all those in need. Without discrimination.


Related pieces:

medConfidential Bulletin, 21st April 2017

Though the political focus is on the General Election, the ‘STP shuffle’ remains highly significant. Whatever the result in June, both funding and decision- making for health and care services will be increasingly devolved to local areas.

What’s happened? General Election!

What medConfidential will be looking for in every party’s Manifesto is rather simple:

    Will patients know how their medical records have been used?

A straightforward “Yes, they will” or “No, they will not” will suffice.

Every flow of data into, across and out of the NHS and care system should be consensual, safe, and transparent – there need be no conflict between good research, good ethics and good medical care.

We shall provide more detail on how this relates to current issues like Genomics and AI in due course – but the question to which there must be a clear answer, for whatever the future brings is: Will you know how data about you is used?

Update on DPA Section 10 notices

Last December, NHS Digital and Public Health England (PHE) were sent hundreds of Section 10 Data Protection Act notices by patients who had opted out, insisting that their data should not be sold – even through a loophole.

Though there were some ‘boilerplate’ responses, both bodies effectively ignored every single one of those notices. Patients’ data continues to be sold for commercial re-use, and further problems have emerged:

  • PHE considers itself exempt from existing opt-outs; will it make you opt out again?
  • What about the NHS? Will the Government’s response to Caldicott 3 force yet another opt-out?

It is understood the Caldicott Consent Model should include overrides – and some exceptions, where required by law – but this should not be at the whim of Public Health England, which still copies patient data to companies in secret. PHE said it was becoming transparent, but its own actions give lie to this and still it demands more data.

If you want to know public health information about your area, PHE thinks you should use a site called “fingertips” – which gives you a mountain of statistics, a trowel, and suggests you start digging. If you want to see the biggest public health issues in your area, you may want to try this list instead.

Speaking of digging…

Questions for the elections; what is your lived experience of the NHS?

With STPs and financial devolution on the way, it’s the candidates who are elected in your area who’ll be making decisions that will impact directly on your, your family’s and your community’s health and care services – and the exploitation (or not) of your medical records.

In the run-up to the elections, all you need do is ask the people who canvas you some straightforward questions, share some of what you know from your own experience, and put up a poster to encourage your neighbours to do the same. Here are our suggestions:

  • Does [the candidate] agree that everyone should be told how the council and NHS use their data?
  • Given the political choices that are changing the NHS in your area, how would your own or your family’s past experience of the NHS have been different?
  • What are [the candidate]’s priorities for reducing problems that put a strain on your community’s NHS and care services?

If you get answers, please do post them on facebook and in other appropriate forums, so others can see them too.

Phil Booth & Sam Smith
21st April 2017

medConfidential Bulletin, 9th April 2017

Where does your data go? And do you know? These are questions to which we’ve been getting you answers for three years or so, but now you have an opportunity to ask these questions too… Local elections are coming up, and political parties want your vote…

But first:

What just happened?

In a 280-page PDF from NHS Digital is one item worth noting; “Programme 12: General Practice Data for Secondary Uses” (item C4 on page 56) with a deadline of this Christmas is – as far as medConfidential is aware – the first public sighting of… the return of care.data

So, although the Government has yet to issue the necessary CAG Regulations; or ‘one strike and you’re out’ sanctions for data misuse or abuse; has failed to close the “promotion of health” (i.e. Pharma marketing) and commercial re-use loophole; still hasn’t put the National Data Guardian on a proper statutory footing, let alone responded to the Caldicott 3 review; is mute on whether you will have to opt out again, and whether cancer patients will have their data copied anyway; and wants to copy data to any Government department under the Digital Economy Bill; it seems someone is eager to flood the “National Data Lake” we mentioned in our last bulletin.

What’s happening next?

Unless you pay close attention to NHS internal meetings, you could be forgiven for knowing little about how the NHS talks to itself, but the 44 Sustainability and Transformation Plans (STPs) is the jargon for a new NHS reorganisation that really matters. To you.

The NHS England website describes them as follows:

NHS organisations and local councils are developing shared proposals to improve health and care. Working in 44 geographical areas covering all of England (called ‘footprints’), the plans are led by senior figures from different parts of the local health and care system.

It is this top-down-mandated, bottom-up-driven restructuring into STP “footprints” that has led to the mega-CCG mergers in Manchester, Lancashire, and Liverpool, with more mergers planned in other cities of the North, and across the rest of England (e.g. in Buckinghamshire).

Why you should care is that this ‘STP shuffle’ will put your local council in partial control of where your medical records get copied – including how much of your personal data will end up being dumped into a “national data lake”.

In ducking responsibility, as they have since care.data started, NHS England claim all decisions will get made “locally”, but they can choose to send more cash for more data…

What can you do?

If you have elections in May, some of the candidates will end up choosing who sits on your local Health and Wellbeing Board. That will be the body that chooses how your area’s health budget gets spent – what gets funded, what gets cut, and what medical records they copy to the Data Lake in return for more resources…

Given this, we suggest you ask your council candidates a few questions that might them focus on the issues and evidence, and then help you and your community decide who’s paying proper attention to the impacts on your health and care, and medical confidentiality:

  • Community: Do they agree that you should be told how the council and NHS use your data?
  • Contribution: For the political choices that are changing the NHS in your area, how would your own or your family’s past experience of the NHS have been different??
  • Autonomy: What are their local priorities for reducing problems that put a strain on your local NHS?

If you get answers, please post them on facebook and other appropriate forums, so your neighbours can see them too; here are some ‘localised’ posters you can print out to help you.

If you’d like us to send you some, we’re offering five A3 posters for a £5 donation – when sending us the money, just add a comment with your address and we’ll send you posters for that postcode. (N.B. If you don’t add the comment, we won’t see your address.)

We’re glad to see a number of you are quite happy with our new badges (with text | no text) and are immensely grateful for the £20 donation medConfidential gets every time someone buys one. Thank you.

More next time on who wants to go fishing in the National Data Lake…

Phil Booth & Sam Smith
9th April 2017