Author Archives: medcon

GCHQ and NHSX’s contact tracing app

The GCHQ-informed NHSX app requires a central authority which can read (i.e. decrypt) everything the app shares. In NHS language and the language of the law, the app is an ‘information processing system’.

Given NHSX has chosen to build an unnecessary massive pool of sensitive data, it  must ensure that the data is well protected. With combined effort, GCHQ and NHS Digital will likely be good at defending the big pool of sensitive data.

But there is no need to have that data. The best way to make sure data doesn’t leak, is to have chosen the method that never collected it.

Google and Apple’s ‘Exposure Notification’ model does not have a central data authority so does not require the infrastructure that GCHQ suggested the NHS build, a design which requires GCHQ to defend it. 

And GCHQ needs extensive new powers to detect abuse of the system it designed, that Google and Apple’s system makes simply impossible. (Their approach minimises the amount of identifiable data in the system to the extent that it is effectively publishable.)

Those building the NHSX app made a fundamental mistake, and are now trying to cover it up with more mistakes. It emerged at the Sci/Tech select committee that it would be ‘very useful epidemiologically’ to keep the location of where you see other devices, to share where you got infected several days ago, and to “see the contact graph”.

We expect there will be an app for a country in the United Kingdom which uses the Google/Apple API; we are inclined to suggest everyone waits for that one. You can install GCHQ’s code on your phone if you wish – but their job isn’t to protect you or your family.

Coronavirus and NHS data – 17 April 2020 update

[our update for the week after – 24th April – fitted in a tweet]

NHS England is keeping its dashboards hidden away, but the contractors building them left their contracts “accessible via an unrestricted portal” – which goes some way towards explaining why things are still hidden

Despite promises to be transparent, and to publish the Data Protection and other Impact Assessments of what they are doing – as well as the contracts and agreements they claim followed standard ‘G-Cloud’ procurement processes – NHS England and DHSC are staying true to form; demanding visibility of our data, but showing nothing in return.

This lack of transparency only fuels suspicion and mistrust –  especially when we hear the Secretary of State, after melting down in two interviews back-to-back, try blaming the tech companies for his own ‘app-happy’ mistakes. And when we learn the CEO of NHSX has to admonish his staff not to exploit their positions “for personal or corporate gain”.

If this continues much longer, such behaviour – and even more blatant attempts to rewrite history – will not only be seen as a serious transparency deficit, but will raise serious questions about the accountability of those who demand we trust what they do with our data.

“It’s for your own good” is no reassurance when those saying it won’t show how, and for what, and by who.

So where are we now?

Death statistics: Extrapolating using a rough rule of thumb, the current figures we are being given for COVID deaths represent only around 40% of those who are dying in reality. Many of whom are in care homes. Meanwhile, the continuing failure to supply sufficient PPE for both clinicians and carers is an ongoing scandal. Matt Hancock believes a single “Herculean” effort is enough; but PPE gets used up quickly. In reality, the task’s more Sisyphean.

DWP: While each week drags by for those keen to leave the house, the clock ticks even slower for those who’ve been forced onto Universal Credit. For another 2 weeks, they’re still part of a 1.4 million person queue somewhere inside DWP. Support services like Citizens Advice always have insight into the size of the peaks as more and more people claim UC, and sight also of how UC breaks. Such insights will only increase as DWP’s business processes do their business-as-usual things, and comparisons will become clearer over time.

Google and Apple announced their new shared API. Both their API, and the way they have approached it, are the right things to do in this situation. We want to take this opportunity to thank both companies for their positive and proactive outreach to responsible members of the international privacy community. Despite whining from those who made bad early choices, the NHSx tracing app will either be like all the other apps with an NHS logo, or people will install a generic one built by someone who believes in technology assisting access to health everywhere around the world.


Contact tracing: We await news on whether the NHSx app (and DP3T) will be rewritten to use the new APIs. If not, the app will only work while your phone screen is turned on, and you’re using the app – which also eats your battery. The concept of everyone on the tube staring at their phone screen which shows them the number of people they’ve ‘been in contact with’ today is not one likely to reduce public anxiety.

Tracing beyond the border of England: Given its and PHE’s remit ends at the boundary of England, when (or if) the NHSx app launches, it is not at all clear what will happen to those who are close to Wales or Scotland. It’s likely many people will not be best served by installing an app on their phone that is based on a political and bureaucratic boundary which is more limited than they are…

‘Immunity certificates’: With little more than the sound of a starting gun from Matt Hancock to go on, it is still far from clear why or how these will be useful. But harsh lessons from history tell us how such “immunoprivilege” can be actively harmful, both personally and economically; even the editor-in-chief of the Lancet has pointed out they’re not helpful. We must reserve judgement until more information is forthcoming, but for now, we have questions (to which you are welcome to add).

Perverse incentives: When bars and restaurants reopen, will the old ‘smoking areas’ be transformed into sections for those with compromised immune systems, or for those with COVID immunity? Either way, HM Government will need to avoid creating perverse incentives around self-reporting of antibody tests. NHS incentives are all for people to be honest, and to get the best care – but HM Treasury (which knows the price of everything, but perhaps the value of much much less) still won’t reassure your racist uncle that the people wearing their ‘certificates’ who ‘look a bit foreign’ have actually met the criteria. Wrong information in an already toxic culture just makes things ten times worse (or maybe half that, e.g. 5G).

As much of the magic thinking around contact tracing without mass testing dissipates, and as reality – both technical and biological – bites, we sincerely hope the next magic roundabout ride on apps for immunity measuring will itself be more… measured. 

NHS Data responses to Coronavirus – 9th April 2020

[For background, please see our earlier posts, “The Coronavirus” and “Apps for the next pandemic”.]

Matt Hancock’s ‘tech vision’ from February now seems to be from another world (our response, drafted pre-Corona is here). The best parts have been implemented already, in the NHS at least – while other parts now look more like digital ideology than things that would have happened if they were a good idea. The tech ‘shortcut’, that people should adapt to the technology before it improves, has been upended; the virus has made the tech companies satisfy the requirements of doctors.

If DHSC had not deferred the decision to tell every patient how data about them is used, public concerns about Palantir et al. could have largely been mitigated by normal NHS processes. Instead, all of the consequences of commentators and the general public not understanding how the NHS uses data are causing work for the Department (and parts of the NHS) at a time when they have little free time.

medConfidential had already drafted a net assessment, which remains all too relevant – as a list of things undone by DHSC, which the NHS would have been able to build on today.

Instead, we have what we have…

So where are we?

Don’t get caught: Many of the companies offering their services to the NHS would previously have lobbied hard to weaken the standards they now seem perfectly willing to meet. (It’s almost as if their previous actions were driven by money, not substance…) Unfortunately for Palantir, DeepMind, Google, Amazon, and others, their previous missteps around data and public trust undermine their claims to be working in the public interest now.

Notices to all care providers: Hal Hodson of the Economist published a scoop of the Notices under reg 3 of the COPI Regulations that care providers are required to do with data what is appropriate to fight COVID-19. (Noting that “appropriate” still includes restrictions and controls that are sensible, practical and necessary.) Those who go beyond this, indulging in unenlightened self-interest, will be examined afterwards – and the public will not be kind to those who exploit others, even if the regulators are slow.

AI Lab: Handing the NHSX ‘AI Lab’ to Mustafa Suleyman of Google DeepMind is not necessarily the worst idea, given the Lab by itself wasn’t due to start for another year – but with the cloud under which he left the company he founded, we hope this move will be productive, and result in fewer gagging clauses and pay-offs to junior staff. DeepMind has previously produced an AI which can tell the difference between viral and bacterial pneumonia; adding SARS-CoV-2 to that seems like a good use of resources. 

Intellectual Property: Following the approach of the Gates Foundation, the healthcare response should commit to building multiple diagnostic support AIs, on different datasets, and with different approaches – and make them all free to everyone around the world. If  DeepMind’s past contracts (now taken over by Google) are anything to go by, how much is the NHS being charged for that model and expertise, and how long will that cheap deal last? The COVID response must deliver results the NHS and world can use in perpetuity, at no additional cost.

Deaths: Many people are dying who are not included in the headline figures. While the NHS is receiving a great deal of the political focus, the effects of the lack of protective equipment, staffing shortages, and long term chronic underfunding in social care are just as severe. And we will see the effects. We still lack current overall death figures – i.e. “all cause mortality” – which cover not just those who had COVID-19, but deaths for all related reasons (so HMG cannot fiddle the figures by, e.g. not testing the dying). Testing only when it has clinical relevance is the right thing to do right now – but it does undermine the current death statistics. (These also exclude inquests, which should cover health care workers, deaths of young people, and deaths where treatment was delayed or were due to the economic consequences of COVID.)

Planning: When pandemic planning was the remit of PHE and professionals, it seemed to be  going relatively well. Now they’ve let CDEI and the ‘Tech Bros’ in, things are going about as well as you might expect from an outfit led by someone whose previous venture helped cause mid-Staffs. These issues will most likely come to the fore with the ‘immunity certificate’ app in the next week or two…

Contact tracing: medConfidential understands NCSC has had input into the contact tracing app, but we have not seen written confirmation that the ‘random identifier’ broadcast by the app will be generated by the app itself, or be read from the phone operating system’s bluetooth mac address (and so be available to others). We believe the app is less broken by design than it was a week ago, but highly controversial implementation decisions seem to have been made for reasons that may provide short-term benefits to NHSX – while dumping longer-term burdens onto the public, without any clear justification. Getting the 50-60% takeup required for such an app will be extremely difficult, especially if those building it don’t invite knowledgeable civil society experts to briefings containing complete answers to substantive questions.

‘Monster factories’: Details on DWP’s blunders are always five weeks behind the headlines, while the Home Office is a monstrosity (mostly) in public view. The NHS is working flat-out to save as many lives as possible, and most of the healthcare workers who have died are from overseas, yet the Home Office changes nothing and continues to increase the burden on the NHS in all aspects of its operations.

‘Immunity certificates’: While Matt Hancock might want his get-out-of-quarantine-free card, the NHSX (for which read, NHS England and DHSC) approach to ‘immunity certificates’ needs to be of a standard higher than anything else they have delivered so far. While the contract tracing app has clear health functions and can be NHS branded, it is unlikely the NHS and public health infrastructure will lead on an immunity app that will be actively undermining the consistent public health messaging. As a result, it seems likely this will be something the unreformed ‘institutionally ignorant’ Home Office may seek to take on, as ‘immunity passports’. The Home Office approach to NHS data entirely aside, it and its Ministers’ and officials’ regard for life and law make the ‘herd immunity’ debate look positively affectionate towards Grandma… [Edited to add: Initial thoughts for comment]

GP data for care: TPP/SystmOne previously took it upon itself to act as a data controller for its customers’ patients’ data, and apparently misled the Information Commissioner about its actions. With an opportunism that would not be unprecedented, the company is believed to want to re-enable that ‘design flaw’ for an unknown period of time. We’ve written to them with questions.

GP data for research: EMIS and Oxford are doing a study for which GPs can opt their entire practice into sharing information on, or relating to, COVID. (They won’t be the only ones.) It is unclear at this point what, if anything, this study tells patients about how data about them is used. A bit of text on a website, which no one knows to look at, is always insufficient.

Transparency: Extraordinary times may require extraordinary measures, but throwing due process out of the window creates even worse problems. Talking about transparency but failing to deliver it is no longer an option, especially if those asking the public to do extraordinary things want to maintain trust and public confidence.

NHS England’s ‘all seeing dashboard’: We have been promised transparency, and that “G-Cloud procedures” were followed – so, where are the Data Protection and other necessary Impact Assessments, the Data Sharing Agreements (surely they have them…) and what about the contracts? At the time of writing, no previews or proper information have been given to the medical or tech press about what NHS England has asked Palantir et al. to build. Does the system even work? 

Happy Easter to you all; our continued thanks and admiration to each and every person working in the NHS and across social care for all your efforts in the current pandemic, and our thoughts and good wishes to all those affected

Apps for the next pandemic

It may be too late for this pandemic, but some of the apps under development could be useful in the early stages of the next large outbreak. There should be no rush to launch any new ‘shiny thing’ that undermines or conflicts with HM Government’s current advice to the public on their behaviour.

Currently proposed apps tend to fall into one of three overlapping categories, plus egregious random ridiculousness:

  1. Open Standards and survey apps
  2. Contact tracing
  3. “Immunity certificates” and testing apps

Plus what happens afterwards…

1) Palantir, Open Standards, and survey apps

The Palantir dashboards could be entirely public. For the same reasons that NHS England hasn’t said the Palantir dashboards will be public, not everything about real-time health should flow without friction.

Open Standards in healthcare are a good thing; Open Standards in public health in the time of a pandemic rely on every actor moving with understanding, responsibility, and the gravitas appropriate to the situation. Then someone invites Facebook…

If asking people to fill in a daily COVID-19 survey is good, for example, and more people filling in interoperable surveys is good, then surely Facebook promoting a survey daily to everyone on their homepage is even better? Especially when Facebook can see exactly who clicks what, and people can be tracked all across the web (fbclid)… or maybe Facebook could just do this all itself (and be trusted not to use it for its advertising algorithms)?

Just as Huawei are politically toxic in the UK right now – but are keeping the mobile networks working anyway – and Palantir are completely aware that they’re creepy by design – their logistics platform is world class, though it’s more often used to move people closer to death than further away – so Facebook have the same naive arrogance in 2020 that they had in 2015, without any appreciation of what happened in the interim.

All standards get abused.

In a public health context, it isn’t enough to merely claim that you won’t shit in the water supply; you must have everyone else believe that you don’t – in addition to not actually doing it. Your track record matters.

Newbie missteps in implementing an Open Standard for flu tracking will undermine the good work of all of those entities which have been doing this for some time.

2) Contact Tracing – good apps or bad apps?

A contact tracing app can be encouraged for social care and NHS staff to use to help them protect themselves, their clients/patients and families – and there is no way such an app can be launched and not be usable by everyone else, since people will install it anyway – so it has to work. That does not mean it is necessarily a good idea.

While the Government’s communications strategy has improved in the last week or so, its ability to launch an app that doesn’t undermine the ‘stay at home’ narrative is, in practice, likely to be low – even if well intentioned. And this DHSC has an unfortunately poor record of promoting digital mediocrity and clinical irrelevances. Even in the current crisis, Matt Hancock shows little sign of changing his spots or improving his discretion

At best, a rewrite of the Singapore app so that it instead stores a list of random Bluetooth LE beacons on-device would be a beginning, then allowing the sharing of beacons ‘seen’ in a particular time frame after the user presses an “I have symptoms” button. While it looks like the Government has gone with Oxford, we don’t yet know who actually wrote the NHS tracing app, how badly they’ve screwed up the inevitable Facebook / Grindr / TikTok integration, or whether they’ve taken shortcuts in their implementation at the expense of the people they want to use it.

Any contact tracing app will only be installed and working on the devices of those who choose to use it. (In the same way that Boris Johnson said he’d still shake hands…) People may have learned a great deal in the past few weeks, but those who choose to use the app will likely already be following the rules, and those who don’t, well, don’t…

And, in the bigger picture, we must ask: would we do this for HIV? For whatever we do for COVID will be copied by others – first for COVID, and then by others for other conditions.

Given the volumes of users and devices required for contact tracing to be even minimally effective, there is non-trivial scope for ‘tourists’ to stand outside Number 10’s gates near the journalists for a few hours, and then press the “I’m Infected” button for giggles and chaos.

3) “Immunity certificates” and testing apps

The tests will come. They are coming. And there will be a time when such tests are necessary to affect what a citizen should do next; that time is not now, but it is approaching.

  • Antibody tests can show that you have had the virus, and are effectively immune (for now).
  • Antigen tests can tell you that you’re not currently infected with the virus.
  • In terms of results, what you want is a negative antigen test, and/or a positive antibody test – a distinction that scientific illiterates who’ve repeatedly been told to disregard experts may find difficult to make.

There will be some who (lawfully or otherwise) choose to limit access to those with the ‘right’ form of either test. Whether by employer, or social groups, or at the border. Do we want to become one of those countries that takes blood samples as people try to pass through customs? China may choose to do so, but who do we want to be?

While minimal central infrastructure is required for a contact tracing app that one hopes is used by a high enough percentage of the population for it to be meaningful, testing apps have very different requirements.  

When there are perverse economic incentives around testing, one person with known immunity might take the test in place of others – and others may feel compelled to expose themselves for the chance to feed their kids, or return to a ‘normal life’. Any mass testing infrastructure will rely not only upon the accuracy of the test itself, but upon there not being harm(s) for one type of outcome compared with another. 

And any centralised list of confirmed test results will, by definition, be a list of the entire population and their digital devices. A National Identity Register in all but name. The choices and actions of the unreformed ‘institutionally ignorant’ Home Office with regard to such datasets in the past now creates harm for everyone, should this be attempted.

Incentivising people to lie about their status will cause harm. And forcing people to disclose they have had (or not had) any temporarily notifiable disease has ugly precedent, with practices steeped in prejudice and racism. 

The costs of ongoing institutional intransigence, blind spots, and/or delivery failure are all coming due in a period where agility is most needed and where the results are most visible. This is clearest in the NHS supply chain, but it applies to institutions (of all types) which, when they are most needed to deliver something new, end up just doing the same thing they did before.

People will follow Government’s lead. A million people in a week responded to Treasury’s announcements by first understanding their circumstances, and deciding they needed UC.  GDS and NHS digital teams have deployed hundreds of structural changes to their services, in addition to thousands of content changes, but others – most notably DWP – seem to have done little more than shuffle people around. 

When the vaccine comes, we want people to have survived, and our society to have survived too – not be degraded into the sort of fear and bigotry that embody Marsham Street’s default perspective. 

After ‘stay home’, five strains of flu are normal

Last year, there were four strains of flu; now there are five. In a perspective of years, that difference is minimal.

The country will go back to (a new) normal. Do we really want abusive employers or others to be checking antigen and antibody status for employment in Wetherspoons, or at an Amazon warehouse – or so you can pick your own kids up from the school gate? 

If not that, then how do we treat someone who gets off a plane from anywhere – whether they came from China (with their choices), Trumpistan (with their inaction), NYC (with their resources), or Africa (with their resources)? What is the goal of “immunity certificates”? And how will that work at Heathrow?

The Home Office will, of course, default to racism and prejudice; fingerprinting arrivals because its Ministers and officials have long wanted to (it went badly). Actions and cultures predicated on secrecy rarely prove effective.

For any category of app to be effective, you need enough users using it for users to be able to expect others to have it. Announcing an app at the PM’s daily press conference might (hopefully) achieve that. And the app might even do what it’s supposed to. But making that announcement in a way which doesn’t undermine even more vital public health messaging would need a degree of demonstrated competence that NHS England / NHSX have thus far failed to deliver at any point since their inception.


Substantive details on particular points:

The Coronavirus

[This was written and published on the 17th March 2020. Our briefing for the Commons stages of the Coronavirus Bill was published on the 20th]

The notion of a public health emergency has always been within the scope of discussions around data and confidentiality in the NHS and health data. Overall, the responses of the NHS across the UK, Public Health England and its counterparts in the Devolved Administrations, and Her Majesty’s Government have been within and along the lines of pre-considered contingency plans they were able to take off the shelf, which matched those discussions. 

In that respect – and while medConfidential will maintain a watching brief, from a discreet distance – we don’t expect there should be any major concerns regarding the use of patients’ data as long as HMG and its agencies follow scientific advice. Should that change, or as new information becomes public, we will take a view at that time (for example, the legislation text has not yet been published)

We expect there to be a full Public Inquiry into lessons learnt after the current crisis is over – even in the best case scenarios, the effects are already too large for there not to be such an inquiry.

medConfidential will keep non-urgent suggestions until our submission to that Inquiry, by which time the Scientific Advisory Group for Emergencies (SAGE) advice and models will all have been published (as per HMG contingency planning). It may be helpful for the Government to indicate at what point in the pandemic they anticipate the selection of the Inquiry chair to commence.

Even in the best case this situation is going to last months, and many possible actions could be taken. However, the human rights and civil liberties of the entire nation should not be abandoned in favour of fear or xenophobia against anyone whose virus threshold is unknown – nor to introduce a state of permanent data monitoring that will, even at its most intrusive, fail to suppress people’s fears. Decades of discrimination around HIV have taught us the dangers that arise when mob mentality demands proof of a negative from whomever they choose to ask. 

In closing, we want to express our gratitude and admiration for each person working in the NHS for their efforts in the current pandemic. Thank you all.

Matt Hancock Still Doesn’t Tell You How Your Data Is Used – December 2019

There should be a price for misleading public statements about the NHS, whether about data or everything else.

Since our last update, and our first update in 2014, what a Data Usage Report should contain has reached consensus. The NHS has made some progress – not that you as patients will have seen any of it – and while NHS Digital calls it a Statement, rather than a Report, that is probably a better word. 

If you e-mail NHS Digital and ask them, they will now tell you how data about you has been used, to whom it has been sent, and hopefully, the places where your Summary Care Record and the new ‘National Record Locator’ have been accessed for you. NHS Digital still doesn’t know everything, but it is progress. Whether this will roll out to appear on the NHS App, and on NHS.UK is a different question.  Mainly a political question for NHSX – NHS Digital would prefer to avoid the transparency.

Now NHS Digital can do this, we turn to the question of the quality of the content. Anything at all is clearly better than the status quo of nothing, and “at will” is far better than “by request”. medConfidential also has an example of what this would look like across the rest of Government (i.e. behind a GOV.UK Login) – as the approach we’ve taken applies for data used across Government, not just health data.

How to talk about data projects

Over the last year, our friends at Understanding Patient Data and Use My Data – and others – have done quite a bit of work on how you explain data usage to people.


While application forms may contain ‘lay person summaries’, these are of variable quality. That’s not to say that the current ones aren’t a good start – they prove the process can work, and they will get better over time. Conversations about improvement will continue to be hypothetical until patients can actually see how their data is used, at which point quality will go up. 

The work of Understanding Patient Data (and friends) shows that these explanations can be good. Whilst various agencies can take their own view of ‘good’ communications, once the feedback loops exist and are running, the incentives to get better quickly kick in. With its limited numbers of projects and outcomes, the cancer registry has shown that assisting some projects as needed can be done without the need for extra resources. 


For projects that make it into the press, the existing NHS comms team who write the ‘NHS health news explainers’ can also assist; showing how to tie the legitimate uses of data more tightly into the benefits of research.

There’s plenty of expertise to help make the explanations for patients really good, but they must be good enough for now – if they aren’t, then the data use should not have been approved! – and therefore the biggest block on improvement is simply not having started to show people how data about them is used on NHS.UK.

Of course, medConfidential still shows everything it can on TheySoldItAnyway.com… and commercial entities are still getting data. As NHSX ramps up for 2020, does it really want the only place patients can readily see how their opt out was respected (or not) to be at TheySoldItAnyway.com?

care.data returns

Another attempt to collect your GP data is coming. While none of the details are finalised, NHS England is quoting BMA as saying it’s “care.data done right”. (It is unclear at this point whether that quote is from “care data day” or elsewhere.)

Will the 97% of people who do haven’t said no to the desire to “use my data” for purposes beyond their direct care be able to see how their data is used? How confident are NHSX, DHSC and NHS England that what they tell the public this time will be matched by what NHS Digital is told to do, and what it actually does?

Information will be provided to the public about care.data 2; the question is whether NHS England and DHSC tell people, or we do. As before, medConfidential will tell people the truth, with evidence – whereas Matt Hancock’s choices in the recent election suggest he may choose a lower standard.

There are of course legitimate reasons to use patient data, especially the data of people who wish it to be used. If the programme is consensual, safe, and transparent, then it will be scrutinised and the outcome can be positive – will you know how your data is used, and how your choices are implemented? Do you have the facts you need in order to make an informed decision on how your and your family’s health data is used?

The opt-out model of Organ Donation

We are approaching half-way through the communications period for organ donation in England to become an opt-out rather than an opt-in process.

medConfidential has not yet seen any figures published for the effectiveness of this national communications campaign – nor how many people have taken action as a result, both opt-outs and explicit opt-ins – but those ads we have seen so far have all been vague and non-specific.

The memories of Alder Hey haven’t faded, and we sincerely hope DH / NHSBT ‘step up a gear’ so people really can make an informed choice before death, and avoid unnecessary stress and suffering for their loved ones.

How the communication of the organ donation opt-out programme succeeds or fails will likely demonstrate whether the care.data2 process will succeed or fail too. With luck, the Secretary of State won’t be in so much of a rush to grab your medical records that, in haste, he undermines organ donation too.

As has always been the case, NHS England and DHSC could have a data system that is consensual, safe, and transparent. The question is whether they will duck the hard choices and make you pick up the pieces they wanted to avoid.

The Home Office

With the Government proposing to move Immigration and Borders responsibilities out of the Home Office, a decision will be required on what happens to all of the toxic soup of data agreements between the Secretary of State and the Home Department for those purposes. 

While still meeting the obligation of transparency, simply cloning and rubber-stamping each data sharing agreement for such purposes would be a terrible outcome. We may yet see improvements on the toxic legacy of the last nine years under the previous Prime Minister’s worldview – but this will require significant changes, notwithstanding cancellation.

If the data flows do continue, then this Prime Minister is clearly not interested in solving the problem that sees his Government threatening to deport scientists due to a Government typo. Number 10 will either decide to fix this as part of its Machinery of Government change, or it will decide to keep things as they are. The message will be clear either way.


This post is the latest in an irregular series of updates posted in 2014, 2015 and 2016.

Digital Government report from the House of Commons Sci/Tech committee

The House of Commons Sci/Tech Committee’s report on Digital Government sets out a direction of travel, something lacking from Government in recent times, but some of the details are disturbing.

The committee diverge from the Information Commissioner and 2018 Data Protection Act which are quite clear that consent is not a legal basis generally available for the routine delivery of most public services.

In paragraph 29, the committee justify unique identifiers for people on the basis of evidence about unique identifiers for objects or company numbers. Did they not notice the difference? While the Home Office may treat UK residents like cattle, numbered and tracked, that’s not what is usually expected by Parliament. The principle is a good one, the Committee’s suggestion is internally contradictory.

Paragraph 23 implies the Lib Dem led committee believe citizens should have no right to approve or object to to sharing any data  that isn’t “sensitive personal data” such as “ethnicity, State of residence, and sexuality”. Such a framework would significantly weaken data rules protecting citizens and would be a radical change in the law to suggest by accident, giving even more power to a future data controller in chief.


In more positive news, the top line, that “The Government should facilitate a national debate on single unique identifiers for citizens to use for accessing public services along with the right of the citizen to know exactly what the Government is doing with their data” is a public debate that is necessary. The single unique identifier is a bad idea, but there are better ideas that should replace it in a genuine public debate. Consider the current Home Office and next-Prime Minister, how many windrush-style mistakes will they make with your single identifier? And what happens when they decide to take it away from someone you care about?

With the top line from the committee, it is increasingly untenable that the NHS continues to withhold from patients how data about them was used – a place where the infrastructure is in place and identity is already known.

Public bodies, GDPR and consent

TL;DR – just because they ask (nicely) doesn’t mean it’s GDPR consent.

First, clinical consent is not GDPR/data consent

Clinical consent is informed consent for a clinical course of action, such as “Yes, you can amputate my arm”. If doctors don’t get clinical consent from a conscious patient, it’s GBH.

Sharing the medical records required for direct care is implicit from the clinically-consented decision, but that isn’t a GDPR consent – though it is part of the public task of the NHS body providing that surgery.

Both of these situations use the ‘consent’ word, but they actually mean very different things. (We agree that’s not entirely helpful.)

Consent, GDPR, and public bodies

GDPR provides six different legal bases for data use. Consent is the one most often used in the private sector – you technically consent to Facebook’s abuse as part of Facebook’s terms of service.

But GDPR requires consent to be a “freely given”. And, with government bodies providing public services, the power imbalance between a citizen in need and the state is so great that those bodies cannot get meaningful consent. A problem amply demonstrated by #metoo in different arenas

(Given what many experience as social obligation to be on Facebook, whether or not the consent there is meaningful and freely given is an interesting question for others, but outside the scope of this consideration.

The Information Commissioner’s guidance is clear – except in a few highly specific circumstances – public bodies shouldn’t use consent as their legal basis for their public task. Indeed, doing so is probably invalid.

A public body can always ask if you wish to go through a data sharing process that will make your life easier – that’s politeness, not GDPR – but that is not the same as asking you to consent to it processing your data in order to receive a service or benefit. In the most benign of cases, the two may be virtually indistinguishable – but while the first is a meaningful choice that has no effect on the outcome, the other is a requirement of accessing the service and so consent cannot be freely given.

“Give us your data or you don’t get your benefit” isn’t consent. It’s coercion.

Under GDPR, both the process itself and the legal basis for how your data will be used must be clear – even if the organisation processing your data doesn’t ask you whether you want it to or not. Most public bodies will have a lawful basis for processing your data under what is called their ‘public task’ (the private sector version of public task is ‘legitimate interest’). Critically, the process cannot offer you a different outcome if you hand over or allow it to access more personal data – though it is allowed for this to make that same process faster.

It is worth being aware that a public body can ask about your consent, and then ignore your answer and process your data anyway – so long as it has documented that that is what it was going to do, and made sure that the information wasn’t (too) opaque. As some have discovered there are, however, large political and process burdens to doing that…

Why is there confusion?

The same word being used in two different contexts doesn’t help, but another cause is the fundamental lack of clarity on how data is used, and the inaccuracy that comes from the process.

Every bit of data processed by digital government ultimately comes down to someone typing something in via a keyboard and, as anyone who reads Twitter will know, such input may not always make complete sense. Officials are trained to believe that the data is perfect and ignore reality – and it’s the citizen who pays the price, and the most digitally excluded pay first and pay the most.

Every use of personal data by a public body must have a lawful basis, which can be known by the data subject. Those uses can be listed, and should be listed (including, around data sharing, in registers of data sharing agreements). A UK resident should be able to know how their data will be used in advance of dealing with that public service – in practice, in the public services, existing law, process, and safeguards mean that it is not often necessary to do so, as one person’s data should be treated the same as another.

No citizen is expected to sign ‘terms and conditions’ or an ‘acceptable use policy’ when dealing with Government – nor should they be. The private sector uses such mechanisms because their acts are not based in public policy and law (as Facebook has recently shown).

When things go wrong, many of the frustrations representatives and support services feel in handling ‘casework’ are simply hurdles within smokescreens thrown up by those who do not wish informed scrutiny of decisions. We go into such issues further in our recent evidence to the Science and Technology Committee Inquiry on Digital Government

A public body may, as a matter of policy and in politeness to the people it serves, ask whether a citizen is willing for data to be shared with another body;  this is not ‘consent’ in the Data Protection terminology – though, if asked and declined, the data sharing must not occur anyway (in which case the ‘consent’ choice would be unfair). Asking the citizen is an offer, and can also be a mandate for the policy, but it is not a basis for legal consent.

The Long Term Plan – more of a medium-term plaster

NHS England and the Department of Health have launched the NHS Long Term Plan. It includes a range of topics of interest. It doesn’t say anything about the National Data Opt-out and not much about data – the LTP repeatedly suggests jam tomorrow – as things that work will be replaced with things that might, which the recent NHS Digital Board discussed. There is still no guarantee of a competitive NHS market for decision support tools, despite the interest of the Secretary of State in some companies…

The Summary Care Record is a (mostly) consented system run by NHS Digital that works (after a decade), and will be replaced with NHS England’s LCHR programme, which is neither consented nor working (and will take at least a decade).

The fundamental point of the Summary Care Record (SCR) is that it is accessible nationwide. If you live in Cornwall and visit A&E in Carlisle, a doctor there can see the medications you are prescribed – unless you have chosen that they shouldn’t.

When the ‘NHS app’ launches, you will (should!) be able to see where your SCR has been accessed – although, as with other NHS.UK services in the queue, you need to be able to log in to NHS.UK before that can work. NHS Digital is still consulting on tweaks to the SCR for carers – a use case that the LCHR programme is far from being in a position to consider.

The LCHR programme, which we’ve covered before, often chooses mass data copying – so if you live in Newcastle and fall off a horse in Newmarket, it is unclear what happens, and whether or how those accesses or copies are consented. The LCHR programme breaks at every boundary you cross, because it is designed and run by NHS England for NHS institutions not patients. There is, as yet, no Information Governance model, there is no patient (or care provider?) accessible audit trail, and the change in name from ‘Local’ to Longitudinal Health and Care Records (page 99) also implies that the data held within them will be eternally expanded, rather than kept to the initial tightly defined dataset – care.data2?

The plan also refers to “The use of de-personalised data extracted from local records” (page 97), which suggests that LCHRs are intended to be used for both direct care and secondary uses – perpetuating the festering wound which is NHS Digital’s continuing disregard of the GDPR around the extraction and dissemination of data that GDPR considers identifiable. The use of “de-personalised” in this context (para 5.27) is a grasp at the figleaf of obfuscation that continues to be defended by the Wellcome Trust.

Lacking an appropriate IG model, there is no consent for the secondary uses of the records that LCHRs are sucking up – meaning the devolved LCHR teams are not only lying to patients but also, in contrast to the publicly owned SCR infrastructure, they have generally outsourced the data handling to the private sector. While replacing SCR is not necessarily a problem, replacing one data copy with a different, inferior data copy is not progress to anyone other than the IT companies that will get new contracts for what was previously a publicly-run service. NHS England delegating blame but not control is not a new phenomenon, nor is issuing plans that undermine and will remove what another part of the NHS is proposing to add to help patients.

The hint in paragraph 1.38 of online NHS services to help the mental health crisis is welcome, but the detail – barely two sentences of vague hand waving in chapter 5 – do not meet what is claimed, let alone what is needed.


Choices of the Secretary of State

The standards the Secretary of State sets for the NHS are not merely that its services are ‘safe’. He can have views on the user experience of an app, but user experience is not safety. There is a fundamental difference, and patients assume the NHS will never be unsafe – which is why there is such concern when it turns out not to be safe. The Secretary of State can say what he thinks ‘good’ looks like, but ‘safe’ must remain within the remit of qualified professionals.

Criteria for ‘safe’ are absolutely necessary, if not sufficient – it is entirely appropriate for there to be separate criteria for what is ‘good’. ‘Good’ apps may be what the public choose, but ‘safe’ is what they expect – apps can be both.

As an app example, Matt has chosen that the money for his GP registration should go to Babylon in London rather than the surgery in his constituency. He feels this works well for him, as someone who likely rarely needs a doctor, and disregards any wider harm that comes from taking funds away from the doctors in his constituency. The choice of what is ‘good’ is partially subjective, and different patients will make different decisions. The critique from the profession that the app is unsafe, however, is met with a response that someone thinks it is good. These are entirely different criteria, and both groups are talking past each other – the criteria of ‘safe’ and ‘good’ should be separated; as noted above, the former is necessary but not necessarily sufficient. The Secretary of State can add standards for what ‘good enough’ looks like, without reducing safety, if he so chooses.

At the insistence of DH, NHS Digital has pulled its consultation on the ‘Clinical Data Architecture’ Principles “so that we can ensure consistency with wider emerging strategies”. The Secretary of State continues to laud the first draft of his Vision, while the “Code of Conduct” update is delayed to let the AI companies lobby more, but how visionary is it?

NHS Digital’s recent Board papers have included their assessment of what they need to do to deliver on the Vision. While there are some things they wish to do faster, there are only two new things they need to start doing. One is new to NHS Digital only because it was certification work in which they weren’t previously involved (and it remains unclear why they are now), and the second is so substantive that we’ll quote it in its entirety from page 71:

“We will identify frontline staff whose skills and competence are evident to us and make them honorary colleagues (badges and certificates and all that jazz)”

The Vision the Secretary of State proclaims doesn’t seem any more visionary (or meaningful) than a late night monologue from an NHS manager in a hospital corridor about how GPs should work. The Plan suggests there will be new legislation, in which we will look for the National Data Opt-out to be placed on a statutory basis – rather than remaining a gift of a Secretary of State. We note that the National Data Guardian Act 2018 has now received Royal Assent, which is welcome progress.

Its recent Board papers state that NHS Digital is also going to “run all our public services in the public cloud with no more locally managed servers” (page 64) – presumably moving all services to Azure and AWS. Hopefully they will let tech-savvy journalists in to do long form pieces on what they’re doing, and provide reassurance, as this has the potential to go spectacularly wrong – with limited abilities for entities in the UK to clean up the mess, while doctors in A&E and GPs deal with the consequences of the national governance bodies screwing up yet again.

There is one final annual tradition DH has maintained – the new plan delays the full digitalisation of hospitals by yet another year; this target has been slipping by one year per annum ever since it was announced…

medConfidential comments ahead of the Spending Review ( / Manifestos)

Ahead of a Comprehensive Spending Review, the Government has decided that a large percentage of discretionary government spending will go on health and social care by the end of the next spending period. That is probably a better choice than the US, which spends about that amount on their military.

In light of that decision being made, there are rational consequences which require thought to avoid perverse incentives:

  • Data available to life sciences and research: For there to be public confidence in data use, every patient should be able to know how the NHS and others use data about them, and how their wishes are respected. The NHS has established clear processes for the use of data for legitimate research – these do not need to be changed. However, the implementation of the National Data Opt-out remains hamstrung by legacy data disseminations. This, the first spending review since the 2018 Data Protection Act, allows for a clearer formulation when communicating with the public: “If you want your data to be used for research and for other purposes beyond your care, it will be; if you don’t, it won’t.” (Any exceptions being solely decided by the explicit approval of the Confidentiality Advisory Group – which was placed on a statutory footing in 2014, yet still has no Regulations governing its work.) Past and current heavy reliance on (DPA98) ‘anonymous’ data as the basis for dissemination both undermines public confidence and limits the data available to research. The spending review offers an opportunity to reconsider that failed approach, improving public confidence and making more high quality data available to researchers and the life sciences – both underpinned by a commitment that whatever a patient wishes, they will be able to see how their wishes were respected. Any suggestion of ‘data trusts’ for NHS patients’ data requires as a prerequisite the admission that the NHS itself will never get data dissemination right in patient’s interests. Public confidence in data for life sciences and research would be higher if the message was clear, simple, and accurate: If you want us to use your data in legitimate projects, we will; if you don’t, we won’t.
  • Technology in the NHS: Clinicians will use technology when it helps them with patients; when it doesn’t, they don’t – no matter how hard NHS England may push it. The FHIR (Fast Healthcare Interoperability Resources) standard is now internationally recognised as the standard for interoperability between health systems – yet the first version was only published after the last spending round. Treasury / DH / NHSE should ensure that companies cannot use contracts to limit or prohibit interoperability, or to require bulk data copying from core hospital systems into commercial companies. Where they are proposing new national programmes, chopped up into parts, what happens at the boundaries between parts? 
  • Prevention is cheaper than cure: In advance of the spending review, HMT should commission an independent assessment of the ‘DH vision’ on prevention to answer two critical questions: will it do what it claims to do? And if not, how and where does it fall short? (Page 14 of the vision shows the disconnects.) The assessment should be published alongside the DH green paper, and show what questions must be considered across Whitehall to avoid any other department causing what DH seek to prevent. 
  • New forms of Transport: Will DfT allow self-driving cars to operate in a way where their stopping distance is greater than their effective sensor range? Will equivalent assessments be made for other technologies; and if not, what will the consequent effects be on the health of the nation? 
  • Procurement incentives for competitive markets: Where an NHS body, wishes to procure an AI to assist in diagnosis, it should be required to procure 3 – effectively requiring 3 diverse analyses rather than one, replicating the medical norm of a ‘second opinion’ from a human doctor. That may be extensible to other public bodies. 
  • AI and algorithms in the public sector: For all bodies subject to judicial review, any AI or algorithm involved in input to that decision must satisfy the explainability requirements of judicial review. Should there be a clear public sector mandate that algorithms will only be used if they satisfy existing legal obligations, and that technology tools will need to be procured to satisfy those tools, that will create a market in which the UK is possibly uniquely placed to lead.

The first two points have strong equivalents across all departments.

 

Tests for the spending review: Balancing mental health, parity of esteem, and Public Health

The spending review is the primary administrative mechanism for cross-government prioritisation. 

The largest public health concerns are different in different local areas – will the spending review (and MHCLG priorities) reduce or exacerbate those differences?

Will (tech) companies assessed to be causing mental health issues be required to take steps to reduce the harms they cause in future, and mitigate harms already caused? If they are not, these costs will have to come from the NHS budget, and are effectively a commercial subsidy paid by the public purse. By comparison, for each of alcohol, tobacco, other substances with consequences for human health, and digital companies – does each contribute in tax revenue what they create in direct and indirect costs?


Some areas of this post were elaborated in our submission to the Digital Competition Expert Panel.