Author Archives: medcon

Digital Government report from the House of Commons Sci/Tech committee

The House of Commons Sci/Tech Committee’s report on Digital Government sets out a direction of travel, something lacking from Government in recent times, but some of the details are disturbing.

The committee diverge from the Information Commissioner and 2018 Data Protection Act which are quite clear that consent is not a legal basis generally available for the routine delivery of most public services.

In paragraph 29, the committee justify unique identifiers for people on the basis of evidence about unique identifiers for objects or company numbers. Did they not notice the difference? While the Home Office may treat UK residents like cattle, numbered and tracked, that’s not what is usually expected by Parliament. The principle is a good one, the Committee’s suggestion is internally contradictory.

Paragraph 23 implies the Lib Dem led committee believe citizens should have no right to approve or object to to sharing any data  that isn’t “sensitive personal data” such as “ethnicity, State of residence, and sexuality”. Such a framework would significantly weaken data rules protecting citizens and would be a radical change in the law to suggest by accident, giving even more power to a future data controller in chief.


In more positive news, the top line, that “The Government should facilitate a national debate on single unique identifiers for citizens to use for accessing public services along with the right of the citizen to know exactly what the Government is doing with their data” is a public debate that is necessary. The single unique identifier is a bad idea, but there are better ideas that should replace it in a genuine public debate. Consider the current Home Office and next-Prime Minister, how many windrush-style mistakes will they make with your single identifier? And what happens when they decide to take it away from someone you care about?

With the top line from the committee, it is increasingly untenable that the NHS continues to withhold from patients how data about them was used – a place where the infrastructure is in place and identity is already known.

Public bodies, GDPR and consent

TL;DR – just because they ask (nicely) doesn’t mean it’s GDPR consent.

First, clinical consent is not GDPR/data consent

Clinical consent is informed consent for a clinical course of action, such as “Yes, you can amputate my arm”. If doctors don’t get clinical consent from a conscious patient, it’s GBH.

Sharing the medical records required for direct care is implicit from the clinically-consented decision, but that isn’t a GDPR consent – though it is part of the public task of the NHS body providing that surgery.

Both of these situations use the ‘consent’ word, but they actually mean very different things. (We agree that’s not entirely helpful.)

Consent, GDPR, and public bodies

GDPR provides six different legal bases for data use. Consent is the one most often used in the private sector – you technically consent to Facebook’s abuse as part of Facebook’s terms of service.

But GDPR requires consent to be a “freely given”. And, with government bodies providing public services, the power imbalance between a citizen in need and the state is so great that those bodies cannot get meaningful consent. A problem amply demonstrated by #metoo in different arenas

(Given what many experience as social obligation to be on Facebook, whether or not the consent there is meaningful and freely given is an interesting question for others, but outside the scope of this consideration.

The Information Commissioner’s guidance is clear – except in a few highly specific circumstances – public bodies shouldn’t use consent as their legal basis for their public task. Indeed, doing so is probably invalid.

A public body can always ask if you wish to go through a data sharing process that will make your life easier – that’s politeness, not GDPR – but that is not the same as asking you to consent to it processing your data in order to receive a service or benefit. In the most benign of cases, the two may be virtually indistinguishable – but while the first is a meaningful choice that has no effect on the outcome, the other is a requirement of accessing the service and so consent cannot be freely given.

“Give us your data or you don’t get your benefit” isn’t consent. It’s coercion.

Under GDPR, both the process itself and the legal basis for how your data will be used must be clear – even if the organisation processing your data doesn’t ask you whether you want it to or not. Most public bodies will have a lawful basis for processing your data under what is called their ‘public task’ (the private sector version of public task is ‘legitimate interest’). Critically, the process cannot offer you a different outcome if you hand over or allow it to access more personal data – though it is allowed for this to make that same process faster.

It is worth being aware that a public body can ask about your consent, and then ignore your answer and process your data anyway – so long as it has documented that that is what it was going to do, and made sure that the information wasn’t (too) opaque. As some have discovered there are, however, large political and process burdens to doing that…

Why is there confusion?

The same word being used in two different contexts doesn’t help, but another cause is the fundamental lack of clarity on how data is used, and the inaccuracy that comes from the process.

Every bit of data processed by digital government ultimately comes down to someone typing something in via a keyboard and, as anyone who reads Twitter will know, such input may not always make complete sense. Officials are trained to believe that the data is perfect and ignore reality – and it’s the citizen who pays the price, and the most digitally excluded pay first and pay the most.

Every use of personal data by a public body must have a lawful basis, which can be known by the data subject. Those uses can be listed, and should be listed (including, around data sharing, in registers of data sharing agreements). A UK resident should be able to know how their data will be used in advance of dealing with that public service – in practice, in the public services, existing law, process, and safeguards mean that it is not often necessary to do so, as one person’s data should be treated the same as another.

No citizen is expected to sign ‘terms and conditions’ or an ‘acceptable use policy’ when dealing with Government – nor should they be. The private sector uses such mechanisms because their acts are not based in public policy and law (as Facebook has recently shown).

When things go wrong, many of the frustrations representatives and support services feel in handling ‘casework’ are simply hurdles within smokescreens thrown up by those who do not wish informed scrutiny of decisions. We go into such issues further in our recent evidence to the Science and Technology Committee Inquiry on Digital Government

A public body may, as a matter of policy and in politeness to the people it serves, ask whether a citizen is willing for data to be shared with another body;  this is not ‘consent’ in the Data Protection terminology – though, if asked and declined, the data sharing must not occur anyway (in which case the ‘consent’ choice would be unfair). Asking the citizen is an offer, and can also be a mandate for the policy, but it is not a basis for legal consent.

The Long Term Plan – more of a medium-term plaster

NHS England and the Department of Health have launched the NHS Long Term Plan. It includes a range of topics of interest. It doesn’t say anything about the National Data Opt-out and not much about data – the LTP repeatedly suggests jam tomorrow – as things that work will be replaced with things that might, which the recent NHS Digital Board discussed. There is still no guarantee of a competitive NHS market for decision support tools, despite the interest of the Secretary of State in some companies…

The Summary Care Record is a (mostly) consented system run by NHS Digital that works (after a decade), and will be replaced with NHS England’s LCHR programme, which is neither consented nor working (and will take at least a decade).

The fundamental point of the Summary Care Record (SCR) is that it is accessible nationwide. If you live in Cornwall and visit A&E in Carlisle, a doctor there can see the medications you are prescribed – unless you have chosen that they shouldn’t.

When the ‘NHS app’ launches, you will (should!) be able to see where your SCR has been accessed – although, as with other NHS.UK services in the queue, you need to be able to log in to NHS.UK before that can work. NHS Digital is still consulting on tweaks to the SCR for carers – a use case that the LCHR programme is far from being in a position to consider.

The LCHR programme, which we’ve covered before, often chooses mass data copying – so if you live in Newcastle and fall off a horse in Newmarket, it is unclear what happens, and whether or how those accesses or copies are consented. The LCHR programme breaks at every boundary you cross, because it is designed and run by NHS England for NHS institutions not patients. There is, as yet, no Information Governance model, there is no patient (or care provider?) accessible audit trail, and the change in name from ‘Local’ to Longitudinal Health and Care Records (page 99) also implies that the data held within them will be eternally expanded, rather than kept to the initial tightly defined dataset – care.data2?

The plan also refers to “The use of de-personalised data extracted from local records” (page 97), which suggests that LCHRs are intended to be used for both direct care and secondary uses – perpetuating the festering wound which is NHS Digital’s continuing disregard of the GDPR around the extraction and dissemination of data that GDPR considers identifiable. The use of “de-personalised” in this context (para 5.27) is a grasp at the figleaf of obfuscation that continues to be defended by the Wellcome Trust.

Lacking an appropriate IG model, there is no consent for the secondary uses of the records that LCHRs are sucking up – meaning the devolved LCHR teams are not only lying to patients but also, in contrast to the publicly owned SCR infrastructure, they have generally outsourced the data handling to the private sector. While replacing SCR is not necessarily a problem, replacing one data copy with a different, inferior data copy is not progress to anyone other than the IT companies that will get new contracts for what was previously a publicly-run service. NHS England delegating blame but not control is not a new phenomenon, nor is issuing plans that undermine and will remove what another part of the NHS is proposing to add to help patients.

The hint in paragraph 1.38 of online NHS services to help the mental health crisis is welcome, but the detail – barely two sentences of vague hand waving in chapter 5 – do not meet what is claimed, let alone what is needed.


Choices of the Secretary of State

The standards the Secretary of State sets for the NHS are not merely that its services are ‘safe’. He can have views on the user experience of an app, but user experience is not safety. There is a fundamental difference, and patients assume the NHS will never be unsafe – which is why there is such concern when it turns out not to be safe. The Secretary of State can say what he thinks ‘good’ looks like, but ‘safe’ must remain within the remit of qualified professionals.

Criteria for ‘safe’ are absolutely necessary, if not sufficient – it is entirely appropriate for there to be separate criteria for what is ‘good’. ‘Good’ apps may be what the public choose, but ‘safe’ is what they expect – apps can be both.

As an app example, Matt has chosen that the money for his GP registration should go to Babylon in London rather than the surgery in his constituency. He feels this works well for him, as someone who likely rarely needs a doctor, and disregards any wider harm that comes from taking funds away from the doctors in his constituency. The choice of what is ‘good’ is partially subjective, and different patients will make different decisions. The critique from the profession that the app is unsafe, however, is met with a response that someone thinks it is good. These are entirely different criteria, and both groups are talking past each other – the criteria of ‘safe’ and ‘good’ should be separated; as noted above, the former is necessary but not necessarily sufficient. The Secretary of State can add standards for what ‘good enough’ looks like, without reducing safety, if he so chooses.

At the insistence of DH, NHS Digital has pulled its consultation on the ‘Clinical Data Architecture’ Principles “so that we can ensure consistency with wider emerging strategies”. The Secretary of State continues to laud the first draft of his Vision, while the “Code of Conduct” update is delayed to let the AI companies lobby more, but how visionary is it?

NHS Digital’s recent Board papers have included their assessment of what they need to do to deliver on the Vision. While there are some things they wish to do faster, there are only two new things they need to start doing. One is new to NHS Digital only because it was certification work in which they weren’t previously involved (and it remains unclear why they are now), and the second is so substantive that we’ll quote it in its entirety from page 71:

“We will identify frontline staff whose skills and competence are evident to us and make them honorary colleagues (badges and certificates and all that jazz)”

The Vision the Secretary of State proclaims doesn’t seem any more visionary (or meaningful) than a late night monologue from an NHS manager in a hospital corridor about how GPs should work. The Plan suggests there will be new legislation, in which we will look for the National Data Opt-out to be placed on a statutory basis – rather than remaining a gift of a Secretary of State. We note that the National Data Guardian Act 2018 has now received Royal Assent, which is welcome progress.

Its recent Board papers state that NHS Digital is also going to “run all our public services in the public cloud with no more locally managed servers” (page 64) – presumably moving all services to Azure and AWS. Hopefully they will let tech-savvy journalists in to do long form pieces on what they’re doing, and provide reassurance, as this has the potential to go spectacularly wrong – with limited abilities for entities in the UK to clean up the mess, while doctors in A&E and GPs deal with the consequences of the national governance bodies screwing up yet again.

There is one final annual tradition DH has maintained – the new plan delays the full digitalisation of hospitals by yet another year; this target has been slipping by one year per annum ever since it was announced…

medConfidential comments ahead of the Spending Review ( / Manifestos)

Ahead of a Comprehensive Spending Review, the Government has decided that a large percentage of discretionary government spending will go on health and social care by the end of the next spending period. That is probably a better choice than the US, which spends about that amount on their military.

In light of that decision being made, there are rational consequences which require thought to avoid perverse incentives:

  • Data available to life sciences and research: For there to be public confidence in data use, every patient should be able to know how the NHS and others use data about them, and how their wishes are respected. The NHS has established clear processes for the use of data for legitimate research – these do not need to be changed. However, the implementation of the National Data Opt-out remains hamstrung by legacy data disseminations. This, the first spending review since the 2018 Data Protection Act, allows for a clearer formulation when communicating with the public: “If you want your data to be used for research and for other purposes beyond your care, it will be; if you don’t, it won’t.” (Any exceptions being solely decided by the explicit approval of the Confidentiality Advisory Group – which was placed on a statutory footing in 2014, yet still has no Regulations governing its work.) Past and current heavy reliance on (DPA98) ‘anonymous’ data as the basis for dissemination both undermines public confidence and limits the data available to research. The spending review offers an opportunity to reconsider that failed approach, improving public confidence and making more high quality data available to researchers and the life sciences – both underpinned by a commitment that whatever a patient wishes, they will be able to see how their wishes were respected. Any suggestion of ‘data trusts’ for NHS patients’ data requires as a prerequisite the admission that the NHS itself will never get data dissemination right in patient’s interests. Public confidence in data for life sciences and research would be higher if the message was clear, simple, and accurate: If you want us to use your data in legitimate projects, we will; if you don’t, we won’t.
  • Technology in the NHS: Clinicians will use technology when it helps them with patients; when it doesn’t, they don’t – no matter how hard NHS England may push it. The FHIR (Fast Healthcare Interoperability Resources) standard is now internationally recognised as the standard for interoperability between health systems – yet the first version was only published after the last spending round. Treasury / DH / NHSE should ensure that companies cannot use contracts to limit or prohibit interoperability, or to require bulk data copying from core hospital systems into commercial companies. Where they are proposing new national programmes, chopped up into parts, what happens at the boundaries between parts? 
  • Prevention is cheaper than cure: In advance of the spending review, HMT should commission an independent assessment of the ‘DH vision’ on prevention to answer two critical questions: will it do what it claims to do? And if not, how and where does it fall short? (Page 14 of the vision shows the disconnects.) The assessment should be published alongside the DH green paper, and show what questions must be considered across Whitehall to avoid any other department causing what DH seek to prevent. 
  • New forms of Transport: Will DfT allow self-driving cars to operate in a way where their stopping distance is greater than their effective sensor range? Will equivalent assessments be made for other technologies; and if not, what will the consequent effects be on the health of the nation? 
  • Procurement incentives for competitive markets: Where an NHS body, wishes to procure an AI to assist in diagnosis, it should be required to procure 3 – effectively requiring 3 diverse analyses rather than one, replicating the medical norm of a ‘second opinion’ from a human doctor. That may be extensible to other public bodies. 
  • AI and algorithms in the public sector: For all bodies subject to judicial review, any AI or algorithm involved in input to that decision must satisfy the explainability requirements of judicial review. Should there be a clear public sector mandate that algorithms will only be used if they satisfy existing legal obligations, and that technology tools will need to be procured to satisfy those tools, that will create a market in which the UK is possibly uniquely placed to lead.

The first two points have strong equivalents across all departments.

 

Tests for the spending review: Balancing mental health, parity of esteem, and Public Health

The spending review is the primary administrative mechanism for cross-government prioritisation. 

The largest public health concerns are different in different local areas – will the spending review (and MHCLG priorities) reduce or exacerbate those differences?

Will (tech) companies assessed to be causing mental health issues be required to take steps to reduce the harms they cause in future, and mitigate harms already caused? If they are not, these costs will have to come from the NHS budget, and are effectively a commercial subsidy paid by the public purse. By comparison, for each of alcohol, tobacco, other substances with consequences for human health, and digital companies – does each contribute in tax revenue what they create in direct and indirect costs?


Some areas of this post were elaborated in our submission to the Digital Competition Expert Panel.

The Opt Out process for those with children

This page has been superseded by a 2021 page

Below is left here for historical purposes

medConfidential’s opt-out form for GP data continues to allow you to protect the information held by your family GP as it has always done – but the only way to protect your data collected by hospitals and in other care contexts is now via NHS Digital.

NHS Digital does not (yet) have direct access to your GP record, so its opt-out process requires you to take several steps – and officials have decided that people who have children must make a third.

Additional steps forced on those who care for children

If you have children, or other dependents, and wish to express a choice about the use of their medical records for purposes beyond their direct care, the process is needlessly complex.

NHS Digital provides no online process for families with children even those who are registered with the NHS at the same home address as their parent or carer – so you must use its online process for yourself alone, and then use a separate postal process for any children under 13.

(Children aged 13 or over can express their own choice about their medical records for themselves online.)

So, instead of the single form you could use to express your choice for yourself and your family back in 2014, you must now do THREE things:

1) For your own hospital and other non-GP data: opt out (or opt back in) online

2) For your dependents’ hospital and other non-GP data: post this form to NHS Digital

3) For your and your dependents’ GP data: give this form to your GP

If you don’t have access to a working printer, e-mail children@medConfidential.org with your postal address and we will post you copies of the paper forms, for free, no questions asked. If you don’t have e-mail, you can text your address to 07980 210 746.

If you can afford to make a small donation to support us in offering this service to others, we have a donation page.

registered with the ICO to process personal data in this way.)

Why?

It appears the “most digital” Secretary of State the NHS has ever had would rather place the burden of understanding and action onto individual patients – especially those with families – than solve the problem his officials have created.

And the fix is simple: NHS Digital’s online process confirms who you are to the extent that you are able to express your choice online, yet it chooses not to ask if you have any children – who could then be confirmed in exactly the same way the process confirms that you are who you say you are. The move to ‘digitise’ the National Data Opt-out left YOU with more work to do…

This extra work for you is a deliberate choice; it didn’t have to be this way.

Once you have expressed your wishes about your and your family’s data, you may also wish to write to your MP and ask them to ask the Department of Health why you are being forced to jump through these hoops.

Because while healthy, young, and particularly male protagonists commonly have little understanding or insight into the types of sensitive information they will one day have to divulge to their doctor – and the consequences of confidentiality not being respected – others do not have the luxury of such ignorance.

Late October update

At the start of October, the Department of Health took away your ability to opt out via your GP from having information about you, collected by the rest of the NHS, being used for purposes beyond your direct care. (The option to prevent information from your GP record leaving your GP practice remains. For now.) The new process is so ‘hip and digital’ that you also have to use the Royal Mail if you wish to make a consent choice for your children, as well as visiting your GP practice to make a choice for your GP data that the online process tells you nothing about.

Is this Matt Hancock’s view of a digital NHS?

We are testing a new trifold to guide families through expressing their full opt out choices –which is now a three step process: online, post box, and at the GP. This may be simpler for NHS Digital, but it’s a lot harder for you – a choice with which Matt Hancock seems to be entirely happy.

NHS Digital was apparently very proud that more people opted in via the digital service than opted out in its first 2 months – though sending out 1.6 million letters could be said to have stacked the scales somewhat – but that represents at most a few hundred people a month, whereas 5,000-10,000 people a month were still opting out via their GP until the Secretary of State took that choice away from you.

We have previously given a commitment that there will be a functional digital opt-out process for patients, and that if NHS Digital wasn’t going to deliver one, then medConfidential would have to (though this will likely be very analogue on their side…).

Data rights and proper information can together empower every patient and citizen to have more confidence in those who use their data. NHS Digital seems to want to make it more complicated. Though official information is published in various forms, in various places, the only way a patient can currently read how their wishes were respected is to visit TheySoldItAnyway.com

If you didn’t receive a letter from NHS Digital about the new ‘National Data Opt-out’, and since you’re reading this on our website, you should check the online process to see if your choice disappeared somewhere in the machine (and, if so, to set it to what you want). You’ll then need to set it for your children too by post – and at your GP, for your GP data, to ensure that too is set.

 

Consultation Responses, etc.

With the National Data Guardian Bill having its second reading in the Lords this week, medConfidential has published a letter of support for the Bill. Meanwhile, the Organ Donation Bill contains a supposed safeguard that is overly complex and will not provide reassurance to those who wish to see how their organs will be used after death. We have drafted an amendment for the Lords to fix the broken Bill, if the Commons does not.

As part of the next piece of NHS legislation, the National Data Opt-out should be placed on a statutory footing. The next legislation will likely be the result of NHS England’s consultation on “integrated care providers” (our response) and the “long term plan” (our response), which also referenced the need to reform invoice reconciliation.

Our friends at dotEveryone published their views on digital harms and responsible technology, suggesting that data and ethics in Government should be led by someone “relatable … charismatic and imaginative”. Which would be better than the current person, whose company created the problems around commercial abuses of data in the NHS, and which is still causing problems 20 years later. The current  ‘imagination’ at CDEI (the ‘Centre for Data Ethics and Innovation’) seems to be to repeat the sort of data sale scandals in Government they already caused in the NHS. The Information Commissioner also sought views on a ‘regulatory sandbox’, where companies can experiment with personal data – we had views.

Data use across the rest of Government has also been keeping us occupied. Our evidence to the House of Commons Science and Technology Committee contains some new thinking on the failures of agile in public bodies. Some of that thinking was also in our response to the call for evidence ahead of the UK visit of the UN Special Rapporteur on Extreme Poverty and Human Rights, who is looking at algorithms and digital effects around Universal Credit.

 

Data and the rule of law

The data sharing powers under the Digital Economy Act 2017 are still not fully in force. This did not prevent the Ministry of Housing, Communities and Local Government (MHCLG) demanding data on every homeless person in the country, such as in Camden. The secrecy of data use in such cases must be addressed by the UK Statistics Authority / Office for National Statistics – it is doubly disturbing that MHCLG used the research process to evade the scrutiny that would have applied via other routes.

Decisions by public bodies must, today, comply with the standards of the rule of law. As we move towards more automated decision-making, how will those standards be maintained?

The tech companies and their apologists want the approach to be one defined by ‘ethics’ – as if no tyrant ever failed to justify their crimes. “The computer says no” (or “DeepMind says no”) is wholly insufficient for suppliers of data processing functions to government making decisions about citizens.

All reputable companies will be entirely willing to explain how their “AI” systems arrive at the suggestions or decisions they make – including the (sources of) data on which they were trained. Disreputable companies will be evidenced by their failure or inability to do so.

Government Departments should deliver accountability to ‘their’ data subjects (currently they don’t). But beyond accountability to individuals on how data about them is used, there are standards that must be followed by institutions – especially those which govern.

The Venice Commission has produced a ‘Rule of Law checklist’, covering the context of decision-making. We’ll be taking a look at a couple of Government automated processing plans, and seeing how they conform – and how the checklist applies to digital projects, probably starting with Universal Credit and Settled Status, based on past work. We anticipate identifying holes in some of the frameworks currently used by Government, as compared with the standards required by the rule of law and judicial review. Comments are very welcome to sam@medConfidential.org.

Initial response to new ‘tech vision’ from DHSC

Update: longer response.

The current Westminster penchant for speeches with applause lines but without details has reached DH… 

Update: NHS Digital has now published some details – which are utterly underwhelming when compared with the Secretary of State’s hyperbole. “Use the NHS number” and “upgrade from ICD-10 to ICD-11” is not the radical changes Secretary of State appeared to suggest. Although with the promise of registers, we might dust off the amendment we suggested to the Lefroy Bill (which mandated NHS numbers by law) in 2014. We will update this document when NHS Digital published the document that should have appeared at the same time.

Notes:

  • Data: “We are supportive of … Data Trusts” – does the SofS/DH have so little confidence in the NHS getting data right that he/it is supportive of stripping the NHS of that governance role?
  • DH blindspots: “We will know we have achieved our goals when”… does not mention patients other than suggesting they use apps for self-care…
  • Privacy: There is no reference to the duty of confidence every clinician is under to their patients (it instead points at the Data Protection Act)
  • Google: The obligation on all systems to use FHIR interoperability removes the figleaf behind which DeepMind insisted on data for all patients in the Royal Free for 5 years.
  • Google: It also proposes outlawing the monopoly line in Google’s standard contract that forces users of the Streams app to only connect to DeepMind’s servers. It is unclear whether that line will survive the lobbying it is about to receive.
  • Amazon: Case study 7 is as true as leaving a note on the fridge, but there are other effects of giving Amazon/Alexa such information. Facebook’s new Portal offers the same functionality, and will explicitly be used to target ads at users.

 

Below quotes can be attributed to Sam Smith, coordinator of medConfidential, which works to improve uses of data and technology in the NHS.

The NHS knows what good working technology like, but to get there, you can’t just turn A&E off and on again and see if helps.

Mr Hancock says “we are supportive” of stripping the NHS of its role in oversight of commercial exploitation of data. That should be a cause for widespread concern. If Matt thinks the NHS will never get data right, what does he know that the public don’t?

The widely criticised National Programme for IT also started out with similar lofty vision. This is yet another political piece saying what “good looks like”, but none of the success criteria are about patients getting better care from the NHS. For that, better technology has to be delivered on a ward, and in a GP surgery, and the many other places that the NHS and social care touch. Reforming procurement and standards do matter, and will help, but it helps in the same way a good accountant helps – and that’s not by having a vision of better accounting.

There’s not much detail in here. It’s not so much ‘jam tomorrow’, as ‘jam… sometime’ – there’s no timeline, and jam gets pretty rancid after not very long. He says “these are standards”, but they’re just a vision for standards – all the hard work is left to be done.

-ends-

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

Where are the CAG regulations?

We talk a lot about NHS Digital, and its data releases that continue to ignore opt-outs. But 4 years ago today, Royal Assent of the Care Act 2014 gave NHS Digital a “general duty” to “respect and promote the privacy of recipients of health services and of adult social care in England” – which clearly hasn’t been honoured in some areas of its work. The Act also changed the law specifically so that the Confidentiality Advisory Group (CAG) of the Health Research Authority has the power to advise NHS Digital; advice to which NHS Digital must listen.

Caldicott 3 itself does not require dissent to be honoured when data is disseminated in line with the ICO’s Code of Practice on Anonymisation. (The National Data Guardian – who has been given no enforcement powers – is very careful not to ‘cross wires’ with the UK’s data regulator, who does have such powers.) And, despite well over a million patients clearly indicating their wishes to the contrary, NHS Digital continues to argue its dissemination of pseudonymised data is “anonymised” to the satisfaction of the 1998 Data Protection Act.

The UK is about to get a new Data Protection Act, aligned with and based on the EU General Data Protection Regulation, which says:

(26) … Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person.

The UK’s Information Commissioner will update the Code of Practice on Anonymisation in due course –  she’s just a little busy right now, and the new Data Protection Act is not yet on the statute book – but the Irish Commissioner has already said: (emphasis added)

“Although pseudonymisation has many uses, it should be distinguished from anonymisation, as it only provides a limited protection for the identity of data subjects in many cases as it still allows identification using indirect means. Where a pseudonym is used, it is often possible to identify the data subject by analysing the underlying or related data.

Current practice will have to change.

While IGARD may be the appropriate body to advise whether a request meets NHS Digital’s standards for dissemination, it is not an appropriate body to advise on releasing data which does not honour patients’ objections. The adjudication of the principles of those decisions, by statute, belongs to CAG.

There are legitimate instances where patients’ dissent may be be overridden – but IGARD is not, and never should have been, the body to decide that.

The opt-out is there to protect patients who decide that the safeguard the NHS currently relies upon – pieces of paper, which include for example commercial re-use contracts with commercial companies that service other commercial companies, including pharmaceutical companies that use the data for promoting their products (i.e. marketing) to doctors – is not sufficient for their situation. As is their right.

Another example: in 2014, the Health Select Committee asked for a safe setting for researchers. Only in April 2018 did a remote safe setting begin to be piloted for researchers – that work not only needs to be completed, but it should become the standard means of access.

NHS Digital continues to insist that a piece of paper is sufficient safeguard under which to release copies of the entire nation’s lifelong, linked medical histories to hundreds of organisations. Its own published records show that two-thirds of NHS Digital’s data releases do not respect patient dissent.

It should be CAG which makes such decisions, whenever and wherever it is necessary. The CAG Regulations will make that clear, when they exist. Assurances to patients are less than meaningful when the Regulations to which they relate do not yet exist.

If someone applying for patients’ data cannot do what they need with only 98% of people’s data, they should simply explain to a responsible body why this is the case. Public Health England’s cancer registry already takes this approach with the choice of protections if offers for event dates. NHS Digital simply releases data on every patient, with the medical event dates completely unprotected.

The National Data Guardian was asked to determine a single choice by which patients could express their dissent from their data being used for purposes beyond their direct care. When that choice is disregarded, it must be on a basis clearly and specifically defined in statute, and approved by CAG.

As it is doing around the world, the introduction of the GDPR will force a change, and that change should protect patients’ data that under the new Data Protection Act will be considered identifiable. Those who still need everyone’s data will have to explain why to a competent body – which really isn’t too much to ask.

Given the clear promises given as a consequence of the care.data and HES data scandals – promises much repeated, but yet to be delivered – we’ve been waiting a long time for this to be fixed.

Instant Messaging in Clinical Settings

June 2023: NHS England recently made their guidance less comprehensible (again).

The 2023 wording is entirely compatible with the much clearer 2018 wording; paragraphs in blue below are omissions from that official guidance.


As a clinician or nurse, you should not have to keep up with the latest fluff of the apps you might use for work. But teams need to talk to each other (without using Microsoft Teams for everything!)

NHS England has put our several attempt at ‘guidance’ on using instant messaging apps. It previous WhatsApp was not banned, but failed to provide helpful guidance on what to actually use. It still hasn’t. There was a Do & Don’t list, which was better than nothing, but it isn’t in the latest version, and was almost impossible to turn into practice in the real world.

If asked, we would suggest something like this:

Summary

  1. If your employer offers an instant messaging solution, use that.
  2. If you are picking apps to use yourself, you are safest with Signal.
  3. If you are not picking the apps you use, you will probably have to use WhatsApp or Skype. But be aware that someone will be held responsible when Facebook or Skype change their rules – and it’s probably not going to be the person who picked the app…
  4. Don’t use Facebook Messenger, Instagram, or Telegram.

Whatever app you use for work, the vast majority of people should avoid having their phone going ding for work purposes while they are not at work. For most apps, a swipe left on the main list of ‘chats’ should show an option to “hide alerts” for some time period – this should ensure that if you do give your personal number to work colleagues, it doesn’t end up driving you to distraction outside work. If someone really wants to get in touch, they can always just call you normally.

The reasoning behind our suggestions

The important step in secure messaging is something called “end-to-end” encryption, which prevents anyone – a third party ‘listening in’, or even the service making the connection –  knowing what you said. It’s the equivalent of having a conversation in a private consultation room, rather than doing it standing next to the nurses station, or in a waiting room. But even with Signal, if you are messaging using your personal device, you should treat any conversation as if it were in a lift where another person might be listening.

Signal allows you to decide for how long you will keep messages from any particular person or group, and will automatically delete the stored messages after that. But what happens with the stored message history in other apps? WhatsApp, for example, wants you to give it a full copy of all your messages and send them to its servers as a ‘backup’ (though at some point it will show you ads against them – it is part of Facebook after all).

You may also have set your phone itself to backup to somewhere. Do you know where the backup goes, and what’s in it?

Of course, it is best practice to backup everything on your phone, and most apps assume (probably correctly) that you don’t want to lose every message or photo you receive of your kids. This doesn’t necessarily translate neatly to a clinical setting – anything that must be kept should be recorded elsewhere, so that if you lose your phone, the only thing you won’t have kept was ward chit-chat. WhatsApp wants everything – it doesn’t offer clinical reassurance. And while Snapchat has deletion as a feature, it has other problems akin to Facebook and Skype.

The longer-term security of your messaging is dependent upon who makes the app – and when, and why, they will change the rules on you. We (also) recommend Signal because it is produced by a charitable foundation whose sole mission is to provide secure, usable, communications. One key reason why the NHS England guidance is so terrible is that WhatsApp has lobbyists telling NHS England that it should allow their product; Signal doesn’t.

Since Facebook (the owner of WhatsApp) lies to regulators about its intentions, you clearly cannot rely on the company not to do tomorrow what it denies it will do today.  As a consequence of this, any official guidance must in future be kept up to date by NHS Digital. And, as corporate policies change, so must the guidance – removing from the equation NHS England’s fear of the deluge of lobbying that created this mess in the first place.

Clinicians deserve better tools than those that NHS England chooses to recommend, where a national body prioritises its own interests over the needs of those delivering direct care. The NHS England guidance is the output of meetings and committees that with every iteration gets progressively less useful for those who need to use something to help the and the people they work with practice medicine.

(This post will be kept under review as technologies change; it was last updated in June 2023)

June 2023:  The December 2022 guidance from NHS England is split over a page about messaging, with key parts on the page about devices.

March 2021: added link to common definition and tests for a secure app.

March 2020 Update: custom apps are now in the NHS Apps library, and so apps that your staff routinely use for other purposes shouldn’t be used.