Category Archives: News

Late October update

At the start of October, the Department of Health took away your ability to opt out via your GP from having information about you, collected by the rest of the NHS, being used for purposes beyond your direct care. (The option to prevent information from your GP record leaving your GP practice remains. For now.) The new process is so ‘hip and digital’ that you also have to use the Royal Mail if you wish to make a consent choice for your children, as well as visiting your GP practice to make a choice for your GP data that the online process tells you nothing about.

Is this Matt Hancock’s view of a digital NHS?

We are testing a new trifold to guide families through expressing their full opt out choices –which is now a three step process: online, post box, and at the GP. This may be simpler for NHS Digital, but it’s a lot harder for you – a choice with which Matt Hancock seems to be entirely happy.

NHS Digital was apparently very proud that more people opted in via the digital service than opted out in its first 2 months – though sending out 1.6 million letters could be said to have stacked the scales somewhat – but that represents at most a few hundred people a month, whereas 5,000-10,000 people a month were still opting out via their GP until the Secretary of State took that choice away from you.

We have previously given a commitment that there will be a functional digital opt-out process for patients, and that if NHS Digital wasn’t going to deliver one, then medConfidential would have to (though this will likely be very analogue on their side…).

Data rights and proper information can together empower every patient and citizen to have more confidence in those who use their data. NHS Digital seems to want to make it more complicated. Though official information is published in various forms, in various places, the only way a patient can currently read how their wishes were respected is to visit TheySoldItAnyway.com

If you didn’t receive a letter from NHS Digital about the new ‘National Data Opt-out’, and since you’re reading this on our website, you should check the online process to see if your choice disappeared somewhere in the machine (and, if so, to set it to what you want). You’ll then need to set it for your children too by post – and at your GP, for your GP data, to ensure that too is set.

 

Consultation Responses, etc.

With the National Data Guardian Bill having its second reading in the Lords this week, medConfidential has published a letter of support for the Bill. Meanwhile, the Organ Donation Bill contains a supposed safeguard that is overly complex and will not provide reassurance to those who wish to see how their organs will be used after death. We have drafted an amendment for the Lords to fix the broken Bill, if the Commons does not.

As part of the next piece of NHS legislation, the National Data Opt-out should be placed on a statutory footing. The next legislation will likely be the result of NHS England’s consultation on “integrated care providers” (our response) and the “long term plan” (our response), which also referenced the need to reform invoice reconciliation.

Our friends at dotEveryone published their views on digital harms and responsible technology, suggesting that data and ethics in Government should be led by someone “relatable … charismatic and imaginative”. Which would be better than the current person, whose company created the problems around commercial abuses of data in the NHS, and which is still causing problems 20 years later. The current  ‘imagination’ at CDEI (the ‘Centre for Data Ethics and Innovation’) seems to be to repeat the sort of data sale scandals in Government they already caused in the NHS. The Information Commissioner also sought views on a ‘regulatory sandbox’, where companies can experiment with personal data – we had views.

Data use across the rest of Government has also been keeping us occupied. Our evidence to the House of Commons Science and Technology Committee contains some new thinking on the failures of agile in public bodies. Some of that thinking was also in our response to the call for evidence ahead of the UK visit of the UN Special Rapporteur on Extreme Poverty and Human Rights, who is looking at algorithms and digital effects around Universal Credit.

 

Data and the rule of law

The data sharing powers under the Digital Economy Act 2017 are still not fully in force. This did not prevent the Ministry of Housing, Communities and Local Government (MHCLG) demanding data on every homeless person in the country, such as in Camden. The secrecy of data use in such cases must be addressed by the UK Statistics Authority / Office for National Statistics – it is doubly disturbing that MHCLG used the research process to evade the scrutiny that would have applied via other routes.

Decisions by public bodies must, today, comply with the standards of the rule of law. As we move towards more automated decision-making, how will those standards be maintained?

The tech companies and their apologists want the approach to be one defined by ‘ethics’ – as if no tyrant ever failed to justify their crimes. “The computer says no” (or “DeepMind says no”) is wholly insufficient for suppliers of data processing functions to government making decisions about citizens.

All reputable companies will be entirely willing to explain how their “AI” systems arrive at the suggestions or decisions they make – including the (sources of) data on which they were trained. Disreputable companies will be evidenced by their failure or inability to do so.

Government Departments should deliver accountability to ‘their’ data subjects (currently they don’t). But beyond accountability to individuals on how data about them is used, there are standards that must be followed by institutions – especially those which govern.

The Venice Commission has produced a ‘Rule of Law checklist’, covering the context of decision-making. We’ll be taking a look at a couple of Government automated processing plans, and seeing how they conform – and how the checklist applies to digital projects, probably starting with Universal Credit and Settled Status, based on past work. We anticipate identifying holes in some of the frameworks currently used by Government, as compared with the standards required by the rule of law and judicial review. Comments are very welcome to sam@medConfidential.org.

Initial response to new ‘tech vision’ from DHSC

The current Westminster penchant for speeches with applause lines but without details has reached DH… 

Update: NHS Digital has now published some details – which are utterly underwhelming when compared with the Secretary of State’s hyperbole. “Use the NHS number” and “upgrade from ICD-10 to ICD-11” is not the radical changes Secretary of State appeared to suggest. Although with the promise of registers, we might dust off the amendment we suggested to the Lefroy Bill (which mandated NHS numbers by law) in 2014. We will update this document when NHS Digital published the document that should have appeared at the same time.

Notes:

  • Data: “We are supportive of … Data Trusts” – does the SofS/DH have so little confidence in the NHS getting data right that he/it is supportive of stripping the NHS of that governance role?
  • DH blindspots: “We will know we have achieved our goals when”… does not mention patients other than suggesting they use apps for self-care…
  • Privacy: There is no reference to the duty of confidence every clinician is under to their patients (it instead points at the Data Protection Act)
  • Google: The obligation on all systems to use FHIR interoperability removes the figleaf behind which DeepMind insisted on data for all patients in the Royal Free for 5 years.
  • Google: It also proposes outlawing the monopoly line in Google’s standard contract that forces users of the Streams app to only connect to DeepMind’s servers. It is unclear whether that line will survive the lobbying it is about to receive.
  • Amazon: Case study 7 is as true as leaving a note on the fridge, but there are other effects of giving Amazon/Alexa such information. Facebook’s new Portal offers the same functionality, and will explicitly be used to target ads at users.

 

Below quotes can be attributed to Sam Smith, coordinator of medConfidential, which works to improve uses of data and technology in the NHS.

The NHS knows what good working technology like, but to get there, you can’t just turn A&E off and on again and see if helps.

Mr Hancock says “we are supportive” of stripping the NHS of its role in oversight of commercial exploitation of data. That should be a cause for widespread concern. If Matt thinks the NHS will never get data right, what does he know that the public don’t?

The widely criticised National Programme for IT also started out with similar lofty vision. This is yet another political piece saying what “good looks like”, but none of the success criteria are about patients getting better care from the NHS. For that, better technology has to be delivered on a ward, and in a GP surgery, and the many other places that the NHS and social care touch. Reforming procurement and standards do matter, and will help, but it helps in the same way a good accountant helps – and that’s not by having a vision of better accounting.

There’s not much detail in here. It’s not so much ‘jam tomorrow’, as ‘jam… sometime’ – there’s no timeline, and jam gets pretty rancid after not very long. He says “these are standards”, but they’re just a vision for standards – all the hard work is left to be done.

-ends-

Local Health and Care Records: exemplary, or just more of the same old?

The Australian ‘care.data’ isn’t going well. Taking its lead from the English plan, the Australian ‘My Health Record’ programme claims to be about direct care and creates a new repository of patients’ data hiding behind a badly-run opt-out process, with minimal publicity that’s all about direct care; those who support it don’t actually understand it, and in the small print, the data can be copied for purposes that aren’t direct care at all. None of the lessons of care.data have been learnt there at all.

Meanwhile, in the UK, NHS England has announced a set of pilots for ‘Local Health and Care Record’ schemes – 3 in May, 2 more in June – which claim to be about direct care, which may or may not create a new repository of data without an opt-out process, and about which all the publicity is only about direct care; those who support it seem not to articulate fully what it is, and in the small print, the data can be copied for purposes that aren’t direct care at all.

‘Just in time’ or ‘Eventually in case’?

There is clear value in the principal public goal of Local Health and Care Records (LHCR): when a patient arrives at a new care setting, doctors should be able to see care provided only hours before – especially when it’s that which may have put the patient there. An obvious example would be a hospital doctor being able to see new medicines, prescribed that morning. This has obvious clinical value, assuming the data is up to date enough – showing data that is as current as the patient’s arrival, not yesterday.

In practice, this would mean looking up records across the system as needed, rather than building yet another pile of outdated data – most of which would never be touched for care purposes, and which could be dangerously out of date if it were. The ‘glue’ required for interoperability is to simplify ‘just in time’ interoperability, rather than to copy data from everywhere ‘just in case’ it’s needed somewhere else.

The driving principle must be the provision of relevant, necessary, up-to-date information. Which, as any serious technologist will tell you, means using APIs – not making and passing around endless copies of data. (‘Paperless’ shouldn’t mean promiscuous…) Building yet another set of out-of-date databases only works for the people who sell and run databases.

The local areas offered as pilots apparently get to choose their own technology for their own needs. But before the ink on the LHCR announcements was dry, there were other announcements about projects that want to copy the data that the ‘pilots’ apparently haven’t yet decided upon. The money is already flowing as if they will.

Clearly they all know something that NHS England isn’t telling the public. Where data is made available to ‘research’, NHS England (as data controller) will also want to use the GP data it copies for its own purposes – clinical performance management and ‘developments’ that skirt the line between research and micromanaging. The National Data Opt Out should apply to these uses – whether it will or not remains to be seen – but even so, the creation of another copy of patients’ medical records, under the control of NHS England rather than doctors, has apparently been mandated without public debate, discussion, or basic honesty.

Will patients be sold the figleaf of ‘direct care’, with other uses hidden behind, in a very Australian way?

However a local care record system is implemented, every patient needs to be able to have clear and specific answers to their questions: who is looking at my data? Why? And what are my choices?

The advocates of dangerous data copying will continue to push for it – while the ‘spawn of care.data’ resurfaces in Australia, the toxic causes of care.data are now reappearing across this country, in the form of ‘data trusts’. Until there is a binding commitment to transparency over all data access, those who wish to copy patients’ information for secret reasons will continue to publicly claim patient ‘benefits’ for activities that are far more sordid.

 

We have also done a deep dive into the systems, which is possibly far more than you ever wanted to know about local health and care record systems.

150,000 patients’ opt-outs not honoured; their confidential information sold for 3 years

A serious error affecting 150,000 NHS patients has been reported in the media in recent days, after it was uncovered last week. We understand the error affects patients who set an opt-out between March 2015 and June 2018 and whose GP practices use TPP’s SystmOne software – their opt-out codes were not sent to NHS Digital until last week.

As a consequence of this error, from April 2016 until 26 June this year, those patients’ confidential, identifiable data was sold to a range of organisations, including private companies. This will obviously be of concern to a great many people.

Both TPP and NHS Digital are taking remedial action; the coding error has been corrected to ensure opt-outs will be uploaded properly from now on, affected GP practices were written to on Monday 2 July, and the individual patients affected should be written to by the end of the month.

Until then, based on current information, this is what you can do:

If you have recently received a letter from NHS Digital about the conversion of your Type-2 opt-out to the National Data Opt-out then you weren’t affected by this incident. (These letters were sent out during June.)

If however you haven’t received a letter, and you are over 16, and you remember opting out any time from March 2015 onwards, then either:

  1. a) you are affected by the TPP incident, or
  2. b) separately, your opt-out was never applied to your GP record.

Anyone over the age of 13 should be able to check their current opt-out status by using NHS Digital’s new online National Data Opt-out process:

If the light blue status box does not appear when you check and you do not wish your confidential, identifiable medical information to be used for any purposes beyond your own direct care, then you need to set the option on this screen to “No”.

This new online process only works, however, for individuals over 13 years old – and not for families with children or adult dependants. medConfidential’s (now improved!) GP opt-out form continues to work, as it has done since late 2013. It also lets you prevent confidential, identifiable information leaving your GP record, which the National Data Opt-out does not cover.

But – given this incident, and every previous breach of public trust – why can’t every patient see their data, so they can know what has happened?

Everyone agrees how bad the situation created by TPP’s error, with consequences for patients from their data being used against their wishes, really is:

Professor Helen Stokes-Lampard, Chair of the Royal College of GPs, said:

Patient data held by the National Health Service should only ever be used morally, safely and responsibly, and we must all work together to ensure mistakes of this nature are never repeated. We need to be able to reassure patients that their wishes concerning their data are being respected.

Understanding Patient Data said in response (their emphasis):

This incident highlights the critical need for transparency – to ensure that it is clear where data is going and how choices are honoured. It also demonstrates that a trustworthy system must not just say the right things but also do the right things in practice as well: if opt-outs are claimed to be honoured, they absolutely must be. If these standards are not upheld, there has be clear accountability in the system, with sanctions if necessary to demonstrate that these issues are taken seriously, or public confidence will again suffer.

Dr Natalie Banner, who now leads the ‘Understanding Patient Data’ project, tweeted:

Astonishing and appalling failure to uphold patient objections: but what sanctions to ensure providers uphold the standards we expect of them? New opt-out, which is patient-registered rather than GP-registered, *should* be less liable to such errors though.

Mr Harry Evans, from the Kings Fund policy team, said:

We are all agreed on the importance of the public not being surprised by how NHS uses data, so this is just remarkable.

These are fine words, but when will they speak out about the people NHS Digital disregarded in its new ‘digital’ process – a process that Ministers signed off – which separates processing for parents and children? (Not every American policy approach should be replicated in the NHS…)

In a recent explanation for OurNHS, we showed the ‘proxy’ form itself says:

…if your family has children under the age of 13, or if you look after a dependent older relative, then things are even more complicated. Rather than giving a simple instruction to your doctor, those who would prefer their children’s data wasn’t sold to third parties for unknown purposes, will be required to send to NHS Digital, by post, four pieces of ID documentation along with a seven-page form. So much for Jeremy Hunt’s much-vaunted commitment to a ‘paperless’ NHS.

Given the significant effect this will have on people far wider than the 150,000 currently affected, you might want to ask (a) Understanding Patient Data, or (b) your MP, what they are doing to ensure the broken process for families making a decision together is fixed.

As the dust settles from GDPR Day…

…we’ve updated our scorecard.

One of the existing patient opt-outs has been renamed as the new National Data Opt-out, but a whole swathe of issues that have festered, unaddressed, for years still remain.

We consider these issues below, framed by questions of – and glaring omissions to – the ‘Your NHS Data Matters’ communications campaign, launched on GDPR Day.

Overview

“Your health and adult social care information supports your individual care. It also helps us to research, plan and improve health and care services in England.”

The word “us” appears to be doing a lot of work here; would patients expect “us” to include, for example, commercial information intermediaries such as Harvey Walsh Ltd?

“You can choose whether or not your confidential patient information is used for research and planning.”

All well and good, if true – but what about all other ongoing (and future) uses of patient information besides “research and planning”? Why does the new National Data Opt-out not use the far clearer, more accurate, and comprehensive Caldicott 3 definition of “purposes beyond direct care”?

If the new National Data Opt-out does cover the use of patients’ information for all purposes beyond their direct or ‘individual’ care, then why not say so? If it does not, then how does the ‘new’ opt-out meet the requirements of Caldicott 3, the recommendations of which the Government accepted in full?

medConfidential recommends: Be accurate! All public communications must in the first instance use the proper Caldicott 3 formulation, “purposes beyond direct care”.

These purposes may be further explained in terms of “research” and “planning”, but public explanations must be clear that uses are not limited to only these two purposes. To do otherwise would both mislead patients and undermine the effectiveness of the opt-out, and could lead to further collapses in public trust when people become aware of uses that do not clearly fall into either category.

“Information that only identifies you like your name and address is not confidential patient information and may still be used.”

This is utterly muddle-headed, and goes against what many (if not most) people reasonably understand is meant by the word “confidential”. While the example given is relatively benign it is precisely this loophole that, not coincidentally, led to the scandal of the Home Office MoU.

“Your confidential patient information is used in two different ways:”

This is not even close to true! We consider other uses, such as commissioning and commercial re-use in more detail below, but this statement is demonstrably untrue: take, for example, the HRA Confidentiality Advisory Group’s Non-Research Register, which contains ongoing uses such as invoice reconciliation, risk stratification, commissioning and projects that explicitly mix direct care and secondary uses.

medConfidential recommends: Don’t mislead patients! Be more clear and explicit about the range of uses to which patients’ information are put.

While public communications must be as clear and as understandable as possible, they must also be accurate – and true. The split between “your individual care” and “research and planning” (a description that we note above is itself too narrow and misleading) is far too simplistic, especially when patients are being asked to make an informed consent choice.

“Most of the time, we use anonymised data for research and planning. So your confidential patient information isn’t always needed.”

No definition of “anonymised” is provided. Using this word without explaining what it means is misleading; the natural (i.e. many patients’) assumption is that “anonymised data” is anonymous, which is not the case. GDPR and guidance from the ICO now makes it clear that what NHS Digital has been disseminating “most of the time” is identifiable data.

That only some identifiers are being removed, or pseudonyms substituted, must be explained – and linking to a third party, non-NHS information source to do this will hardly be reassuring to many patients. Hiding behind narrow legalistic reasoning and jargon never has and (especially post-GDPR) never will solve major long-standing issues.

medConfidential recommends: Follow the law! Stop implying that ‘anonymised’ is the same as anonymous, and respect people’s opt-outs for all identifiable data – don’t keep trying to find loopholes and excuses.

Benefits of data sharing

We are not aware that this is, or ever has been, in dispute. Clearly there are benefits to lawful, ethical, consensual, safe, and transparent data sharing.

Problems come when, as with the care.data programme and previous attempts at public communication, the NHS tries exclusively ‘selling the benefits’ without even mentioning any of the risks. Addressing these directly helps people make sense of the mitigations used – and such measures are no longer just arbitrary claims or assertions – and enables a more informed choice on that basis.

Who uses your data

As we note above, the narrow definition “research and planning” does not even come close to defining the full range of uses, for purposes beyond their direct care, to which patients’ information is put.

These omissions aside, and while ‘Your NHS Data Matters’ lists some types of organisations that may do “research” and acknowledges the use of patients’ information “to support the delivery of health and social care” (conflating both direct care and “planning”) it makes no mention of the types of organisations that may be involved in “planning”, and all of the activities that term is supposed to encompass.

Given that it is precisely those people and organisations that may have access to their information that matters to most patients who have concerns, this is another serious omission. Without it, how are patients supposed to make an informed choice?

medConfidential recommends: Be honest and upfront about who will have access to patients’ information; patients should not be assumed (or required) to understand the inner workings of the health and care system in order to make choices.

It may be argued that NHS Digital’s Data Release Register performs this function. However, linking to a massive Excel spreadsheet, containing literally thousands of individual entries, puts too much of a burden on any normal individual and – given the disparity between this and the level of detail provided elsewhere – begins to look like obfuscation.

We understand NHS Digital is working on a more clearly-formatted HTML publication of its Data Release Register but, in its absence, medConfidential has provided a more readable version that – unlike the current Register – contains links to NHS Digital’s audit reports, for those organisations that have been audited.

medConfidential recommends: Continue improving the transparency of data use.

For example, besides audits (and actions taken) future Registers should link to the relevant DARS and/or IGARD documentation; showing there is a process, and that the process is being applied competently and consistently is an important way to build and maintain trust.

It is unfortunate that the “NIC numbers” given in current Registers are entirely self-referential to anyone performing, e.g. a Google search; concealing or obscuring relevant parts of the process raises and persists doubts.

“Research bodies and organisations include:
– university researchers
– hospital researchers
– medical royal colleges
– pharmaceutical companies researching new treatments”

Why are “pharmaceutical companies” the only ones on this list whose use of patients’ information is qualified? Is the claim that pharmaceutical companies only receive patients’ data for the specific purpose of researching new treatments? This is patently untrue, and leads onto the further spurious claim that patients’ information will not be sold for “marketing or insurance” purposes.

While this claim may be narrowly true, at least for “commercial insurance purposes”, it omits to mention that at least some information intermediaries (i.e. commercial re-users, now sometimes referred to as “health and care analysis companies”) which regularly receive data from NHS Digital still service pharmaceutical marketers.

NHS Digital cannot state definitively who does or does not reuse patients’ medical records, as it specifically chooses not to know.

medConfidential recommends: Stop misleading patients as to the ultimate uses of their data, and stop sending out copies of everyone’s hospital histories to companies which (also) profit from servicing non-NHS commercial interests.

How data is protected

‘Your NHS Data Matters’ makes quite a few assertions about what will and will not be done with your data – though, and especially given the tendency to use jargon and narrow legalistic terms, it would be good to provide evidence and to clearly define key phrases. For example, we presume “confidential patient information” and “data” are two different things.

In addition, as noted above, linking to a third party, non-NHS information source to achieve some of this – however good the explanation – will hardly be reassuring to patients with concerns.

Another glaring omission, given the significant number organisations that do not provide health and care services, but that do use patients’ information for purposes beyond their direct care, is the list of the steps those organisations are supposed (required?) to take to protect patients’ data.

The list of steps for such organisations clearly cannot be the same as those for NHS bodies, in that some of these organisations do not “make it clear why and how data is being used”, and others hide behind the information intermediaries’ commercial reuse contracts to, e.g. measure the efficacy of pharma sales, and to market their products to NHS doctors and commissioners.

medConfidential recommends: Make it a requirement to report to NHS Digital (and thence to patients) how data is used by everyone; stop relying on commercial reuse contracts and the “promotion of  health” loophole in the Care Act 2014 to persist what are self-evidently marketing uses.

Manage your choice

The new ‘digital’ National Data Opt-out process cannot cope with dependant children, and assumes that all 13 year olds have mobile phones or e-mail accounts which can be accessed safely without duress. It appears as if, when the process was signed off under the Government’s Digital Service Standard, Ministers did not spare a thought for their families at all…

The entire process is overly bureaucratic and intimidating, especially when compared with the existing ‘Type-2’ opt-out process: rather than simply instructing your own GP, who already knows you, you must identify yourself to officials at a remote call centre – and may even have to send in up to four pieces of ID and documentation with a form. (Check pages 2-3 of NHS Digital’s 7-page ‘Non-Digital Proxy Opt-Out Form’ for a list of requirements.)

This feels more like an inquisition than a ‘choice’.

Given how fundamentally broken NHS Digital’s new ‘Beta’ opt-out process is, medConfidential recommends patients who have concerns use the opt-out form we’ve provided since late 2013.

We updated our form in line with recent changes and it still works for you, your dependant children and others for whom you are responsible – it also protects your GP data from uses beyond your direct care, not just your data supplied to NHS Digital.

With all that is and will be changing, medConfidential also strongly recommends you get yourself a Patient Online account, if you don’t already have one.

We provide more information about this on the ‘For patients’ section of our website.

Though it will still be some time until you can see how all of your data has been used, by which organisations and for what purposes, a Patient Online login to your GP’s online system should already allow you to see how your GP data is being used.

Where an opt out doesn’t apply

One critical question is whether patients’ opt-outs will now be honoured in the dissemination of ‘Hospital Episode Statistics’. HES comprises two-thirds of data releases from NHS Digital, most of those to commercial companies – including all of the commercial reusers. Until now, over 1.4 million patients’ wishes in this regard have been ignored.

Apparently officials believe IGARD, a body of NHS Digital’s own creation, can decide to override patients’ dissent when, in fact, the only body with a statutory basis to approve such exceptions is the Confidentiality Advisory Group (CAG) at the HRA.

Both GDPR and the UK’s new Data Protection Act clarify and extend the definition of identifiable data such that – the day before GDPR came into effect – staff at NHS Digital were ordered not to disseminate any “anonymised” patient data. Data releases were resumed the following day, but NHS Digital is still in discussions with the Information Commissioner’s Office as to what patient information can now be considered “anonymous”.

Under GDPR, this is unlikely to include HES in its current form: individual-level, lifelong hospital medical histories, where every event is dated and linked by a pseudonym.

Given a mother with two children is over 99% likely to be identifiable from her children’s birth dates alone, and given the enhanced GDPR ‘right of access’ to any data received by any customer of NHS Digital to which opt-outs have not been applied, it would seem not only unlawful but highly irresponsible for NHS Digital to keep selling what GDPR now defines as the identifiable patient data of those who have opted out.

If you – or any patient – would like to see how your data is used, and where your choices are being ignored, medConfidential recommends you visit TheySoldItAnyway.com

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

GDPR DAY BRIEFING: ‘Your data matters’ and NHS patients’ data, with the ‘new’ National Data Opt-Out

May 25th 2018 will start an awareness-raising programme with the public that their data matters – and what, if anything, changes as a result of patients’ increased rights under GDPR.

With regard to NHS patients’ health data:

  • A new NHS ‘National Data Opt-Out’ commences on GDPR day (May 25th);
  • NHS Digital continues to sell (i.e. disseminates under contract, on payment of a fee) patients’ data, as it has been doing for years;
  • GDPR expands the definition of ‘identifiable data’ (one reason why everyone’s privacy policies are changing);
  • Will NHS Digital ignore patient opt-outs on these newly-identifiable data releases, relying on definitions from the old Data Protection Act?
  • NHS Digital / DH refuse to ask data applicants why 98% of NHS patients’ data isn’t enough for them; while there may be legitimate reasons to override patient opt-outs, pretending new legislation does not apply to data releases (yet again) is not one of them.

Your Data Matters… but NHS Digital sells it anyway

NHS Digital still forgets about patients. Unfortunately, it sees them less as people and more as ‘lines in a database’.

NHS Digital continues to sell a product called ‘Hospital Episode Statistics’ (HES); a dataset that is not actually statistics but that rather consists of individual patients’ lifelong hospital histories, with every medical event dated and linked together by a unique identifier. As of May 24th, two-thirds of NHS Digital’s data disseminations do not respect patients’ right to object (‘opt out’) to their data being used for purposes beyond their direct care.

If you read NHS Digital’s own Data Release Registers, or view them at TheySoldItAnyway.com, [1] you can see for yourself the evidence of where data goes – and where patients’ express wishes are deemed not to matter.

After four years, and further breaches of health data, NHS Digital ignores the choices of the 1.4 million people who opted out and still sells their (and every other hospital patient’s) data for commercial reuse. Those who claim to need 100% of the data for some reason, need merely explain to a competent data release body why 98% of people’s data isn’t enough – an explanation they’re currently not even asked to provide.

GDPR clarifies that the hospital data NHS Digital continues to sell is identifiable data – so claimed exemptions (item 5) to people’s opt outs don’t apply. Especially for those who remember the dates of particular medical events in hospital, such as the birth dates of their own children, or who can read about them online. [2]

‘Could do better’

Last week, the Department for Education called a halt to the sale of the data it collects on schoolchildren [3] for the very reason the NHS continues using to justify its sale of patients’ data.

NHS Digital now has a research environment [4] which allows far higher safety for patients’ data – but the companies that don’t want the NHS to be able to see what they’re doing with the data are special pleading. It is precisely these hidden uses to which patients are most likely to object.

NHS Digital’s customers, for example, still include for-profit companies such as Harvey Walsh, an “information intermediary” that – exactly as it did in and before 2014, and despite having breached the terms of its contract since then – continues to service commercial clients including pharmaceutical companies, which use the information to promote their products to doctors.

The digital service launching for GDPR Day in fact does less than the form that’s been available on medConfidential’s website since late 2013. [5] Our GP form works immediately – if you use the new digital service, your GP won’t know about it for months.

Discussing a damning ‘report’ in the House of Commons, the chair of the Health Select Committee censured NHS Digital for its “dimmest grasp of the principles of underpinning confidentiality”. [6] The Government has agreed to raise the bar for breaching patients’ confidentiality when handing information to the Home Office; will NHS Digital now respect the choices of those patients who wish to keep the information in their medical records confidential too?

The solution to this is straightforward: DH can Direct NHS Digital to respect objections (opt-outs) in all releases of HES that CAG has not approved to have data released without patients’ objections honoured. There may be projects that require 100% of patient data; two-thirds of them do not.

The ICO has not yet updated its (non-statutory) Anonymisation Code of Practice to match GDPR, although its guidance on the GDPR definition of personal data and newer codes on subject access rights show the definitions in GDPR mean NHS Digital’s current practice does not deliver on its promises to patients.

The NHS has ignored legislative changes and harmed research projects before – see note 4 in this post. This effect is one of the main things that prompted the Wellcome Trust to create the Understanding Patient Data initiative.

But it is still (a bit) better than it was…

NHS Digital now sells less of your data than it used to; it only sends out hundreds of copies of the nation’s hospital records – ‘pseudonymised’, but containing information that GDPR recognises makes it identifiable, and therefore still personal data.

You now have the ability to make a choice for you and (after a fashion) your family that will work, in due course [7] – but NHS Digital needs to listen to Dr Wollaston, “take its responsibilities seriously, understand the ethical underpinnings and stand up for patients”, respect that patients’ data matters, and fully honour everyone’s choices.

Questions for interviewees:

  • What does the NHS’ online-only opt-out service not do on day one, that the GP-led process did last week?
  • How many steps does it take for a family to express their collective choice on how their data is used?
  • When this new digital dissent process was signed off under the Government’s Digital Service Standard, did Ministers spare a thought for their families at all?
  • Will patients’ opt-outs be honoured in the dissemination of HES under GDPR?
    • If not, will those patients who already opted out be told why not?
  • A mother with 2 children is over 99% likely to be identifiable from their children’s birth dates alone; given the enhanced GDPR ‘right of access’ to any recipient data to which opt-outs have not been applied, will NHS Digital keep selling what GDPR defines as the identifiable patient data of those who have opted out?
    • What is the burden on medical research of this choice by NHS Digital, made to placate its commercial customers?

If you or any patient would like to see how their data is used, and where their choices are being ignored, please visit TheySoldItAnyway.com

Notes for Editors

1) NHS Digital publishes its data release register as a spreadsheet but it fails to link to, e.g. its own audit reports – so medConfidential has created a more readable version that does.

2) All the information required to identify Ed Sheeran’s entire hospital history in HES appears in this BBC News article, published online on 19 May 2018: http://www.bbc.co.uk/news/uk-england-suffolk-44155784

3) ‘Sharing of school pupils’ data put on hold’, BBC News, 15 May 2018: http://www.bbc.co.uk/news/technology-44109978

4)  A ‘safe setting’, as medConfidential and others recommended in evidence to Parliament back in 2014: https://digital.nhs.uk/services/research-advisory-group/rag-news-and-case-studies/remote-data-access-environment-to-speed-up-access-to-data

5) We have updated our form to reflect the name change. See https://medconfidential.org/how-to-opt-out/

6)  https://www.theyworkforyou.com/debates/?id=2018-05-09a.746.6#g771.0

7) The National Data Opt-out should be respected by bodies across the health and care system “by 2020”.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

– ends –

Where are the CAG regulations?

We talk a lot about NHS Digital, and its data releases that continue to ignore opt-outs. But 4 years ago today, Royal Assent of the Care Act 2014 gave NHS Digital a “general duty” to “respect and promote the privacy of recipients of health services and of adult social care in England” – which clearly hasn’t been honoured in some areas of its work. The Act also changed the law specifically so that the Confidentiality Advisory Group (CAG) of the Health Research Authority has the power to advise NHS Digital; advice to which NHS Digital must listen.

Caldicott 3 itself does not require dissent to be honoured when data is disseminated in line with the ICO’s Code of Practice on Anonymisation. (The National Data Guardian – who has been given no enforcement powers – is very careful not to ‘cross wires’ with the UK’s data regulator, who does have such powers.) And, despite well over a million patients clearly indicating their wishes to the contrary, NHS Digital continues to argue its dissemination of pseudonymised data is “anonymised” to the satisfaction of the 1998 Data Protection Act.

The UK is about to get a new Data Protection Act, aligned with and based on the EU General Data Protection Regulation, which says:

(26) … Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person.

The UK’s Information Commissioner will update the Code of Practice on Anonymisation in due course –  she’s just a little busy right now, and the new Data Protection Act is not yet on the statute book – but the Irish Commissioner has already said: (emphasis added)

“Although pseudonymisation has many uses, it should be distinguished from anonymisation, as it only provides a limited protection for the identity of data subjects in many cases as it still allows identification using indirect means. Where a pseudonym is used, it is often possible to identify the data subject by analysing the underlying or related data.

Current practice will have to change.

While IGARD may be the appropriate body to advise whether a request meets NHS Digital’s standards for dissemination, it is not an appropriate body to advise on releasing data which does not honour patients’ objections. The adjudication of the principles of those decisions, by statute, belongs to CAG.

There are legitimate instances where patients’ dissent may be be overridden – but IGARD is not, and never should have been, the body to decide that.

The opt-out is there to protect patients who decide that the safeguard the NHS currently relies upon – pieces of paper, which include for example commercial re-use contracts with commercial companies that service other commercial companies, including pharmaceutical companies that use the data for promoting their products (i.e. marketing) to doctors – is not sufficient for their situation. As is their right.

Another example: in 2014, the Health Select Committee asked for a safe setting for researchers. Only in April 2018 did a remote safe setting begin to be piloted for researchers – that work not only needs to be completed, but it should become the standard means of access.

NHS Digital continues to insist that a piece of paper is sufficient safeguard under which to release copies of the entire nation’s lifelong, linked medical histories to hundreds of organisations. Its own published records show that two-thirds of NHS Digital’s data releases do not respect patient dissent.

It should be CAG which makes such decisions, whenever and wherever it is necessary. The CAG Regulations will make that clear, when they exist. Assurances to patients are less than meaningful when the Regulations to which they relate do not yet exist.

If someone applying for patients’ data cannot do what they need with only 98% of people’s data, they should simply explain to a responsible body why this is the case. Public Health England’s cancer registry already takes this approach with the choice of protections if offers for event dates. NHS Digital simply releases data on every patient, with the medical event dates completely unprotected.

The National Data Guardian was asked to determine a single choice by which patients could express their dissent from their data being used for purposes beyond their direct care. When that choice is disregarded, it must be on a basis clearly and specifically defined in statute, and approved by CAG.

As it is doing around the world, the introduction of the GDPR will force a change, and that change should protect patients’ data that under the new Data Protection Act will be considered identifiable. Those who still need everyone’s data will have to explain why to a competent body – which really isn’t too much to ask.

Given the clear promises given as a consequence of the care.data and HES data scandals – promises much repeated, but yet to be delivered – we’ve been waiting a long time for this to be fixed.

Instant Messaging in Clinical Settings

November 2018 Update: The below is consistent with the new NHS Digital guidance, but paragraphs in blue below are omissions from that official guidance.


As a clinician or nurse, you should not have to keep up with the latest fluff of the apps you might use for work. 

NHS England has put out another attempt at ‘guidance’ on using instant messaging apps. Last year it said WhatsApp was not banned, but failed to provide helpful guidance on what to actually use. It still hasn’t. There is a Do & Don’t list, which is better than nothing, but almost impossible to turn into practice in the real world.  If asked, we would suggest something like this:

Summary

  1. If your employer offers an instant messaging solution, use that.
  2. If you are picking apps to use yourself, you are safest with Signal.
  3. If you are not picking the apps you use, you will probably have to use WhatsApp or Skype. But be aware that someone will be held responsible when Facebook or Skype change their rules – and it’s probably not going to be the person who picked the app…
  4. Don’t use Facebook Messenger, Instagram, or Telegram.

Whatever app you use for work, the vast majority of people should avoid having their phone going ding for work purposes while they are not at work. For most apps, a swipe left on the main list of ‘chats’ should show an option to “hide alerts” for some time period – this should ensure that if you do give your personal number to work colleagues, it doesn’t end up driving you to distraction outside work. If someone really wants to get in touch, they can always just call you normally.

 

The reasoning behind our suggestions

The important step in secure messaging is something called “end-to-end” encryption, which prevents anyone – a third party ‘listening in’, or even the service making the connection –  knowing what you said. It’s the equivalent of having a conversation in a private consultation room, rather than doing it standing next to the nurses station, or in a waiting room. But even with Signal, if you are messaging using your personal device, you should treat any conversation as if it were in a lift where another person might be listening.

Signal allows you to decide for how long you will keep messages from any particular person or group, and will automatically delete the stored messages after that. But what happens with the stored message history in other apps? WhatsApp, for example, wants you to give it a full copy of all your messages and send them to its servers as a ‘backup’ (though at some point it will show you ads against them – it is part of Facebook after all).

You may also have set your phone itself to backup to somewhere. Do you know where the backup goes, and what’s in it?

Of course, it is best practice to backup everything on your phone, and most apps assume (probably correctly) that you don’t want to lose every message or photo you receive of your kids. This doesn’t necessarily translate neatly to a clinical setting – anything that must be kept should be recorded elsewhere, so that if you lose your phone, the only thing you won’t have kept was ward chit-chat. WhatsApp wants everything – it doesn’t offer clinical reassurance. And while Snapchat has deletion as a feature, it has other problems akin to Facebook and Skype.

The longer-term security of your messaging is dependent upon who makes the app – and when, and why, they will change the rules on you. We (also) recommend Signal because it is produced by a charitable foundation whose sole mission is to provide secure, usable, communications. One key reason why the NHS England guidance is so terrible is that WhatsApp has lobbyists telling NHS England that it should allow their product; Signal doesn’t.

Since Facebook (the owner of WhatsApp) lies to regulators about its intentions, you clearly cannot rely on the company not to do tomorrow what it denies it will do today.  As a consequence of this, any official guidance must in future be kept up to date by NHS Digital. And, as corporate policies change, so must the guidance – removing from the equation NHS England’s fear of the deluge of lobbying that created this mess in the first place.

Clinicians deserve better tools than those that NHS England chooses to recommend, where a national body prioritises its own interests over the needs of those delivering direct care.

(This post will be kept under review as technologies change; it was last updated in November 2018)

Data and AI in the Rest of Government: the Rule of Law

medConfidential spoke about the Framework for Data Processing by Government at the All Party Parliamentary Group on the Rule of Law. The topic of the APPG provides a useful perspective for much work on data in the public sector, and the wider use of AI by anyone. The meeting was on the same day as the launch of the AI Select Committee Report, which addresses similar key issues of  ‘data ethics’.

The ‘Rule of Law’ is defined in 8 principles as identified by Lord Bingham. The principles are not themselves law, but rather describe the process that must be followed for the Rule of Law to be respected.

Public bodies must already follow that process, and also be able to show how that process has been followed. As a result, those developing AIs (and data processing tools) for use by public bodies must also show how these processes have been followed. This is necessary to satisfy the lawful obligations of the bodies to which they are trying to sell services.

The principles identified by Lord Bingham are a model for testing whether an explanation of an AI and its output, or a data model, is sufficient for use by a public body.

While debates on ethics and society, and on politics and policy, focus on whether a technology should be used – the Rule of Law is about the evidence for and integrity of that debate. As Departments implement the Framework for data processing, to deliver on their obligations under the Rule of Law, it must be compliant with the Principles identified by Lord Bingham – not just the ethics and policies of the Minister in charge that day.

Public bodies are already bound by these rules – unless Parliament legislates to escape them. The principles are widely understood, they are testable, and they are implementable in a meaningful way by all necessary parties, with significant expertise available to aid understanding.

 

Companies and other non-public bodies

Companies (i.e. non-public bodies) are not subject to the same legal framework as public bodies. A Public Body must be able to cite in law the powers it uses; a Private Body may do (almost) anything that is not prohibited by law. This is why facebook’s terms and conditions are so vague and let it get away with almost anything – such a data model does not apply to the tax office.

Some of those looking to make money – to “move fast and break things” – would like the standard to be ethics, and ethics alone. There are currently many groups and centres having money poured into them, with names involving ‘data and society’, ‘ethics and society’, and DCMS’s own ‘Centre for Data Ethics’. The latter is led by a Minister in a Government that will always have political priorities, and – given recent revelations about Facebook – the consequences of incentives to lower standards should be very clear.

Ethics may contribute to whether something should be done – but they are not binding on how it is done, and they offer no actual accountability. After all, no tyrant ever failed to justify their actions; it is the rule of law that ultimately holds them accountable, and leads to justice for those harmed. Ethics alone do not suffice, as facebook and others have recently shown.

There is a great deal more work to do in this area. But unlike other AI ‘ethics’ standards which seek to create something so weak no-one opposes it, the existing standards and conventions of the Rule of Law are well known and well understood, and provide real and meaningful scrutiny of decisions – assuming an entity believes in the Rule of Law.

The question to companies and public bodies alike is therefore simple: Do you believe in the Rule of Law?

[notes from APPG talk]
[medConfidential (updated) portion of the APPG briefing]