Category Archives: News

150,000 patients’ opt-outs not honoured; their confidential information sold for 3 years

A serious error affecting 150,000 NHS patients has been reported in the media in recent days, after it was uncovered last week. We understand the error affects patients who set an opt-out between March 2015 and June 2018 and whose GP practices use TPP’s SystmOne software – their opt-out codes were not sent to NHS Digital until last week.

As a consequence of this error, from April 2016 until 26 June this year, those patients’ confidential, identifiable data was sold to a range of organisations, including private companies. This will obviously be of concern to a great many people.

Both TPP and NHS Digital are taking remedial action; the coding error has been corrected to ensure opt-outs will be uploaded properly from now on, affected GP practices were written to on Monday 2 July, and the individual patients affected should be written to by the end of the month.

Until then, based on current information, this is what you can do:

If you have recently received a letter from NHS Digital about the conversion of your Type-2 opt-out to the National Data Opt-out then you weren’t affected by this incident. (These letters were sent out during June.)

If however you haven’t received a letter, and you are over 16, and you remember opting out any time from March 2015 onwards, then either:

  1. a) you are affected by the TPP incident, or
  2. b) separately, your opt-out was never applied to your GP record.

Anyone over the age of 13 should be able to check their current opt-out status by using NHS Digital’s new online National Data Opt-out process:

If the light blue status box does not appear when you check and you do not wish your confidential, identifiable medical information to be used for any purposes beyond your own direct care, then you need to set the option on this screen to “No”.

This new online process only works, however, for individuals over 13 years old – and not for families with children or adult dependants. medConfidential’s (now improved!) GP opt-out form continues to work, as it has done since late 2013. It also lets you prevent confidential, identifiable information leaving your GP record, which the National Data Opt-out does not cover.

But – given this incident, and every previous breach of public trust – why can’t every patient see their data, so they can know what has happened?

Everyone agrees how bad the situation created by TPP’s error, with consequences for patients from their data being used against their wishes, really is:

Professor Helen Stokes-Lampard, Chair of the Royal College of GPs, said:

Patient data held by the National Health Service should only ever be used morally, safely and responsibly, and we must all work together to ensure mistakes of this nature are never repeated. We need to be able to reassure patients that their wishes concerning their data are being respected.

Understanding Patient Data said in response (their emphasis):

This incident highlights the critical need for transparency – to ensure that it is clear where data is going and how choices are honoured. It also demonstrates that a trustworthy system must not just say the right things but also do the right things in practice as well: if opt-outs are claimed to be honoured, they absolutely must be. If these standards are not upheld, there has be clear accountability in the system, with sanctions if necessary to demonstrate that these issues are taken seriously, or public confidence will again suffer.

Dr Natalie Banner, who now leads the ‘Understanding Patient Data’ project, tweeted:

Astonishing and appalling failure to uphold patient objections: but what sanctions to ensure providers uphold the standards we expect of them? New opt-out, which is patient-registered rather than GP-registered, *should* be less liable to such errors though.

Mr Harry Evans, from the Kings Fund policy team, said:

We are all agreed on the importance of the public not being surprised by how NHS uses data, so this is just remarkable.

These are fine words, but when will they speak out about the people NHS Digital disregarded in its new ‘digital’ process – a process that Ministers signed off – which separates processing for parents and children? (Not every American policy approach should be replicated in the NHS…)

In a recent explanation for OurNHS, we showed the ‘proxy’ form itself says:

…if your family has children under the age of 13, or if you look after a dependent older relative, then things are even more complicated. Rather than giving a simple instruction to your doctor, those who would prefer their children’s data wasn’t sold to third parties for unknown purposes, will be required to send to NHS Digital, by post, four pieces of ID documentation along with a seven-page form. So much for Jeremy Hunt’s much-vaunted commitment to a ‘paperless’ NHS.

Given the significant effect this will have on people far wider than the 150,000 currently affected, you might want to ask (a) Understanding Patient Data, or (b) your MP, what they are doing to ensure the broken process for families making a decision together is fixed.

As the dust settles from GDPR Day…

…we’ve updated our scorecard.

One of the existing patient opt-outs has been renamed as the new National Data Opt-out, but a whole swathe of issues that have festered, unaddressed, for years still remain.

We consider these issues below, framed by questions of – and glaring omissions to – the ‘Your NHS Data Matters’ communications campaign, launched on GDPR Day.

Overview

“Your health and adult social care information supports your individual care. It also helps us to research, plan and improve health and care services in England.”

The word “us” appears to be doing a lot of work here; would patients expect “us” to include, for example, commercial information intermediaries such as Harvey Walsh Ltd?

“You can choose whether or not your confidential patient information is used for research and planning.”

All well and good, if true – but what about all other ongoing (and future) uses of patient information besides “research and planning”? Why does the new National Data Opt-out not use the far clearer, more accurate, and comprehensive Caldicott 3 definition of “purposes beyond direct care”?

If the new National Data Opt-out does cover the use of patients’ information for all purposes beyond their direct or ‘individual’ care, then why not say so? If it does not, then how does the ‘new’ opt-out meet the requirements of Caldicott 3, the recommendations of which the Government accepted in full?

medConfidential recommends: Be accurate! All public communications must in the first instance use the proper Caldicott 3 formulation, “purposes beyond direct care”.

These purposes may be further explained in terms of “research” and “planning”, but public explanations must be clear that uses are not limited to only these two purposes. To do otherwise would both mislead patients and undermine the effectiveness of the opt-out, and could lead to further collapses in public trust when people become aware of uses that do not clearly fall into either category.

“Information that only identifies you like your name and address is not confidential patient information and may still be used.”

This is utterly muddle-headed, and goes against what many (if not most) people reasonably understand is meant by the word “confidential”. While the example given is relatively benign it is precisely this loophole that, not coincidentally, led to the scandal of the Home Office MoU.

“Your confidential patient information is used in two different ways:”

This is not even close to true! We consider other uses, such as commissioning and commercial re-use in more detail below, but this statement is demonstrably untrue: take, for example, the HRA Confidentiality Advisory Group’s Non-Research Register, which contains ongoing uses such as invoice reconciliation, risk stratification, commissioning and projects that explicitly mix direct care and secondary uses.

medConfidential recommends: Don’t mislead patients! Be more clear and explicit about the range of uses to which patients’ information are put.

While public communications must be as clear and as understandable as possible, they must also be accurate – and true. The split between “your individual care” and “research and planning” (a description that we note above is itself too narrow and misleading) is far too simplistic, especially when patients are being asked to make an informed consent choice.

“Most of the time, we use anonymised data for research and planning. So your confidential patient information isn’t always needed.”

No definition of “anonymised” is provided. Using this word without explaining what it means is misleading; the natural (i.e. many patients’) assumption is that “anonymised data” is anonymous, which is not the case. GDPR and guidance from the ICO now makes it clear that what NHS Digital has been disseminating “most of the time” is identifiable data.

That only some identifiers are being removed, or pseudonyms substituted, must be explained – and linking to a third party, non-NHS information source to do this will hardly be reassuring to many patients. Hiding behind narrow legalistic reasoning and jargon never has and (especially post-GDPR) never will solve major long-standing issues.

medConfidential recommends: Follow the law! Stop implying that ‘anonymised’ is the same as anonymous, and respect people’s opt-outs for all identifiable data – don’t keep trying to find loopholes and excuses.

Benefits of data sharing

We are not aware that this is, or ever has been, in dispute. Clearly there are benefits to lawful, ethical, consensual, safe, and transparent data sharing.

Problems come when, as with the care.data programme and previous attempts at public communication, the NHS tries exclusively ‘selling the benefits’ without even mentioning any of the risks. Addressing these directly helps people make sense of the mitigations used – and such measures are no longer just arbitrary claims or assertions – and enables a more informed choice on that basis.

Who uses your data

As we note above, the narrow definition “research and planning” does not even come close to defining the full range of uses, for purposes beyond their direct care, to which patients’ information is put.

These omissions aside, and while ‘Your NHS Data Matters’ lists some types of organisations that may do “research” and acknowledges the use of patients’ information “to support the delivery of health and social care” (conflating both direct care and “planning”) it makes no mention of the types of organisations that may be involved in “planning”, and all of the activities that term is supposed to encompass.

Given that it is precisely those people and organisations that may have access to their information that matters to most patients who have concerns, this is another serious omission. Without it, how are patients supposed to make an informed choice?

medConfidential recommends: Be honest and upfront about who will have access to patients’ information; patients should not be assumed (or required) to understand the inner workings of the health and care system in order to make choices.

It may be argued that NHS Digital’s Data Release Register performs this function. However, linking to a massive Excel spreadsheet, containing literally thousands of individual entries, puts too much of a burden on any normal individual and – given the disparity between this and the level of detail provided elsewhere – begins to look like obfuscation.

We understand NHS Digital is working on a more clearly-formatted HTML publication of its Data Release Register but, in its absence, medConfidential has provided a more readable version that – unlike the current Register – contains links to NHS Digital’s audit reports, for those organisations that have been audited.

medConfidential recommends: Continue improving the transparency of data use.

For example, besides audits (and actions taken) future Registers should link to the relevant DARS and/or IGARD documentation; showing there is a process, and that the process is being applied competently and consistently is an important way to build and maintain trust.

It is unfortunate that the “NIC numbers” given in current Registers are entirely self-referential to anyone performing, e.g. a Google search; concealing or obscuring relevant parts of the process raises and persists doubts.

“Research bodies and organisations include:
– university researchers
– hospital researchers
– medical royal colleges
– pharmaceutical companies researching new treatments”

Why are “pharmaceutical companies” the only ones on this list whose use of patients’ information is qualified? Is the claim that pharmaceutical companies only receive patients’ data for the specific purpose of researching new treatments? This is patently untrue, and leads onto the further spurious claim that patients’ information will not be sold for “marketing or insurance” purposes.

While this claim may be narrowly true, at least for “commercial insurance purposes”, it omits to mention that at least some information intermediaries (i.e. commercial re-users, now sometimes referred to as “health and care analysis companies”) which regularly receive data from NHS Digital still service pharmaceutical marketers.

NHS Digital cannot state definitively who does or does not reuse patients’ medical records, as it specifically chooses not to know.

medConfidential recommends: Stop misleading patients as to the ultimate uses of their data, and stop sending out copies of everyone’s hospital histories to companies which (also) profit from servicing non-NHS commercial interests.

How data is protected

‘Your NHS Data Matters’ makes quite a few assertions about what will and will not be done with your data – though, and especially given the tendency to use jargon and narrow legalistic terms, it would be good to provide evidence and to clearly define key phrases. For example, we presume “confidential patient information” and “data” are two different things.

In addition, as noted above, linking to a third party, non-NHS information source to achieve some of this – however good the explanation – will hardly be reassuring to patients with concerns.

Another glaring omission, given the significant number organisations that do not provide health and care services, but that do use patients’ information for purposes beyond their direct care, is the list of the steps those organisations are supposed (required?) to take to protect patients’ data.

The list of steps for such organisations clearly cannot be the same as those for NHS bodies, in that some of these organisations do not “make it clear why and how data is being used”, and others hide behind the information intermediaries’ commercial reuse contracts to, e.g. measure the efficacy of pharma sales, and to market their products to NHS doctors and commissioners.

medConfidential recommends: Make it a requirement to report to NHS Digital (and thence to patients) how data is used by everyone; stop relying on commercial reuse contracts and the “promotion of  health” loophole in the Care Act 2014 to persist what are self-evidently marketing uses.

Manage your choice

The new ‘digital’ National Data Opt-out process cannot cope with dependant children, and assumes that all 13 year olds have mobile phones or e-mail accounts which can be accessed safely without duress. It appears as if, when the process was signed off under the Government’s Digital Service Standard, Ministers did not spare a thought for their families at all…

The entire process is overly bureaucratic and intimidating, especially when compared with the existing ‘Type-2’ opt-out process: rather than simply instructing your own GP, who already knows you, you must identify yourself to officials at a remote call centre – and may even have to send in up to four pieces of ID and documentation with a form. (Check pages 2-3 of NHS Digital’s 7-page ‘Non-Digital Proxy Opt-Out Form’ for a list of requirements.)

This feels more like an inquisition than a ‘choice’.

Given how fundamentally broken NHS Digital’s new ‘Beta’ opt-out process is, medConfidential recommends patients who have concerns use the opt-out form we’ve provided since late 2013.

We updated our form in line with recent changes and it still works for you, your dependant children and others for whom you are responsible – it also protects your GP data from uses beyond your direct care, not just your data supplied to NHS Digital.

With all that is and will be changing, medConfidential also strongly recommends you get yourself a Patient Online account, if you don’t already have one.

We provide more information about this on the ‘For patients’ section of our website.

Though it will still be some time until you can see how all of your data has been used, by which organisations and for what purposes, a Patient Online login to your GP’s online system should already allow you to see how your GP data is being used.

Where an opt out doesn’t apply

One critical question is whether patients’ opt-outs will now be honoured in the dissemination of ‘Hospital Episode Statistics’. HES comprises two-thirds of data releases from NHS Digital, most of those to commercial companies – including all of the commercial reusers. Until now, over 1.4 million patients’ wishes in this regard have been ignored.

Apparently officials believe IGARD, a body of NHS Digital’s own creation, can decide to override patients’ dissent when, in fact, the only body with a statutory basis to approve such exceptions is the Confidentiality Advisory Group (CAG) at the HRA.

Both GDPR and the UK’s new Data Protection Act clarify and extend the definition of identifiable data such that – the day before GDPR came into effect – staff at NHS Digital were ordered not to disseminate any “anonymised” patient data. Data releases were resumed the following day, but NHS Digital is still in discussions with the Information Commissioner’s Office as to what patient information can now be considered “anonymous”.

Under GDPR, this is unlikely to include HES in its current form: individual-level, lifelong hospital medical histories, where every event is dated and linked by a pseudonym.

Given a mother with two children is over 99% likely to be identifiable from her children’s birth dates alone, and given the enhanced GDPR ‘right of access’ to any data received by any customer of NHS Digital to which opt-outs have not been applied, it would seem not only unlawful but highly irresponsible for NHS Digital to keep selling what GDPR now defines as the identifiable patient data of those who have opted out.

If you – or any patient – would like to see how your data is used, and where your choices are being ignored, medConfidential recommends you visit TheySoldItAnyway.com

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

GDPR DAY BRIEFING: ‘Your data matters’ and NHS patients’ data, with the ‘new’ National Data Opt-Out

May 25th 2018 will start an awareness-raising programme with the public that their data matters – and what, if anything, changes as a result of patients’ increased rights under GDPR.

With regard to NHS patients’ health data:

  • A new NHS ‘National Data Opt-Out’ commences on GDPR day (May 25th);
  • NHS Digital continues to sell (i.e. disseminates under contract, on payment of a fee) patients’ data, as it has been doing for years;
  • GDPR expands the definition of ‘identifiable data’ (one reason why everyone’s privacy policies are changing);
  • Will NHS Digital ignore patient opt-outs on these newly-identifiable data releases, relying on definitions from the old Data Protection Act?
  • NHS Digital / DH refuse to ask data applicants why 98% of NHS patients’ data isn’t enough for them; while there may be legitimate reasons to override patient opt-outs, pretending new legislation does not apply to data releases (yet again) is not one of them.

Your Data Matters… but NHS Digital sells it anyway

NHS Digital still forgets about patients. Unfortunately, it sees them less as people and more as ‘lines in a database’.

NHS Digital continues to sell a product called ‘Hospital Episode Statistics’ (HES); a dataset that is not actually statistics but that rather consists of individual patients’ lifelong hospital histories, with every medical event dated and linked together by a unique identifier. As of May 24th, two-thirds of NHS Digital’s data disseminations do not respect patients’ right to object (‘opt out’) to their data being used for purposes beyond their direct care.

If you read NHS Digital’s own Data Release Registers, or view them at TheySoldItAnyway.com, [1] you can see for yourself the evidence of where data goes – and where patients’ express wishes are deemed not to matter.

After four years, and further breaches of health data, NHS Digital ignores the choices of the 1.4 million people who opted out and still sells their (and every other hospital patient’s) data for commercial reuse. Those who claim to need 100% of the data for some reason, need merely explain to a competent data release body why 98% of people’s data isn’t enough – an explanation they’re currently not even asked to provide.

GDPR clarifies that the hospital data NHS Digital continues to sell is identifiable data – so claimed exemptions (item 5) to people’s opt outs don’t apply. Especially for those who remember the dates of particular medical events in hospital, such as the birth dates of their own children, or who can read about them online. [2]

‘Could do better’

Last week, the Department for Education called a halt to the sale of the data it collects on schoolchildren [3] for the very reason the NHS continues using to justify its sale of patients’ data.

NHS Digital now has a research environment [4] which allows far higher safety for patients’ data – but the companies that don’t want the NHS to be able to see what they’re doing with the data are special pleading. It is precisely these hidden uses to which patients are most likely to object.

NHS Digital’s customers, for example, still include for-profit companies such as Harvey Walsh, an “information intermediary” that – exactly as it did in and before 2014, and despite having breached the terms of its contract since then – continues to service commercial clients including pharmaceutical companies, which use the information to promote their products to doctors.

The digital service launching for GDPR Day in fact does less than the form that’s been available on medConfidential’s website since late 2013. [5] Our GP form works immediately – if you use the new digital service, your GP won’t know about it for months.

Discussing a damning ‘report’ in the House of Commons, the chair of the Health Select Committee censured NHS Digital for its “dimmest grasp of the principles of underpinning confidentiality”. [6] The Government has agreed to raise the bar for breaching patients’ confidentiality when handing information to the Home Office; will NHS Digital now respect the choices of those patients who wish to keep the information in their medical records confidential too?

The solution to this is straightforward: DH can Direct NHS Digital to respect objections (opt-outs) in all releases of HES that CAG has not approved to have data released without patients’ objections honoured. There may be projects that require 100% of patient data; two-thirds of them do not.

The ICO has not yet updated its (non-statutory) Anonymisation Code of Practice to match GDPR, although its guidance on the GDPR definition of personal data and newer codes on subject access rights show the definitions in GDPR mean NHS Digital’s current practice does not deliver on its promises to patients.

The NHS has ignored legislative changes and harmed research projects before – see note 4 in this post. This effect is one of the main things that prompted the Wellcome Trust to create the Understanding Patient Data initiative.

But it is still (a bit) better than it was…

NHS Digital now sells less of your data than it used to; it only sends out hundreds of copies of the nation’s hospital records – ‘pseudonymised’, but containing information that GDPR recognises makes it identifiable, and therefore still personal data.

You now have the ability to make a choice for you and (after a fashion) your family that will work, in due course [7] – but NHS Digital needs to listen to Dr Wollaston, “take its responsibilities seriously, understand the ethical underpinnings and stand up for patients”, respect that patients’ data matters, and fully honour everyone’s choices.

Questions for interviewees:

  • What does the NHS’ online-only opt-out service not do on day one, that the GP-led process did last week?
  • How many steps does it take for a family to express their collective choice on how their data is used?
  • When this new digital dissent process was signed off under the Government’s Digital Service Standard, did Ministers spare a thought for their families at all?
  • Will patients’ opt-outs be honoured in the dissemination of HES under GDPR?
    • If not, will those patients who already opted out be told why not?
  • A mother with 2 children is over 99% likely to be identifiable from their children’s birth dates alone; given the enhanced GDPR ‘right of access’ to any recipient data to which opt-outs have not been applied, will NHS Digital keep selling what GDPR defines as the identifiable patient data of those who have opted out?
    • What is the burden on medical research of this choice by NHS Digital, made to placate its commercial customers?

If you or any patient would like to see how their data is used, and where their choices are being ignored, please visit TheySoldItAnyway.com

Notes for Editors

1) NHS Digital publishes its data release register as a spreadsheet but it fails to link to, e.g. its own audit reports – so medConfidential has created a more readable version that does.

2) All the information required to identify Ed Sheeran’s entire hospital history in HES appears in this BBC News article, published online on 19 May 2018: http://www.bbc.co.uk/news/uk-england-suffolk-44155784

3) ‘Sharing of school pupils’ data put on hold’, BBC News, 15 May 2018: http://www.bbc.co.uk/news/technology-44109978

4)  A ‘safe setting’, as medConfidential and others recommended in evidence to Parliament back in 2014: https://digital.nhs.uk/services/research-advisory-group/rag-news-and-case-studies/remote-data-access-environment-to-speed-up-access-to-data

5) We have updated our form to reflect the name change. See https://medconfidential.org/how-to-opt-out/

6)  https://www.theyworkforyou.com/debates/?id=2018-05-09a.746.6#g771.0

7) The National Data Opt-out should be respected by bodies across the health and care system “by 2020”.

medConfidential campaigns for confidentiality and consent in health and social care, seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Founded in January 2013, medConfidential is an independent, non-partisan organisation working with patients and medics, service users and care professionals.

– ends –

Where are the CAG regulations?

We talk a lot about NHS Digital, and its data releases that continue to ignore opt-outs. But 4 years ago today, Royal Assent of the Care Act 2014 gave NHS Digital a “general duty” to “respect and promote the privacy of recipients of health services and of adult social care in England” – which clearly hasn’t been honoured in some areas of its work. The Act also changed the law specifically so that the Confidentiality Advisory Group (CAG) of the Health Research Authority has the power to advise NHS Digital; advice to which NHS Digital must listen.

Caldicott 3 itself does not require dissent to be honoured when data is disseminated in line with the ICO’s Code of Practice on Anonymisation. (The National Data Guardian – who has been given no enforcement powers – is very careful not to ‘cross wires’ with the UK’s data regulator, who does have such powers.) And, despite well over a million patients clearly indicating their wishes to the contrary, NHS Digital continues to argue its dissemination of pseudonymised data is “anonymised” to the satisfaction of the 1998 Data Protection Act.

The UK is about to get a new Data Protection Act, aligned with and based on the EU General Data Protection Regulation, which says:

(26) … Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person.

The UK’s Information Commissioner will update the Code of Practice on Anonymisation in due course –  she’s just a little busy right now, and the new Data Protection Act is not yet on the statute book – but the Irish Commissioner has already said: (emphasis added)

“Although pseudonymisation has many uses, it should be distinguished from anonymisation, as it only provides a limited protection for the identity of data subjects in many cases as it still allows identification using indirect means. Where a pseudonym is used, it is often possible to identify the data subject by analysing the underlying or related data.

Current practice will have to change.

While IGARD may be the appropriate body to advise whether a request meets NHS Digital’s standards for dissemination, it is not an appropriate body to advise on releasing data which does not honour patients’ objections. The adjudication of the principles of those decisions, by statute, belongs to CAG.

There are legitimate instances where patients’ dissent may be be overridden – but IGARD is not, and never should have been, the body to decide that.

The opt-out is there to protect patients who decide that the safeguard the NHS currently relies upon – pieces of paper, which include for example commercial re-use contracts with commercial companies that service other commercial companies, including pharmaceutical companies that use the data for promoting their products (i.e. marketing) to doctors – is not sufficient for their situation. As is their right.

Another example: in 2014, the Health Select Committee asked for a safe setting for researchers. Only in April 2018 did a remote safe setting begin to be piloted for researchers – that work not only needs to be completed, but it should become the standard means of access.

NHS Digital continues to insist that a piece of paper is sufficient safeguard under which to release copies of the entire nation’s lifelong, linked medical histories to hundreds of organisations. Its own published records show that two-thirds of NHS Digital’s data releases do not respect patient dissent.

It should be CAG which makes such decisions, whenever and wherever it is necessary. The CAG Regulations will make that clear, when they exist. Assurances to patients are less than meaningful when the Regulations to which they relate do not yet exist.

If someone applying for patients’ data cannot do what they need with only 98% of people’s data, they should simply explain to a responsible body why this is the case. Public Health England’s cancer registry already takes this approach with the choice of protections if offers for event dates. NHS Digital simply releases data on every patient, with the medical event dates completely unprotected.

The National Data Guardian was asked to determine a single choice by which patients could express their dissent from their data being used for purposes beyond their direct care. When that choice is disregarded, it must be on a basis clearly and specifically defined in statute, and approved by CAG.

As it is doing around the world, the introduction of the GDPR will force a change, and that change should protect patients’ data that under the new Data Protection Act will be considered identifiable. Those who still need everyone’s data will have to explain why to a competent body – which really isn’t too much to ask.

Given the clear promises given as a consequence of the care.data and HES data scandals – promises much repeated, but yet to be delivered – we’ve been waiting a long time for this to be fixed.

Instant Messaging in Clinical Settings

As a clinician or nurse, you should not have to keep up with the latest features of the apps you use for work.

NHS England has put out another attempt at ‘guidance’ on using instant messaging apps. Last year it said WhatsApp was not banned, but failed to provide helpful guidance on what to actually use. It still hasn’t. There is a Do & Don’t list, which is better than nothing, but almost impossible to turn into practice in the real world.  If asked, we would suggest something like this:

Summary

  1. If your employer offers an instant messaging solution, use that.
  2. If you are picking apps to use yourself, you are safest with Signal.
  3. If you are not picking the apps you use, you will probably have to use WhatsApp or Skype. But be aware that someone will be held responsible when Facebook or Skype change their rules – and it’s probably not going to be the person who picked the app…
  4. Don’t use Facebook Messenger, Instagram, or Telegram.

Whatever app you use for work, the vast majority of people should avoid having their phone going ding for work purposes while they are not at work. For most apps, a swipe left on the main list of ‘chats’ should show an option to “hide alerts” for some time period – this should ensure that if you do give your personal number to work colleagues, it doesn’t end up driving you to distraction outside work. If someone really wants to get in touch, they can always just call you normally.

 

The reasoning behind our suggestions

The important step in secure messaging is something called “end-to-end” encryption, which prevents anyone – a third party ‘listening in’, or even the service making the connection –  knowing what you said. It’s the equivalent of having a conversation in a private consultation room, rather than doing it standing next to the nurses station, or in a waiting room. But even with Signal, if you are messaging using your personal device, you should treat any conversation as if it were in a lift where another person might be listening.

Signal allows you to decide for how long you will keep messages from any particular person or group, and will automatically delete the stored messages after that. But what happens with the stored message history in other apps? WhatsApp, for example, wants you to give it a full copy of all your messages and send them to its servers as a ‘backup’ (though at some point it will show you ads against them – it is part of Facebook after all).

You may also have set your phone itself to backup to somewhere. Do you know where the backup goes, and what’s in it?

Of course, it is best practice to backup everything on your phone, and most apps assume (probably correctly) that you don’t want to lose every message or photo you receive of your kids. This doesn’t necessarily translate neatly to a clinical setting – anything that must be kept should be recorded elsewhere, so that if you lose your phone, the only thing you won’t have kept was ward chit-chat. WhatsApp wants everything – it doesn’t offer clinical reassurance. And while Snapchat has deletion as a feature, it has other problems akin to Facebook and Skype.

The longer-term security of your messaging is dependent upon who makes the app – and when, and why, they will change the rules on you. We (also) recommend Signal because it is produced by a charitable foundation whose sole mission is to provide secure, usable, communications. One key reason why the NHS England guidance is so terrible is that WhatsApp has lobbyists telling NHS England that it should allow their product; Signal doesn’t.

Since Facebook (the owner of WhatsApp) lies to regulators about its intentions, you clearly cannot rely on the company not to do tomorrow what it denies it will do today.  As a consequence of this, any official guidance must in future be kept up to date by NHS Digital. And, as corporate policies change, so must the guidance – removing from the equation NHS England’s fear of the deluge of lobbying that created this mess in the first place.

Clinicians deserve better tools than those that NHS England chooses to recommend, where a national body prioritises its own interests over the needs of those delivering direct care.

(This post will be kept under review as technologies change; it was last updated in April 2018)

Data and AI in the Rest of Government: the Rule of Law

medConfidential spoke about the Framework for Data Processing by Government at the All Party Parliamentary Group on the Rule of Law. The topic of the APPG provides a useful perspective for much work on data in the public sector, and the wider use of AI by anyone. The meeting was on the same day as the launch of the AI Select Committee Report, which addresses similar key issues of  ‘data ethics’.

The ‘Rule of Law’ is defined in 8 principles as identified by Lord Bingham. The principles are not themselves law, but rather describe the process that must be followed for the Rule of Law to be respected.

Public bodies must already follow that process, and also be able to show how that process has been followed. As a result, those developing AIs (and data processing tools) for use by public bodies must also show how these processes have been followed. This is necessary to satisfy the lawful obligations of the bodies to which they are trying to sell services.

The principles identified by Lord Bingham are a model for testing whether an explanation of an AI and its output, or a data model, is sufficient for use by a public body.

While debates on ethics and society, and on politics and policy, focus on whether a technology should be used – the Rule of Law is about the evidence for and integrity of that debate. As Departments implement the Framework for data processing, to deliver on their obligations under the Rule of Law, it must be compliant with the Principles identified by Lord Bingham – not just the ethics and policies of the Minister in charge that day.

Public bodies are already bound by these rules – unless Parliament legislates to escape them. The principles are widely understood, they are testable, and they are implementable in a meaningful way by all necessary parties, with significant expertise available to aid understanding.

 

Companies and other non-public bodies

Companies (i.e. non-public bodies) are not subject to the same legal framework as public bodies. A Public Body must be able to cite in law the powers it uses; a Private Body may do (almost) anything that is not prohibited by law. This is why facebook’s terms and conditions are so vague and let it get away with almost anything – such a data model does not apply to the tax office.

Some of those looking to make money – to “move fast and break things” – would like the standard to be ethics, and ethics alone. There are currently many groups and centres having money poured into them, with names involving ‘data and society’, ‘ethics and society’, and DCMS’s own ‘Centre for Data Ethics’. The latter is led by a Minister in a Government that will always have political priorities, and – given recent revelations about Facebook – the consequences of incentives to lower standards should be very clear.

Ethics may contribute to whether something should be done – but they are not binding on how it is done, and they offer no actual accountability. After all, no tyrant ever failed to justify their actions; it is the rule of law that ultimately holds them accountable, and leads to justice for those harmed. Ethics alone do not suffice, as facebook and others have recently shown.

There is a great deal more work to do in this area. But unlike other AI ‘ethics’ standards which seek to create something so weak no-one opposes it, the existing standards and conventions of the Rule of Law are well known and well understood, and provide real and meaningful scrutiny of decisions – assuming an entity believes in the Rule of Law.

The question to companies and public bodies alike is therefore simple: Do you believe in the Rule of Law?

[notes from APPG talk]
[medConfidential (updated) portion of the APPG briefing]

Response to the House of Lords AI Select Committee Report

The AI Select Committee of the House of Lords published their report this morning.

In respect of the NHS, it suggests nothing the NHS wasn’t already doing anyway.

The suggestion that ‘data trusts’ be created for public sector datasets – such as tax data – will likely cause fundamental distrust in AI amongst the public (paragraphs 82 & 84). The NHS has shown how that model ends badly when the prime drivers are commercial, not ‘human flourishing’.

Sam Smith, a coordinator at medConfidential said (referring to paragraphs 99, 129, 317-318, 386, 419-420) :

“A week after Facebook were criticised by the US Congress, the only reference to the Rule of Law in this report is about exempting companies from liability for breaking it.

“Public bodies are required to follow the rule of law, and any tools sold to them must meet those legal obligations. This standard for the public sector will drive the creation of tools which can be reused by all.

 

-ends-

medConfidential are speaking at the APPG Rule of Law in Parliament from 11 – 12:30, and more details are now available.

NHS Digital failing to uphold patient interest


The Health Select Committee has published a report on data sharing which “raises serious concerns about NHS Digital’s ability to protect patient data” under the headline “NHS Digital failing to uphold patient interest”.  The Home Office is “treating GP patient data like the Yellow Pages” according to the RCGP.

The NHS has been trying to rebuild trustworthiness around data since the last big NHS data project collapsed in 2014. This report shows that all promises can be undermined by the narrow minded view of one office in Whitehall

The Health Select Committee is clear that NHS Digital has again failed in its statutory duties, and has put patients at risk by the processes it has adopted and refuses to change.

HSCIC rebranded into NHS Digital in an attempt to avoid the history of past failures, but this report shows actions are unchanged…

We submitted written evidence to the inquiry.

medConfidential Bulletin, 9th March 2018

It has been a while since we last sent a newsletter. Our apologies for that – we have been kept busy on a number of fronts, but rather than spam you with speculations we believe it’s better to communicate when there are significant developments.

 

New national opt-out for medical records

An announcement has been delayed for some months and there’s still some time until action is taken, but to quote NHS Digital last week:

The Secretary of State has agreed that the national data opt-out will be introduced alongside the new data protection legislation on 25 May 2018. It has also been agreed to present the national data opt-out as a single question to cover both research and planning. Type 2 opt-outs (which currently prevent identifiable data from leaving NHS Digital) will be converted to the new national data opt-out when it is introduced in May. Patients with type 2 opt-out will be contacted directly about this change.

There are still a number of important questions to be answered, but we’re working on those for you. For example, at this point, the Government has not yet confirmed that every data release that would be covered by the Type 2 opt-out will be covered by the new opt-out.

medConfidential has yet to see the final wording of the question, but this announcement is clear confirmation that if you opted out in 2014 (or subsequently), you will be sent a letter about what happened. We also haven’t yet seen the wording of the letter, as we and the other members of CDAG (the care.data Advisory Group) would previously have done, but apparently we are to be consulted on that too. When we have the ability to cite formal statements on the new process, we will update our website – this is likely to be in May.

So, if you have already opted out, the NHS will write to you about the new opt-out model. Whether anyone will tell other people remains unclear. We do hope the Secretary of State won’t snatch defeat from the jaws of a victory which could improve patient confidentiality and everyone’s confidence in how the NHS uses data.

 

This week: Data Protection Bill

The Data Protection Bill was delayed by political squabbling, but must pass by early May, and is now on a very tight timescale.

medConfidential’s concerns with the Bill relate to something called the “Framework for Data Processing by Government” which, in effect, creates a ‘Data Controller in Chief’ who can ignore the Information Commissioner, and the fact that the Government wishes to deny your ability to access information on how your records are used, if that might be used by someone else at another time in a way which may “prejudice… effective immigration control”.

Thanks to a great deal of work by many concerned groups and organisations, the Government no longer considers this framework above the law, just above enforcement of the law. The Rule of Law requires that justice both be done, and be seen to be done – requiring transparency that Governments and companies often prefer to avoid.

 

What you can do

Many parts of England have local elections in May. The ongoing stealth reorganisation of the NHS in England (into 44 “Sustainability and Transformation Partnerships” and “Integrated Care Systems”) will give your local council more responsibility for data re-use in your area. No details will be given until after the elections – of course! – but if anything does emerge before that, we’ll let you know.

The health and care issues that most burden the NHS differ from place to place, sometimes quite widely. So when local politicians ask for your vote in the next few weeks, you might ask them what their council would do about the biggest issues in your area.

You can see the top three issues most impacting health in your local authority, and those nearby, on this map: http://bit.ly/2FVYVE1

(Created thanks to current data from Public Health England, and with the help of tools provided by Democracy Club whose volunteers collate and share information on elections across the UK.)

 

What’s next?

medConfidential keeps working even when we’re not sending newsletters; we won’t spam you if there’s nothing important to say. As you can see from this Bulletin, we are approaching another critical time for patient confidentiality that we hope can be negotiated with far greater success than in 2014! If you appreciate our ongoing efforts, we accept donations. Thank you for your support.

 

Phil Booth & Sam Smith
9th March 2018