Category Archives: News

Matt Hancock Still Doesn’t Tell You How Your Data Is Used – December 2019

There should be a price for misleading public statements about the NHS, whether about data or everything else.

Since our last update, and our first update in 2014, what a Data Usage Report should contain has reached consensus. The NHS has made some progress – not that you as patients will have seen any of it – and while NHS Digital calls it a Statement, rather than a Report, that is probably a better word. 

If you e-mail NHS Digital and ask them, they will now tell you how data about you has been used, to whom it has been sent, and hopefully, the places where your Summary Care Record and the new ‘National Record Locator’ have been accessed for you. NHS Digital still doesn’t know everything, but it is progress. Whether this will roll out to appear on the NHS App, and on NHS.UK is a different question.  Mainly a political question for NHSX – NHS Digital would prefer to avoid the transparency.

Now NHS Digital can do this, we turn to the question of the quality of the content. Anything at all is clearly better than the status quo of nothing, and “at will” is far better than “by request”. medConfidential also has an example of what this would look like across the rest of Government (i.e. behind a GOV.UK Login) – as the approach we’ve taken applies for data used across Government, not just health data.

How to talk about data projects

Over the last year, our friends at Understanding Patient Data and Use My Data – and others – have done quite a bit of work on how you explain data usage to people.


While application forms may contain ‘lay person summaries’, these are of variable quality. That’s not to say that the current ones aren’t a good start – they prove the process can work, and they will get better over time. Conversations about improvement will continue to be hypothetical until patients can actually see how their data is used, at which point quality will go up. 

The work of Understanding Patient Data (and friends) shows that these explanations can be good. Whilst various agencies can take their own view of ‘good’ communications, once the feedback loops exist and are running, the incentives to get better quickly kick in. With its limited numbers of projects and outcomes, the cancer registry has shown that assisting some projects as needed can be done without the need for extra resources. 


For projects that make it into the press, the existing NHS comms team who write the ‘NHS health news explainers’ can also assist; showing how to tie the legitimate uses of data more tightly into the benefits of research.

There’s plenty of expertise to help make the explanations for patients really good, but they must be good enough for now – if they aren’t, then the data use should not have been approved! – and therefore the biggest block on improvement is simply not having started to show people how data about them is used on NHS.UK.

Of course, medConfidential still shows everything it can on TheySoldItAnyway.com… and commercial entities are still getting data. As NHSX ramps up for 2020, does it really want the only place patients can readily see how their opt out was respected (or not) to be at TheySoldItAnyway.com?

care.data returns

Another attempt to collect your GP data is coming. While none of the details are finalised, NHS England is quoting BMA as saying it’s “care.data done right”. (It is unclear at this point whether that quote is from “care data day” or elsewhere.)

Will the 97% of people who do haven’t said no to the desire to “use my data” for purposes beyond their direct care be able to see how their data is used? How confident are NHSX, DHSC and NHS England that what they tell the public this time will be matched by what NHS Digital is told to do, and what it actually does?

Information will be provided to the public about care.data 2; the question is whether NHS England and DHSC tell people, or we do. As before, medConfidential will tell people the truth, with evidence – whereas Matt Hancock’s choices in the recent election suggest he may choose a lower standard.

There are of course legitimate reasons to use patient data, especially the data of people who wish it to be used. If the programme is consensual, safe, and transparent, then it will be scrutinised and the outcome can be positive – will you know how your data is used, and how your choices are implemented? Do you have the facts you need in order to make an informed decision on how your and your family’s health data is used?

The opt-out model of Organ Donation

We are approaching half-way through the communications period for organ donation in England to become an opt-out rather than an opt-in process.

medConfidential has not yet seen any figures published for the effectiveness of this national communications campaign – nor how many people have taken action as a result, both opt-outs and explicit opt-ins – but those ads we have seen so far have all been vague and non-specific.

The memories of Alder Hey haven’t faded, and we sincerely hope DH / NHSBT ‘step up a gear’ so people really can make an informed choice before death, and avoid unnecessary stress and suffering for their loved ones.

How the communication of the organ donation opt-out programme succeeds or fails will likely demonstrate whether the care.data2 process will succeed or fail too. With luck, the Secretary of State won’t be in so much of a rush to grab your medical records that, in haste, he undermines organ donation too.

As has always been the case, NHS England and DHSC could have a data system that is consensual, safe, and transparent. The question is whether they will duck the hard choices and make you pick up the pieces they wanted to avoid.

The Home Office

With the Government proposing to move Immigration and Borders responsibilities out of the Home Office, a decision will be required on what happens to all of the toxic soup of data agreements between the Secretary of State and the Home Department for those purposes. 

While still meeting the obligation of transparency, simply cloning and rubber-stamping each data sharing agreement for such purposes would be a terrible outcome. We may yet see improvements on the toxic legacy of the last nine years under the previous Prime Minister’s worldview – but this will require significant changes, notwithstanding cancellation.

If the data flows do continue, then this Prime Minister is clearly not interested in solving the problem that sees his Government threatening to deport scientists due to a Government typo. Number 10 will either decide to fix this as part of its Machinery of Government change, or it will decide to keep things as they are. The message will be clear either way.


This post is the latest in an irregular series of updates posted in 2014, 2015 and 2016.

Your medical records and the 2019 Election manifestos

At the last Election in 2017, medConfidential had a single, simple request:

Will patients know how their medical records are used?

While the Government seemed to make noises in that direction, it has delivered nothing. For this Election, we reiterated that question and highlighted a number of key issues and commitments that the next Government would have to make around how NHS patients’ data is used and how it buys technology.

So what did the parties all say in their manifestos?

Continue reading

Digital Government report from the House of Commons Sci/Tech committee

The House of Commons Sci/Tech Committee’s report on Digital Government sets out a direction of travel, something lacking from Government in recent times, but some of the details are disturbing.

The committee diverge from the Information Commissioner and 2018 Data Protection Act which are quite clear that consent is not a legal basis generally available for the routine delivery of most public services.

In paragraph 29, the committee justify unique identifiers for people on the basis of evidence about unique identifiers for objects or company numbers. Did they not notice the difference? While the Home Office may treat UK residents like cattle, numbered and tracked, that’s not what is usually expected by Parliament. The principle is a good one, the Committee’s suggestion is internally contradictory.

Paragraph 23 implies the Lib Dem led committee believe citizens should have no right to approve or object to to sharing any data  that isn’t “sensitive personal data” such as “ethnicity, State of residence, and sexuality”. Such a framework would significantly weaken data rules protecting citizens and would be a radical change in the law to suggest by accident, giving even more power to a future data controller in chief.


In more positive news, the top line, that “The Government should facilitate a national debate on single unique identifiers for citizens to use for accessing public services along with the right of the citizen to know exactly what the Government is doing with their data” is a public debate that is necessary. The single unique identifier is a bad idea, but there are better ideas that should replace it in a genuine public debate. Consider the current Home Office and next-Prime Minister, how many windrush-style mistakes will they make with your single identifier? And what happens when they decide to take it away from someone you care about?

With the top line from the committee, it is increasingly untenable that the NHS continues to withhold from patients how data about them was used – a place where the infrastructure is in place and identity is already known.

Alternatives to the DWP tender for a “medical records broker”

The existing ‘Atos’ process* for Employment and Support Allowance (ESA), some components of Universal Credit (UC) and Personal Independence Payments (PIP) is recognised to be brutal and inhumane. The Department of Work and Pensions’ desire to replace it is fundamentally to be welcomed.

Where DWP ends up with regard to health information will likely be something modelled on the British Medical Association’s agreement with the Association of British Insurers on GP reports. This agreement provides a published definition of what data is medically relevant, with clear standards and a template form in which a report must be produced, including the structures and uses to which the data can and cannot be put, and a commitment on the part of the data recipient (in this case, the insurance company) to respect the medical view.

It is this last element with which DWP is most likely to struggle – but without it, all the rest is undermined.

The information a patient gives to their doctor(s) must be untainted by external priorities, otherwise people will come to harm – whether from excess or under-reporting. As Professor Helen Stokes-Lampard, chair of the Royal College of GPs has said, “We are doctors, whose first interest is the care of our patient: we are not border guards, and we are not benefits assessors.”

Were data sharing with DWP to be perceived in this way, the provision of care and medical research – and public confidence and trust in both – would be undermined to a far greater extent than care.data ever did.

Such data sharing would arguably be more dangerous and harmful overall than the Atos assessment process, as GPs would have consider competing incentives around  the information provided by their patients – not knowing in advance which Departments would get to see it, nor when DWP might send letters demanding they change their medical judgment.
DWP cannot address these issues alone and in secret – a tender is never a good place to start, and the current one could potentially catastrophically undermine any improvements.

There aren’t many ‘fully automated’ decisions

Were a third party being contracted to make automated decisions, DWP could (and should) expose the ‘business logic’ upon which those decisions are to be made. But that is not what DWP is doing.

Some of the decisions being made are really, really simple – being pregnant, for example, is a binary state indicated by a medical test, from which processes can result. And a ‘terminal’ state is a medical decision, made for medical reasons.

While the actions of DWP and its contractors may suggest that some terminally-ill patients are ‘fit for work’, we do not imagine this to be an explicit policy intent – more a result of the systematic process neglect that the current Secretary of State has expressed a desire to resolve. Will DWP accept a ‘terminal’ definition from the NHS in future? If not, as now, nothing that DWP does will matter.

Most areas of controversy are not fully automated, nor even fully automatable. While a recorded status of ‘terminal’ may (trivially) be the result of a human pressing a button, when the doctor presses that button has to be based on a carefully considered discussion with the patient – which should only be about how they wish to die, without any implications or insinuations from target-obsessed bureaucracies of the state.

Medical decisions are human decisions

The Atos processes exist because DWP does not trust the data it could get from the NHS. As has become evident, there are processes where NHS doctors offer ‘fit notes’ the DWP requires them to provide, but pressures them not to. This is DWP gaming the very system that it set up in an attempt to make it not be gameable by anyone else. And this remains the context in which the tender has been issued.

Patients must believe that the information they give to their doctor will not be used against them. No data protection law, old or new, allows the DWP to rifle through GP records without an explicit legal case. However the Atos process is replaced, whatever replaces it is going to require new legislation – informed by all of the stakeholders, in a public debate.

DWP brings ‘business logic’ to a political problem

Any data access or data sharing cannot be done under Data Protection law alone. It is not part of the NHS’ or GP’s public task to assess benefits for DWP, and DWP cannot ‘ask’ for consent when that ‘consent’ is a condition of the social safety net that makes it possible for a person to buy food. (DWP may try; it will fail.) So any replacement is going to require primary legislation – and that legislation cannot be initiated by DWP alone, and certainly not by issuing a tender.

The tender document is quite clear on the policy intent, and how DWP sees the world. But DWP cannot fix this alone. It may try because it believes it has no other levers to pull, given wider distractions. So this current approach – trying to fix a complex political issue with more technology – will likely go no better than the Home Office’s roll out of the Settled Status digital process, and could go a lot worse…

Technical barriers imposed by others

DWP has no way of knowing that some of these barriers even exist.

For example, the new ‘NHS Login’ service will be necessary for any digital service that interacts with the NHS. And, in an explicit decision decision by NHS Digital (we argued against it; they ignored us, in writing), the NHS Login service passes the patient’s NHS number to every digital service.

While this may be good for direct care, it is terrible for anything else. As a result, any DWP ‘data broker’ or ‘trusted third party’ using the NHS Login will have a copy of the patients’ NHS number that the NHS would argue they are prohibited by law from having.

On an even more fundamental matter, the DWP is going to have to work with the NHS and medical professional bodies if only for the reason that it has little – if any – experience with coded health data on which the health service runs. Interpreting a person’s condition into (or from) dated medical events is a highly-valued clinical skill and, on the evidence of the outsourced work capability assessors, not one that will prove easy to duplicate.

We also have a further, modest, proposal.


* By ‘ATOS process’, we mean the Work Capability Assessment for ESA or UC – run by Centre for Health and Disability Assessments Ltd, a subsidiary firm of Maximus – and the assessments for PIP – run by Capita and an arm of Atos, trading as ‘Independent Assessment Services’.

Public bodies, GDPR and consent

TL;DR – just because they ask (nicely) doesn’t mean it’s GDPR consent.

First, clinical consent is not GDPR/data consent

Clinical consent is informed consent for a clinical course of action, such as “Yes, you can amputate my arm”. If doctors don’t get clinical consent from a conscious patient, it’s GBH.

Sharing the medical records required for direct care is implicit from the clinically-consented decision, but that isn’t a GDPR consent – though it is part of the public task of the NHS body providing that surgery.

Both of these situations use the ‘consent’ word, but they actually mean very different things. (We agree that’s not entirely helpful.)

Consent, GDPR, and public bodies

GDPR provides six different legal bases for data use. Consent is the one most often used in the private sector – you technically consent to Facebook’s abuse as part of Facebook’s terms of service.

But GDPR requires consent to be a “freely given”. And, with government bodies providing public services, the power imbalance between a citizen in need and the state is so great that those bodies cannot get meaningful consent. A problem amply demonstrated by #metoo in different arenas

(Given what many experience as social obligation to be on Facebook, whether or not the consent there is meaningful and freely given is an interesting question for others, but outside the scope of this consideration.

The Information Commissioner’s guidance is clear – except in a few highly specific circumstances – public bodies shouldn’t use consent as their legal basis for their public task. Indeed, doing so is probably invalid.

A public body can always ask if you wish to go through a data sharing process that will make your life easier – that’s politeness, not GDPR – but that is not the same as asking you to consent to it processing your data in order to receive a service or benefit. In the most benign of cases, the two may be virtually indistinguishable – but while the first is a meaningful choice that has no effect on the outcome, the other is a requirement of accessing the service and so consent cannot be freely given.

“Give us your data or you don’t get your benefit” isn’t consent. It’s coercion.

Under GDPR, both the process itself and the legal basis for how your data will be used must be clear – even if the organisation processing your data doesn’t ask you whether you want it to or not. Most public bodies will have a lawful basis for processing your data under what is called their ‘public task’ (the private sector version of public task is ‘legitimate interest’). Critically, the process cannot offer you a different outcome if you hand over or allow it to access more personal data – though it is allowed for this to make that same process faster.

It is worth being aware that a public body can ask about your consent, and then ignore your answer and process your data anyway – so long as it has documented that that is what it was going to do, and made sure that the information wasn’t (too) opaque. As some have discovered there are, however, large political and process burdens to doing that…

Why is there confusion?

The same word being used in two different contexts doesn’t help, but another cause is the fundamental lack of clarity on how data is used, and the inaccuracy that comes from the process.

Every bit of data processed by digital government ultimately comes down to someone typing something in via a keyboard and, as anyone who reads Twitter will know, such input may not always make complete sense. Officials are trained to believe that the data is perfect and ignore reality – and it’s the citizen who pays the price, and the most digitally excluded pay first and pay the most.

Every use of personal data by a public body must have a lawful basis, which can be known by the data subject. Those uses can be listed, and should be listed (including, around data sharing, in registers of data sharing agreements). A UK resident should be able to know how their data will be used in advance of dealing with that public service – in practice, in the public services, existing law, process, and safeguards mean that it is not often necessary to do so, as one person’s data should be treated the same as another.

No citizen is expected to sign ‘terms and conditions’ or an ‘acceptable use policy’ when dealing with Government – nor should they be. The private sector uses such mechanisms because their acts are not based in public policy and law (as Facebook has recently shown).

When things go wrong, many of the frustrations representatives and support services feel in handling ‘casework’ are simply hurdles within smokescreens thrown up by those who do not wish informed scrutiny of decisions. We go into such issues further in our recent evidence to the Science and Technology Committee Inquiry on Digital Government

A public body may, as a matter of policy and in politeness to the people it serves, ask whether a citizen is willing for data to be shared with another body;  this is not ‘consent’ in the Data Protection terminology – though, if asked and declined, the data sharing must not occur anyway (in which case the ‘consent’ choice would be unfair). Asking the citizen is an offer, and can also be a mandate for the policy, but it is not a basis for legal consent.

The Long Term Plan – more of a medium-term plaster

NHS England and the Department of Health have launched the NHS Long Term Plan. It includes a range of topics of interest. It doesn’t say anything about the National Data Opt-out and not much about data – the LTP repeatedly suggests jam tomorrow – as things that work will be replaced with things that might, which the recent NHS Digital Board discussed. There is still no guarantee of a competitive NHS market for decision support tools, despite the interest of the Secretary of State in some companies…

The Summary Care Record is a (mostly) consented system run by NHS Digital that works (after a decade), and will be replaced with NHS England’s LCHR programme, which is neither consented nor working (and will take at least a decade).

The fundamental point of the Summary Care Record (SCR) is that it is accessible nationwide. If you live in Cornwall and visit A&E in Carlisle, a doctor there can see the medications you are prescribed – unless you have chosen that they shouldn’t.

When the ‘NHS app’ launches, you will (should!) be able to see where your SCR has been accessed – although, as with other NHS.UK services in the queue, you need to be able to log in to NHS.UK before that can work. NHS Digital is still consulting on tweaks to the SCR for carers – a use case that the LCHR programme is far from being in a position to consider.

The LCHR programme, which we’ve covered before, often chooses mass data copying – so if you live in Newcastle and fall off a horse in Newmarket, it is unclear what happens, and whether or how those accesses or copies are consented. The LCHR programme breaks at every boundary you cross, because it is designed and run by NHS England for NHS institutions not patients. There is, as yet, no Information Governance model, there is no patient (or care provider?) accessible audit trail, and the change in name from ‘Local’ to Longitudinal Health and Care Records (page 99) also implies that the data held within them will be eternally expanded, rather than kept to the initial tightly defined dataset – care.data2?

The plan also refers to “The use of de-personalised data extracted from local records” (page 97), which suggests that LCHRs are intended to be used for both direct care and secondary uses – perpetuating the festering wound which is NHS Digital’s continuing disregard of the GDPR around the extraction and dissemination of data that GDPR considers identifiable. The use of “de-personalised” in this context (para 5.27) is a grasp at the figleaf of obfuscation that continues to be defended by the Wellcome Trust.

Lacking an appropriate IG model, there is no consent for the secondary uses of the records that LCHRs are sucking up – meaning the devolved LCHR teams are not only lying to patients but also, in contrast to the publicly owned SCR infrastructure, they have generally outsourced the data handling to the private sector. While replacing SCR is not necessarily a problem, replacing one data copy with a different, inferior data copy is not progress to anyone other than the IT companies that will get new contracts for what was previously a publicly-run service. NHS England delegating blame but not control is not a new phenomenon, nor is issuing plans that undermine and will remove what another part of the NHS is proposing to add to help patients.

The hint in paragraph 1.38 of online NHS services to help the mental health crisis is welcome, but the detail – barely two sentences of vague hand waving in chapter 5 – do not meet what is claimed, let alone what is needed.


Choices of the Secretary of State

The standards the Secretary of State sets for the NHS are not merely that its services are ‘safe’. He can have views on the user experience of an app, but user experience is not safety. There is a fundamental difference, and patients assume the NHS will never be unsafe – which is why there is such concern when it turns out not to be safe. The Secretary of State can say what he thinks ‘good’ looks like, but ‘safe’ must remain within the remit of qualified professionals.

Criteria for ‘safe’ are absolutely necessary, if not sufficient – it is entirely appropriate for there to be separate criteria for what is ‘good’. ‘Good’ apps may be what the public choose, but ‘safe’ is what they expect – apps can be both.

As an app example, Matt has chosen that the money for his GP registration should go to Babylon in London rather than the surgery in his constituency. He feels this works well for him, as someone who likely rarely needs a doctor, and disregards any wider harm that comes from taking funds away from the doctors in his constituency. The choice of what is ‘good’ is partially subjective, and different patients will make different decisions. The critique from the profession that the app is unsafe, however, is met with a response that someone thinks it is good. These are entirely different criteria, and both groups are talking past each other – the criteria of ‘safe’ and ‘good’ should be separated; as noted above, the former is necessary but not necessarily sufficient. The Secretary of State can add standards for what ‘good enough’ looks like, without reducing safety, if he so chooses.

At the insistence of DH, NHS Digital has pulled its consultation on the ‘Clinical Data Architecture’ Principles “so that we can ensure consistency with wider emerging strategies”. The Secretary of State continues to laud the first draft of his Vision, while the “Code of Conduct” update is delayed to let the AI companies lobby more, but how visionary is it?

NHS Digital’s recent Board papers have included their assessment of what they need to do to deliver on the Vision. While there are some things they wish to do faster, there are only two new things they need to start doing. One is new to NHS Digital only because it was certification work in which they weren’t previously involved (and it remains unclear why they are now), and the second is so substantive that we’ll quote it in its entirety from page 71:

“We will identify frontline staff whose skills and competence are evident to us and make them honorary colleagues (badges and certificates and all that jazz)”

The Vision the Secretary of State proclaims doesn’t seem any more visionary (or meaningful) than a late night monologue from an NHS manager in a hospital corridor about how GPs should work. The Plan suggests there will be new legislation, in which we will look for the National Data Opt-out to be placed on a statutory basis – rather than remaining a gift of a Secretary of State. We note that the National Data Guardian Act 2018 has now received Royal Assent, which is welcome progress.

Its recent Board papers state that NHS Digital is also going to “run all our public services in the public cloud with no more locally managed servers” (page 64) – presumably moving all services to Azure and AWS. Hopefully they will let tech-savvy journalists in to do long form pieces on what they’re doing, and provide reassurance, as this has the potential to go spectacularly wrong – with limited abilities for entities in the UK to clean up the mess, while doctors in A&E and GPs deal with the consequences of the national governance bodies screwing up yet again.

There is one final annual tradition DH has maintained – the new plan delays the full digitalisation of hospitals by yet another year; this target has been slipping by one year per annum ever since it was announced…

medConfidential comments ahead of the Spending Review ( / Manifestos)

Ahead of a Comprehensive Spending Review, the Government has decided that a large percentage of discretionary government spending will go on health and social care by the end of the next spending period. That is probably a better choice than the US, which spends about that amount on their military.

In light of that decision being made, there are rational consequences which require thought to avoid perverse incentives:

  • Data available to life sciences and research: For there to be public confidence in data use, every patient should be able to know how the NHS and others use data about them, and how their wishes are respected. The NHS has established clear processes for the use of data for legitimate research – these do not need to be changed. However, the implementation of the National Data Opt-out remains hamstrung by legacy data disseminations. This, the first spending review since the 2018 Data Protection Act, allows for a clearer formulation when communicating with the public: “If you want your data to be used for research and for other purposes beyond your care, it will be; if you don’t, it won’t.” (Any exceptions being solely decided by the explicit approval of the Confidentiality Advisory Group – which was placed on a statutory footing in 2014, yet still has no Regulations governing its work.) Past and current heavy reliance on (DPA98) ‘anonymous’ data as the basis for dissemination both undermines public confidence and limits the data available to research. The spending review offers an opportunity to reconsider that failed approach, improving public confidence and making more high quality data available to researchers and the life sciences – both underpinned by a commitment that whatever a patient wishes, they will be able to see how their wishes were respected. Any suggestion of ‘data trusts’ for NHS patients’ data requires as a prerequisite the admission that the NHS itself will never get data dissemination right in patient’s interests. Public confidence in data for life sciences and research would be higher if the message was clear, simple, and accurate: If you want us to use your data in legitimate projects, we will; if you don’t, we won’t.
  • Technology in the NHS: Clinicians will use technology when it helps them with patients; when it doesn’t, they don’t – no matter how hard NHS England may push it. The FHIR (Fast Healthcare Interoperability Resources) standard is now internationally recognised as the standard for interoperability between health systems – yet the first version was only published after the last spending round. Treasury / DH / NHSE should ensure that companies cannot use contracts to limit or prohibit interoperability, or to require bulk data copying from core hospital systems into commercial companies. Where they are proposing new national programmes, chopped up into parts, what happens at the boundaries between parts? 
  • Prevention is cheaper than cure: In advance of the spending review, HMT should commission an independent assessment of the ‘DH vision’ on prevention to answer two critical questions: will it do what it claims to do? And if not, how and where does it fall short? (Page 14 of the vision shows the disconnects.) The assessment should be published alongside the DH green paper, and show what questions must be considered across Whitehall to avoid any other department causing what DH seek to prevent. 
  • New forms of Transport: Will DfT allow self-driving cars to operate in a way where their stopping distance is greater than their effective sensor range? Will equivalent assessments be made for other technologies; and if not, what will the consequent effects be on the health of the nation? 
  • Procurement incentives for competitive markets: Where an NHS body, wishes to procure an AI to assist in diagnosis, it should be required to procure 3 – effectively requiring 3 diverse analyses rather than one, replicating the medical norm of a ‘second opinion’ from a human doctor. That may be extensible to other public bodies. 
  • AI and algorithms in the public sector: For all bodies subject to judicial review, any AI or algorithm involved in input to that decision must satisfy the explainability requirements of judicial review. Should there be a clear public sector mandate that algorithms will only be used if they satisfy existing legal obligations, and that technology tools will need to be procured to satisfy those tools, that will create a market in which the UK is possibly uniquely placed to lead.

The first two points have strong equivalents across all departments.

 

Tests for the spending review: Balancing mental health, parity of esteem, and Public Health

The spending review is the primary administrative mechanism for cross-government prioritisation. 

The largest public health concerns are different in different local areas – will the spending review (and MHCLG priorities) reduce or exacerbate those differences?

Will (tech) companies assessed to be causing mental health issues be required to take steps to reduce the harms they cause in future, and mitigate harms already caused? If they are not, these costs will have to come from the NHS budget, and are effectively a commercial subsidy paid by the public purse. By comparison, for each of alcohol, tobacco, other substances with consequences for human health, and digital companies – does each contribute in tax revenue what they create in direct and indirect costs?


Some areas of this post were elaborated in our submission to the Digital Competition Expert Panel.

The Opt Out process for those with children

This page has been superseded by a 2021 page

Below is left here for historical purposes

medConfidential’s opt-out form for GP data continues to allow you to protect the information held by your family GP as it has always done – but the only way to protect your data collected by hospitals and in other care contexts is now via NHS Digital.

NHS Digital does not (yet) have direct access to your GP record, so its opt-out process requires you to take several steps – and officials have decided that people who have children must make a third.

Additional steps forced on those who care for children

If you have children, or other dependents, and wish to express a choice about the use of their medical records for purposes beyond their direct care, the process is needlessly complex.

NHS Digital provides no online process for families with children even those who are registered with the NHS at the same home address as their parent or carer – so you must use its online process for yourself alone, and then use a separate postal process for any children under 13.

(Children aged 13 or over can express their own choice about their medical records for themselves online.)

So, instead of the single form you could use to express your choice for yourself and your family back in 2014, you must now do THREE things:

1) For your own hospital and other non-GP data: opt out (or opt back in) online

2) For your dependents’ hospital and other non-GP data: post this form to NHS Digital

3) For your and your dependents’ GP data: give this form to your GP

If you don’t have access to a working printer, e-mail children@medConfidential.org with your postal address and we will post you copies of the paper forms, for free, no questions asked. If you don’t have e-mail, you can text your address to 07980 210 746.

If you can afford to make a small donation to support us in offering this service to others, we have a donation page.

registered with the ICO to process personal data in this way.)

Why?

It appears the “most digital” Secretary of State the NHS has ever had would rather place the burden of understanding and action onto individual patients – especially those with families – than solve the problem his officials have created.

And the fix is simple: NHS Digital’s online process confirms who you are to the extent that you are able to express your choice online, yet it chooses not to ask if you have any children – who could then be confirmed in exactly the same way the process confirms that you are who you say you are. The move to ‘digitise’ the National Data Opt-out left YOU with more work to do…

This extra work for you is a deliberate choice; it didn’t have to be this way.

Once you have expressed your wishes about your and your family’s data, you may also wish to write to your MP and ask them to ask the Department of Health why you are being forced to jump through these hoops.

Because while healthy, young, and particularly male protagonists commonly have little understanding or insight into the types of sensitive information they will one day have to divulge to their doctor – and the consequences of confidentiality not being respected – others do not have the luxury of such ignorance.

Late October update

At the start of October, the Department of Health took away your ability to opt out via your GP from having information about you, collected by the rest of the NHS, being used for purposes beyond your direct care. (The option to prevent information from your GP record leaving your GP practice remains. For now.) The new process is so ‘hip and digital’ that you also have to use the Royal Mail if you wish to make a consent choice for your children, as well as visiting your GP practice to make a choice for your GP data that the online process tells you nothing about.

Is this Matt Hancock’s view of a digital NHS?

We are testing a new trifold to guide families through expressing their full opt out choices –which is now a three step process: online, post box, and at the GP. This may be simpler for NHS Digital, but it’s a lot harder for you – a choice with which Matt Hancock seems to be entirely happy.

NHS Digital was apparently very proud that more people opted in via the digital service than opted out in its first 2 months – though sending out 1.6 million letters could be said to have stacked the scales somewhat – but that represents at most a few hundred people a month, whereas 5,000-10,000 people a month were still opting out via their GP until the Secretary of State took that choice away from you.

We have previously given a commitment that there will be a functional digital opt-out process for patients, and that if NHS Digital wasn’t going to deliver one, then medConfidential would have to (though this will likely be very analogue on their side…).

Data rights and proper information can together empower every patient and citizen to have more confidence in those who use their data. NHS Digital seems to want to make it more complicated. Though official information is published in various forms, in various places, the only way a patient can currently read how their wishes were respected is to visit TheySoldItAnyway.com

If you didn’t receive a letter from NHS Digital about the new ‘National Data Opt-out’, and since you’re reading this on our website, you should check the online process to see if your choice disappeared somewhere in the machine (and, if so, to set it to what you want). You’ll then need to set it for your children too by post – and at your GP, for your GP data, to ensure that too is set.

 

Consultation Responses, etc.

With the National Data Guardian Bill having its second reading in the Lords this week, medConfidential has published a letter of support for the Bill. Meanwhile, the Organ Donation Bill contains a supposed safeguard that is overly complex and will not provide reassurance to those who wish to see how their organs will be used after death. We have drafted an amendment for the Lords to fix the broken Bill, if the Commons does not.

As part of the next piece of NHS legislation, the National Data Opt-out should be placed on a statutory footing. The next legislation will likely be the result of NHS England’s consultation on “integrated care providers” (our response) and the “long term plan” (our response), which also referenced the need to reform invoice reconciliation.

Our friends at dotEveryone published their views on digital harms and responsible technology, suggesting that data and ethics in Government should be led by someone “relatable … charismatic and imaginative”. Which would be better than the current person, whose company created the problems around commercial abuses of data in the NHS, and which is still causing problems 20 years later. The current  ‘imagination’ at CDEI (the ‘Centre for Data Ethics and Innovation’) seems to be to repeat the sort of data sale scandals in Government they already caused in the NHS. The Information Commissioner also sought views on a ‘regulatory sandbox’, where companies can experiment with personal data – we had views.

Data use across the rest of Government has also been keeping us occupied. Our evidence to the House of Commons Science and Technology Committee contains some new thinking on the failures of agile in public bodies. Some of that thinking was also in our response to the call for evidence ahead of the UK visit of the UN Special Rapporteur on Extreme Poverty and Human Rights, who is looking at algorithms and digital effects around Universal Credit.

 

Data and the rule of law

The data sharing powers under the Digital Economy Act 2017 are still not fully in force. This did not prevent the Ministry of Housing, Communities and Local Government (MHCLG) demanding data on every homeless person in the country, such as in Camden. The secrecy of data use in such cases must be addressed by the UK Statistics Authority / Office for National Statistics – it is doubly disturbing that MHCLG used the research process to evade the scrutiny that would have applied via other routes.

Decisions by public bodies must, today, comply with the standards of the rule of law. As we move towards more automated decision-making, how will those standards be maintained?

The tech companies and their apologists want the approach to be one defined by ‘ethics’ – as if no tyrant ever failed to justify their crimes. “The computer says no” (or “DeepMind says no”) is wholly insufficient for suppliers of data processing functions to government making decisions about citizens.

All reputable companies will be entirely willing to explain how their “AI” systems arrive at the suggestions or decisions they make – including the (sources of) data on which they were trained. Disreputable companies will be evidenced by their failure or inability to do so.

Government Departments should deliver accountability to ‘their’ data subjects (currently they don’t). But beyond accountability to individuals on how data about them is used, there are standards that must be followed by institutions – especially those which govern.

The Venice Commission has produced a ‘Rule of Law checklist’, covering the context of decision-making. We’ll be taking a look at a couple of Government automated processing plans, and seeing how they conform – and how the checklist applies to digital projects, probably starting with Universal Credit and Settled Status, based on past work. We anticipate identifying holes in some of the frameworks currently used by Government, as compared with the standards required by the rule of law and judicial review. Comments are very welcome to sam@medConfidential.org.

Initial response to new ‘tech vision’ from DHSC

Update: longer response.

The current Westminster penchant for speeches with applause lines but without details has reached DH… 

Update: NHS Digital has now published some details – which are utterly underwhelming when compared with the Secretary of State’s hyperbole. “Use the NHS number” and “upgrade from ICD-10 to ICD-11” is not the radical changes Secretary of State appeared to suggest. Although with the promise of registers, we might dust off the amendment we suggested to the Lefroy Bill (which mandated NHS numbers by law) in 2014. We will update this document when NHS Digital published the document that should have appeared at the same time.

Notes:

  • Data: “We are supportive of … Data Trusts” – does the SofS/DH have so little confidence in the NHS getting data right that he/it is supportive of stripping the NHS of that governance role?
  • DH blindspots: “We will know we have achieved our goals when”… does not mention patients other than suggesting they use apps for self-care…
  • Privacy: There is no reference to the duty of confidence every clinician is under to their patients (it instead points at the Data Protection Act)
  • Google: The obligation on all systems to use FHIR interoperability removes the figleaf behind which DeepMind insisted on data for all patients in the Royal Free for 5 years.
  • Google: It also proposes outlawing the monopoly line in Google’s standard contract that forces users of the Streams app to only connect to DeepMind’s servers. It is unclear whether that line will survive the lobbying it is about to receive.
  • Amazon: Case study 7 is as true as leaving a note on the fridge, but there are other effects of giving Amazon/Alexa such information. Facebook’s new Portal offers the same functionality, and will explicitly be used to target ads at users.

 

Below quotes can be attributed to Sam Smith, coordinator of medConfidential, which works to improve uses of data and technology in the NHS.

The NHS knows what good working technology like, but to get there, you can’t just turn A&E off and on again and see if helps.

Mr Hancock says “we are supportive” of stripping the NHS of its role in oversight of commercial exploitation of data. That should be a cause for widespread concern. If Matt thinks the NHS will never get data right, what does he know that the public don’t?

The widely criticised National Programme for IT also started out with similar lofty vision. This is yet another political piece saying what “good looks like”, but none of the success criteria are about patients getting better care from the NHS. For that, better technology has to be delivered on a ward, and in a GP surgery, and the many other places that the NHS and social care touch. Reforming procurement and standards do matter, and will help, but it helps in the same way a good accountant helps – and that’s not by having a vision of better accounting.

There’s not much detail in here. It’s not so much ‘jam tomorrow’, as ‘jam… sometime’ – there’s no timeline, and jam gets pretty rancid after not very long. He says “these are standards”, but they’re just a vision for standards – all the hard work is left to be done.

-ends-