Author Archives: medcon

medConfidential Bulletin, 11th June 2021

Hello to all of our new newsletter readers – a lot of people have joined in the last week.

medConfidential only sends out a newsletter when there is something worth saying. There might be a few more of them over the next few months…

What just happened?

On 12 May, NHS Digital quietly announced there would be a new GP data collection, known as ‘GP Data for Planning and Research’, ‘GPDPR’ – or the #GPdataGrab, for clarity. 

NHS Digital and the Secretary of State, who on 6th April had Directed NHS Digital to run the scheme, hoped no-one would notice.

Matt Green did a very good, and funny, explainer of what it was they were planning, which you can also watch (or share) on YouTube:

https://www.youtube.com/watch?v=QqZXH0CJYcM (the deadline date has since changed)

Because it was rushed out, all sorts of issues were missed. Just one being that if you are pregnant, there’s no guidance on what to do for the GP data of babies born shortly after the deadline; there is no digital process for unborn children…

And then, less than a month later – after a media firestorm, a bunch of contradictions and corrections, and huge public outcry – the programme got paused.

Here’s just a sample of some of the media coverage:

Just yesterday, NHS Digital confirmed that its Data Protection Impact Assessment (DPIA) for GPDPR is still not in a publishable state, suggesting that fundamental contradictions within the programme have not been resolved. The DPIA being the one document where everyone has to write down what it is that the programme actually does, why, and the consequences – i.e the ‘impacts’. So, of course, any contradictions become obvious.

The GP data grab programme was clearly not ready, and is still not ready – and looks like it cannot be ready by the 1st September. (At least…)

What’s the new deadline?

Originally, the GPDPR scheme had no official opt-out forms. medConfidential said we would publicise ours (including our logo) and so they created one. As a result, the Government and the GP Profession agreed that it could take up to a week for a GP practice to process their patients’ opt-out forms – they are rather busy at present! – and the 23rd June deadline date was written into a document, one week before the 1st July start (i.e. data upload) date. 

That ‘time lag’ applies equally to any new start date, which is now (no earlier than) 1st September. The September date was entirely up to the Government, and did not need to be agreed with anyone. So Ministers could announce the new start date.

But any deadline has to be agreed with the GPs.

And it is notable that the Government, hiding behind NHS Digital, “wasn’t able to specify” officially what the new deadline is. Ministers and civil servants have calendars like you and us, so they could work it out – but the Government can only announce those actions the GPs have agreed to.

That, at the time of writing, NHS Digital appears to be prohibited from saying exactly when the new deadline is suggests that far more substantive changes to the GP data programme are coming than the Government is currently willing to say.

Having said that, the deadline for opting out to your GP practice relates to the processing time it takes your GP – something that is not within the power of the Government to arbitrarily shorten. (Though it could be made longer, by extending the 1st September arbitrary date; an “artificial deadline” for protecting your GP data.) 

Of course, the correct sequence of actions and deadlines is that no GP agrees to any upload of their patients’ data until each patient has been notified; that patients have been given the opportunity to make a choice, and the information and forms they need, and that those choices have been processed. 

This may be why NHS Digital cannot say what the opt-out deadline is, because it has more work to do on its communications and the opt-out process – especially for dependant children – a process which will likely take months, not days.

Since the Department of Health (DHSC) can’t even announce a deadline that is based simply on being able to read a calendar, medConfidential currently has little expectation that the GPDPR programme will start in 2021. In all likelihood, and as with the previous attempt in 2014, this new GP data scheme will likely drag on until it gets fully reset by the next Secretary of State for Health. 

Of course, we can’t afford to be complacent; we do have a Secretary of State in office who believes in data over everything else. (Apart from start dates, apparently…)

What should be next?

The letter from research funders, “Patient data must be safeguarded”, should still have applied this week – but it seems some on the Euston Road have slid backwards in their approach.

One narrow idea from some within the research community is to try to win a “research boffins vs privacy people” argument. That framing is eternally unstable; whoever is winning that argument this month doesn’t matter, because someone else will be winning it next month. 

Any stable and sustainable patient data programme must take a “research boffins and privacy people” approach – with everyone in the same room, working towards a goal that everyone can stand behind. 


We see no sign of that happening.

The best way for uses of data to be sustainable and trustworthy is for patients and the public to be informed about what data is used and how, what your choices are, and to have safeguards and governance that is both effective (with no loopholes) and seen to be effective – so individual patients and the public at large can have confidence in how the NHS uses data about them.

What can you do?

Spread the word, and please share this link to our ‘How to opt out’ page:

This battle is far from over.

There is still a lot of confusion – even medConfidential is being accused of ‘misinformation’! – though we do our best to always present a clear and accurate picture, and link to the evidence, about an unnecessarily overcomplicated process that is being hustled through by the Government while we are still in a pandemic.

Please do not panic, keep yourself informed. We will send further updates when we know something has changed. And be aware that this is going to run into ‘silly season’ in August, in a year when everyone really deserves a break – or, at the very least, a staycation.

Thank you to all those who have given us support. We really appreciate it, especially right now.

And you can be confident that we will be here when they try again! 

Let us tell you about the massive new GP data grab the Government would rather you didn’t hear about…

The countdown has already begun. The Government’s plan is to copy your entire GP medical history – including all the most sensitive parts – and make it available for third parties to apply for and buy access. Even though Matt Hancock Directed it to happen he’s not going to tell anyone about it. Neither will Boris. 

Details are still a bit sketchy; critical documents like the programme’s Data Protection Impact Assessment aren’t written published yet, and some of wha’s being said to patients seems… contradictory. The promise that “you can opt out at any time”, for example, doesn’t fit with the fact that once your data has been copied, it will never be deleted. 

We will provide more information as it emerges, but for now…

If you’re wondering why you haven’t heard about this, it’s because they haven’t told you! The Government are playing the odds that you, your family and your friends and colleagues won’t have noticed some information they buried on a website, or the handful of tweets they’ve made.

While you can opt out, they’ve made it deliberately confusing and difficult. Unlike the single form medConfidential provided at the beginning of the previous attempted GP data grab in 2014, you must now use several:

  • The most important one is the ‘Type 1’ opt-out form – this is the only opt-out that will prevent your GP data being copied to NHS Digital, and then onwards. If you haven’t done so already, you need to fill one in and send it to your GP practice within the next six weeks. (If you opt out after 30 June, your GP data could be copied from your practice at any point and, once copied, it will never be deleted.)
  • If you opted out from care.data in 2014, your opt-out will still be valid. Your own GP practice may still be using a form from that period, but it is completely different from the ‘National Data Opt-out’, which used to be called a ‘Type 2’ opt-out. Bottom line, if you opted out of care.data in 2014, you should be OK for now.
  • The National Data Opt-out, introduced in 2018, limits NHS Digital from selling access to some of your data in some circumstances. (They still sell it 85% of the time.) This opt-out process is supposedly ‘digital first’ but in 2021, for people and families with dependents, NHS Digital’s process still involved multiple PDF forms – which we’ve combined into one. And because the process is so overly complicated, we’ve created a page to help guide people through it: https://medconfidential.org/2021/children/ 

If you don’t opt out before 30 June 2021, the Government will take a copy of everything medical that ever made it into your GP record, throughout your whole life – apart from some limited aspects of information around gender recognition or IVF treatment (if you have received any).

The first upload will be of your entire GP medical history to date – which will happen as soon as your next GP appointment, possibly even before that – and then there will be daily updates thereafter, to copy every new thing that is recorded about you.

There will doubtless be much more to come, but the headline is this: 

The Government intends to take YOUR entire GP history, and isn’t even planning on TELLING you that you have a CHOICE, i.e. by writing you a letter.

This time they’re not even sending out a junk mail leaflet, but they might do some tweets and social media ads. (As we write this, their YouTube video appears to have fewer than 260 views.) 

To summarise, the process to dissent fully from the copying and then further use of information from your GP record for purposes beyond your direct care is as follows:

  • Give a Type-1 form to your GP, for your whole family’s GP records;
  • Do the online National Data Opt-out (NDOP) for yourself;
  • Download, fill in and e-mail the multi-page NDOP form for your children.

There should be better options, but this Government apparently doesn’t want to listen.

medConfidential has been fighting to ensure every use of patients’ data is consensual, safe, and transparent since 2013. We aren’t there yet, but there’s every reason to believe that if enough people take action, we will get the protections you and your family deserve.

This Government sees so little value in any form of protest that it is trying to ban it through legislation. And its Ministers’ (and senior officials’) view of profit seems to be that ‘any means are acceptable’. Indeed, in a global shortage in the throes of a pandemic, politicians picked their friends to profit off the NHS to provide PPE for nurses. 

Why would anyone believe they wouldn’t seek to profit in exactly the same way from your health data?

If you believe they can be persuaded to change course, or if you simply want to be kept informed, please join our mailing list for more news. And don’t forget to forward this e-mail to your friends and loved ones, who may wish to make their own choice before the end of June. 

Our thanks to those who have donated, especially monthly, over the last couple of years – we kept your money in a pot for times like this. We’re currently using it to pay for sending out forms to those who don’t have printers, and for some other things that we might ask you for help with, next time.


The opt out process for children

In order to register dissent for your children’s medical records to be used for purposes beyond their direct care (and why):

1) Protect your GP data: fill in and give this ‘Type 1’ form to your GP practice [PDF] [or MS Word version] – this form allows you to include details for your children and dependants as well. This is the most important step; the Type 1 opt-out is the only opt-out that will stop NHS Digital extracting your GP data.

2)  If you want to stop your non-GP data, such as hospital or clinic treatments, being used/sold for purposes other than your direct care (e.g. for “research and planning“) you must use this process: 

    • If you have children under 13, you need to fill in this form [PDF] and e-mail or post it back to NHS Digital – this form works for both you and your children.
    • If you have an adult dependant for whom you have legal responsibility, you must use this form [PDF] and send it back to NHS Digital on their behalf

If you don’t have a printer

If you don’t have a printer, and can’t fill in the electronic forms above, you can e-mail children@medConfidential.org with your postal address and how many people you need forms for, and we will post you copies of the GP paper forms, for free, no questions asked (also tell us if you have children under 13, or the online hospital data service hasn’t worked for you, and so you need the hospital data form as well).

We will, of course, only use your details to send you the forms you want and we will delete them as soon as we have done that. (medConfidential is registered with the ICO to process personal data in this way.) If you can afford to make a small donation to support us in offering this service to others, we have a donation page where others have donated so we can send the forms to you for free.


Why does DHSC put you (and other families) through all this hassle?

The pandemic changed many things, but until 23 April 2021 and for much of the last year, the forms to express dissent for your children and other dependents were missing from the NHS Digital website. Instead, for up to a year it said:

Screen capture from 6 October 2020

The explicit decision back in 2018 that there would not be a digital route for families with children came back to bite those who had to implement it.

Postal processing being temporarily unavailable might be considered understandable at the height of a pandemic, though it was clearly someone’s decision to remove the links from the website to prevent new processing.

This may not have been a concern in and of itself, but buried in NHS Digital’s Board papers is a statement that the Direction and Data Provision Notice for “GP Data for Planning and Research” was due to be published “to enable collection to commence (March 2021)”, according to the papers for NHS Digital’s Board meeting on 31st March 2021. That the new children’s opt out form did not appear until three weeks later suggests the discriminatory and inappropriate approach to dissent continues. 

Before 2016, when the dissent process had been considered primarily from a patient’s perspective, the way to opt out was to give one piece of paper to your GP, which covered your entire family – and then the NHS would deal with any complexity. It did, and it worked. Over 1.2 million people used that process, which was possibly more than some would like…

In 2018, NHS England and the Department of Health and Social Care made a series of choices about how the National Data Opt-out Process (NDOP) would work; the effect of each of those decisions made it harder for someone to express their wishes: 

  • By cherry-picking who was invited to meetings, and on a narrow reading of the Data Protection Act 2018, one of those choices was that if you are over 13, you must do it yourself online. (The people to whom NHS England chose to listen at that point were those who believed your GP shouldn’t be an interface between you and ‘the NHS’.) 
  • Another such choice was that the databases used to validate that you are who you say you are online were not to be used to check if you had children who lived at home and who were registered with the same GP.
  • In fact, the decision was then made that there would be no digital process for children at all – parents’ and carers’ choices for any of their dependents would have to be done via a form sent in the post. First that form had 8 pages, then 7, now 4, and (finally!) the option to send forms by e-mail, after feedback about the punitive nature of the process.

As a result of all this complexity, the GP opt out form – which, prior to 2018, used to work for your entire family for all NHS records – still works for your entire family, but only for your GP records. And since NHS England and DHSC chose to create another process, you now have to do that too!

The process doesn’t have to be this complicated. Most families have children registered with the same GP as the parent, living in the same house, so the NHS identity checks for adults should cover their dependent children. There will always be exceptions, families with more complex situations – the current PM’s for example – which is why a paper form is a necessary backstop. But having no web process at all starts to look more like a procedural punishment for families.

The Secretary of State or NHS England could have said that the process for dependent children (and other dependents) should be the same as for adults. Instead they shifted the bureaucratic burden from the NHS onto patients and families, in the hope you would care less about your GP data than their cronies who wish to buy it.

To register dissent for your children’s medical records to be used for purposes beyond their direct care (and why):

1) For your and your dependants’ GP data, give this Type 1 form to your GP.

2) For your children’s and your own hospital and other non-GP data, fill in and e-mail or post this ‘National Data Opt-out’ form to NHS Digital.

3) Not forgetting that for your own hospital and other non-GP data, individuals aged 13 or over can also opt out (or opt back in) online.

Shared Care Records

One thing the NHS bureaucracy likes more than anything is having the same acronym to mean two different but similar things. In addition to Summary Care Records (SCR, SuCR?), which have existed since 2007, NHSEx now adds ‘Shared Care Records’ from 2021.

As explained to the Public Accounts Committee on 17 September 2020 by Matthew Gould, CEO of NHSX, Shared Care Records (SCR, ShCR?) should:

“…allow patient data to flow safely and appropriately between different care providers, not just in health but also in social care.”  – Question 46

It is not that NHS England / NHSx have access to data today that is necessarily the problem; it is what they will do with it tomorrow – and whether they will keep their promises (cf. Test & Trace DPIA, transparency court case, ‘contracts for cronies’, etc. , etc.) 

What is needed for Shared Care Records to work?

Patients’ and service users’ data flowing along their care pathway for the purpose of their direct care is a worthy and worthwhile goal, and one that medConfidential has supported since its inception.

If Shared Care Records are to be successful in practice, and not repeat the dead-ends of LHCRs, that success requires they must do an number of things:

1)  The Shared Care Record is claimed to be for direct care only, and a good Shared Care Record will indeed be for direct care. A bad Shared Care Record will be used for lots of other things that it does not tell you about. NHS direct care services are too often seen as a ‘Christmas tree’ off which to hang things; ShCR cannot be a means by which sensitive health records leak out of the NHS via a back door.

  • What happens across the boundaries between administrative areas (ICS, ShCR, or other)? Do ShCRs facilitate care for those who live in one area but, e.g. visit A&E in another?
  • Do (creepy single) doctors get to look up the records of women they’ve met on dates – or anyone in the country – without disclosure or recourse? Are meaningful measures in place for those who are affected to know that their records were accessed?

2)  Shared Care Records’ existence and purpose(s) must be properly communicated to the public, before they are used, (unlike LHCRs) which means:

  • Clearly and publicly stating what they will and won’t do, and explaining the rights and choices people have;
  • Writing to all those who will have ShCRs created – including service users – not forgetting or otherwise disadvantaging families with children, and not just to those who may have previously opted out of LHCR sharing and/or SuCR.

3)  Ensure Shared Care Records can be seen as trustworthy, on an ongoing basis. From the point they are introduced, a record of every access of a ShCR must be made available to the patient or service user via their new NHS Login, cf. Data Usage Reports.

  • If a provider is capable of connecting to a person’s Shared Care Record, it must also be capable of recognising and respecting that person’s confidentiality and consent choices; if it cannot do both, it must not be permitted to do either. 
  • There is no excuse to ‘retrofit’ later; GDPR requires that all data processing is lawful, fair and transparent – and the data flowing through Shared Care records is, by definition, identifiable individual-level special category personal data.
  • Confidentiality and consent choices should be managed centrally, ensuring that system-wide rules and IG are applied consistently and effectively. If local areas / ICSs manage their own dissent processes and someone moves to another area, will they have to dissent again? How will anyone know?

4)  To the extent that any data contained within Shared Care Records is extracted, copied or otherwise processed (e.g. ‘anonymised’) for any secondary uses, this must be done either within the statutory Safe Haven (i.e. NHS Digital) or under its Information Governance processes, which confer (joint) data controllership. Anything less than this would be a retrograde step and, as the failings of consent and governance processes around individual LHCRs have demonstrated, will compromise public trust.

  • Notably for social care data, the regulator (i.e. CQC) cannot also be the Safe Haven: the incentives would be completely perverse. We understand a reluctance to put a body named ‘NHS Digital’ in charge of (adult) social care data, and share concerns about the ‘medicalisation’ of social care. That challenge is, however, an improvement DHSC and the NHS have to learn if health and social care are ever to be properly integrated.

If Shared Care Records cannot meet these conditions then they will be unfit for purpose, and will have been commissioned badly at huge cost to the taxpayer and to the reputation of the NHS.

LHCRs largely failed; will the lessons be learned?

From NHS Data Day, October 2019

Commercial re-use of data

If there are to be secondary uses of data within the Shared Care Record, then plainly the National Data Opt-out must be made statutory, it must work properly for everyone (including families), and must be made readily available to all before any secondary uses go live.

  • The deadline for system-wide implementation of NDOP has been extended repeatedly from the original compliance deadline of 31 March 2020, to 31 September 2021;
  • Meanwhile, LHCRs and CCGs have struggled to interpret and in some cases properly apply correct and appropriate Information Governance for NDOP – a situation that cannot be permitted to continue beyond the pandemic.

Even if there were to be zero secondary uses of data flowing through the Shared Care Record (ShCR), there would still be the issue of Summary Care Record (SuCR) opt-outs. For if someone has objected to just a summary of their record being shared, how can it be assumed that they will accept the wider sharing of their ‘whole’ care record?

Where there is legislation, therefore, both the National Data Opt-out (NDOP) and a ShCR opt-out must be made statutory; and these must be made readily available to all, and must be respected by all across the whole health and social care system.

“Selling the benefits” is no longer enough if you are also selling the records.

Details documents

As all of the future emerges, the graphics below from various NHS presentations show the thinking that went into them, and the direction of travel:

‘Shared Care Records’ for secondary use?

Data for direct care doesn’t stay for direct care. A series of slides over years show the embedded view of NHS bodies which all want data to flow beyond direct care to commercial companies, and for planning, policy and commissioning decisions.

  • It all begins with a lifelong (“longitudinal”) record and “maximising the use of data”:

from: https://digital.nhs.uk/blog/transformation-blog/2019/so-what-is-a-local-health-and-care-record-anyway

  • For the purposes of policy-making, planning, commissioning and near ‘real-time’ surveillance of individual-level patient data, explicitly intended long before COVID:

From: https://hscic.kahootz.com/connect.ti/PubNHSDDTSF/view?objectId=10508916 

  • Of course, there is also research and commercial exploitation; discussed well before COVID

From NHS Data Day, October 2019

  • …and during the pandemic as well; here’s the Minister discussing legislation to align with Big Pharma interests:

From Baroness Blackwood’s roundtables with Pharma, June-July 2020

ALL of these interests (and many more, e.g. Big Tech, Big DNA…) will seek the data once it has been created.

Details documents

Analysis and Inputs Reporting

[The 2020 update to our ongoing series on data usage reports (20142021)]

The need for and consequences of data usage reporting is something medConfidential has worked through for a long time.

You have the right to know how data about you is used, but what does that look like in practice? We’ve mocked up a data usage report for the NHS, and the equivalent for Government – but what about the analyses that are run on any data? What should responsible data analysts be able to say (and prove) about the analyses they have run?


The new, eighth Caldicott Principle is “Inform patients and service users about how their confidential information is used”. In future work we will look at how this goes beyond existing legal requirements under the 2018 Data Protection Act, what Data Usage Reports (or Data Release Statements) should look like to the NHS in 2021, and what patients should see. For now, though, we want to take a look at the other end of the process.

Analyses, Analysts, and their readers

Public bodies (and indeed everyone) buying AI and Machine Learning products need to know what it is they are buying, and how it has been developed and tested. Ethically, they must be able to know the equivalent of “This was not tested on animals”, i.e. “No data subject was harmed in the making of this AI”.

We covered a lot of the procurement side of this in our recent work on AI, data and business models. But that raised a question: what is it that procurers should ask for when procuring data-driven products and services? And what does good look like – or, at a bare minimum, what does adequate look like?

At the most practical level, what should someone wanting to follow best practice actually do?

And just as importantly, who should do what?

In a world of the Five Safes, Trusted Research Environments (TREs) and openSAFELY, and as the role of independent third parties becomes increasingly viable, those who wish to follow more dangerous ‘legacy practices’ with data will be unable to provide and evidence equivalent assurances – and their offerings will therefore be at a significant disadvantage in the market.  

A trustworthy TRE records exactly what data was used in each analysis, and can report that back to its users and to those who read their analyses. Academic journals often require copies of data to be published alongside an academic paper, which is not possible for health data (and if someone were to make that mistake would be catastrophic), but this certificate could act as a sufficient proxy for confidence and reproducibility.

If you are running the data ‘in your own basement’, there’s no way for anyone to know what you did with it beyond simply trusting you. In health analyses and with health and care data, that isn’t enough – and it should certainly not be the basis for procurement decisions.

So, as before, we decided to mock something up.

Trusted Research Environments which facilitate transparent data assurance like this, and which automate the provision of evidence of compliance with the rules – Data Protection, Information Governance, Equality, or otherwise – will be offering advantages for their users over those which do not. And any TRE that does not report back to its users how its safety measures were used will clearly not be helping its users build confidence in the entire research process.

While they may claim to be “trusted”, organisations that fail to provide every project with an ‘Analysis and Input report’ cannot be seen as genuinely trustworthy.

[2021 blog post in the series]

The National Data Strategy for Health and Care (and the other one for everything else)

Across Government data and digital is too often used solely to help civil servants to make decisions, rather than benefitting all stakeholders.

There is little sign this inequity will be addressed under current structures or priorities, but as Government thinking evolves around the structure of the new Information Commissioner, CDEI should also be fundamentally restructured so as to receive and consolidate a much wider range of inputs – including lay members (DHSC’s former National Information Board had six, for example). Without wide-ranging input, data in Government shall continue to make rookie mistakes such as those of the ONS / GDS Data Standards Authority.

There are alternatives to creating many ‘pools’ of data around Whitehall and simply hoping no-one makes a mistake. Built for the pandemic, and with appropriate governance within and beyond it, the model of openSAFELY could apply across the rest of Government – especially for monumental failures like the National Pupil Database at DfE.

While it is self-evident that the vision of the forthcoming National Health Data Strategy should be to maximise the health of patients within the NHS, the vision of a National Care Data Strategy is less clear. Is it only to maximise the health and health outcomes of those to whom care is provided, or do quality of care and quality of life have other dimensions? Whatever is decided, as the Health and Care systems move towards integration, those two goals must align – but it shows how far apart things are that to talk of (the state of) Health and Care data as if they are even remotely equivalent is quite clearly nonsense.

As the pandemic has brutally illustrated, there is no data strategy for social care – and no evident plan to move towards one. Given every journey must begin with a single step, something like this might work.

Health and Care ‘moving parts’

Whenever NHS legislation is next put to Parliament, the National Data Opt-out should be placed on a statutory footing. Aside from guaranteeing patient choice and underpinning trust, this will provide proper democratic scrutiny of official choices such as the one which the National Data Guardian highlighted in her recent annual report – page 11, right column – where NHS Digital, NHSX, and DHSC decided it wasn’t in their interests for patients to see how data about them is used. Should government attempt to defend that position, when the push-poll and focus group used to come up with it are more widely known, the u-turn will be more embarrassing than fixing it now. 

In a similar vein, transparency on access to patients’ details via APIs (whether new ones for COVID-19 or pre-existing ones, such as for the Summary Care Record) would also begin to address the ‘creepy single doctors’ problem that has been exacerbated by the widening of access in a time of reduced oversight. And that some in Government still wish to use patient records for funding and “decommissioning” decisions (para 2) is unlikely to be wise.

Government argues that new business models are the way the NHS and the Life Sciences Industrial Strategy will get them out of the hole they’ve created. Trillion-dollar tech fantasies abound. But while the conflicts of interest amongst advocates of this strategy are clear, whether it will work is not. 

National Data Strategy (outside of health)

The NDS is a “pro-growth” data strategy, which is an entirely appropriate mission for DCMS – but it creates a fundamental conflict of interest in its sponsorship of the ICO as regulator, and its role in choosing the replacement for the current Information Commissioner. For this if not other reasons as well, the ICO should move back to being Departmentally sponsored by the Ministry of Justice, to underline the fundamental importance of following the law and to ensure the principles of justice apply to all data use, as well as to quasi-judicial decision making by the regulator.

A data strategy for the UK should first and foremost respect the rights and freedoms of every data subject, and aim to provide the greatest net-benefit to the whole of the UK – yet there is no compelling vision in the strategy; no clarion call. There is also no testable hypothesis in the strategy, by which its success (or otherwise) can be known. It is likely no single vision acceptable to all stakeholders could have got through write-round – not least because the unreformed, institutionally-ignorant Home Office will not accept a nuance that is in the public’s interest (for example, PHE / Test&Trace / police data sharing).

As written, there is no explicit difference in the National Data Strategy between personal data and data about objects. Lacking specificity, much of the strategy that is intended for one could be used for the other, thereby creating effects entirely unintended by the authors. A recent misstep by the new “Data Standards Authority” illustrates the sort of harm that can be caused when ‘generic intent’ overrides substantive nuance.

After a summer tainted by “mutant algorithms” in education, nothing would say understanding data less than NHS England agreeing to run the COVID-19 vaccination database off a 66 million row Excel spreadsheet. (While a single worksheet can have a million rows, losing 65 million people should be relatively noticeable – plus they know to look… now.)

Enclosures:

Towards making the pandemic response data changes safe for the longer term

HSJ reports a belief within Government that some current data practices, changed dramatically with emergency powers to meet the needs of the urgent pandemic response, should now become ‘the new normal’. While some of these changes might indeed be welcome, and some probably should remain, others need to end – and others must be significantly amended if they are to become anything like ‘normal’. 

It is not news that some status quo practices in the NHS around digital records were not entirely safe; this was for many reasons, not least the motivations and incentives of a range of actors – from multinational corporations to creepy single doctors – who want access to people’s direct care records for reasons beyond direct care.

A net assessment should be conducted of the goals and proposed ‘end state’ around health and care data (medConfidential will do one too) to provide a comparison with our net assessment from before COVID-19.

Digital and Direct Care

DHSC and the NHS did what they could in the circumstances, but access to digital services for those who are digitally disengaged continues to be a problem across Government – especially where community access points such as libraries are closed, either temporarily or permanently. A Whole of Government approach should be taken (possibly in the spending review) to assess and improve the piecemeal work done by Departments.

Mobile phone networks providing free data access to NHS.UK was a milestone in access to digital services, but many digital approaches across the NHS are not via zero-rated services: probably the starkest example of this is video consultations, which are a postcode lottery of apps and charging models – while the much-vaunted NHS app* still lacks video consultations for those situations where it helps both GPs and NHS 111. (*: No, not the (contact tracing) app. Rather, the good one that NHS Digital built as a core service; the NHS app which acts as a ‘front end’ to NHS.UK)

As COVID-19 de-escalates, and as NHS Test and Trace capacity therefore becomes available, the newly-NHS parts of PHE should address the mess – including the ongoing postcode lottery – of digital services that facilitate STD testing. NHS T&T will need something to do with its capacity after COVID, and the country requires a testing infrastructure to remain.

There will likely be a range of additional tests which can be moved to the ‘post-back and test’ approach of Test and Trace; SH:24 has shown how to do this at scale, but the broken model of Public Health England prevented equal benefit for all. And when such testing moves into the NHS, all of the existing Public Health safeguards and ring fencing around such data collected by NHS T&T will be required.

As with every new technology innovation requiring personal data, these can be used as a mechanism to get laid: creepy single doctors (and others without clear direct care purposes) should not have the ability to view the STD history of those they treat – or go on dates with, having met outside of work – in the way that, due to COVID reforms, creepy single doctors can currently view someone’s full medical history due to the removal of safeguards, with no means for a patient to know when their record was accessed.

Access to individual records for care

The widening of access to records has long been debated within the NHS. And while some clinicians will say how much it helps them, and while some of that may indeed be true, it is far from clear whether the patients involved can know whether their records were accessed where they should have been – i.e. that the wider access was actually useful – or whether their records were accessed when they should not have been – i.e. where wider access was harmful.

NHS Digital keeps records of every Summary Care Record access; these should be made available to each patient within the NHS app (and on NHS.UK when the NHS Login launches there) in order that verified patients can see how their record was used. Without providing that evidence base, any argument for any use of patients’ data will likely be some form of special pleading.

If the public is to have confidence in the broader uses of their data, the ‘new normal’ is going to require the NHS and wider public services to provide the evidence and information people require to assess their trustworthiness. Absent such information, and with decisions being made or influenced by those with other agendas, public trust will continue to degrade. Whether incrementally or catastrophically (as with another care.data) remains to be seen.

The decision to provide this evidence can no longer be ‘kicked into the long grass’; the information vacuum is already being filled. And where NHS IT suppliers such as TTP – which, with its GP Connect Access Record: HTML service, makes information on how a patient’s record has been accessed available to people outside of TPP’s service – do this in ways in which patients themselves cannot see, even if they use the NHS app, it is being filled in ways that are potentially explosive.

Access to records (in bulk) for secondary uses

ONS recently published a new re-identification process for ‘anonymised’ administrative data, which demonstrates that data even less detailed and less specific than data that is currently disseminated by NHS Digital is still open to re-identification – in practice, as well as in theory.

Even if some still assert that pseudonymised data is “not identifiable” – as contradictory as that opinion is to GDPR and DPA 2018 – it is now clear that pseudonymised data can be re-identified. NHS policy and practices of dissemination can no longer ignore the law, or the published work of the Office of National Statistics.

Some developments during the pandemic, such as openSAFELY, which while impossible even to establish without emergency COVID powers, probably should be incorporated into the ‘new normal’. But not simply as they are. Each such initiative must have a proper ongoing legal basis – by which we do not mean infinitely-extended exemptions, such as perpetually renewed s251 support, but proper involvement of data controllers – and robust information governance for every project: all projects being approved by a statutory public body with a reputable, transparent process approved by data controllers. 

Consensual, safe and transparent use of patients’ data is the only sustainable long term model; completely lawful, and with the appropriate governance and patient visibility to be trustworthy that is absent around the cabal of friends we see with some entities.

Public bodies can Improve The Foundations of other priorities

The move of (much of) PHE into the NHS is not new. The cancer registry was moved from PHE to NHS Digital due to the failures of PHE, and the opportunities available for better cancer data within the NHS are already being delivered, following that move. That the cancer registry has applied the National Data Opt-out since 2018 did not cause harm to data users, so there is little cause to worry that any other lawfully-operating disease registry will lose out by moving within NHS Digital.

As the future location for all of PHE’s other responsibilities remain unclear, an approach based on ‘offline harms’ would – given the new bodies’ remits – allow a new advisory committee to cover anything beyond DHSC’s National Institute for Health Protection and the NHS, and ensure no gaps.

NHSX / NHS Digital reforms: One cannot build on toxic foundations. Any ‘reform’ that merged NHS Digital and/or NHSX into NHS England (and Improvement?), would be fundamentally unworkable. The body that makes commissioning and decommissioning decisions cannot credibly claim to both make decisions based on evidence and be the statutory safe haven for medical records, without patients equally credibly believing their records were used to close their hospital – even if such a belief is incorrect.

‘Artificial Intelligence’: Using its purchasing power to insist on a scheme of commodity pricing, the NHS can ensure both a competitive market for health AI – giving patients the benefits of new services, NHS medics tools and diagnostic assistance they can use, and innovators the confidence they will be able to get a reasonable return for a good investment – while also opening up the worldwide use of NHS-class services and tools.

Documents:

The data flows of Universal Credit

[this was written in stages from 2020 to June 2024 – edits after 2020 are dated]

Over the last year, medConfidential has been examining the systems and information flows in and around Universal Credit – a key example of what the former UN Rapporteur on Extreme Poverty, Philip Alston, calls the ‘Digital Welfare State’.

Phil and Sam would like to thank the many organisations working on UC for their help and support, especially Child Poverty Action Group (CPAG). The project, funded by the Open Society Foundations, and working alongside front line support services, charities, campaigners and lawyers will continue.

Our first report looked at four key areas:

One focus (Annex 1) covers what DWP knows, and what it should know about how and when claimants are paid. DWP officials have denied to Parliament and the High Court that UC has access to information that HMRC holds; whether DWP didn’t ask, or if HMRC withheld that info from DWP is as yet unclear.

DWP’s own documentation tells claimants which months they will be thrown off Universal Credit due to DWP’s systems’ inability to read a calendar. And indeed the Court recently ruled, in a case brought forward by CPAG, that DWP’s “refusal to put in place a solution to this very specific problem is so irrational that I have concluded that the threshold is met because no reasonable [Secretary of State for Work and Pensions] would have struck the balance in that way.”

Annex 2 examines how risk based verification (RBV) has been used in the benefit system – in particular around housing benefit and council tax support, but elsewhere too – and what that reveals about DWP and fraud.

As we cover in Annex 2A, DWP documents imply UC has been designed so that GOV.UK Verify can never work for 20% of claimants, i.e. those deemed ‘high risk’, against whom 75% of ‘counter-fraud’ resources are targeted – whose cases may in fact be more complex than genuinely risky. (This mandatory ranking process, also known as ‘stack ranking’, is the same process that led to the 2020 A-level grading fiasco.)

Annex 3 takes a deeper dive into how the wider Government fraud agenda impacts on DWP, noting that while DWP may keep trying to keep things secret from both the public and Parliament, the Cabinet Office has recently instituted Government-wide oversight of ‘Fraud and Error’…

Annex 4 addresses DWP’s response to COVID-19, and its effects on UC information flows; what changed, and what didn’t. Annex 4B briefly also covers the ‘home testing’ process for COVID, which uses DWP and Government ‘counter-fraud norms’ – i.e. a credit history check on every applicant – a process DWP is considering as it’s ‘in-house’ identity approach “to replace Verify”.

Annex 5 lists the burdens upon burdens loaded by digital services upon the poorest and least able to cope; Annex 6 looks at what can be done to address those who UC currently doesn’t serve well, with Annex 7 focussing on new mothers who have a new baby and much new bureaucracy. Annex 8 looks at the new development of Superapps for Government.

And, bringing things all together, the core report covers the core parts of UC, and serves as a reference point for our ongoing and future work, followed by a short closing report.

As this work was done chronologically, you may wish to begin with the Annex that interests you most – or, if you prefer, read the Annexes in order and then the main report.

Wider lessons for Government

DWP chose what to automate, and those choices primarily benefited DWP.


Annexes 2A, 3, and 5 contain wider lessons for government, that Departments would already know if they had cared to look; but sometimes the best internal lessons come from outside.

DWP’s own documentation tells claimants which months they will be thrown off Universal credit due to DWP’s systems’ inability to read a calendar. CPAG recently won a case against DWP, where the judge said:

DWP’s “refusal to put in place a solution to this very specific problem is so irrational that I have concluded that the threshold is met because no reasonable [Secretary of State for Work and Pensions] would have struck the balance in that way.”

UC is a large canary in the coal mine, but the ‘fraud’ approach is metastasising across government and will have consequences for all Government uses of data.

COVID’s Butler Review

The Butler Review into Intelligence on Weapons of Mass Destruction (ie the Government’s decision to invade Iraq) had one meaningful outcome – it obliged the creation of the Chilcot Inquiry. The current Review of the UK’s response to COVID-19 by the All-Party Group on Coronavirus must be given the evidence to do the same. 

This Review has other important matters to attend to, so its remit will naturally be constrained. Its main focus while we are still in the crisis must of course be forward planning for this winter, and our future response to COVID-19.

While there will – quite rightly – be much wailing and gnashing of teeth about the history of this pandemic, including the contact tracing app debacle, this will in large part be academic except in what it contributes to the primary goal of getting the Review to require an Inquiry.

History has shown this can come from civil servants, who already know this Government will leave them unsupported within processes they built.

The truth will come out, it always does; the question is, will you help?

medConfidential will publish our draft submission here in due course, and we are happy to help others with theirs. 

P.S. We take donations.

Rest of Government: Data misuse as “Missed Use”

Even during the height of the pandemic, DCMS continues (and continued) its work to share data, including and especially your medical records.

Not all data projects are a good idea – such as when a Home Office civil servant tried arguing it is part of the Home Office’s ‘public task’ to copy the medical records of all women in an attempt to discover sham marriages. In Government, not using data in that way is known as “missed use”.

Powers in the 2017 Digital Economy Act made it easier for most data in Government to be used in that way, with the exception of NHS data. DCMS has now conducted a ‘review’ of how those powers are used, and how they are not, which will recommend removing that safeguard. DCMS didn’t ask for our input, so we did our own review.

In another example of “missed use”, the UCAS form includes a question that asks whether an applicant’s parents went to university. The question is optional, and was originally added for statistical purposes. However, the data generated by that question soon came under pressure to be used for admissions decisions. The rules were changed, and then behaviour changed too – for if your child answers truthfully ‘yes’, they may be ‘penalised’ in favour of those whose parents did not go to university. As it is an optional question, there is a third choice which every university application coach tells their teenage clients to use: just don’t answer that optional question – thereby being both truthful (by omission) and avoiding any penalty for their parents’ education. 

Those who cannot afford such coaching, but who go to schools where having a parent who went to university is common, are therefore at a significant disadvantage. This is a stark illustration of the way in which flawed incentives, created entirely because of claims of “missed use”, can destroy the integrity and utility of the data itself – and now skew the statistics about intergenerational access to university.

Widening participation may be important, but administrative idiocy is inevitable.

Both of the examples above might appear to be clear, logical, even defensible decisions by civil servants following a public task – albeit with narrow definitions and no obligation to the bigger picture, or even any assessment that there might be a bigger picture. Such uses are driven by the simplistic view that “Administrative data is an invaluable resource for public good. Let’s use it.

“Let’s use it”

As pre-Covid DHSC paved the way for the new grab of your GP records, the “public debate” about uses of data continued without any meaningful public debate, and DCMS carried on its work as if the pandemic never happened. Pre-pandemic, the number of “round tables” and “expert panels” advancing cases that were entirely guessable from their titles was already ramping up, as useful idiots (and those chasing funding) made themselves useful to a wide-ranging programme in the data world, e.g. page 8.

Meanwhile, the “public awareness” effort was (and is) more subtle and better planned than that of the early days of care.data in 2013-14, but it is no less flawed. If you happen to attend one of those events, when they restart, one good question for the panel would be this: “What would make you take a different view on this project – is there space for diverse views, based on information you’ve not yet seen?”

In another example, it was perceived risk of ‘missed use’ that undermined appropriate scrutiny of a data request for a ‘causes of cancer’ study. That the study was run by a tobacco company didn’t stop Understanding Patient Data later explaining why cancer patients’ data should be used by a tobacco company. Will the gatekeepers and their defenders keep justifying anything that passes through their gates? The tobacco industry faux-naïvely asks how a richer, more statistically-informed ‘debate’ about harms can be anything other than a good thing – while the debating points are based on cherry-picked data, with misleading or disingenuous framing. 

The only way to avoid such issues is to tell every patient how (their) data is used, what their choices are and how they work. Because there will always be incentives that make it in someone’s interests to use data in ways that undermine the promises made to data subjects.

An ‘academic project’ to look at the effectiveness of justice and education incentives could be a good thing – especially when subject to peer review, published in a peer-reviewed academic journal, and subject to usual academic funding rules. But this project was commissioned by the Home Office, and its ‘academic’ input appears limited to that of a librarian wearing a fig leaf.

Data librarians do a vital job, and there is great value in well-curated data. Academics too are vital, but policy-based evidence-making – where the only real choice is to do what the Home Office has commissioned – is not academic research. Such ‘research’ may be legitimate to inform civil service and/or political action, but it is not an academically led process. So why is it being funded by UKRI / ADR, using resources that should be going to bona fide academic research?

The data to which ADR has access is detailed and sensitive personal data, for legitimate research. It is self-evidently wrong for ADR to claim the data it holds “is no longer classed as personal data” – classification by ADR is irrelevant, what matters is the law. And even their own academic specialists disagree: “Safe Data. This is a misnomer” – both GDPR and DPA 2018 are explicit that pseudonymised data is personal data. 

Always even more data

There are endless claims that “more data” is needed. Always more data, and “legal barriers” are forever being cast as the biggest problem – hence the Digital Economy Act Part 5. But three years after the DEA was passed, how many pilot projects for public service delivery in England did Government (or government) approve in the last year? Zero. None. Zilch. And how many have Departments proposed to run? Zero. Nada. None. (Although there is one single project in Scotland.)

Despite not even having made use of existing powers, Whitehall now wants to ‘loosen’ the rules around access to health data in a very similar way. Having provided no evidence of benefit – nor even of pressing need – the problem claimed is still “not enough data”.

It is inconvenient facts such as these that are most often omitted from, or masked in, the endless briefing packs for ‘public acceptability debates’. It’s almost as if the most toxic desires and projects are pushing the agenda, while never actually themselves going forwards. And, as history has shown, focusing solely upon a singular objective or key result as justification to expand powers or drive policy is fundamentally toxic.

One good thing about the Digital Economy Act was going to be that all of the decisions taken, the minutes and related papers were supposed to be made public in a single register – but they aren’t. It should not be only the privacy campaigners and their pesky FOIs that get to see what is going on; DCMS has gone back on its promised approach, and is hiding as much as it can.

medConfidential has been asking since January 2019, and had a stream of excuses – the latest being that the pandemic means that while DCMS’ data work and agenda continues apace, officials can’t publish anything because of it. We wrote to the Secretary of State in May, and were told that the DEA ‘review’ would be published “shortly”. We won’t be holding our breath. And once Government believes Covid has passed, we can start chasing these again.

Everyone should be able to see what is done with their data. In that clarity lies an informed debate based on how data is actually used – not just special interests pleading to do more of whatever it is they want to do. The decision makers should not be the op-ed pages of the national newspapers, but rather properly-informed data subjects, making choices based on the truth.

The first care.data programme collapsed after official assurances were shown to be false. The problem was not the fact that many believed, in good faith, that what they were being told was true. It was the fact that it wasn’t.

“Why should I care?” is an entirely legitimate question about data use. A better question, however, is: why should anyone have to take on trust what the public briefings say about it, or choose to exclude?

Six years on, we have a new Government, a new data environment, and exactly the same policy debates. Do you really think – as HMG privately does – that people care less about their data now than they did six years ago? And do those putting their professional reputations on the line (again) really believe they’ve been told everything?

It is one thing to be opaque and deceptive; it is far worse to be transparent and deceptive. If you combine a barrel full of wine and a teaspoon of sewage, what you get still looks like a barrel full of wine. And public bodies will spin their figures to say that it’s statistically safe to drink – while they reach for their gin…


Documents:

  1. medConfidential’s Review of Powers in  the Digital Economy Act (which DCMS didn’t ask for). This was written before the ADR RCB was abolished but our comments are on process and information provided to decision makers, not the name of their committee)
  2. How HMG used credit checks for covid test eligibility