Towards making every data flow in the NHS consensual, safe and transparent

This page was first posted during the care.data debacle and amended after the Goldacre Review. It will be updated following the 2024 general election.


Out of the wreckage of care.data, GPDPR, and FDP, something will emerge. It can be better, but that will not happen without a multi-organisation discussion led by medical professionals and patients, with everyone at the table being heard. If that doesn’t happen, what replaces it be ‘care.data twothree four – without the culture change that Dame Fiona Caldicott has called for since 2013.

medConfidential has several proposals, which generally look at processes designed to enhance trust and help make all flows of data in and around the NHS and wider care system consensual, safe and transparent:

1) Tell patients exactly where their data was used, for both direct care and secondary uses, in a Personalised Data Usage Report delivered through the app.

Tell patients about every access to their medical record for direct care, and every use of information from their record for secondary uses.  It’s that simple: hide nothing from the patient. It is the secret corners that cause concern. This should be achievable for the majority of the areas of most significant concern, for all patients, within 6 months (as we first proposed in 2014).

Start with a coalition of the willing and the capable – Summary Care Records, GPConnect, the FDP, and (most) Shared Care Records all keep track of when a record is accessed for direct care: tell the patient where/when their record was accessed. It will show them that their records are accessed properly, but they’ll be able to see if it is accessed improperly, or more commonly, isn’t accessed when it arguably should be.

2) Finish the transition exclusively to Trusted Research Environments as a ‘safe setting’

Where patients are happy to allow their data to be linked and used for research, it must be done safely as proposed by the Goldacre Review.

A Trusted Environment should be that should be used for all individual-level data access, and we will continue to engage to ensure it delivers the necessary benefits and the necessary protections. Such an approach would also improve the work of the disease registries (as we proposed in 2016, and which is now in sight as NHS England has taken over caner registry and will increasingly integrate more).

3) A change process into the future

NHS England can commit to a change process which embeds public confidence in their uses of data. The engagement from NHS England on their purchase of the Palantir platform was based primarily on one focus group of only 11 people. Any promises today can be taken away tomorrow without a trustworthy.

The NHS app can have new features added purely because a Minister wants them: government needs prioritised irrespective of clinical or patient needs. There needs to be clarity about change processes.

4) Fix consent – use the Spine

To have privacy, a citizen should not have to know how the NHS works. Hence, consent choices about the secondary use of patients’ information should cover all data flows for purposes other than their direct care. In 2015, we proposed a single NHS Spine-based consent setting: “Dissent from disclosure of individual-level data for uses other than my care and treatment” with the power and detail of the GP data opt out and the scope implied by the name of the National Data Opt Out.

The National Data Opt Out is full of loopholes and doesn’t cover GP data. It should encompass any data leaving the GP practice for purposes beyond direct care (ie replace the GP Data Opt Out without patients having to use a different form). The online process should also be made to work for dependent children living at home; currently, NHS England forces a paper form; how very digital…

5) A modern Lloyd George Envelope?

A thought experiment: given their value and potential sensitivity, when medical records are not being edited by a care provider, they should be classified and subject to the highest legal protections that the UK can provide. Government has the Official Secrets Act. So which needs better protection – official memos, or all of our consolidated medical records?

6) Digital services

The NHS Login and associated national digital services should be transparent about prioritisation and decision making, before there can be an informed discussion about choices. Currently, NHS England do what they wish to do and others have to beg for features that may not be in NHS England’s political interests, so will never be developed: Why doesn’t the NHS app do video calling to 111 for consultations?

When these are (all) in place, a new data programme?

Once the first four of these are in place, there can be a full and open discussion about what the new data framework for the future of research and other secondary uses should be. The only data programmes that work, like openSAFELY, are those where everyone gets a seat at the table.

One of the descriptions of care.data that has been used is that it was the most data they could sneak past without anybody noticing”, and it’s terrible. This does not mean that a proper programme, with properly managed access to the full care episode history of individuals who wish to be included, could not be done safely, in time.

Secondary uses of patients’ data are clearly necessary to achieve specific, clearly-defined benefits – but the use of individual-level data and/or linkable identifiers should be done only in ways that respect the digital equivalent of the 3 Rs: replacement (using personal data only where absolutely necessary), reduction (using the minimum amount of data necessary) and refinement (using methods that minimise the risk of harm and distress), and as part of an ongoing national conversation.

When examined on a case-by-case basis, an appropriate data product with deconflated requirements can be defined for each use case. We have conducted a worked example for two of the most complex and most cited scenarios: risk stratification and invoicing (2018).

The important question still remains: what would a good, high-quality dataset that didn’t try to sneak anything anywhere look like? With properly managed consent, and with the full knowledge and support of the public, what would meet – or possibly even exceed – the needs and wishes of the UK research community, for the problems they are trying to solve?