Scorecard: A List of Loopholes and Unpleasant Surprises (in England)

As a patient, can you know how your medical records are used, and whether those uses are all consensual, safe, and transparent?

This is the current and publicly-proposed state of play, as of mid-November 2019 – following both the GDPR and the UK’s Data Protection Act 2018 coming into force, and the introduction of the National Data Opt-out. Large parts remain undelivered and late. Covid then changed some things, and the subsequent takeover of NHS Digital by NHS England, and the complete opacity of their Palantir Procurement means there is not enough information in the public domain (or even in secret) to meaningfully update it before (likely) summer 2024

(N.B. This scorecard was first published in September 2017, following the Government’s response to Caldicott 3.)

Consensual

Safe

Transparent

GPs – Direct Care

✔︎

✔︎

Depends on GP

GPs – local CCGs / councils

✔︎

Varies by recipient

Depends on GP

GPs – research copies

✔︎

Unknown

Hospital – Direct Care

✔︎

✔︎

✔︎

Hospital – local sharing

By 2020

Varies by recipient

By 2020?

NHS Digital: SCR controls

✔︎

✔︎

In 2018…

NHS Digital: Safe Setting

✔︎

✔︎

✔︎

NHS Digital: Sale of hospital records

✘ (opt outs ignored)

Partial: now;

Full: Late 2018?

NHS Digital: Commercial reuse of hospital records

✘ (opt outs ignored)

NHS England: CSUs / councils / national

Variable

Unlikely

DH family:

PHE disease registries

✘ (no fair processing)

opt outs as for hospital data

Partial

CPRD @ MHRA

✔︎ (Type 1)

Partial (unknown: will it be included in the NHS lists?)

Genomics England

✔︎

✔︎

✔︎

Chief Medical Officer 2017 Annual Report plan for Cancer care ✔︎ ✔︎ ✔︎

It is still the case that, in practice, an individual institution or organisation may fall short of being consensual, safe, and/or transparent in any particular instance. This scorecard covers what the current rules intend to be the case. medConfidential believes it is possible for every trustworthy organisation to meet the requirements of consensual, safe, and transparent.

The scorecard above relates to patients’ data within the boundary we define as ‘being under NHS data controllership’, which includes data processed under an NHS contract. This may, for example, exclude apps to which you provide data which were “signposted” by the NHS. For such apps, we have a page covering ‘The questions you should ask about apps’ (coming soon).

What the NHS terms “Risk Stratification” – activities such as the calling in for screening of people with particular characteristics – can be done in a number of ways. The scorecard considers such activities depending on where the decisions are taken: consensual and transparent risk stratification requires different types of decisions to be made at different levels, e.g. the CCG may choose the characteristics of whom should be called, but the GP should select their patients who actually meet those characteristics. (The transparency and accountability of such a model sometimes works against the narrower interests of some local decision makers, who do not wish to make a clear decision based on the best evidence.)

medConfidential strongly believes that there should be a single consent choice for patients – one tick box covering all secondary uses. However, ‘consensual’ covers any legal basis for consent, not just dissent – including exceptions provided for in law by Parliament, such as in the case of a public health emergency.

An acid test of what is ‘safe’ is: can you find a record and medical history based on reasonably-known information (e.g. available to neighbours, or from an article in a newspaper) and then successfully cover it up? It is insufficient for every member of the public to be expected to keep their children’s ages and dates of birth secret from their friends, classmates, and ex-husbands. The only truly safe way to handle patient-level data for secondary uses is in a fully-auditable Safe Setting, or by providing statistical outputs only.

Public Health England: PHE refuses to consider effective patient dissent, and does not do fair processing – so at least some of its data processing could not be lawful under the Data Protection Act 1998; this is still the case under GDPR / DPA 2018. Additionally, due to the conflation of direct care and secondary uses, the current data model for, e.g. the Cancer Registry, cannot deliver a true Caldicott Consent Choice to patients without a fundamental redesign that takes the law into account (even the law as it was when the current registry was built).