Ahead of a Comprehensive Spending Review, the Government has decided that a large percentage of discretionary government spending will go on health and social care by the end of the next spending period. That is probably a better choice than the US, which spends about that amount on their military.
In light of that decision being made, there are rational consequences which require thought to avoid perverse incentives:
- Data available to life sciences and research: For there to be public confidence in data use, every patient should be able to know how the NHS and others use data about them, and how their wishes are respected. The NHS has established clear processes for the use of data for legitimate research – these do not need to be changed. However, the implementation of the National Data Opt-out remains hamstrung by legacy data disseminations. This, the first spending review since the 2018 Data Protection Act, allows for a clearer formulation when communicating with the public: “If you want your data to be used for research and for other purposes beyond your care, it will be; if you don’t, it won’t.” (Any exceptions being solely decided by the explicit approval of the Confidentiality Advisory Group – which was placed on a statutory footing in 2014, yet still has no Regulations governing its work.) Past and current heavy reliance on (DPA98) ‘anonymous’ data as the basis for dissemination both undermines public confidence and limits the data available to research. The spending review offers an opportunity to reconsider that failed approach, improving public confidence and making more high quality data available to researchers and the life sciences – both underpinned by a commitment that whatever a patient wishes, they will be able to see how their wishes were respected. Any suggestion of ‘data trusts’ for NHS patients’ data requires as a prerequisite the admission that the NHS itself will never get data dissemination right in patient’s interests. Public confidence in data for life sciences and research would be higher if the message was clear, simple, and accurate: If you want us to use your data in legitimate projects, we will; if you don’t, we won’t.
- Technology in the NHS: Clinicians will use technology when it helps them with patients; when it doesn’t, they don’t – no matter how hard NHS England may push it. The FHIR (Fast Healthcare Interoperability Resources) standard is now internationally recognised as the standard for interoperability between health systems – yet the first version was only published after the last spending round. Treasury / DH / NHSE should ensure that companies cannot use contracts to limit or prohibit interoperability, or to require bulk data copying from core hospital systems into commercial companies. Where they are proposing new national programmes, chopped up into parts, what happens at the boundaries between parts?
- Prevention is cheaper than cure: In advance of the spending review, HMT should commission an independent assessment of the ‘DH vision’ on prevention to answer two critical questions: will it do what it claims to do? And if not, how and where does it fall short? (Page 14 of the vision shows the disconnects.) The assessment should be published alongside the DH green paper, and show what questions must be considered across Whitehall to avoid any other department causing what DH seek to prevent.
- New forms of Transport: Will DfT allow self-driving cars to operate in a way where their stopping distance is greater than their effective sensor range? Will equivalent assessments be made for other technologies; and if not, what will the consequent effects be on the health of the nation?
- Procurement incentives for competitive markets: Where an NHS body, wishes to procure an AI to assist in diagnosis, it should be required to procure 3 – effectively requiring 3 diverse analyses rather than one, replicating the medical norm of a ‘second opinion’ from a human doctor. That may be extensible to other public bodies.
- AI and algorithms in the public sector: For all bodies subject to judicial review, any AI or algorithm involved in input to that decision must satisfy the explainability requirements of judicial review. Should there be a clear public sector mandate that algorithms will only be used if they satisfy existing legal obligations, and that technology tools will need to be procured to satisfy those tools, that will create a market in which the UK is possibly uniquely placed to lead.
The first two points have strong equivalents across all departments.
Tests for the spending review: Balancing mental health, parity of esteem, and Public Health
The spending review is the primary administrative mechanism for cross-government prioritisation.
The largest public health concerns are different in different local areas – will the spending review (and MHCLG priorities) reduce or exacerbate those differences?
Will (tech) companies assessed to be causing mental health issues be required to take steps to reduce the harms they cause in future, and mitigate harms already caused? If they are not, these costs will have to come from the NHS budget, and are effectively a commercial subsidy paid by the public purse. By comparison, for each of alcohol, tobacco, other substances with consequences for human health, and digital companies – does each contribute in tax revenue what they create in direct and indirect costs?
Some areas of this post were elaborated in our submission to the Digital Competition Expert Panel.