Author Archives: medcon

medConfidential comments ahead of the Spending Review ( / Manifestos)

Ahead of a Comprehensive Spending Review, the Government has decided that a large percentage of discretionary government spending will go on health and social care by the end of the next spending period. That is probably a better choice than the US, which spends about that amount on their military.

In light of that decision being made, there are rational consequences which require thought to avoid perverse incentives:

  • Data available to life sciences and research: For there to be public confidence in data use, every patient should be able to know how the NHS and others use data about them, and how their wishes are respected. The NHS has established clear processes for the use of data for legitimate research – these do not need to be changed. However, the implementation of the National Data Opt-out remains hamstrung by legacy data disseminations. This, the first spending review since the 2018 Data Protection Act, allows for a clearer formulation when communicating with the public: “If you want your data to be used for research and for other purposes beyond your care, it will be; if you don’t, it won’t.” (Any exceptions being solely decided by the explicit approval of the Confidentiality Advisory Group – which was placed on a statutory footing in 2014, yet still has no Regulations governing its work.) Past and current heavy reliance on (DPA98) ‘anonymous’ data as the basis for dissemination both undermines public confidence and limits the data available to research. The spending review offers an opportunity to reconsider that failed approach, improving public confidence and making more high quality data available to researchers and the life sciences – both underpinned by a commitment that whatever a patient wishes, they will be able to see how their wishes were respected. Any suggestion of ‘data trusts’ for NHS patients’ data requires as a prerequisite the admission that the NHS itself will never get data dissemination right in patient’s interests. Public confidence in data for life sciences and research would be higher if the message was clear, simple, and accurate: If you want us to use your data in legitimate projects, we will; if you don’t, we won’t.
  • Technology in the NHS: Clinicians will use technology when it helps them with patients; when it doesn’t, they don’t – no matter how hard NHS England may push it. The FHIR (Fast Healthcare Interoperability Resources) standard is now internationally recognised as the standard for interoperability between health systems – yet the first version was only published after the last spending round. Treasury / DH / NHSE should ensure that companies cannot use contracts to limit or prohibit interoperability, or to require bulk data copying from core hospital systems into commercial companies. Where they are proposing new national programmes, chopped up into parts, what happens at the boundaries between parts? 
  • Prevention is cheaper than cure: In advance of the spending review, HMT should commission an independent assessment of the ‘DH vision’ on prevention to answer two critical questions: will it do what it claims to do? And if not, how and where does it fall short? (Page 14 of the vision shows the disconnects.) The assessment should be published alongside the DH green paper, and show what questions must be considered across Whitehall to avoid any other department causing what DH seek to prevent. 
  • New forms of Transport: Will DfT allow self-driving cars to operate in a way where their stopping distance is greater than their effective sensor range? Will equivalent assessments be made for other technologies; and if not, what will the consequent effects be on the health of the nation? 
  • Procurement incentives for competitive markets: Where an NHS body, wishes to procure an AI to assist in diagnosis, it should be required to procure 3 – effectively requiring 3 diverse analyses rather than one, replicating the medical norm of a ‘second opinion’ from a human doctor. That may be extensible to other public bodies. 
  • AI and algorithms in the public sector: For all bodies subject to judicial review, any AI or algorithm involved in input to that decision must satisfy the explainability requirements of judicial review. Should there be a clear public sector mandate that algorithms will only be used if they satisfy existing legal obligations, and that technology tools will need to be procured to satisfy those tools, that will create a market in which the UK is possibly uniquely placed to lead.

The first two points have strong equivalents across all departments.

 

Tests for the spending review: Balancing mental health, parity of esteem, and Public Health

The spending review is the primary administrative mechanism for cross-government prioritisation. 

The largest public health concerns are different in different local areas – will the spending review (and MHCLG priorities) reduce or exacerbate those differences?

Will (tech) companies assessed to be causing mental health issues be required to take steps to reduce the harms they cause in future, and mitigate harms already caused? If they are not, these costs will have to come from the NHS budget, and are effectively a commercial subsidy paid by the public purse. By comparison, for each of alcohol, tobacco, other substances with consequences for human health, and digital companies – does each contribute in tax revenue what they create in direct and indirect costs?


Some areas of this post were elaborated in our submission to the Digital Competition Expert Panel.

The Opt Out process for those with children

This page has been superseded by a 2021 page

Below is left here for historical purposes

medConfidential’s opt-out form for GP data continues to allow you to protect the information held by your family GP as it has always done – but the only way to protect your data collected by hospitals and in other care contexts is now via NHS Digital.

NHS Digital does not (yet) have direct access to your GP record, so its opt-out process requires you to take several steps – and officials have decided that people who have children must make a third.

Additional steps forced on those who care for children

If you have children, or other dependents, and wish to express a choice about the use of their medical records for purposes beyond their direct care, the process is needlessly complex.

NHS Digital provides no online process for families with children even those who are registered with the NHS at the same home address as their parent or carer – so you must use its online process for yourself alone, and then use a separate postal process for any children under 13.

(Children aged 13 or over can express their own choice about their medical records for themselves online.)

So, instead of the single form you could use to express your choice for yourself and your family back in 2014, you must now do THREE things:

1) For your own hospital and other non-GP data: opt out (or opt back in) online

2) For your dependents’ hospital and other non-GP data: post this form to NHS Digital

3) For your and your dependents’ GP data: give this form to your GP

If you don’t have access to a working printer, e-mail children@medConfidential.org with your postal address and we will post you copies of the paper forms, for free, no questions asked. If you don’t have e-mail, you can text your address to 07980 210 746.

If you can afford to make a small donation to support us in offering this service to others, we have a donation page.

registered with the ICO to process personal data in this way.)

Why?

It appears the “most digital” Secretary of State the NHS has ever had would rather place the burden of understanding and action onto individual patients – especially those with families – than solve the problem his officials have created.

And the fix is simple: NHS Digital’s online process confirms who you are to the extent that you are able to express your choice online, yet it chooses not to ask if you have any children – who could then be confirmed in exactly the same way the process confirms that you are who you say you are. The move to ‘digitise’ the National Data Opt-out left YOU with more work to do…

This extra work for you is a deliberate choice; it didn’t have to be this way.

Once you have expressed your wishes about your and your family’s data, you may also wish to write to your MP and ask them to ask the Department of Health why you are being forced to jump through these hoops.

Because while healthy, young, and particularly male protagonists commonly have little understanding or insight into the types of sensitive information they will one day have to divulge to their doctor – and the consequences of confidentiality not being respected – others do not have the luxury of such ignorance.

Late October update

At the start of October, the Department of Health took away your ability to opt out via your GP from having information about you, collected by the rest of the NHS, being used for purposes beyond your direct care. (The option to prevent information from your GP record leaving your GP practice remains. For now.) The new process is so ‘hip and digital’ that you also have to use the Royal Mail if you wish to make a consent choice for your children, as well as visiting your GP practice to make a choice for your GP data that the online process tells you nothing about.

Is this Matt Hancock’s view of a digital NHS?

We are testing a new trifold to guide families through expressing their full opt out choices –which is now a three step process: online, post box, and at the GP. This may be simpler for NHS Digital, but it’s a lot harder for you – a choice with which Matt Hancock seems to be entirely happy.

NHS Digital was apparently very proud that more people opted in via the digital service than opted out in its first 2 months – though sending out 1.6 million letters could be said to have stacked the scales somewhat – but that represents at most a few hundred people a month, whereas 5,000-10,000 people a month were still opting out via their GP until the Secretary of State took that choice away from you.

We have previously given a commitment that there will be a functional digital opt-out process for patients, and that if NHS Digital wasn’t going to deliver one, then medConfidential would have to (though this will likely be very analogue on their side…).

Data rights and proper information can together empower every patient and citizen to have more confidence in those who use their data. NHS Digital seems to want to make it more complicated. Though official information is published in various forms, in various places, the only way a patient can currently read how their wishes were respected is to visit TheySoldItAnyway.com

If you didn’t receive a letter from NHS Digital about the new ‘National Data Opt-out’, and since you’re reading this on our website, you should check the online process to see if your choice disappeared somewhere in the machine (and, if so, to set it to what you want). You’ll then need to set it for your children too by post – and at your GP, for your GP data, to ensure that too is set.

 

Consultation Responses, etc.

With the National Data Guardian Bill having its second reading in the Lords this week, medConfidential has published a letter of support for the Bill. Meanwhile, the Organ Donation Bill contains a supposed safeguard that is overly complex and will not provide reassurance to those who wish to see how their organs will be used after death. We have drafted an amendment for the Lords to fix the broken Bill, if the Commons does not.

As part of the next piece of NHS legislation, the National Data Opt-out should be placed on a statutory footing. The next legislation will likely be the result of NHS England’s consultation on “integrated care providers” (our response) and the “long term plan” (our response), which also referenced the need to reform invoice reconciliation.

Our friends at dotEveryone published their views on digital harms and responsible technology, suggesting that data and ethics in Government should be led by someone “relatable … charismatic and imaginative”. Which would be better than the current person, whose company created the problems around commercial abuses of data in the NHS, and which is still causing problems 20 years later. The current  ‘imagination’ at CDEI (the ‘Centre for Data Ethics and Innovation’) seems to be to repeat the sort of data sale scandals in Government they already caused in the NHS. The Information Commissioner also sought views on a ‘regulatory sandbox’, where companies can experiment with personal data – we had views.

Data use across the rest of Government has also been keeping us occupied. Our evidence to the House of Commons Science and Technology Committee contains some new thinking on the failures of agile in public bodies. Some of that thinking was also in our response to the call for evidence ahead of the UK visit of the UN Special Rapporteur on Extreme Poverty and Human Rights, who is looking at algorithms and digital effects around Universal Credit.

 

Data and the rule of law

The data sharing powers under the Digital Economy Act 2017 are still not fully in force. This did not prevent the Ministry of Housing, Communities and Local Government (MHCLG) demanding data on every homeless person in the country, such as in Camden. The secrecy of data use in such cases must be addressed by the UK Statistics Authority / Office for National Statistics – it is doubly disturbing that MHCLG used the research process to evade the scrutiny that would have applied via other routes.

Decisions by public bodies must, today, comply with the standards of the rule of law. As we move towards more automated decision-making, how will those standards be maintained?

The tech companies and their apologists want the approach to be one defined by ‘ethics’ – as if no tyrant ever failed to justify their crimes. “The computer says no” (or “DeepMind says no”) is wholly insufficient for suppliers of data processing functions to government making decisions about citizens.

All reputable companies will be entirely willing to explain how their “AI” systems arrive at the suggestions or decisions they make – including the (sources of) data on which they were trained. Disreputable companies will be evidenced by their failure or inability to do so.

Government Departments should deliver accountability to ‘their’ data subjects (currently they don’t). But beyond accountability to individuals on how data about them is used, there are standards that must be followed by institutions – especially those which govern.

The Venice Commission has produced a ‘Rule of Law checklist’, covering the context of decision-making. We’ll be taking a look at a couple of Government automated processing plans, and seeing how they conform – and how the checklist applies to digital projects, probably starting with Universal Credit and Settled Status, based on past work. We anticipate identifying holes in some of the frameworks currently used by Government, as compared with the standards required by the rule of law and judicial review. Comments are very welcome to sam@medConfidential.org.

Initial response to new ‘tech vision’ from DHSC

Update: longer response.

The current Westminster penchant for speeches with applause lines but without details has reached DH… 

Update: NHS Digital has now published some details – which are utterly underwhelming when compared with the Secretary of State’s hyperbole. “Use the NHS number” and “upgrade from ICD-10 to ICD-11” is not the radical changes Secretary of State appeared to suggest. Although with the promise of registers, we might dust off the amendment we suggested to the Lefroy Bill (which mandated NHS numbers by law) in 2014. We will update this document when NHS Digital published the document that should have appeared at the same time.

Notes:

  • Data: “We are supportive of … Data Trusts” – does the SofS/DH have so little confidence in the NHS getting data right that he/it is supportive of stripping the NHS of that governance role?
  • DH blindspots: “We will know we have achieved our goals when”… does not mention patients other than suggesting they use apps for self-care…
  • Privacy: There is no reference to the duty of confidence every clinician is under to their patients (it instead points at the Data Protection Act)
  • Google: The obligation on all systems to use FHIR interoperability removes the figleaf behind which DeepMind insisted on data for all patients in the Royal Free for 5 years.
  • Google: It also proposes outlawing the monopoly line in Google’s standard contract that forces users of the Streams app to only connect to DeepMind’s servers. It is unclear whether that line will survive the lobbying it is about to receive.
  • Amazon: Case study 7 is as true as leaving a note on the fridge, but there are other effects of giving Amazon/Alexa such information. Facebook’s new Portal offers the same functionality, and will explicitly be used to target ads at users.

 

Below quotes can be attributed to Sam Smith, coordinator of medConfidential, which works to improve uses of data and technology in the NHS.

The NHS knows what good working technology like, but to get there, you can’t just turn A&E off and on again and see if helps.

Mr Hancock says “we are supportive” of stripping the NHS of its role in oversight of commercial exploitation of data. That should be a cause for widespread concern. If Matt thinks the NHS will never get data right, what does he know that the public don’t?

The widely criticised National Programme for IT also started out with similar lofty vision. This is yet another political piece saying what “good looks like”, but none of the success criteria are about patients getting better care from the NHS. For that, better technology has to be delivered on a ward, and in a GP surgery, and the many other places that the NHS and social care touch. Reforming procurement and standards do matter, and will help, but it helps in the same way a good accountant helps – and that’s not by having a vision of better accounting.

There’s not much detail in here. It’s not so much ‘jam tomorrow’, as ‘jam… sometime’ – there’s no timeline, and jam gets pretty rancid after not very long. He says “these are standards”, but they’re just a vision for standards – all the hard work is left to be done.

-ends-

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

Where are the CAG regulations?

We talk a lot about NHS Digital, and its data releases that continue to ignore opt-outs. But 4 years ago today, Royal Assent of the Care Act 2014 gave NHS Digital a “general duty” to “respect and promote the privacy of recipients of health services and of adult social care in England” – which clearly hasn’t been honoured in some areas of its work. The Act also changed the law specifically so that the Confidentiality Advisory Group (CAG) of the Health Research Authority has the power to advise NHS Digital; advice to which NHS Digital must listen.

Caldicott 3 itself does not require dissent to be honoured when data is disseminated in line with the ICO’s Code of Practice on Anonymisation. (The National Data Guardian – who has been given no enforcement powers – is very careful not to ‘cross wires’ with the UK’s data regulator, who does have such powers.) And, despite well over a million patients clearly indicating their wishes to the contrary, NHS Digital continues to argue its dissemination of pseudonymised data is “anonymised” to the satisfaction of the 1998 Data Protection Act.

The UK is about to get a new Data Protection Act, aligned with and based on the EU General Data Protection Regulation, which says:

(26) … Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person.

The UK’s Information Commissioner will update the Code of Practice on Anonymisation in due course –  she’s just a little busy right now, and the new Data Protection Act is not yet on the statute book – but the Irish Commissioner has already said: (emphasis added)

“Although pseudonymisation has many uses, it should be distinguished from anonymisation, as it only provides a limited protection for the identity of data subjects in many cases as it still allows identification using indirect means. Where a pseudonym is used, it is often possible to identify the data subject by analysing the underlying or related data.

Current practice will have to change.

While IGARD may be the appropriate body to advise whether a request meets NHS Digital’s standards for dissemination, it is not an appropriate body to advise on releasing data which does not honour patients’ objections. The adjudication of the principles of those decisions, by statute, belongs to CAG.

There are legitimate instances where patients’ dissent may be be overridden – but IGARD is not, and never should have been, the body to decide that.

The opt-out is there to protect patients who decide that the safeguard the NHS currently relies upon – pieces of paper, which include for example commercial re-use contracts with commercial companies that service other commercial companies, including pharmaceutical companies that use the data for promoting their products (i.e. marketing) to doctors – is not sufficient for their situation. As is their right.

Another example: in 2014, the Health Select Committee asked for a safe setting for researchers. Only in April 2018 did a remote safe setting begin to be piloted for researchers – that work not only needs to be completed, but it should become the standard means of access.

NHS Digital continues to insist that a piece of paper is sufficient safeguard under which to release copies of the entire nation’s lifelong, linked medical histories to hundreds of organisations. Its own published records show that two-thirds of NHS Digital’s data releases do not respect patient dissent.

It should be CAG which makes such decisions, whenever and wherever it is necessary. The CAG Regulations will make that clear, when they exist. Assurances to patients are less than meaningful when the Regulations to which they relate do not yet exist.

If someone applying for patients’ data cannot do what they need with only 98% of people’s data, they should simply explain to a responsible body why this is the case. Public Health England’s cancer registry already takes this approach with the choice of protections if offers for event dates. NHS Digital simply releases data on every patient, with the medical event dates completely unprotected.

The National Data Guardian was asked to determine a single choice by which patients could express their dissent from their data being used for purposes beyond their direct care. When that choice is disregarded, it must be on a basis clearly and specifically defined in statute, and approved by CAG.

As it is doing around the world, the introduction of the GDPR will force a change, and that change should protect patients’ data that under the new Data Protection Act will be considered identifiable. Those who still need everyone’s data will have to explain why to a competent body – which really isn’t too much to ask.

Given the clear promises given as a consequence of the care.data and HES data scandals – promises much repeated, but yet to be delivered – we’ve been waiting a long time for this to be fixed.

Instant Messaging in Clinical Settings

November 2024: slight tidy ups.

June 2023: NHS England recently made their guidance less comprehensible (again). The 2023 wording is entirely compatible with the much clearer 2018 wording; paragraphs in blue below are omissions from that official guidance.


As a clinician or nurse, you should not have to keep up with the latest fluff of the apps you might use for work. But teams need to talk to each other (without using Microsoft Teams for everything!)

NHS England has put our several attempt at ‘guidance’ on using instant messaging apps. It previous WhatsApp was not banned, but failed to provide helpful guidance on what to actually use. It still hasn’t. There was a Do & Don’t list, which was better than nothing, but it isn’t in the latest version, and was almost impossible to turn into practice in the real world.

If asked, we would suggest something like this:

Summary

  1. If your employer offers an instant messaging solution, use that.
  2. If you are picking apps to use yourself, you are safest with Signal.
  3. If you are not picking the apps you use, you will probably have to use WhatsApp or Skype. But be aware that someone will be held responsible when Facebook or Skype change their rules – and it’s probably not going to be the person who picked the app…
  4. Don’t use Facebook Messenger, Instagram, or Telegram.

Whatever app you use for work, chats in that app should be set to expire in a few days (possibly less), and the vast majority of people should avoid having their phone going ding for work purposes while they are not at work. For most apps, a swipe left on the main list of ‘chats’ should show an option to “hide alerts” for some time period – this should ensure that if you do give your personal number to work colleagues, it doesn’t end up driving you to distraction outside work. If someone really wants to get in touch, they can always just call you normally.

The reasoning behind our suggestions: Doctor-to-Doctor encryption

The important step in secure messaging is something called “end-to-end” encryption, which prevents anyone – a third party ‘listening in’, or even the service making the connection –  knowing what you said. It’s the equivalent of having a conversation in a private consultation room, rather than doing it standing next to the nurses station, or in a waiting room. But even with Signal, if you are messaging using your personal device, you should treat any conversation as if it were in a lift where another person might be listening.

Signal allows you to decide for how long you will keep messages from any particular person or group, and will automatically delete the stored messages after that. But what happens with the stored message history in other apps? WhatsApp, for example, wants you to give it a full copy of all your messages and send them to its servers as a ‘backup’ (though at some point it will show you ads against them – it is part of Facebook after all).

You may also have set your phone itself to backup to somewhere. Do you know where the backup goes, and what’s in it? If chats don’t auto-delete in minutes, your backups will need to be carefully managed.

Of course, it is best practice to backup everything on your phone, and most apps assume (probably correctly) that you don’t want to lose every message or photo you receive of your kids. This doesn’t necessarily translate neatly to a clinical setting – anything that must be kept should be recorded elsewhere, so that if you lose your phone, the only thing you won’t have kept was ward chit-chat. WhatsApp wants everything – it doesn’t offer clinical reassurance. And while Snapchat has deletion as a feature, it has other problems akin to Facebook and Skype.

The longer-term security of your messaging is dependent upon who makes the app – and when, and why, they will change the rules on you. We (also) recommend Signal because it is produced by a charitable foundation whose sole mission is to provide secure, usable, communications. One key reason why the NHS England guidance is so terrible is that WhatsApp has lobbyists telling NHS England that it should allow their product; Signal doesn’t.

Since Facebook (the owner of WhatsApp) lies to regulators about its intentions, you clearly cannot rely on the company not to do tomorrow what it denies it will do today.  As a consequence of this, any official guidance must in future be kept up to date by NHS Digital. And, as corporate policies change, so must the guidance – removing from the equation NHS England’s fear of the deluge of lobbying that created this mess in the first place.

Clinicians deserve better tools than those that NHS England chooses to recommend, where a national body prioritises its own interests over the needs of those delivering direct care. The NHS England guidance is the output of meetings and committees that with every iteration gets progressively less useful for those who need to use something to help the and the people they work with practice medicine.

(This post will be kept under review as technologies change; it was last updated in October 2024)

October 2024: slight tweaks, primarily doctor-to-doctor encryption and deleting messages.

June 2023:  The December 2022 guidance from NHS England is split over a page about messaging, with key parts on the page about devices.

March 2021: added link to common definition and tests for a secure app.

March 2020 Update: custom apps are now in the NHS Apps library, and so apps that your staff routinely use for other purposes shouldn’t be used.

Data and AI in the Rest of Government: the Rule of Law

medConfidential spoke about the Framework for Data Processing by Government at the All Party Parliamentary Group on the Rule of Law. The topic of the APPG provides a useful perspective for much work on data in the public sector, and the wider use of AI by anyone. The meeting was on the same day as the launch of the AI Select Committee Report, which addresses similar key issues of  ‘data ethics’.

The ‘Rule of Law’ is defined in 8 principles as identified by Lord Bingham. The principles are not themselves law, but rather describe the process that must be followed for the Rule of Law to be respected.

Public bodies must already follow that process, and also be able to show how that process has been followed. As a result, those developing AIs (and data processing tools) for use by public bodies must also show how these processes have been followed. This is necessary to satisfy the lawful obligations of the bodies to which they are trying to sell services.

The principles identified by Lord Bingham are a model for testing whether an explanation of an AI and its output, or a data model, is sufficient for use by a public body.

While debates on ethics and society, and on politics and policy, focus on whether a technology should be used – the Rule of Law is about the evidence for and integrity of that debate. As Departments implement the Framework for data processing, to deliver on their obligations under the Rule of Law, it must be compliant with the Principles identified by Lord Bingham – not just the ethics and policies of the Minister in charge that day.

Public bodies are already bound by these rules – unless Parliament legislates to escape them. The principles are widely understood, they are testable, and they are implementable in a meaningful way by all necessary parties, with significant expertise available to aid understanding.

 

Companies and other non-public bodies

Companies (i.e. non-public bodies) are not subject to the same legal framework as public bodies. A Public Body must be able to cite in law the powers it uses; a Private Body may do (almost) anything that is not prohibited by law. This is why facebook’s terms and conditions are so vague and let it get away with almost anything – such a data model does not apply to the tax office.

Some of those looking to make money – to “move fast and break things” – would like the standard to be ethics, and ethics alone. There are currently many groups and centres having money poured into them, with names involving ‘data and society’, ‘ethics and society’, and DCMS’s own ‘Centre for Data Ethics’. The latter is led by a Minister in a Government that will always have political priorities, and – given recent revelations about Facebook – the consequences of incentives to lower standards should be very clear.

Ethics may contribute to whether something should be done – but they are not binding on how it is done, and they offer no actual accountability. After all, no tyrant ever failed to justify their actions; it is the rule of law that ultimately holds them accountable, and leads to justice for those harmed. Ethics alone do not suffice, as facebook and others have recently shown.

There is a great deal more work to do in this area. But unlike other AI ‘ethics’ standards which seek to create something so weak no-one opposes it, the existing standards and conventions of the Rule of Law are well known and well understood, and provide real and meaningful scrutiny of decisions – assuming an entity believes in the Rule of Law.

The question to companies and public bodies alike is therefore simple: Do you believe in the Rule of Law?

[notes from APPG talk]
[medConfidential (updated) portion of the APPG briefing]

Response to the House of Lords AI Select Committee Report

The AI Select Committee of the House of Lords published their report this morning.

In respect of the NHS, it suggests nothing the NHS wasn’t already doing anyway.

The suggestion that ‘data trusts’ be created for public sector datasets – such as tax data – will likely cause fundamental distrust in AI amongst the public (paragraphs 82 & 84). The NHS has shown how that model ends badly when the prime drivers are commercial, not ‘human flourishing’.

Sam Smith, a coordinator at medConfidential said (referring to paragraphs 99, 129, 317-318, 386, 419-420) :

“A week after Facebook were criticised by the US Congress, the only reference to the Rule of Law in this report is about exempting companies from liability for breaking it.

“Public bodies are required to follow the rule of law, and any tools sold to them must meet those legal obligations. This standard for the public sector will drive the creation of tools which can be reused by all.

 

-ends-

medConfidential are speaking at the APPG Rule of Law in Parliament from 11 – 12:30, and more details are now available.

NHS Digital failing to uphold patient interest


The Health Select Committee has published a report on data sharing which “raises serious concerns about NHS Digital’s ability to protect patient data” under the headline “NHS Digital failing to uphold patient interest”.  The Home Office is “treating GP patient data like the Yellow Pages” according to the RCGP.

The NHS has been trying to rebuild trustworthiness around data since the last big NHS data project collapsed in 2014. This report shows that all promises can be undermined by the narrow minded view of one office in Whitehall

The Health Select Committee is clear that NHS Digital has again failed in its statutory duties, and has put patients at risk by the processes it has adopted and refuses to change.

HSCIC rebranded into NHS Digital in an attempt to avoid the history of past failures, but this report shows actions are unchanged…

We submitted written evidence to the inquiry.