Author Archives: medcon

Caldicott Review – Future Fiascos?

(this is part 4 in a series – you may want to begin with part 1)

For some, care.data wasn’t a mess. It was practice...

But practice for what?

The new patient choice suggested on page 41 of the Review is: “information about me can only be used by the people directly providing my care.” That meets the promise in the Conservative’s election Manifesto, and it is a promise that is capable of being kept.

A longer term aim of the Review was to begin to restore public trust. It is only after the consultation, when we see the implementation, that we will be able to judge whether it was successful. Details of implementation were outside of the Terms of Reference, which is why there is so little detail on implementation.  It is also why the “NHS/non-NHS” proposal could be considered, as there was no need for anyone to think about how to do it.

This is the shadow boxing that has gone on for nearly two years since it become clear that care.data wasn’t coming back. But the problems remain unresolved.

To prevent the outcome of this consultation being changed at the whim of a future Secretary of State, the outcome has to be put into Legislation. Given the slow process to put the National Data Guardian on a statutory footing, that is entirely possible. If there are going to be new changes to patient dissent, they must be given a democratic basis by Parliament.

The Review has decided that there should be a “Caldicott Reset of Consent” even if the mechanism is currently unclear. Will they be positive, or in retrospect, will they turn out to be as compromised as the care.data “vision”?  It could go both ways.

Consensual, Safe, and Transparent?

Consensual: What is the promise to patients, and how will they know it was kept? How does it change from the current offer? How will patients be able to make an informed choice of whether to express dissent? The option of maintaining the current opt out for GPs, and extending the HSCIC opt out to cover all other care, is still possible.

Safe: “don’t leave medical records on trains” The Caldicott Review is strong here, and it just needs to happen. How will patients know if it hasn’t?

Transparent: Patients should know how their data gets used. At the moment, it’s a secretive process because there is no transparency. If DH wishes to override patients’ dissent (whatever that is), the patients should be able to know about it. It makes all of the discussions practical. This is the informed part of informed consent (dissent), and also the mechanism for Dame Fiona’s conversation into the future.


The opt out language

The “hard” part of Dame Fiona’s task was to suggest new language on opt outs that would both have public trust and was capable of being delivered. And it succeeded in offering that option, amongst others.

How that is interpreted is currently solely up to the Department of Health.

Data sharing will continue, as will AI, genomics, and whatever comes after AI and genomics. At some point, Google Deepmind will do something creepy, or the next startup in shoreditch will do something stupid. If a media firestorm is the only way to tell patients what happened, then eventually public confidence will lose, and lose badly.

Every patient should know how their data is used, and from that, innovation can be celebrated. At the moment, everyone prays really really hard that nothing goes wrong.

A transparent communications mechanism can build trust. It can also be used to talk about all the other innovations and benefits that already happen – it’s just no one knows about them.

Currently, the best way to find out whether your retina scan was fed to an AI, is by being on Google’s press list. That can not be right. If the only way Jeremy Hunt wants patients to hear how their medical records were used is via the press, then that is sheer lunacy.

There are few opportunities to make changes such as this. The Caldicott Reset of Consent coming after the collapse of the care.data is likely to be one of them. We hope the outcome will reflect the choices and interests of everyone, not just those who wish to ignore consent and profit off patients.

Patients have been given an option to opt out. Any attempt to remove it is likely to exacerbate the concerns of the vast majority who have so far not used it.

Caldicott Review – The Ugly

(This is part 3 of a 4 part series looking at the future. If you are concerned about the privacy of your health data now, our advice remains unchanged)

The expectation of the Review was that it would create new opt out wording for the public – Care.data had many false starts as “mistakes” kept being discovered.

The new patient choice suggested on page 41 of the Review is: “information about me can only be used by the people directly providing my care.” That meets the promise in the Conservative’s election Manifesto, and it is a promise that is capable of being kept.

But what may happen in practice? Everything below is covered in the consultation that we discuss link to – it is not yet certain, it is just a proposal that can be changed. It may turn out the way you want, or the way you don’t – it depends on whether you make your views heard. For now, nothing changes, but it may in the future, and how it changes, is something you can help with.

 

Will Parliament be involved?

You may have already told your GP that you don’t want your data to leave their practice. But the Department of Health believes it alone can decide to ignore what you have told your Doctor… What happens when “too many” people opt out again?

The only safe way to “improve” past arrangements is for Parliament to put the opt-out into law. That should include how to get to the new model from where we are – topics that were outside the Terms of Reference of the Review.

1.2m people have explicitly ticked a box and personally delivered a form into their GP that says to their doctor “don’t pass on my records”. NHS managers want to instruct doctors to ignore it. For the medical profession, this is an ethical question, not a data protection question. Parliament therefore has to be involved.

Resolving this must be done democratically, with debate, if it is done at all. Otherwise trust in the NHS will be fundamentally damaged. What will stop a future Secretary of State changing the rules underneath patients and doctors, yet again?

This issue goes to a patient and professional trust — will the public have reason to believe the system will do what it says it will do?

 

Proposal: Your data will still leave your GP…

The Review suggests removing your existing opt out for GP data going to the HSCIC (page 31). Information you share with only your GP, will be copied into the HSCIC against any wishes you have already expressed.

This is the removal of an option that pre-dates care.data – and breaches many promises of confidentiality that GPs have made to patients in the past. Currently, these promises are being kept by the GPs, but the new proposal is that GPs will be required to break them.

We have seen no compelling case for this to be necessary.

 

…and then data about you will leave the HSCIC…

Back in 2014, you may have opted out of “data about you leaving the HSCIC for purposes beyond your direct care”. If you did, data about you is still included in the hospital data sold for commercial reuse. The review proposes this continues (page 34).

As we showed in part 2, de-identification is not anonymisation. The Government continues to pretend that it is, so they can continue to sell your medical data and ignore the opt out you used to prevent that. While the Government say “marketing” is banned, use for “market access” is not – as the latest data release register shows.

When it was announced that opt outs were being respected, they kept the exclusions secret. It was not announced that the scope of the opt out had been slashed to exclude the data that was the primary source of patient concern. This is half the basis of our complaint to the ICO: the opt out does not do currently everything you (and we) were told it would do.

 

…and you may have to opt out again

If the “two box” model discussed in the second blog post is implemented, the Government will require you to opt out again. It will ignore your past choices, and you will have to fill in another form. That was the intent of those who didn’t want you to be able to opt out in the first place.

Question 15 of the consultation shows whether the Government has thought about how to do what it is consulting on: it hasn’t. Which is a recipe for the repeat of care.data.

No one can know the problems with what is proposed until the Government is clear on what it wishes to do. If they respect your   wishes, that’s pretty easy to tell patients… If they choose to ignore them, the chances of them telling you are somewhat lower.

The question the consultation should ask, is what will give you confidence that your wishes were honoured?

How will the outcome of this consultation help when the next data project gets mishandled, with a high price of patient trust,

Nothing has changed yet, but it will in the future; how it changes, is something you can help with.

The proposal in the review: “Information about me can only be used by the people directly providing my care” is strong, simple, and deliverable. You can help it become reality.

 

All parts can change – Here’s how you can help:

You can respond to the consultation online. You don’t need to answer every question, and can only answer question 15 if you wish.

You might want to mention some of these points in your own words:

  • Why is what you tell your GP private for you?
  • Why must the NHS keep the promises it makes to you?
  • Is this promise clear: “information about me can only be used by the people directly providing my care”?
    • Do you want that promise to be given and kept?

You can say as much or as little as you like.  

You may also want to tell your friends…

(we also take donations)

 

See part 4 of the series

Caldicott Review – The Bad…

In part 2 of this short series, we look at the “Bad” parts of the The Caldicott Review of Data Security, Consent, and Opt-Outs. (link to part 1)

A good part was the suggestion of a continuing conversation with professionals and the public. What will inform that continuing conversation?

 

Silence on transparency to patients

While the Review suggests a range of improvements, there is no recommendation that patients should be told what happens to data about them – ie whether their wishes have been honoured or ignored.

There is also nothing in the Review to require transparency, or prevent it; but nothing to change the status quo of secrecy. We will see in the next part of this series how the bureaucracy wishes to continue to do things to your data without your knowledge.  All future scandals, concerns, and catastrophes will flow from this decision, and it will also limit innovation and harm research for every patient who does wish their data used in new ways.

As part of the “paperless 2020” agenda, the Secretary of State has told the National Information Board to look at telling patients how data about them is used. But this Review has committed to nothing, beyond a recognition that there should be a “continuing conversation”. A conversation requires parties that are willing to listen and change based on what it’s heard…

Hospital Episode “statistics” – Privacy Impact Assessment

Also currently out for consultation, is a Privacy Impact Assessment on the Hospital Episode Statistics. The hospital episode statistics are patient level, unprotected, individual records covering hospitals in England over 25+ years. They are not statistics in the Office of National Statistics sense – it is the raw data on your treatments, linked over time.

The assessment was written in 2014 when public disquiet at how their hospital records were being used led 1.2 million people to opt out, the document has finally been published for consultation. What is most disappointing is how little has changed. Companies are still getting data for commercial re-use, and copies are still being sent outside of HSCIC’s control to be lost, stolen, or abused. According to HSCIC, it’s “anonymous”…

Whether those data are anonymous is covered by the UK Anonymisation Network, which says quite clearly on page 16:

“Anonymisation – refers to a process of ensuring that the risk of somebody being identified in the data is negligible. This invariably involves doing more than simply de-identifying the data, and often requires that data be further altered or masked in some way in order to prevent statistical linkage.27

We can highlight further the difference between anonymisation and de-identification (including pseudonymisation) by considering how re-identification might occur:

 

  • Directly from those data.
  • Indirectly from those data and other information which is in the possession, or is likely to come into the possession, of someone who has access to the data.28

 

The process of de-identification addresses no more than the first, i.e. the risk of identification arising directly from data. The process of anonymisation, on the other hand, should address both 1 and 2. Thus the purpose of anonymisation is to make re-identification difficult both directly and indirectly. In de-identification – because one is only removing direct identifiers – the process is unlikely to affect the risk of indirect re-identification from data in combination with other data.”

As such, claims in the Caldicott Review that the ongoing release of HES data is compliant with the ICO’s Anonymisation Code are deeply flawed. We have complained to the Information Commissioner, as, according to the PIA, this is the risk to patients of re-identification: “This may happen in future” (risk 7).

A Two box model – “NHS” vs “non-NHS”

This choice is fundamentally confusing.

This suggestion allows aggressively commercial entities to acquire a figleaf NHS contract, and reuse data for any purpose they wish, while legitimate and public spirited academics and not for profit researchers are stuck in the “non-NHS” box. For those worried about the privatisation of the NHS, this proposal should be of deep concern.

In a review with such limited time, it was never going to be possible to fully design a new consent model. However, due to the obstinacy of existing data projects who have no desire to improve their ways, and no political leadership to enforce such improvement, those who got to lobby at the table made sure their interests were protected; patients, not so much. This intentionally murky choice is the result – it is right that this was put to public consultation to demonstrate the problems with the approach of existing commercial data use.

It is right that the patients who choose to opt in to particular studies do not have their preferences overridden; yet the review doesn’t do the same for those who chose to opt out.

It gets worse in part 3

Caldicott Review – The Good…

 

The Caldicott Review of Data Security, Consent, and Opt-Outs was published a few weeks ago. Commissioned by the Secretary of State after Tim Kelsey lied to the Care.Data Advisory Group, it was tasked with solving the outstanding problems of care.data.

In this series of blog posts, we’ll look at the outcomes, and other related issues. The Caldicott Review was a look at a large set of concerns, without enough time for consideration of implementation. The Review was finished before it emerged that Google DeepMind wasn’t entirely accurate about what it was doing.

Page 40 of the Review offers an example “Restricted setting – information about me can only be used by the people directly providing my care”. It seems like a water tight opt out of all other uses, but it is potentially undermined by other parts. What will turn out to be accurate?

After all, if the nuances of the review have to be relied upon, that means all the political promises and systematic improvements have failed. The system should be so good that no patient has unaddressed concerns, and the opt out is there, but that personal circumstances aside, you shouldn’t need to use it. Will the implementation of the Review fall short?

All data flows in the NHS should be consensual, safe, and transparent. Let’s see how this measures up…

Professional continuing education

The long term solution to all these issues has to be education. This is fundamentally a human problem.

Professionals understand what a duty of confidence is, they understand Direct Care, and they are trusted by patients in ways few others are. That may be undermined.

The seventh Caldicott principle is “The duty to share information can be as important as the duty to protect patient confidentiality” – and knowing the difference requires knowing what the words involved mean. There are many examples of past failures on this topic.

Education of non-Professionals

Everyone in the NHS is committed to improving the health of the nation; not everyone does direct care.

Direct care can be described as an Identified Patient receiving Individual Care from an Identified Clinical Professional. Many other people are necessary to support Direct Care by providing tools, but they do not provide it themselves.

Providing a working computer system, or electricity, or cleaning services is a necessary task, it is improving the health of the nation, but it is not providing direct care. To summarise our presentation on the topic – not everyone gets to be an astronaut.

Other clinical professionals in an organisation, while they are doctors, are not your doctor. Someone can be a father and doing childcare, but that’s not the same as being your child’s father. That they provide care to some, does not necessarily mean they provide care to you. The only reason to others argue that there is “gray area” here, is to justify the ignorance behind decisions already made.

 

Opt-Out coverage should be NHS wide

Because HSCIC were not involved in giving 5+ years of hospital data to DeepMind, the opt out didn’t apply – and couldn’t apply because the hospital didn’t know who had opted out. The review recommends that, just as when you walk into any part of the NHS, they can find out who you are; every part of the NHS should know and respect your objection to data about you being used beyond your direct care.

This is important, and means other problems can be solved.

Clinician led

We welcome that clinicians, doctors, should be partially responsible for explaining how data is used to patients. However, that requires doctors to be told how the bureuacracy uses data, and to have control over it.

The situation where Doctors are responsible for explaining the decisions of the Secretary of State is unlikely to turn out well for patients, for Doctors, or the Secretary of State. What Doctors tell patients has to be true, and those promises have to be kept into the future, otherwise patient trust will suffer.

 

Data Safety

It is the majority of the work, and so the majority of the review looked at data security, working with other bodies to ensure that standards are followed. Patients, rightly, just assume that this happens, in the same way no patient should need to check that the surgeon is using a sterile scalpel.

The review found that there was good practice but not everywhere. It’s the CQC’s job to assess and improve.  The CQC have broad powers to assess GPs, and also look at medical records in the practices they are inspecting. A high standard of safe data practices is necessary, it is important, but inspectors should not be able to rummage through medical records without those patients knowing it happened and why. Transparency protects all sides.

Handling of Patient data, and Information Governance as the NHS terms it, must improve, and the Review is a necessary step in that process.

It is highly welcome that the review has patient agency as a key theme throughout. But the accountability back to those patients: will those benefits be seen to be done, or is it all in secret?

Continue reading part 2…

Data in the Rest of Government – the Cabinet Office Data Programme

If you see care.data as anything other than a complete success of vision and implementation, this Cabinet Office “process” should cause you concern.

Organisations that want to copy your data around Government have developed a figleaf to allow it. It forms the basis for the “Digital Economy” Bill that has been laid before Parliament, and which will be debated after the summer.

To inform that debate, we’ve used NHS England’s public comments to answer the Cabinet Office “Data Science” “Ethical” “Framework”.

Because we exclusively use NHS England quotes, this runs to 2 sides of paper in length. The Cabinet Office version is one side long, so this is twice as long as they think it should be.

If you’re wondering whether care.data could happen again, this is how:


Cabinet Office Data Science Ethical Framework: justification for care.data

  1. Start with a clear public benefit:

– How does the public benefit outweigh the risks to privacy and the risk that someone will suffer an unintended negative consequence? (PIA step 1)

“NHS England is introducing a modern information service on behalf of the NHS called care.data. The service will use information from a patients’ medical record to improve the way that healthcare is delivered for all.” (Source: NHS England)

– Brief description of the project, including data to be used, how will it be collected and deleted. (PIA step 2)

“The care.data programme will link information from different NHS providers to give healthcare commissioners a more complete picture of how safe local services are, and how well they treat and care for patients across community, GP and hospital settings.” (Source: NHS England)

    – What steps are you taking to maximise the benefit of the project outcome?

“At the moment, the NHS often doesn’t have the complete picture as information lies in different parts of the health services and isn’t joined up.  This programme will give NHS commissioners a more complete picture of the safety and quality of services in their local area which will lead to improvements to patient outcomes.”  (Source: NHS England)

“The information can also be used by NHS organisations to plan and design services better, using the best available evidence of which treatments and services have the greatest impact on improving patients’ health.”(Source: NHS England)

 

  1. Use data and tools which have the minimal intrusion necessary

– What steps are you taking to minimise risks to privacy? (for example using less intrusive data, aggregating data etc)

“The HSCIC has been handling hospital data securely in this way for decades.  The system is designed to be extremely secure, with a suite of safeguards to protect confidentiality.” (Source: NHS England)

“The service will only use the minimum amount of information needed to help improve patient care and the health services provided to the local community. A thorough process must be followed before any information can be shared and strict rules about how information is stored and used are followed.” (source: NHS England)

  1. Create robust data science models

– What steps have you taken to make sure the insight is as accurate as possible and there are minimal unintended consequences? (for example thinking through quality of the data, human oversight, giving people recourse)

“Everyone making healthcare decisions needs access to high quality information: clinicians need it to inform their decision making; patients need it when deciding which treatment option is best for them; and commissioners need it when making decisions about which services are right for their populations.” (source: NHS England)

 

  1. Be alert to public perceptions:

– How have you assessed what the public or stakeholders would think of the acceptability of the project? What have you done in addition to address any concerns?

“Materials and guidance have been developed in collaboration with the Health and Social Care Information Centre (HSCIC), British Medical Association (BMA) and the Royal College of General Practitioners (RCGP), to support practices to raise awareness. Patients who are not happy for their data to be used in this way can ask their GP practice to make a note of this in their medical record and this will prevent their information leaving the practice.”  (source: NHS England)

 

  1. Be as open and accountable as possible?

– How are you telling people about the project and how you are managing the risks?

“NHS England, together with the Health and Social Care Information Centre, announced that throughout January, all 22 million households in England will receive a leaflet explaining how the new system will work and the benefits it will bring.  The leaflet drop is the next stage of NHS England’s public awareness plan and follows wide consultation with a range of stakeholders including GPs and patient groups.”  (source: NHS England)

– Who has signed this off within your organisation? Who will make sure the steps are taken and how? (PIA Step 5)

“This programme is too important to get wrong, and while I think that there is understanding on both sides of the House about the benefits of using anonymised data properly, the process must be carried out in a way that reassures the public.” (Source: Secretary of State, Jeremy Hunt, to Parliament)

 

  1. Keep data secure

    – What steps are you taking to keep the data secure?

“The NHS is very good at preserving the privacy of people in analysing that kind of data.” … “in 25 years there’s never been a single episode where the very strict rules have ever compromised the patient’s privacy,” (source: Mr Kelsey of NHS England on BBC Radio 4audio)


According to the Cabinet Office, that’s all you need to do as “answering these questions will also act as your Privacy Impact Assessment” (top of page 6). That is clearly ridiculous – the above is as false and misleading as it is entirely accurate. The care.data privacy impact assessment was 32 pages long plus other supporting documents.

The Digital Economy Bill makes the above superficiality entirely legal for any part of Government to acquire data from any other, and will be discussed by Parliament in September.

Reporting to a new Minister, and a new Director General, the GDS data programme needs an external review to provide constructive input from outside the existing whitehall silo. Otherwise, across Government, the public facing legacy of GDS may become care.data style fiascos.

Health data and blockchains

There’s a lot of buzz in the digital health world about “blockchains” – unalterable records of history. Those looking to make money are looking adoringly at the health IT budgets.

No health app, data, or service, involving blockchains, should be considered credible without publishing specific worked examples of what data is written to the blockchain.

That must be the key test to allow a discussion of privacy. Without that, no credible assessment can be conducted. Is there a worked example of what will be recorded, for each of 7 entities involved, for an average of 5 transactions each?

If you don’t know what information is recorded, it’s impossible to analyse whether the promise is a sweet dream or a beautiful nightmare.  For different scenarios, it will be different – Beauty is in the eye of the beholder

In the 1990s, Iceland decided that it would give “a single company monopoly control of the country’s health records”. The system was cancelled when it was demonstrated that individuals could be identified.

As so often in the tech world, there is an incentive for a shiny press release which ignores past failures. Those failures being forgotten until they are repeated. With blockchains, that may be a much less private event.

Bulletin – July 2016

A New Government…

We wait to see what will happen with Theresa May as Prime Minister, and her appointment of Ministers. The Home Secretary focuses on national security – the Prime Minister will focus on what is in the wider national interest.

The Conservative Manifesto said: “We will give you full access to your own electronic health records, while retaining your right to opt-out of your records being shared electronically”.

Will this be done, and will this be seen to be done?

 

…but the spirit of care.data continues?

In the overview of her recent report, Dame Fiona Caldicott quoted the (then) Health Secretary saying: “Exciting though this all is, we will throw away these opportunities if the public do not believe they can trust us to look after their personal medical data securely. The NHS has not yet won the public’s trust in an area that is vital for the future of patient care’”.

As such, we’re disappointed in the “keep going” approach of the Department of Health. These are issues covered in the current public consultation, so aren’t on the immediate in tray of new Ministers. We’ll cover details next time.

Care.data was the spark that created widespread interest, but the fuel for the fire was the surprising data uses much more widely. Adding a care.data nameplate just showed that the data governance emperor was naked – with the health data of everyone on display.

Snuck out in a long announcement, the care.data name has gone, but the plans continue as they were originally designed back in 2013.

A simple name swap for the same goal might have worked with the last Prime Minister; we’re not sure it will work for this one.

Patients should not be surprised by what happened with data about them. Will the surprises continue?

What’s next?

If, as Recommendation 11 says, that “There should be a new consent/ opt-out model to allow people to opt out of their personal confidential data being used for purposes beyond their direct care. This would apply unless there is a mandatory legal requirement or an overriding public interest.” – then that must be true.

The new focus on the use of doctors and trusted individuals to explain the arrangements to patients are important. As care.data showed, what they say has to be true to avoid great harm to those relationship. The researcher community was burnt supporting care.data, hopefully they will not do the same thing twice.

Government promises being explained by your doctor will mean those who make the promises will have no ability to ensure they are kept.

We’ll cover the details of the consultation in the next newsletter, and how you can respond to say why promises made to you should be kept.

Government may want doctors to make promises to patients, but it will remain politicians and accountants breaking them.

We’ll be here.

medConfidential – mid June update

We’ll have more on implementation of the hospital data opt-outs when the dust has settled after the referendum.

“Intelligent Transparency”

According to a letter from a Minister, “Intelligent Transparency” is the new goal. We hope that all Department of Health decisions will prove “intelligent” from a patient view, and not just the political priorities of their desk in Whitehall.

Will transparency extend to telling you how your data has been used?  Or is that the sort of intelligence they don’t want you to have?

Tech startups are no magic bullet

We’re waiting for a response from the Regulators about DeepMind’s project at the Royal Free Hospital Trust. Whatever they say, we note that Google has now made public commitments to move towards the transparency expected of them. Regulators are still investigating, and given the contradictory statements, it may take some time.

We look forward to seeing what they will tell the public about their experiments to replace doctors.

What can you do: The Hospital Episode Statistics consultation

The Hospital Episode Statistics cover data from the nation’s hospitals for over 25 years. The HSCIC is looking for everyone’s views on privacy in the data. We’ll have a long response in a few weeks, but you can quickly complete their survey (or just email enquiries@hscic.gov.uk with a subject of “HES PIA consultation”). You don’t need to answer all the questions – you can just say why it matters to you that your privacy and opt out applies to hospital data. 

Investigatory Powers Bill – Protections for Medical Records?

We welcome Home Office Minister John Hayes’ statements that additional protections for medical records will be added to the Investigatory Powers Bill.

He said: “I am prepared in this specific instance to confirm that the security and intelligence agencies do not hold a bulk personal dataset of medical records. Furthermore, I cannot currently conceive of a situation where, for example, obtaining all NHS records would be either necessary or proportionate.”

Additionally, because he “felt that it was right in the national interest, with the benefit of the wisdom of the Committee” … “I feel that the public expect us to go further” than currently on the face of the bill, because he “cannot bind those who hold office in the future, so it is important that we put additional protections in place.”

Having agreed in principle that there should be “additional protections”, there are multiple ways to implement them.

For these purposes, it is sufficient to consider that Bulk Personal Datasets are used where the identities of the individuals being targeted are unknown, and you need to search by attributes across whole databases rather than names. Think of it like searching your phone book by phone number, rather than by name.

 

Existing mechanisms to get this information

As a Home Office Minister speaking in Committee, there is no reason he would be aware of the existing gateways available for doing precisely the things he was thinking about needing to be able to do in rare circumstances, for the exceptional reasons he was thinking they may need to be done.

In the course of an investigation, especially in a terrorism incident, the police can ask the NHS questions. The police or agencies won’t be able to go fishing for answers, they can ask the relevant hospitals questions, and the hospitals can take a view on whether it is appropriate to answer based on full details. There can be a process followed which can command public confidence.

Doctors are permitted to override the common law duty of confidentiality and release such information to the police when they “believe that it is in the wider public interest,” under GMC guidance. After a terrorism event, it is inconceivable they would not do so. When investigators know what to ask for, they have the ability to use existing processes for those individuals’ details on a targeted basis, should they be relevant.

There is existing guidance on this, and if it needs to be updated, that does not prevent stronger protections on bulk access to medical records being added to the the Bill.

Even if there is only a “risk” that those individuals may have been involved, or may be involved in terrorism in future, the duty of confidence for providing information to the Agencies was lifted in part 5 of the 2015 Counter Terrorism and Security Act.

The Home Office has lowered the bar of confidentiality protection dramatically over the last several years. Unamended, these powers remove it entirely.

 

What the protection must cover

The committee rightly identified that there must be protections for “for material relating to “patient information” as defined in section 251(10) of the National Health Service Act 2006, or relating to “mental health”, “adult social care”, “child social care”, or “health services””. All sections of that are important, although there are different ways to put them together.

It is insufficient to simply exempt data held by DH/NHS data controllers, as that does not cover social care, nor does it cover data with data processors contracted to the NHS (which is a different loophole of concern to the ICO).

The Agencies should also never be permitted to use covert means against the NHS or health professionals to acquire patient information.

Should the Agencies create a scenario where there has been a secret incident where medical professionals are not allowed to know the characteristics of a suspect, and that search can only be done at some future point by the Agencies, rather than now by the medical staff, then some mechanism may be appropriate. This seems highly unlikely, but the Home Office may be able to make such a case to the satisfaction of both Houses of Parliament. We invite them to do so.

In that scenario, it is likely to be necessary to have multiple levels of protection. A general ban on warrantry for such data, except where the data responsible Secretary of State has submitted to the Judicial Commissioner an approval for its handover and retention for a defined period for a defined investigation, and no others.

In effect, this removes from the Agencies permission to acquire the data, but retaining the ability for the Secretary of State elsewhere in Government to hand it over should they believe it appropriate. The Commissioner and Intelligence Services Committee should then be required to be notified that this has been approved, and state on how many individual level records were affected in any annual report covering the period.

Whatever the Home Office come up with, it must be robust and be seen to be robust. We remain happy to discuss this further with all parties.

Update on Google Deepmind’s NHS app – is it “just resting” ?

It appears Google Deepmind has suspended use/“pilots” of their experimental app until they have received regulatory clarification – ie, until it is legal.

This whole incident was harmful to patient trust, it was harmful to the hospital, and it was harmful to Google. All because, it appears, there was a desire to go faster than waiting a few weeks for regulatory and data approvals, and so used a bizarre cut’n’shut agreement.

The controversy has never been about whether the app would help clinicians with their patients. It has been entirely about what happened for people who were not patients of those clinicians. Some of questions from over a week ago remain unanswered. “Collect it all” might have applied to version 1 of the app, but they now have first hand experience of how it makes things harder not easier.

Tech teams often like naming their work. Perhaps the next version will be “Streams 2: This time we read the regulations”…

Google Deepmind could have followed the rules about applications for direct care, and the usage of data for “development work” (ie, “secondary uses”). They just didn’t, for some reason that we will ask their independent reviewers to get to the bottom of.

The app deserves to come back safely, if the humans running the project can follow the rules to get the data and processes that they wish to bring to the NHS. Nothing in our understanding of what they were doing, and the existing rules, should have prevented from doing this “pilot” (apparently), entirely legally, with conventional legal agreements. They simply didn’t do so.

If Google Deepmind choose to walk away from the project, it won’t be because they wanted to help the NHS; it’ll be because they wanted to help only on their terms. For the hospital, and the NHS more widely, it is yet another reminder that some offers of help may come with too high a price.
where-google-app-went-wrong