Category Archives: News

Bulletin – July 2016

A New Government…

We wait to see what will happen with Theresa May as Prime Minister, and her appointment of Ministers. The Home Secretary focuses on national security – the Prime Minister will focus on what is in the wider national interest.

The Conservative Manifesto said: “We will give you full access to your own electronic health records, while retaining your right to opt-out of your records being shared electronically”.

Will this be done, and will this be seen to be done?

 

…but the spirit of care.data continues?

In the overview of her recent report, Dame Fiona Caldicott quoted the (then) Health Secretary saying: “Exciting though this all is, we will throw away these opportunities if the public do not believe they can trust us to look after their personal medical data securely. The NHS has not yet won the public’s trust in an area that is vital for the future of patient care’”.

As such, we’re disappointed in the “keep going” approach of the Department of Health. These are issues covered in the current public consultation, so aren’t on the immediate in tray of new Ministers. We’ll cover details next time.

Care.data was the spark that created widespread interest, but the fuel for the fire was the surprising data uses much more widely. Adding a care.data nameplate just showed that the data governance emperor was naked – with the health data of everyone on display.

Snuck out in a long announcement, the care.data name has gone, but the plans continue as they were originally designed back in 2013.

A simple name swap for the same goal might have worked with the last Prime Minister; we’re not sure it will work for this one.

Patients should not be surprised by what happened with data about them. Will the surprises continue?

What’s next?

If, as Recommendation 11 says, that “There should be a new consent/ opt-out model to allow people to opt out of their personal confidential data being used for purposes beyond their direct care. This would apply unless there is a mandatory legal requirement or an overriding public interest.” – then that must be true.

The new focus on the use of doctors and trusted individuals to explain the arrangements to patients are important. As care.data showed, what they say has to be true to avoid great harm to those relationship. The researcher community was burnt supporting care.data, hopefully they will not do the same thing twice.

Government promises being explained by your doctor will mean those who make the promises will have no ability to ensure they are kept.

We’ll cover the details of the consultation in the next newsletter, and how you can respond to say why promises made to you should be kept.

Government may want doctors to make promises to patients, but it will remain politicians and accountants breaking them.

We’ll be here.

“NHS England is closing the much criticised care.data programme”.

According to the Health Service Journal, “NHS England is closing the much criticised care.data programme”.

(Update: A written ministerial statement has now appeared on Parliament’s site)

Responding to the news, Phil Booth, Coordinator of medConfidential said:

Responding to the news, Phil Booth, Coordinator of medConfidential said:

“One toxic brand may have ended, but Government policy continues to be the widest sharing of every patient’s most private data.

“Launched this morning, the Government’s consultation on consent asks the public to comment on how Government should go about ignoring the opt outs that patients requested.

“The programme did exist, and whatever data the Government may wish to continue to sell to their commercial friends, patients dissented from data about them being shared. Their wishes must be honoured.

Notes to editors

q15 of the Consultation on New data security standards for health and social care puts the onus on consultation respondents to work out how Government could implement the policy they wish to follow, irrespective of the consultation:

 

[Press Release] The National Data Guardian for Health and Care Review of Data Security, Consent and Opt-Outs was published this morning.

“The NHS has not yet won the public’s trust in an area that is vital for the future of patient care” — Secretary of State Jeremy Hunt quoted in paragraph 1.5

From the report:

“4.2.1 This has been a report about trust. It is hard for people to trust what they do not understand, and the Review found that people do not generally understand how their information is used by health and social care organisations.”

About the existing opt outs that patients have expressed:

“the Review recommends that, in due course, the opt-out should not apply to any flows of information into the HSCIC. ”  (p31, 3.2.31 second column)

About the 25+ years of hospital data that continues to be sold:

“The Review recognises that the new opt-out should not cover HSCIC’s already mandated data collections, such as Hospital Episode Statistics (HES) data. The Review believes it is important that there is consistency and therefore where there is a mandatory legal requirement for data in place, opt-outs would not apply.” (p34, 3.2.41, bottom right)

 

We entirely agree with the Association of Medical Research Charities when they say:

“People need to feel that they can trust the system to handle their information with care and competence, and respect their wishes. If the public do not trust the system, they will be unwilling to share health information for medical research and this will seriously hinder progress on new treatments and cures of diseases such as cancer, dementia, rare conditions and many more.” http://www.amrc.org.uk/news/amrc-statement-on-the-caldicott-review

 

Phil Booth, coordinator of medConfidential said:

“Patient trust is vital. The NHS should win the publics trust by being seen to follow each patient’s wishes. However, yet again, the existing commercial entities demand leadership from others so they can continue feeding on patient data, despite the wishes of patients.

“The last data release register from HSCIC contains continued release to commercial companies. One, Beacon Consulting, on their homepage, advertise “we help our pharmaceutical clients solve difficult commercial problems”. Their commerical access was renewed in the most recent HSCIC data release register.”

“It seems the Department of Health is trying to have it both ways – tell patients one thing and commercial entities the other. When the consultation comes out, the public can have their say and the Department of Health will have to finally decide.”

There has to be a better way to find out how your data has been used than reading google’s press releases.”

Notes

The Hospital episode statistics now contain 1.5 billion patient hospital events, linked to each patient across a lifetime. According to the review, the 1.2 million patients who have opted out of their data being included in the hospital episode statistics, continue to have their data included in the hospital episode statistics – their choice has been ignored.

 

Caldicott Review and Government Consultation – 1st thoughts

This post will continue to be updated.

medConfidential welcomes the publication of the National Data Guaridan Review of Data Security, Consent, and Opt-Outs and the Government consultation on the findings.

  1. In practice, it matters most what the Government response and consultation says. Dame Fiona’s Review, while vital, may in practice end up as disregarded as the recommendations of her previous review.
  2. What is the change patients will see?  Will each patient know how their consent choice has been honoured? Will “make informed choices about how their data is used” be made real?  “The public is increasingly interested in what is happening to their information” (video 4)
  3. Being published in weeks where political promises have barely lasted hours after people resigned, and with the current opt out being the gift of the Secretary of State, what basis will the new consent language have? Is it comprehensive?
  4. “There has been little positive change in the use of data across health and social care since the 2013 Review and this has been frustrating to see.” — Dame Fiona Caldicott
  5. “The NHS has not yet won the public’s trust in an area that is vital for the future of patient care” – Secretary of State
  6. Will the National Data Guardian be put on a statutory footing? It was due to happen in the Digital Economy Bill, but the Bill has been published, and it’s not there. Another broken promise from the Secretary of State? The National Data Guardian consultation response is out, again promising legislation.
  7. If there are two opt outs, the “NHS” and “research” boxes may be overly confusing. Dodgy commercial projects will find an NHS figleaf to sneak in the “NHS” preference, while legitimate and bona fide academics will be left in the “research” box with it’s potentially radioactive commercial examples — this is the opposite of what a quick read by a busy citizen would expect. (page 39 – top right for the commercial project on radiation)
  8. MedConfidential welcomes the proposal that the opt out will be comprehensive across the NHS. This is an important simplification for patients, unless it is badly mishandled.
  9. Recommendation 18: “The Health and Social Care Information Centre (HSCIC) should develop a tool to help people understand how sharing their data has benefited other people. This tool should show when personal confidential data collected by HSCIC has been used and for what purposes.”
  10. Paragraph 1.35  “the opt-out should not apply to all flows of information into the HSCIC” — that’s GP data
  11. “The Review recognises that the new opt-out should not cover HSCIC’s already mandated data collections, such as Hospital Episode Statistics (HES) data. The Review believes it is important that there is consistency and therefore where there is a mandatory legal requirement for data in place, opt-outs would not apply.” – that’s all data going to most commercial entities.
  12. Video 4 is of most interest

first press comment now up.

With a pending consultation, it matters that the people who wish their data is used, and those who wish it not to be used, can both know, based on evidence, that their wishes were each honoured.

There has to be a better way to find out how your data has been used than reading google’s press releases.

2016 Digital Economy Bill

On the day that Tory MPs vote on a new leader, with the Home Secretary who tore up an ID card on her first day in office in the lead, the Government has introduced legislation to bring the database state back via the side door.

s38 of the Digital Economy Bill may require sharing of births, marriages, and deaths across the public sector in bulk without individual consent.

s29 as written allows sharing of medical information to anywhere in the public sector, or commercial companies providing public services, if it may increase “contribution to society”.

The National Data Guardian is not placed on a statutory footing.

As the Conservative leadership election moves forward, it seems to be that the database state is back.

 


 

update: The Cabinet Office have been in touch to say:

Para 18 of the government response clearly states:
18.       The Government acknowledges the importance of health and social care data in multi-agency preventative approaches and early intervention to prevent harm. We will do further work with the National Data Guardian following the publication of her review/report to consider how health data is best shared in line with her recommendations.

As a result health bodies are out of scope of the powers in the draft regulations.

The Bill itself contains no such exclusion, and many local authorities have been lobbying for precisely that access. We will look to clarify with a probing amendment at committee stage, but appreciate the press office getting in touch.

medConfidential – mid June update

We’ll have more on implementation of the hospital data opt-outs when the dust has settled after the referendum.

“Intelligent Transparency”

According to a letter from a Minister, “Intelligent Transparency” is the new goal. We hope that all Department of Health decisions will prove “intelligent” from a patient view, and not just the political priorities of their desk in Whitehall.

Will transparency extend to telling you how your data has been used?  Or is that the sort of intelligence they don’t want you to have?

Tech startups are no magic bullet

We’re waiting for a response from the Regulators about DeepMind’s project at the Royal Free Hospital Trust. Whatever they say, we note that Google has now made public commitments to move towards the transparency expected of them. Regulators are still investigating, and given the contradictory statements, it may take some time.

We look forward to seeing what they will tell the public about their experiments to replace doctors.

What can you do: The Hospital Episode Statistics consultation

The Hospital Episode Statistics cover data from the nation’s hospitals for over 25 years. The HSCIC is looking for everyone’s views on privacy in the data. We’ll have a long response in a few weeks, but you can quickly complete their survey (or just email enquiries@hscic.gov.uk with a subject of “HES PIA consultation”). You don’t need to answer all the questions – you can just say why it matters to you that your privacy and opt out applies to hospital data. 

Investigatory Powers Bill – Protections for Medical Records?

We welcome Home Office Minister John Hayes’ statements that additional protections for medical records will be added to the Investigatory Powers Bill.

He said: “I am prepared in this specific instance to confirm that the security and intelligence agencies do not hold a bulk personal dataset of medical records. Furthermore, I cannot currently conceive of a situation where, for example, obtaining all NHS records would be either necessary or proportionate.”

Additionally, because he “felt that it was right in the national interest, with the benefit of the wisdom of the Committee” … “I feel that the public expect us to go further” than currently on the face of the bill, because he “cannot bind those who hold office in the future, so it is important that we put additional protections in place.”

Having agreed in principle that there should be “additional protections”, there are multiple ways to implement them.

For these purposes, it is sufficient to consider that Bulk Personal Datasets are used where the identities of the individuals being targeted are unknown, and you need to search by attributes across whole databases rather than names. Think of it like searching your phone book by phone number, rather than by name.

 

Existing mechanisms to get this information

As a Home Office Minister speaking in Committee, there is no reason he would be aware of the existing gateways available for doing precisely the things he was thinking about needing to be able to do in rare circumstances, for the exceptional reasons he was thinking they may need to be done.

In the course of an investigation, especially in a terrorism incident, the police can ask the NHS questions. The police or agencies won’t be able to go fishing for answers, they can ask the relevant hospitals questions, and the hospitals can take a view on whether it is appropriate to answer based on full details. There can be a process followed which can command public confidence.

Doctors are permitted to override the common law duty of confidentiality and release such information to the police when they “believe that it is in the wider public interest,” under GMC guidance. After a terrorism event, it is inconceivable they would not do so. When investigators know what to ask for, they have the ability to use existing processes for those individuals’ details on a targeted basis, should they be relevant.

There is existing guidance on this, and if it needs to be updated, that does not prevent stronger protections on bulk access to medical records being added to the the Bill.

Even if there is only a “risk” that those individuals may have been involved, or may be involved in terrorism in future, the duty of confidence for providing information to the Agencies was lifted in part 5 of the 2015 Counter Terrorism and Security Act.

The Home Office has lowered the bar of confidentiality protection dramatically over the last several years. Unamended, these powers remove it entirely.

 

What the protection must cover

The committee rightly identified that there must be protections for “for material relating to “patient information” as defined in section 251(10) of the National Health Service Act 2006, or relating to “mental health”, “adult social care”, “child social care”, or “health services””. All sections of that are important, although there are different ways to put them together.

It is insufficient to simply exempt data held by DH/NHS data controllers, as that does not cover social care, nor does it cover data with data processors contracted to the NHS (which is a different loophole of concern to the ICO).

The Agencies should also never be permitted to use covert means against the NHS or health professionals to acquire patient information.

Should the Agencies create a scenario where there has been a secret incident where medical professionals are not allowed to know the characteristics of a suspect, and that search can only be done at some future point by the Agencies, rather than now by the medical staff, then some mechanism may be appropriate. This seems highly unlikely, but the Home Office may be able to make such a case to the satisfaction of both Houses of Parliament. We invite them to do so.

In that scenario, it is likely to be necessary to have multiple levels of protection. A general ban on warrantry for such data, except where the data responsible Secretary of State has submitted to the Judicial Commissioner an approval for its handover and retention for a defined period for a defined investigation, and no others.

In effect, this removes from the Agencies permission to acquire the data, but retaining the ability for the Secretary of State elsewhere in Government to hand it over should they believe it appropriate. The Commissioner and Intelligence Services Committee should then be required to be notified that this has been approved, and state on how many individual level records were affected in any annual report covering the period.

Whatever the Home Office come up with, it must be robust and be seen to be robust. We remain happy to discuss this further with all parties.

Update on Google Deepmind’s NHS app – is it “just resting” ?

It appears Google Deepmind has suspended use/“pilots” of their experimental app until they have received regulatory clarification – ie, until it is legal.

This whole incident was harmful to patient trust, it was harmful to the hospital, and it was harmful to Google. All because, it appears, there was a desire to go faster than waiting a few weeks for regulatory and data approvals, and so used a bizarre cut’n’shut agreement.

The controversy has never been about whether the app would help clinicians with their patients. It has been entirely about what happened for people who were not patients of those clinicians. Some of questions from over a week ago remain unanswered. “Collect it all” might have applied to version 1 of the app, but they now have first hand experience of how it makes things harder not easier.

Tech teams often like naming their work. Perhaps the next version will be “Streams 2: This time we read the regulations”…

Google Deepmind could have followed the rules about applications for direct care, and the usage of data for “development work” (ie, “secondary uses”). They just didn’t, for some reason that we will ask their independent reviewers to get to the bottom of.

The app deserves to come back safely, if the humans running the project can follow the rules to get the data and processes that they wish to bring to the NHS. Nothing in our understanding of what they were doing, and the existing rules, should have prevented from doing this “pilot” (apparently), entirely legally, with conventional legal agreements. They simply didn’t do so.

If Google Deepmind choose to walk away from the project, it won’t be because they wanted to help the NHS; it’ll be because they wanted to help only on their terms. For the hospital, and the NHS more widely, it is yet another reminder that some offers of help may come with too high a price.
where-google-app-went-wrong

MedConfidential comment on Friday’s New Scientist revelations about Google Deepmind

 

Extraordinarily, the New Scientist has quoted Google as having used as part of an unregulated algorithm in the direct care of patients[1].

This follows up on previous news that Google Deepmind had acquired millions of detailed patient histories for unclear purposes[2]. Google Deepmind’s response was to focus that they were keeping the data safely[3], and to ignore questions over what they were doing with it, and whether they should have had it in the first place[4].

MedConfidential has long argued that every patient should be able to know how data about them has been used. If there had been a Ministerial commitment to do that, this mess of unanswered questions would not have happened.[5]

Announced yesterday, it is Government policy to “encourage and support data-driven techniques in policy and service delivery”. Innovation is welcome and vital, but it should be grounded in medical ethics and a clinical relationship, and not ride roughshod over processes in place to protect all involved.[6]

Responding to the latest information, MedConfidential coordinator Phil Booth said:

“Deepmind has spent a fortnight hiding behind the NHS. It’s now clear that this was a unregulated “development” project for deepmind, but a patient care project for the NHS.

“These algorithms evolve: errors get fixed, improvements get made. What approvals did Deepmind have from the medical regulators at the early stages? As the provider of a tool used in direct care, they are responsible for ensuring it meets all safety standards.

“Training doctors to make safe decisions takes years, and requires many exams to be passed. Have Google shown that each version used in direct care met all relevant grades, standards, and regulations?

-ends-

For immediate or future interview, please email coordinator@medconfidential.org 

Notes to editors:

 

  1. See https://www.newscientist.com/article/2088056-exclusive-googles-nhs-deal- does-not-have-regulatory-approval/  “We [Deepmind] and our partners at the Royal Free are in touch with MHRA regarding our development work.”

 

  1. See https://www.newscientist.com/article/2086454-revealed-google-ai-has-access- to-huge-haul-of-nhs-patient-data/ and http://techcrunch.com/2016/05/04/concerns- raised-over-broad-scope-of-deepmind-nhs-health-data-sharing-deal/

 

  1. Google’s self-defence https://www.theguardian.com/technology/2016/ may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder refers to their self-reported scores in the IG Toolkit https://www.igt.hscic.gov.uk/AssessmentReportCriteria.aspx?tk=424999242358961&lnv=3&cb=e8c1aaf1-c40d-45af-9bb9-adc46c712924&sViewOrgId=49979&sDesc=8JE14 . Those scores have not yet been audited by the HSCIC.

 

  1. The question of why Google Deepmind had the histories of people who never had a blood test at the relevant hospital, and who may never return to the hospital, remains unanswered.

 

  1. Much like a bank statement, every patient should be able to see a data usage report, which tells them where data about them has been used, and why, and what the benefits of that usage were. A commitment to investigate implementation was made in late 2014, but remains delayed by the Caldicott Review of Consent. For more, see https://medconfidential.org/2014/what-is-a-data-usage-report/

 

  1. MHRA rules require medical devices to have appropriate pre-approved procedures in place to confirm they’re working as expected, and to ensure any conceivable failures have mitigations considered in advance. The New Scientist article confirms they do not have those approvals as algorithms in their software develop.

Google Deepmind – part 1

 

[this piece covers the state of play as on Sunday 8th May. It may be updated or replaced as new facts emerge]

If you are unwell: seek medical attention. These issues should not prevent you getting the care you need. The below discussion only relates to one Trust, the Royal Free in London, for all patient hospital events since sometime in 2010.

Last summer, following medConfidential’s work on care.data, Dame Fiona Caldicott was asked to review consent in the NHS. That report has still not been published, and provides recommendations. Patients should be able to know every way data has been used, as a condition of using that data – contracts shouldn’t allow secrets from patients.

Following a New Scientist article, there’s been a lot of press discussion about google deepmind receiving over 5 years of detailed medical data from the Royal Free NHS Trust in London. This project is steeped in secrecy, hiding details from patients and the public.

Concerns have not been about the patients whose information would be displayed in this app. Concerns are solely the data of the patients whose data could never be displayed in the app, as they have never had any of the blood tests (etc) it displays. That is 5 in every 6 patients. For the other 1 in 6, there is a potential benefit.

When we were first approached, our initial question was “what are they doing with this?” – details were hidden and emerged only through press investigations.

It looked like what Deepmind were doing should have been a research project – but it had not followed any ethics or research processes. It was using a dataset for the “Secondary Uses Service” – which strongly suggested this was a secondary use.

Data can be used for direct care – the care given to you by a doctor or other clinician. It is also used for other purposes, called “secondary uses”. These include purposes such as research, and the design of models for calling people in for screening (including for detection of kidney problems).

The New Scientist published last Friday, and the question remained unanswered until Wednesday. In an appearance on Radio 4, it emerged that the reason they had followed none of the research processes was simple: it wasn’t research.  It was claimed to be for direct care. The Professor speaking goes on to detail the limits that clinical rules and ethics put on who can access data for direct care.

As a result, on Wednesday afternoon, the question changed to Who is the direct care (ie clinical) relationship between?

Deepmind have made a case that they will look after the data – we’ve no reason to question that different point. This is not about losing data, it’s about whether they should have had most of it in the first place. What data should they have, and how should they have got it?

To answer that question, it has to be clear what they are doing. It is not.

More generally, to have confidence, patients should know how data about them has been used. What is Deepmind hiding in this case? And why? Will they give a full accounting of how they’ve used patient data, and what for, and what happened in direct care as a result?

Every data flow in the NHS should be consensual, safe, and transparent.

Why does google think what it does with the medical history of patients can be secretive, invasive, and possibly harmful?

Throughout most of medConfidential’s work, we are able  to say “opting out will not affect the care you receive”, because large amounts of work have been done by all sides to make sure it does not. If you opt out of “secondary uses” of your data released by HSCIC, it does not affect care compared to someone who did not opt out. Due to the lack of process, and the corners cut by google deepmind avoiding all the relevant processes, that may not necessarily be true. We hope the Trust will clarify what their opt out does. If you didn’t want your data handed to google for speculative purposes, what happens if you get injured and show up at the Royal Free’s A&E? How is your care affected? Did they cut that corner too?

Patients should not be punished for deepmind’s cut corners.

Scalpels Save Lives

Our friends in the research world promote that #datasaveslives, and it does, just like scalpels do.

To be completely clear, deepmind have said that their project is “not research”. That’s why they didn’t follow any research processes. There are 1500 projects which followed the proper processes and appear on the “approved data releases” register – the Deepmind project is not one of them.

Data, and good data hygiene, is as much a requirement of a modern hospital as sterile scalpels. Following the right processes to provide sterile instruments is not seen as an “unnecessary burden”, even if accountants may wish to cut costs due to the expense. Scalpels have to be sterile for a very good reason.

Similarly, processes put in place to protect data are around the same level of importance as adequate cleaning. It may seem like an unnecessary burden to some. Just as too little cleaning will cause problems that clearly demonstrate the necessity of what was previously decried as too much. Those who cut corners are rarely the ones who suffer from the decision. There is a fundamental difference between causation and correlation

Deepmind seem to be a powerful new tool.

Were it was an instrument to be used in surgery, it would not be enough for it to be powerful and new, it must also be safe. Otherwise the harm can be significant.

Rather than clean and safe, if seems deepmind is covered in toxic waste.

It’s not that deepmind couldn’t go through the processes to ensure safety. We don’t know why they didn’t.

Deepmind might be a better instrument, or it might be the new nightmare drug. Technology tools aren’t a panacea. Have lessons been learnt after the “epic failure” of “Google flu trends”?

Research, testing, and regulatory oversight is designed to prove that changes are safe. They also correct any unintended harms to patients as the process proceeds.

How much of that happened in this case? 

If Google DeepMind publish attributable and citable comments in response to these questions, we’ll link to them.