Author Archives: medcon

medConfidential – mid June update

We’ll have more on implementation of the hospital data opt-outs when the dust has settled after the referendum.

“Intelligent Transparency”

According to a letter from a Minister, “Intelligent Transparency” is the new goal. We hope that all Department of Health decisions will prove “intelligent” from a patient view, and not just the political priorities of their desk in Whitehall.

Will transparency extend to telling you how your data has been used?  Or is that the sort of intelligence they don’t want you to have?

Tech startups are no magic bullet

We’re waiting for a response from the Regulators about DeepMind’s project at the Royal Free Hospital Trust. Whatever they say, we note that Google has now made public commitments to move towards the transparency expected of them. Regulators are still investigating, and given the contradictory statements, it may take some time.

We look forward to seeing what they will tell the public about their experiments to replace doctors.

What can you do: The Hospital Episode Statistics consultation

The Hospital Episode Statistics cover data from the nation’s hospitals for over 25 years. The HSCIC is looking for everyone’s views on privacy in the data. We’ll have a long response in a few weeks, but you can quickly complete their survey (or just email enquiries@hscic.gov.uk with a subject of “HES PIA consultation”). You don’t need to answer all the questions – you can just say why it matters to you that your privacy and opt out applies to hospital data. 

Investigatory Powers Bill – Protections for Medical Records?

We welcome Home Office Minister John Hayes’ statements that additional protections for medical records will be added to the Investigatory Powers Bill.

He said: “I am prepared in this specific instance to confirm that the security and intelligence agencies do not hold a bulk personal dataset of medical records. Furthermore, I cannot currently conceive of a situation where, for example, obtaining all NHS records would be either necessary or proportionate.”

Additionally, because he “felt that it was right in the national interest, with the benefit of the wisdom of the Committee” … “I feel that the public expect us to go further” than currently on the face of the bill, because he “cannot bind those who hold office in the future, so it is important that we put additional protections in place.”

Having agreed in principle that there should be “additional protections”, there are multiple ways to implement them.

For these purposes, it is sufficient to consider that Bulk Personal Datasets are used where the identities of the individuals being targeted are unknown, and you need to search by attributes across whole databases rather than names. Think of it like searching your phone book by phone number, rather than by name.

 

Existing mechanisms to get this information

As a Home Office Minister speaking in Committee, there is no reason he would be aware of the existing gateways available for doing precisely the things he was thinking about needing to be able to do in rare circumstances, for the exceptional reasons he was thinking they may need to be done.

In the course of an investigation, especially in a terrorism incident, the police can ask the NHS questions. The police or agencies won’t be able to go fishing for answers, they can ask the relevant hospitals questions, and the hospitals can take a view on whether it is appropriate to answer based on full details. There can be a process followed which can command public confidence.

Doctors are permitted to override the common law duty of confidentiality and release such information to the police when they “believe that it is in the wider public interest,” under GMC guidance. After a terrorism event, it is inconceivable they would not do so. When investigators know what to ask for, they have the ability to use existing processes for those individuals’ details on a targeted basis, should they be relevant.

There is existing guidance on this, and if it needs to be updated, that does not prevent stronger protections on bulk access to medical records being added to the the Bill.

Even if there is only a “risk” that those individuals may have been involved, or may be involved in terrorism in future, the duty of confidence for providing information to the Agencies was lifted in part 5 of the 2015 Counter Terrorism and Security Act.

The Home Office has lowered the bar of confidentiality protection dramatically over the last several years. Unamended, these powers remove it entirely.

 

What the protection must cover

The committee rightly identified that there must be protections for “for material relating to “patient information” as defined in section 251(10) of the National Health Service Act 2006, or relating to “mental health”, “adult social care”, “child social care”, or “health services””. All sections of that are important, although there are different ways to put them together.

It is insufficient to simply exempt data held by DH/NHS data controllers, as that does not cover social care, nor does it cover data with data processors contracted to the NHS (which is a different loophole of concern to the ICO).

The Agencies should also never be permitted to use covert means against the NHS or health professionals to acquire patient information.

Should the Agencies create a scenario where there has been a secret incident where medical professionals are not allowed to know the characteristics of a suspect, and that search can only be done at some future point by the Agencies, rather than now by the medical staff, then some mechanism may be appropriate. This seems highly unlikely, but the Home Office may be able to make such a case to the satisfaction of both Houses of Parliament. We invite them to do so.

In that scenario, it is likely to be necessary to have multiple levels of protection. A general ban on warrantry for such data, except where the data responsible Secretary of State has submitted to the Judicial Commissioner an approval for its handover and retention for a defined period for a defined investigation, and no others.

In effect, this removes from the Agencies permission to acquire the data, but retaining the ability for the Secretary of State elsewhere in Government to hand it over should they believe it appropriate. The Commissioner and Intelligence Services Committee should then be required to be notified that this has been approved, and state on how many individual level records were affected in any annual report covering the period.

Whatever the Home Office come up with, it must be robust and be seen to be robust. We remain happy to discuss this further with all parties.

Update on Google Deepmind’s NHS app – is it “just resting” ?

It appears Google Deepmind has suspended use/“pilots” of their experimental app until they have received regulatory clarification – ie, until it is legal.

This whole incident was harmful to patient trust, it was harmful to the hospital, and it was harmful to Google. All because, it appears, there was a desire to go faster than waiting a few weeks for regulatory and data approvals, and so used a bizarre cut’n’shut agreement.

The controversy has never been about whether the app would help clinicians with their patients. It has been entirely about what happened for people who were not patients of those clinicians. Some of questions from over a week ago remain unanswered. “Collect it all” might have applied to version 1 of the app, but they now have first hand experience of how it makes things harder not easier.

Tech teams often like naming their work. Perhaps the next version will be “Streams 2: This time we read the regulations”…

Google Deepmind could have followed the rules about applications for direct care, and the usage of data for “development work” (ie, “secondary uses”). They just didn’t, for some reason that we will ask their independent reviewers to get to the bottom of.

The app deserves to come back safely, if the humans running the project can follow the rules to get the data and processes that they wish to bring to the NHS. Nothing in our understanding of what they were doing, and the existing rules, should have prevented from doing this “pilot” (apparently), entirely legally, with conventional legal agreements. They simply didn’t do so.

If Google Deepmind choose to walk away from the project, it won’t be because they wanted to help the NHS; it’ll be because they wanted to help only on their terms. For the hospital, and the NHS more widely, it is yet another reminder that some offers of help may come with too high a price.
where-google-app-went-wrong

MedConfidential comment on Friday’s New Scientist revelations about Google Deepmind

 

Extraordinarily, the New Scientist has quoted Google as having used as part of an unregulated algorithm in the direct care of patients[1].

This follows up on previous news that Google Deepmind had acquired millions of detailed patient histories for unclear purposes[2]. Google Deepmind’s response was to focus that they were keeping the data safely[3], and to ignore questions over what they were doing with it, and whether they should have had it in the first place[4].

MedConfidential has long argued that every patient should be able to know how data about them has been used. If there had been a Ministerial commitment to do that, this mess of unanswered questions would not have happened.[5]

Announced yesterday, it is Government policy to “encourage and support data-driven techniques in policy and service delivery”. Innovation is welcome and vital, but it should be grounded in medical ethics and a clinical relationship, and not ride roughshod over processes in place to protect all involved.[6]

Responding to the latest information, MedConfidential coordinator Phil Booth said:

“Deepmind has spent a fortnight hiding behind the NHS. It’s now clear that this was a unregulated “development” project for deepmind, but a patient care project for the NHS.

“These algorithms evolve: errors get fixed, improvements get made. What approvals did Deepmind have from the medical regulators at the early stages? As the provider of a tool used in direct care, they are responsible for ensuring it meets all safety standards.

“Training doctors to make safe decisions takes years, and requires many exams to be passed. Have Google shown that each version used in direct care met all relevant grades, standards, and regulations?

-ends-

For immediate or future interview, please email coordinator@medconfidential.org 

Notes to editors:

 

  1. See https://www.newscientist.com/article/2088056-exclusive-googles-nhs-deal- does-not-have-regulatory-approval/  “We [Deepmind] and our partners at the Royal Free are in touch with MHRA regarding our development work.”

 

  1. See https://www.newscientist.com/article/2086454-revealed-google-ai-has-access- to-huge-haul-of-nhs-patient-data/ and http://techcrunch.com/2016/05/04/concerns- raised-over-broad-scope-of-deepmind-nhs-health-data-sharing-deal/

 

  1. Google’s self-defence https://www.theguardian.com/technology/2016/ may/06/deepmind-best-privacy-infrastructure-handling-nhs-data-says-co-founder refers to their self-reported scores in the IG Toolkit https://www.igt.hscic.gov.uk/AssessmentReportCriteria.aspx?tk=424999242358961&lnv=3&cb=e8c1aaf1-c40d-45af-9bb9-adc46c712924&sViewOrgId=49979&sDesc=8JE14 . Those scores have not yet been audited by the HSCIC.

 

  1. The question of why Google Deepmind had the histories of people who never had a blood test at the relevant hospital, and who may never return to the hospital, remains unanswered.

 

  1. Much like a bank statement, every patient should be able to see a data usage report, which tells them where data about them has been used, and why, and what the benefits of that usage were. A commitment to investigate implementation was made in late 2014, but remains delayed by the Caldicott Review of Consent. For more, see https://medconfidential.org/2014/what-is-a-data-usage-report/

 

  1. MHRA rules require medical devices to have appropriate pre-approved procedures in place to confirm they’re working as expected, and to ensure any conceivable failures have mitigations considered in advance. The New Scientist article confirms they do not have those approvals as algorithms in their software develop.

Google Deepmind – part 1

 

[this piece covers the state of play as on Sunday 8th May. It may be updated or replaced as new facts emerge]

If you are unwell: seek medical attention. These issues should not prevent you getting the care you need. The below discussion only relates to one Trust, the Royal Free in London, for all patient hospital events since sometime in 2010.

Last summer, following medConfidential’s work on care.data, Dame Fiona Caldicott was asked to review consent in the NHS. That report has still not been published, and provides recommendations. Patients should be able to know every way data has been used, as a condition of using that data – contracts shouldn’t allow secrets from patients.

Following a New Scientist article, there’s been a lot of press discussion about google deepmind receiving over 5 years of detailed medical data from the Royal Free NHS Trust in London. This project is steeped in secrecy, hiding details from patients and the public.

Concerns have not been about the patients whose information would be displayed in this app. Concerns are solely the data of the patients whose data could never be displayed in the app, as they have never had any of the blood tests (etc) it displays. That is 5 in every 6 patients. For the other 1 in 6, there is a potential benefit.

When we were first approached, our initial question was “what are they doing with this?” – details were hidden and emerged only through press investigations.

It looked like what Deepmind were doing should have been a research project – but it had not followed any ethics or research processes. It was using a dataset for the “Secondary Uses Service” – which strongly suggested this was a secondary use.

Data can be used for direct care – the care given to you by a doctor or other clinician. It is also used for other purposes, called “secondary uses”. These include purposes such as research, and the design of models for calling people in for screening (including for detection of kidney problems).

The New Scientist published last Friday, and the question remained unanswered until Wednesday. In an appearance on Radio 4, it emerged that the reason they had followed none of the research processes was simple: it wasn’t research.  It was claimed to be for direct care. The Professor speaking goes on to detail the limits that clinical rules and ethics put on who can access data for direct care.

As a result, on Wednesday afternoon, the question changed to Who is the direct care (ie clinical) relationship between?

Deepmind have made a case that they will look after the data – we’ve no reason to question that different point. This is not about losing data, it’s about whether they should have had most of it in the first place. What data should they have, and how should they have got it?

To answer that question, it has to be clear what they are doing. It is not.

More generally, to have confidence, patients should know how data about them has been used. What is Deepmind hiding in this case? And why? Will they give a full accounting of how they’ve used patient data, and what for, and what happened in direct care as a result?

Every data flow in the NHS should be consensual, safe, and transparent.

Why does google think what it does with the medical history of patients can be secretive, invasive, and possibly harmful?

Throughout most of medConfidential’s work, we are able  to say “opting out will not affect the care you receive”, because large amounts of work have been done by all sides to make sure it does not. If you opt out of “secondary uses” of your data released by HSCIC, it does not affect care compared to someone who did not opt out. Due to the lack of process, and the corners cut by google deepmind avoiding all the relevant processes, that may not necessarily be true. We hope the Trust will clarify what their opt out does. If you didn’t want your data handed to google for speculative purposes, what happens if you get injured and show up at the Royal Free’s A&E? How is your care affected? Did they cut that corner too?

Patients should not be punished for deepmind’s cut corners.

Scalpels Save Lives

Our friends in the research world promote that #datasaveslives, and it does, just like scalpels do.

To be completely clear, deepmind have said that their project is “not research”. That’s why they didn’t follow any research processes. There are 1500 projects which followed the proper processes and appear on the “approved data releases” register – the Deepmind project is not one of them.

Data, and good data hygiene, is as much a requirement of a modern hospital as sterile scalpels. Following the right processes to provide sterile instruments is not seen as an “unnecessary burden”, even if accountants may wish to cut costs due to the expense. Scalpels have to be sterile for a very good reason.

Similarly, processes put in place to protect data are around the same level of importance as adequate cleaning. It may seem like an unnecessary burden to some. Just as too little cleaning will cause problems that clearly demonstrate the necessity of what was previously decried as too much. Those who cut corners are rarely the ones who suffer from the decision. There is a fundamental difference between causation and correlation

Deepmind seem to be a powerful new tool.

Were it was an instrument to be used in surgery, it would not be enough for it to be powerful and new, it must also be safe. Otherwise the harm can be significant.

Rather than clean and safe, if seems deepmind is covered in toxic waste.

It’s not that deepmind couldn’t go through the processes to ensure safety. We don’t know why they didn’t.

Deepmind might be a better instrument, or it might be the new nightmare drug. Technology tools aren’t a panacea. Have lessons been learnt after the “epic failure” of “Google flu trends”?

Research, testing, and regulatory oversight is designed to prove that changes are safe. They also correct any unintended harms to patients as the process proceeds.

How much of that happened in this case? 

If Google DeepMind publish attributable and citable comments in response to these questions, we’ll link to them.

MedConfidential Update – Opt outs being honoured

If you have opted out, recently or before, your choices are now being honoured.

Thanks to all those who helped make this happen – especially you, our supporters, donors and friends.

The institutions involved did the right thing in the end, even if they tried all the other things first.

 

What just happened? Your opt out honoured

On Wednesday, the HSCIC announced that they had received permission from the Secretary of State to finally honour his promise to you. You can opt out of data leaving the HSCIC for purposes beyond your direct care, and that is what happens. When he created the opt out that you took up, NHS England, who was then responsible for it, didn’t think it would matter.

The tickbox that you and 1.2 million other people filled in is now being honoured. The announcement says it must be done by this time next week; in practice, we are happy that this is effective with immediate effect.

Until the public consultation on the Caldicott Review, there are a small number of narrow temporary exceptions (3), and some temporary gray areas (5). But in the main, it is now done. If any of those concerns are particularly concerning to you, please let us know. We’ll be writing to HSCIC with some clarification questions next week.

The next hospital dataset to be released will be the cleaned up “full year” data, which replaces past each month parts for April 2015 to March 2016. This is the critical release which really matters. Consent will be respected for this release, and data about those who have opted out will not be included.

The HSCIC has also undertaken with the Information Commissioner to reissue the 2014 – 2015 data to those who already received it. By contract, they are required to replace old data with new.  That undertaking is the direct result of a medConfidential complaint to the ICO.

GPs have been able to honour their part since you gave them the form.

In effect, for current and future projects, as much as it could have been, it is as if your opt out, for data leaving HSCIC for purposes beyond your direct care, was honoured in April 2014.

What’s next?

The announcements this week are not the end of this process – there is a great deal left to do.

The Caldicott Review of Consent is going to propose a comprehensive and permanent solution. That solution should satisfy concerned patients into the long term, resolve the grey areas and simplifies the whole thing. It will be the subject of a public consultation, and then legislation.

But as of Wednesday, the current state is now consensual, increasingly safe, and somewhat transparent. Reducing the number of copies of data that are made will reduce the number that can be lost or stolen. More transparency will mean that you will know that your wishes have been honoured – you wont have to trust they have.

What else?

If you’ve previously had a discussion with your MP on this topic, you may wish to get back in touch with them and thank them for their help, now that the Department of Health has done the right thing, and your wishes are being respected.

MPs often hear about problems, and less often hear about what happened as a result of their help, especially in a long term project like this has been. (You should probably make clear that this is a thank you note – it might confuse their busy offices if it’s unclear…) Also, there was an election in the interim, and some MPs will have changed.

For us, it’s not getting any quieter. There are other organisations that don’t wish to act as if their world has changed. Most seriously, there are a few other projects that see the style-first approach of care.data as a handbook, not a cautionary tale…

It never ends. But this week, a lot got better as a result of our work and your help. Thank you for your support until now, and hopefully into the future.

 

 

PS – our especially deep gratitude to all those who donations also helped. We couldn’t have done this without you.

MedConfidential comment welcoming the Wellcome Trust’s “One Way Mirror” Report

Today, the Wellcome Trust publish a new report on data sharing.

The name says everything data sharing shouldn’t be – and the report shows why.

We welcome another confirmation that organisations can maintain trust via transparency and shared knowledge.Data projects, including commercial data projects, can be handled safely, if the people in charge choose to do so. When they don’t patients and citizens get nervous and trust collapses.

Care.data and others tried the “One Way Mirror” approach, and this report names “context collapse” as the point of public concern. Patients care what happens to their data and are wary about how it could be used beyond the context of their own healthcare, and so simple, complete, accessible and truthful explanations to patients are necessary. Otherwise, context collapse is certain, and like care.data, confidence collapse is sure to follow.

 

(MedConfidential Coordinator Sam Smith sat on the advisory group for this study)

First Thoughts: Government data: Copies of more than medical records?

The consultation is supposed to be about using data to help citizens; but the proposals and principles are about how Government thinks it can do one thing to help all citizens – that seems unlikely.

Yesterday, the Cabinet Office opened their consultation on copying everything but medical records. It is a consultation, not about data, not about citizens, but about Government. It’s officially about “better” use of data, but “better” in this term seems to mean “more”, not “improved”.

As care.data was about NHS England not patients, the same #datacopying mistake has been made.

In short, this consultation is the latest step in the ongoing data debacle of Government. Rather than suggest learning the lessons of care.data, most of it doubles down on repeating the failures by institutions and their shared worldview of an office near the Thames.

We find out within days what the Caldicott Review will recommend, and see where the NHS thinks this should go. If the Cabinet Office were accurate about having worked closely with DH, then this consultation does not look positive. 

A blog post by the Data Sharing network will appear shortly (we’ll update this post) on how the process reached this point.

The relationship to medical records

At the launch meeting for the consultation, the Cabinet Office said that the lessons of the Caldicott Review of consent had been considered, and this consultation was working with the Department of Health team. I can only hope that Cabinet Office paid as little attention to what DH were saying as they have paid to others.

The NHS number makes an oblique appearance, in part 3 below; although it’s only in the original consultation document if you know that it’s there.

Continue reading

Newsletter: Care.Data’s suspension enters the terrible twos

It’s 2 years to the day since Care.Data was suspended amongst public outrage. The failed programme is showing no signs of restarting, as NHS England and the Department of Health continue to sift through old pampers, and keep finding yet more problems.

The Caldicott Review of Consent, which began after NHS England lied to the Care.Data Advisory Group, should report soon, if those who want to water it down to avoid having to make uncomfortable decisions. Why might they do that? Well…

 

Another Jeremy Hunt promise is broken – Your Hospital Data is still being sold

Before their January deadline, HSCIC finished the testing needed to implement the hospital data consent promise that Jeremy Hunt made to every patient – which 1 million patients who opted out took him up on. The final step was for Jeremy Hunt to give the go ahead to keep his promise. He didn’t.

Let us be clear: Jeremy Hunt made the patient promise 2 years ago, and it appears in the 2015 conservative manifesto (pg 38) “We will give you full access to your own electronic health records, while retaining your right to opt-out of your records being shared electronically.” Only he can break his promise, and he has chosen to do so.

So when will the opt outs be implemented? We look forward to hearing any answer the ICO receive shortly on exactly that question, as they respond to our complaint. The Department of Health are refusing to answer questions – which is understandable as they don’t have any answers.

Your GP will honour your request for data not to leave your GP practice, both because of medical ethics and because of their direct connection to you. Who is Jeremy Hunt connected to?

The interim-type-2 opt out can be implemented tomorrow if Jeremy Hunt tells HSCIC to do it. Why hasn’t he?

You may wish to write to your MP, and ask the question, “when will the Secretary of State for Health implement patients’ choices to prevent data about them leaving the HSCIC for purposes beyond direct care?” – please also say why this matters to you. (and sorry the question is a bit of a mouthful)

This can be fixed. The Health Secretary just has to take the single action necessary to fix it, permanently.

A perfect overarching consent flag is something we support; but at best, it is a year away from being something a patient can ask their GP to do. No scenario, other than immediate implementation of the interim-type-2s, addresses the gap between now and then. A long-term maybe-mythical “perfect” solution is currently the weapon of choice of those who want to prevent any patient choice over data usage at all: that change being the consent choice (aka “interim-type-2”) which 1 million patients have requested be actioned, and that they are all waiting patiently for. When the first step down the path to consent has been taken for national datasets, there can be confidence that subsequent steps will be taken. If not, and the Department of Health breaks Jeremy Hunt’s promise this time, why should anyone believe them next time?

What’s next: Care.Data Everywhere?

On Friday, we’re expecting that Cabinet Office to launch their data copying consultation, which probably won’t have the subheading “care.data everywhere”, but unless they’ve fixed their compulsion to copy, it probably should have. It’s not all terrible news; the worst projects (probably) didn’t get this far – what the consultation will show is the stuff that they don’t think is terrible (that’s probably not reassuring).

Every project involved has had to explain how “it’s not like care.data because…”, but the Cabinet Office has seemingly learnt only the lessons convenient for them to learn. It’s hard to all learn the right lesson when institutional incentives encourage people to learn easier ones.

The lack of critical thought across the programme appears in Parliament’s report on the “Big Data Dilemma”, which says the NHS could save £66bn from more data copying. Saving about two thirds of the NHS budget (equivalent to getting rid of all staff from the NHS) seems… unlikely.

We’ll see what the Cabinet Office consultation says over the weekend, and any health implications will appear in the next newsletter. The Caldicott Review is also due to be consulted on, if it ever gets published.

What’s Next: Saatchi Bill returns to the Lords

With the most problematic bits of the bill removed by MPs, the Saatchi Bill on “medical innovation” is now a mechanism to create new databases, and do so only with the approval of Parliament.  How is this different to care.data, which Tim Kelsey repeatedly said was “the will of Parliament”?

That’s a very good question. The main difference is whether Parliament says yes, or whether it chooses not to say anything. Currently, silence means support, which was the approach that failed catastrophically with care.data.

We’ll be looking to have conversations with their Lordships about an amendment to require Parliament to approve any plans, rather than simply not objecting. Especially as this Government is looking to remove the ability for the Lords to object to anything…

More soon, and we especially thank all those who have made donations.

 

MedConfidential Christmas Bulletin: Freedom, Care.Data and Space

It’s been a busy few weeks, as the Government came back from Conference season, and kicked their various schemes into high gear. In 2016, we’ll see data sharing across the NHS and Government taking up time: care.data may become a ministerial playbook.

Your support is greatly appreciated; and thanks to you and your loved ones at this time of year. But here’s where we are at the moment, if you wish to delay Christmas cheer just a little longer:

Care.Data.

Care.Data’s still suspended while Dame Fiona Caldicott tries to unwrap Tim Kelsey’s leaving present. The programme will enter 2016 as it left 2014: still digging in deeper. A new leadership for care.data was an opportunity to change that approach.

We’ve heard secondhand that the a new Senior Responsible Owner, obliged to hold this poisoned chalice, has been handpicked from the few loyal bag carriers left in the care.data bunker. Which means he’ll have repeatedly made valiant attempts at defending the inept and the ill considered. Indeed, the job description practically required blindly ignoring the fact that the ship was sinking until bailed out by his boss. With the Admiral’s hat his to don, it’s interesting to see if it will be full steam ahead into the iceberg of public rejection, yet again.

Dame Fiona Caldicott’s review of consent reports at the end of January, with Ministerial decisions in the months after that. Past NHS management has been good at persuading ministers to put their reputation behind the publicly indefensible until it becomes evident, even to the Department of Health, that perhaps that was unwise. At the last Care.Data Advisory Committee meeting, it was grudgingly admitted that the September roll out was halted by Jeremy Hunt himself…

Given Cabinet level discussions about data sharing, and the scope of opt-outs and consent, 2016 should be a busy year for data in the NHS and beyond. It seems some see care.data as a model to be copied. As always, the first question is whether the Government or NHS England wishes to constructively engage, or cower in a corner and ignore those who will point out necessary implementation changes. That choice is entirely up to them.

Your Right To Know

The CoverUp Commission has found that the public quite like the ability to request copies of Government documents in acts of citizen driven focussed transparency. Thank you for helping with that…

MedConfidential submitted a brief note of our own experiences of FOI, and also a saveFOI.uk submission of 260 different successful FOI requests (or outcomes from multiple requests), many submitted by you and others. SaveFOI.uk submission asked a simple question: Which of these questions does Lord Burns think shouldn’t have been answered?

Power likes secrecy, and “Burns it” would have been a common refrain in Tim Kelsey’s archipelago of NHS England. Freedom of Information is how the details of care.data were forced to be published. The deep veil of official secrecy continues to hide the bulk of Tim Kelsey’s legacy, which hopefully will start to burn up over time.

Not everyone gets to be an astronaut.

Everyone in the NHS wants to help improve the health of the nation, but that’s not the same thing as giving Direct Care. In the same way, that lots of people helped put a man on the moon, without being an astronaut. Every child eventually learns that not everyone gets to be an astronaut; and sometimes it’s a hard transition.

Tim Kelsey, who wanted all to sell medical records before his term was out, leaves NHS England today to take up a new post in Australia, but assured us he “will be back”.

Transitioning to consensual, safe and transparent data handling practices is as important for a hospital as good cleaning or sterile instruments – and the same thing happens when you disregard it too much. “Sufficient” cleaning is too much of a burden until it’s self-evident that it was too little, and harm occurs. Hopefully, in 2016, NHS England will learn about data hygiene and air quality. The astronaut programme had the literal version of the same problem. Will there be a systematic response to a politically driven digital-MRSA infecting the NHS and beyond? If the problem is left to go away of its own accord, it always comes back.

Consensual, Safe and Transparent Christmas sharing

It’s been a busy few months, but we’re still here, and would like to continue to be. If you wish to support our work, a donation is always greatly appreciated.

With best wishes to and your loved ones for Christmas and for the new year. May 2016 bring consensual, safe and transparent data flows throughout the NHS and beyond.

See you next year – we really couldn’t do this without you. Best wishes to one and all.

Sam and Phil