Category Archives: News

Deepmind try again – November 2016

DeepMind this morning reannounced their partnership with the Royal Free Hospital. Updates are at the bottom – details are in the 9:50 and 10:10 updates.

There’s apparently a new legal agreement to copy exactly the same data that caused so much controversy over the summer. We have not yet seen the new legal agreement, so can’t comment on what it permits or disallows.

Responding to the press release, Phil Booth, Coordinator of medConfidential said:

“Our concern is that Google gets data on every patient who has attended the hospital in the last 5 years and they’re getting a monthly report of data on every patient who was in the hospital, but may now have left, never to return.

“What your Doctor needs to be able to see is the up to date medical history of the patient currently in front of them.

“The Deepmind gap, because the patient history is up to a month old, makes the entire process unreliable and makes the fog of unhelpful data potentially even worse.

As Deepmind publish the legal agreements and PIA, we will read them and update comments here.


8:50am update. The Deepmind legal agreement was expected to be published at midnight. As far as we can tell, it wasn’t. Updated below.

TechCrunch have published a news article, and helpfully included the DeepMind talking points in a list. The two that are of interest (emphasis added):

  • An intention to develop what they describe as “an unprecedented new infrastructure that will enable ongoing audit by the Royal Free, allowing administrators to easily and continually verify exactly when, where, by whom and for what purpose patient information is accessed.” This is being built by Ben Laurie, co-founder of the OpenSSL project.
  • A commitment that the infrastructure that powers Streams is being built on “state-of-the-art open and interoperable standards,” which they specify will enable the Royal Free to have other developers build new services that integrate more easily with their systems. “This will dramatically reduce the barrier to entry for developers who want to build for the NHS, opening up a wave of innovation — including the potential for the first artificial intelligence-enabled tools, whether developed by DeepMind or others,” they add.

Public statements about streams (an iPhone app for doctors) don’t seem to explain what that is. What is it?


9:30 update: The Deepmind website has now been updated. We’re reading.

The contracts are no longer part of the FAQ, they’re now linked from the last paragraph of text. (mirrored here)


9:40 update: MedConfidential is greatly helped in its work by donations from people like you.


9:50 update: Interesting what is covered by what…

screen-shot-2016-11-22-at-09-54-17screen-shot-2016-11-22-at-09-45-28

screen-shot-2016-11-22-at-09-47-34


10:10 update: What data does the DeepMind FIHR API cover? What is the Governance of that API? Is it contractually, legally, and operationally independent of the Streams app?

(it’s clearly none of those things, as the above screenshots say).

Deepmind have made great play of their agreement being safe, but consent is determined in a google meeting room, and the arrangements for the “FIHR API” are secretive and far from transparent.

There is likely to only be one more update today around 1pm. Unless Google make an announcement that undermines their contractual agreements.


1pm update: The original information sharing agreement was missing Schedule 1, and has been updated.


3:30 update: DeepMind have given some additional press briefings to Wired (emphasis added):

“Suleyman said the company was holding itself to “an unprecedented level of oversight”. The government of Google’s home nation is conducting a similar experiment…

““Approval wasn’t actually needed previously because we were really only in testing and development, we didn’t actually ship a product,” which is what they said last time, and MHRA told them otherwise.

Apparently “negative headlines surrounding his company’s data-sharing deal with the NHS are being “driven by a group with a particular view to pedal”.”. The headlines are being driven by the massive PR push they have done since 2:30pm on Monday when they put out a press release which talked only about the app, and mentioned data as an aside only in the last sentence of the first note to editors. – Beware of the leopard.

As to our view, MedConfidential is an independent non-partisan organisation campaigning for confidentiality and consent in health and social care, which seeks to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Does Google Inc disagree with that goal? 

Data in the rest of Government: A register of data sharing (managed, rather than inadvertent)

The Codes of Practice for the Digital Economy Bill aren’t worth the paper they’re (not) printed on. They aren’t legally binding (para 11), and are effectively subservient to the ICO’s existing code, even while paragraph 60 pretends a single side of A4 is a valid Privacy Impact Assessment for data sharing for operational purposes.


As this is the case, why is there a code of practice necessary under the legislation? Is does nothing new. Is it solely to make it look more respectable than the dark and dank dodgy deal that it actually is?

In places such as supermarkets, you have a choice of whether to use a clubcard, and can easily use one of the other supermarkets that are available – Government doesn’t have competition promoting integrity. To ensure a citizen can see how Government uses data about them, there should be a statutory register of data sharing agreements (involving public sector data). A register prevents nothing (which seems to be the policy intent of the Bill), but is simply a list of stated intents. From the Register comes an informed discussion of what organisations are actually doing and sharing, rather than excessive secrecy and double dealing.

Opposition to a register comes from fear, based in Government’s lack of knowledge of what data they have, or currently share it. If you don’t have a clue where your data is, or why it’s there, you oppose a register because you don’t want to find out.

How this state of affairs came about, is at the heart of this Bill.

We’ve previously posted about the definition of Personal Data in the Investigatory Powers Bill. What about in the non-secret parts of Government?

In 2010, the Cabinet Office told GCHQ that “to be considered personal data, a dataset has to contain at least the actual names of individuals.GCHQ being subject to the national security exceptions of the Data Protection Act.

In March 2015, the term “bulk personal datasets” was used by Parliament, and entered common terminology, but it wasn’t until November 2015 that the full definition of the Data Protection Act was restored (with DPA exceptions for National Security).

But, in the middle 7 months, the term gained increased currency within Government and used much more widely as it crossed into the non-secret sphere. The Cabinet Office took the existing meaning and thinking and applied it elsewhere.

It was never noted that the definitions in the non-secret parts of Government should have been  different, likely weren’t, and hence possibly are invalid under DPA, because the narrow term for GCHQ was classified, and hence restricted. Ie “actual names” is not the DPA standard.

So what effect did this have?

Following the talk talk hacks, Government ran an exercise described as the “Cabinet Office audit 2016” looking at what each Department held, and the impact of them losing it.

We made FoI requests about what each department held, and got very interesting answers (we excluded national security or serious crime).

The Cabinet Office hold no bulk personal data (apparently… ).

DCMS hold some bulk personal datasets – on people who have responded to their consultations (and some data from the Olympics)…  (erm…  what?)

The Department of Transport gave a much longer answer, (but didn’t know how much was in each dataset).

Does the Government known what personal data it has, uses and shares, and where it keeps it? If so, did Departments share that list with the Cabinet when asked?

Since we can see they probably didn’t, are all uses of large datasets (whether they are considered personal data or otherwise) fully compliant with the definitions in the Data Protection Act?

How does the Bill and associated work help resolve this mess? 

Data in the rest of Government: a fertile breeding ground for fraud and misery

We have now published our submission on Part 5 of the Digital Economy Bill to the Committee scrutinising the Bill.

The Digital Economy Bill (Part 5) promotes a fertile breeding ground for fraud and misery: excessive data sharing, but only in secret.

The protections on sharing data to academic researchers are strict – people who work to increase human knowledge have to jump through a range of hoops to get funded, with critique and transparency; and then even more hoops to get data, again with critique and transparence; and even then, the the Digital Economy Bill does not apply that process to health data for research until we see the outcome to the Caldicott process.

Yet, the Digital Economy Bill contains no prohibition on civil servants secretly sharing any data, including medical records. This includes secret sharing by Government Departments to inform operational decisions. There are no meaningful restrictions on copying of any data held by any public body. From anywhere, to anywhere except independent experts.

“If you give me six lines written by the hand of the most honest of men,
I will find something in them which will hang him.” – Cardinal Richelieu

In recent weeks, it has emerged that a citizen was stripped of housing benefits because data showed they were cohabiting with “Joseph Rowntree”, the 19th Century philanthropist whose modern legacy includes a Housing Trust which bears his name, and which was that woman’s social landlord. The DWP contractors used just enough data to create “evidence” that reinforced their prejudice, but not enough to realise their “evidence” was lunacy. This Bill makes those events more common, more harmful, and more opaque.

The Cabinet Office believe that they should be trusted with everything, and independent academics can be trusted with almost nothing. Neither of those are likely to be right.

That is learning the wrong lesson from care.data. Academic use of data to improve the public knowledge was not what people objected to. It was the secrecy and other purposes.

The Cabinet Office clearly don’t understand that their approach is part of the problem. More copying with the same rules was care.data; this bill is more copying with less rules, less transparency, and less oversight. No one suggests that fraud prevention should be subject to consent, but Parliament should be able to assess whether the approach has worked.

The existing minimal oversight of Parliament goes away, and Departments can do what they wish. A statutory basis for disclosure to prevent fraud is entirely sensible – individuals shouldn’t get that choice – but in addition to all sharing the data, the Government demands secrecy as to whether that sharing was effective.

If governance is effective, then it should apply to all. If it’s ineffective, then it shouldn’t happen at all. Parliament needs to pick one.

The reason the last Government didn’t want good governance and accountability to apply to Government, is because the governance and accountability process works.

An accountability process that is entirely absent from most parts of this Bill.

Which way is the new Government going to go?

Mid-September Update

It’s been a busy few weeks for consultations and briefings. While links are (usually) tweeted as we publish them, the below is a consolidated summary of events. We’ll send a newsletter in the week or so, with other information, including last week’s re-announcement of the new NHS apps library.

Following the change in Prime Minister, there is a new Minister for these issues in the Department of Health. We welcome Nicola Blackwood MP into the post. Her track record as chair of the Science and Technology committee stands her in very good stead, as does her first public speech on her new remit.

The Caldicott Consultation

NHS England has published a series of 3 “discussion events” in London, Southampton, and Leeds. They start 3 weeks after the consultation closed, and end 3 weeks before the Department aim to respond.

Our main submission to the Caldicott Consultation was submitted before the deadline. A supplementary will follow, because of subsequent events.

Updated October 10th: first supplementary publication.

Public Health England

The contents of this consultation, and this Report, suggest that the lessons of care.data  have not yet reached Public Health England. With the Government response to the Caldicott Review, they will have to. This will be a focus of our first supplementary submission to the Caldicott Consultation

Update (19/Sep): We’ve published our blood spot consultation response.

Legislation

The Investigatory Powers Bill has added an extra tickbox if the Agencies want access to health data. Ironically, this is one area where those overseas have greater protections than those in the UK – there is far more concern about the reaction of foreign governments/press than there is about our own.

The Digital Economy Bill has also begun lumbering through Parliament, published shortly after the Referendum with relatively little content. We’ll have more than our initial briefing in due course. The “data science” work of Cabinet Office can easily justify care.data, and doesn’t seem to understand what personal data is. The problems with this Bill are solely of Cabinet Office’s making.

Anonymisation

As HSCIC has now confirmed (page 246), we have a complaint into the Information Commissioner. It covers the “definition” of anonymisation used to exclude opt outs being applied to hospital data. Here is a short summary of what Anonymisation is (and isn’t) in 2016. It is easy to forget the full meaning of “personal data” when there’s a desire to ignore it.

We also responded to the Privacy Impact Assessment consultation on the Hospital Episode Statistics.

What’s in a name?

HSCIC has changed its trading name to NHS Digital. For the next little while, we will use NHS Digital to refer to NHS services provided by the organisation, and HSCIC to refer to actions as an Arms Length Body of the Department of Health. Despite the name, NHS Digital is not an NHS body: it is accountable to Secretary of State, not NHS England.

How much confidence does the Department of Health have in Electronic Health Records? Invoice Reconciliation and the DH Caldicott Consultation.

“By 2018, clinicians in primary care, urgent and emergency care and other key transitions of care contexts will be operating without the use of paper records” says the Department of Health. MedConfidential agrees that Electronic Health Records to pass information electronically along a patient’s care pathway will lead to better care and better patient outcomes (and better privacy), but that’s not all they do.

To ensure the uptake of flows along care pathways, there should be transparency of process. As part of a Data Usage Report, patients should be able to see where their EHRs have gone and why. That data could go anywhere is mitigated by telling patients exactly where it did go, so patients can have confidence it didn’t go elsewhere. The requirement for all care providers to use the NHS number makes this feasible.

There will also be published statistics on the adoption of EHRs. Those statistics should also include what percentage of patients arrive at an organisation with an EHR, or how many need to have an NHS number lookup to create a record (organisations with walk-in patients, including A&E, should be excluded). By institution, they don’t tell us very much, but when you look at pairs of institutions, you can see patterns. How many EHRs did hospital A send to care provide B? How many B receive from A? Where does this process go wrong, and is there anyone chasing up why and fixing it?

In practice, no. Because there’s no strong incentive to.

The place that the chasing up does happen, predictably, is when money becomes involved. The NHS has a balance of accountants whose job it is to make sure that one bit of the NHS bills the other bits the right amounts for the care provided. Given NHS bodies don’t trust each other, those other bits of the NHS then have a different balance of accountants to check all the invoices.

For years, there has been a “temporary” arrangement where those accountants could see identifiable data, because that’s what had to be on invoices otherwise the accountants wouldn’t necessarily pay them. That temporary arrangement should have expired many times, but it keeps being renewed, as the accountants have never changed how they work, and the system doesn’t trust itself. The current incentives are wrong.

It was outside the Caldicott Review’s remit (and time) to look at them, and so the Review had no choice but to suggest that the accountants can continue to look at identifiable information.

In a world with Electronic Health Records that flow along care pathways with patients, that doesn’t have to happen – the constraint on the Review should not apply to the Department. The reporting on those flows can include a summary of care provided at the previous stage from that provider, which provides a separately accountable (to CCG via HSCIC) reporting streams which the accounts can rely on. As it derived from clinical data, any fraud by commercial actors in the system would require clinical fraud, as well as accounting fraud, with it’s far higher penalties. If the counts of care provided from one side are very different to another, that is examined as a clinical issue as well as a financial one.

“Should we pay this invoice?” becomes a simple question based on audited information from multiple automated sources. The counts will show whether all care provided along pathways has yet been paid for by the relevant CCG. Where there are queries, it means there was an EHR flaw which should be addressed, not just for financial reasons, but to improve care. (This is in need of refinement, but the likely question for adoption is “What percentage of new records have an NHS number entered manually, rather than via an electronic records transfer?”. Care per provider per CCG is derivable from EHR flows by third parties).

Invoice reconciliation has been a thorn in the side of privacy and good governance since the inception of the internal market in the NHS – the Department of Health has never bothered to fix it. As EHRs roll out on the same timescale as the Caldicott Review, there is the opportunity to do so.  Besides legislative changes in the consent model, when the CAG regulations are finally laid, they should prevent the approval of s251 for invoice reconciliation “by 2018”. If the organisations of the NHS care about high quality EHRs because Treasury cares about accounting, incentives will be aligned to resolve problems along care pathways, which will improve direct care at the same time.

This is the administrative backwater of the NHS, that only cares about money (which makes it very important to the Department) Disease Registries or the GP opt out are far more high profile and important. But if the destruction of the GP opt out is the primary signal of intent to patients, data flowing to the accountants is the primary signal of intent to NHS managers and institutions on whether the Secretary of State means to deliver on his promises, or whether he will give up if the system ignores him.

This is not a new change – this has long been trailed as coming soon, yet the bullet has never been bitten, and the identifiable data turned off even though it has repeatedly been decided to. Given the changes in the consent model proposed by the Review, if not now, then when?

No evidence was provided by the Caldicott Review for the override of the GP opt out, for the exception of the disease registers, or the override for invoice reconciliation. The Department of Health seems to think it’s easy to override your wish that your data does not leave your GP, but will they ask the accountants to change what it is they count? Or will the Department give itself an opt out on a major flagship policy?

Data in the rest of Government: When is personal data not personal data?

This forms a background note for our Investigatory Powers Bill Briefing.

We asked a selection of Government departments what “Bulk Personal Datasets” they hold. These are collections of personal data on a lot of people, which inadvertently answer the question, what data does Government use to make decisions?

So here are the lists, by Department and how many datasets:

Apparently, the Cabinet Office doesn’t use data on people for anything other than “National Security” or the “prevention and detection of crime”.

If you look at the Education and Transport lists, they include the things you would expect – databases on car registrations, and drivers, or of pupils, schools and teachers. This is the sort of data that Government should be using, transparently and accountably, to make decisions. 

But just as Justice is blind, Health, DWP, apparently make decisions without recourse to data on the population of the country. Do DH really not use any data on doctors, or on patients? Do MoJ not use data on prisoners?  According to both of them, no they don’t.

Of course, they actually do. When DWP counted, they had 15,000+ copies of their Customer Information System lying around their analysts computers, which each had data on up to 120 million citizens (there aren’t that many people in the country, but they had a lot of duplicates; it’s now down to about 80 million records for 65 million people). Just lying around – the loss of any one of those copies (for which there are no records) would have dwarfed the HMRC child benefit data loss of 2007.

None of those copies are accounted for, or considered personal data. Your pension contribution history may be personal data to you, but not according to DWP.

Why is that?

From the work of Privacy International, we see:

“…agreed with Cabinet Office in 2010, as part of the Review of Agency Handling of Bulk Personal Data, that, to be considered personal data, a dataset has to contain at least the actual names of individuals.

On that basis, all of the answers we’ve received may be “accurate” if you use unstated definitions: DWP’s analysts don’t get names (just addresses, dates of birth, and detailed employment/NI history, etc).

The Cabinet Office’s working definition of bulk data on page 2 says simply “personal data”, and gives no indication that the second clause of the definition in the Data Protection Act has been dropped as it was inconvenient:

“personal data” means data which relate to a living individual who can be identified—
(a) from those data, or
(b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller,

Perhaps the current Digital Economy Bill should sneak  “unless it’s inconvenient” onto the end…

MedConfidential Bulletin – August 2016: Do you want your GP records shared, even if you’ve opted out?

MedConfidential newsletter – Do you want your GP records shared, even if you’ve opted out?

Care.data may be gone, but Jeremy Hunt is asking whether you want to keep your opt out of your medical records leaving your GP’s practice. Will you tell him what you think?

There’s a government consultation going on on into the future sharing of your medical records. It doesn’t say it clearly, but what they are asking is do you want your GP to keep your medical history private?

If you do, please respond to the consultation, and tell your friends:

You can respond to the consultation online. You don’t need to answer every question, and can only answer question 15 if you wish.

You might want to mention some of these points in your own words:

  • Why is what you tell your GP private for you?
  • Why must doctors and the NHS keep the promises they make to you?
  • Is this promise clear: “information about me can only be used by the people directly providing my care”?
    • Do you want that promise to be given and kept?

Previously, those questions have been ignored in private. Now they’re public, you get your say. The people who want to use your health data will reply, will you?

For our longer analysis of the Caldicott Review that led into the consultation, it’s online in 4 parts.


(You may have noticed the new format and process for our newsletter – it hopefully works out far cheaper. It’s still the same information you subscribed to about keeping your NHS records confidential. We won’t pass on your email address. If you no longer wish to hear from us, just email unsubscribe@medconfidential.org or click the link below)

Caldicott Review – Future Fiascos?

(this is part 4 in a series – you may want to begin with part 1)

For some, care.data wasn’t a mess. It was practice...

But practice for what?

The new patient choice suggested on page 41 of the Review is: “information about me can only be used by the people directly providing my care.” That meets the promise in the Conservative’s election Manifesto, and it is a promise that is capable of being kept.

A longer term aim of the Review was to begin to restore public trust. It is only after the consultation, when we see the implementation, that we will be able to judge whether it was successful. Details of implementation were outside of the Terms of Reference, which is why there is so little detail on implementation.  It is also why the “NHS/non-NHS” proposal could be considered, as there was no need for anyone to think about how to do it.

This is the shadow boxing that has gone on for nearly two years since it become clear that care.data wasn’t coming back. But the problems remain unresolved.

To prevent the outcome of this consultation being changed at the whim of a future Secretary of State, the outcome has to be put into Legislation. Given the slow process to put the National Data Guardian on a statutory footing, that is entirely possible. If there are going to be new changes to patient dissent, they must be given a democratic basis by Parliament.

The Review has decided that there should be a “Caldicott Reset of Consent” even if the mechanism is currently unclear. Will they be positive, or in retrospect, will they turn out to be as compromised as the care.data “vision”?  It could go both ways.

Consensual, Safe, and Transparent?

Consensual: What is the promise to patients, and how will they know it was kept? How does it change from the current offer? How will patients be able to make an informed choice of whether to express dissent? The option of maintaining the current opt out for GPs, and extending the HSCIC opt out to cover all other care, is still possible.

Safe: “don’t leave medical records on trains” The Caldicott Review is strong here, and it just needs to happen. How will patients know if it hasn’t?

Transparent: Patients should know how their data gets used. At the moment, it’s a secretive process because there is no transparency. If DH wishes to override patients’ dissent (whatever that is), the patients should be able to know about it. It makes all of the discussions practical. This is the informed part of informed consent (dissent), and also the mechanism for Dame Fiona’s conversation into the future.


The opt out language

The “hard” part of Dame Fiona’s task was to suggest new language on opt outs that would both have public trust and was capable of being delivered. And it succeeded in offering that option, amongst others.

How that is interpreted is currently solely up to the Department of Health.

Data sharing will continue, as will AI, genomics, and whatever comes after AI and genomics. At some point, Google Deepmind will do something creepy, or the next startup in shoreditch will do something stupid. If a media firestorm is the only way to tell patients what happened, then eventually public confidence will lose, and lose badly.

Every patient should know how their data is used, and from that, innovation can be celebrated. At the moment, everyone prays really really hard that nothing goes wrong.

A transparent communications mechanism can build trust. It can also be used to talk about all the other innovations and benefits that already happen – it’s just no one knows about them.

Currently, the best way to find out whether your retina scan was fed to an AI, is by being on Google’s press list. That can not be right. If the only way Jeremy Hunt wants patients to hear how their medical records were used is via the press, then that is sheer lunacy.

There are few opportunities to make changes such as this. The Caldicott Reset of Consent coming after the collapse of the care.data is likely to be one of them. We hope the outcome will reflect the choices and interests of everyone, not just those who wish to ignore consent and profit off patients.

Patients have been given an option to opt out. Any attempt to remove it is likely to exacerbate the concerns of the vast majority who have so far not used it.

Caldicott Review – The Ugly

(This is part 3 of a 4 part series looking at the future. If you are concerned about the privacy of your health data now, our advice remains unchanged)

The expectation of the Review was that it would create new opt out wording for the public – Care.data had many false starts as “mistakes” kept being discovered.

The new patient choice suggested on page 41 of the Review is: “information about me can only be used by the people directly providing my care.” That meets the promise in the Conservative’s election Manifesto, and it is a promise that is capable of being kept.

But what may happen in practice? Everything below is covered in the consultation that we discuss link to – it is not yet certain, it is just a proposal that can be changed. It may turn out the way you want, or the way you don’t – it depends on whether you make your views heard. For now, nothing changes, but it may in the future, and how it changes, is something you can help with.

 

Will Parliament be involved?

You may have already told your GP that you don’t want your data to leave their practice. But the Department of Health believes it alone can decide to ignore what you have told your Doctor… What happens when “too many” people opt out again?

The only safe way to “improve” past arrangements is for Parliament to put the opt-out into law. That should include how to get to the new model from where we are – topics that were outside the Terms of Reference of the Review.

1.2m people have explicitly ticked a box and personally delivered a form into their GP that says to their doctor “don’t pass on my records”. NHS managers want to instruct doctors to ignore it. For the medical profession, this is an ethical question, not a data protection question. Parliament therefore has to be involved.

Resolving this must be done democratically, with debate, if it is done at all. Otherwise trust in the NHS will be fundamentally damaged. What will stop a future Secretary of State changing the rules underneath patients and doctors, yet again?

This issue goes to a patient and professional trust — will the public have reason to believe the system will do what it says it will do?

 

Proposal: Your data will still leave your GP…

The Review suggests removing your existing opt out for GP data going to the HSCIC (page 31). Information you share with only your GP, will be copied into the HSCIC against any wishes you have already expressed.

This is the removal of an option that pre-dates care.data – and breaches many promises of confidentiality that GPs have made to patients in the past. Currently, these promises are being kept by the GPs, but the new proposal is that GPs will be required to break them.

We have seen no compelling case for this to be necessary.

 

…and then data about you will leave the HSCIC…

Back in 2014, you may have opted out of “data about you leaving the HSCIC for purposes beyond your direct care”. If you did, data about you is still included in the hospital data sold for commercial reuse. The review proposes this continues (page 34).

As we showed in part 2, de-identification is not anonymisation. The Government continues to pretend that it is, so they can continue to sell your medical data and ignore the opt out you used to prevent that. While the Government say “marketing” is banned, use for “market access” is not – as the latest data release register shows.

When it was announced that opt outs were being respected, they kept the exclusions secret. It was not announced that the scope of the opt out had been slashed to exclude the data that was the primary source of patient concern. This is half the basis of our complaint to the ICO: the opt out does not do currently everything you (and we) were told it would do.

 

…and you may have to opt out again

If the “two box” model discussed in the second blog post is implemented, the Government will require you to opt out again. It will ignore your past choices, and you will have to fill in another form. That was the intent of those who didn’t want you to be able to opt out in the first place.

Question 15 of the consultation shows whether the Government has thought about how to do what it is consulting on: it hasn’t. Which is a recipe for the repeat of care.data.

No one can know the problems with what is proposed until the Government is clear on what it wishes to do. If they respect your   wishes, that’s pretty easy to tell patients… If they choose to ignore them, the chances of them telling you are somewhat lower.

The question the consultation should ask, is what will give you confidence that your wishes were honoured?

How will the outcome of this consultation help when the next data project gets mishandled, with a high price of patient trust,

Nothing has changed yet, but it will in the future; how it changes, is something you can help with.

The proposal in the review: “Information about me can only be used by the people directly providing my care” is strong, simple, and deliverable. You can help it become reality.

 

All parts can change – Here’s how you can help:

You can respond to the consultation online. You don’t need to answer every question, and can only answer question 15 if you wish.

You might want to mention some of these points in your own words:

  • Why is what you tell your GP private for you?
  • Why must the NHS keep the promises it makes to you?
  • Is this promise clear: “information about me can only be used by the people directly providing my care”?
    • Do you want that promise to be given and kept?

You can say as much or as little as you like.  

You may also want to tell your friends…

(we also take donations)

 

See part 4 of the series

Caldicott Review – The Bad…

In part 2 of this short series, we look at the “Bad” parts of the The Caldicott Review of Data Security, Consent, and Opt-Outs. (link to part 1)

A good part was the suggestion of a continuing conversation with professionals and the public. What will inform that continuing conversation?

 

Silence on transparency to patients

While the Review suggests a range of improvements, there is no recommendation that patients should be told what happens to data about them – ie whether their wishes have been honoured or ignored.

There is also nothing in the Review to require transparency, or prevent it; but nothing to change the status quo of secrecy. We will see in the next part of this series how the bureaucracy wishes to continue to do things to your data without your knowledge.  All future scandals, concerns, and catastrophes will flow from this decision, and it will also limit innovation and harm research for every patient who does wish their data used in new ways.

As part of the “paperless 2020” agenda, the Secretary of State has told the National Information Board to look at telling patients how data about them is used. But this Review has committed to nothing, beyond a recognition that there should be a “continuing conversation”. A conversation requires parties that are willing to listen and change based on what it’s heard…

Hospital Episode “statistics” – Privacy Impact Assessment

Also currently out for consultation, is a Privacy Impact Assessment on the Hospital Episode Statistics. The hospital episode statistics are patient level, unprotected, individual records covering hospitals in England over 25+ years. They are not statistics in the Office of National Statistics sense – it is the raw data on your treatments, linked over time.

The assessment was written in 2014 when public disquiet at how their hospital records were being used led 1.2 million people to opt out, the document has finally been published for consultation. What is most disappointing is how little has changed. Companies are still getting data for commercial re-use, and copies are still being sent outside of HSCIC’s control to be lost, stolen, or abused. According to HSCIC, it’s “anonymous”…

Whether those data are anonymous is covered by the UK Anonymisation Network, which says quite clearly on page 16:

“Anonymisation – refers to a process of ensuring that the risk of somebody being identified in the data is negligible. This invariably involves doing more than simply de-identifying the data, and often requires that data be further altered or masked in some way in order to prevent statistical linkage.27

We can highlight further the difference between anonymisation and de-identification (including pseudonymisation) by considering how re-identification might occur:

 

  • Directly from those data.
  • Indirectly from those data and other information which is in the possession, or is likely to come into the possession, of someone who has access to the data.28

 

The process of de-identification addresses no more than the first, i.e. the risk of identification arising directly from data. The process of anonymisation, on the other hand, should address both 1 and 2. Thus the purpose of anonymisation is to make re-identification difficult both directly and indirectly. In de-identification – because one is only removing direct identifiers – the process is unlikely to affect the risk of indirect re-identification from data in combination with other data.”

As such, claims in the Caldicott Review that the ongoing release of HES data is compliant with the ICO’s Anonymisation Code are deeply flawed. We have complained to the Information Commissioner, as, according to the PIA, this is the risk to patients of re-identification: “This may happen in future” (risk 7).

A Two box model – “NHS” vs “non-NHS”

This choice is fundamentally confusing.

This suggestion allows aggressively commercial entities to acquire a figleaf NHS contract, and reuse data for any purpose they wish, while legitimate and public spirited academics and not for profit researchers are stuck in the “non-NHS” box. For those worried about the privatisation of the NHS, this proposal should be of deep concern.

In a review with such limited time, it was never going to be possible to fully design a new consent model. However, due to the obstinacy of existing data projects who have no desire to improve their ways, and no political leadership to enforce such improvement, those who got to lobby at the table made sure their interests were protected; patients, not so much. This intentionally murky choice is the result – it is right that this was put to public consultation to demonstrate the problems with the approach of existing commercial data use.

It is right that the patients who choose to opt in to particular studies do not have their preferences overridden; yet the review doesn’t do the same for those who chose to opt out.

It gets worse in part 3