Tag Archives: digitaleconomy

Digital Economy Bill: Part 5, Chapter 1, clause 30 and Part 5, Chapter 2 from a health data perspective

medConfidential asks Peers to:

  • Express support for Baroness Findlay’s amendment on Part 5 (NC213A-D)
  • Express support for either amendment to Part 5 Chapter 2 (Clause 39)
  • Oppose current Clause 30 of Part 5 in Committee and on Report

We attach a briefing, with a more detailed consideration of these points, but in summary:

In 2009, the then Government removed clause 30’s direct predecessor – clause 152 of the Coroners and Justice Bill – because the single safeguard offered then was ineffective. Bringing that back, this Government has not only excluded important aspects of Parliamentary scrutiny, it is trying to introduce “almost untrammeled powers” (para 21), that would “very significantly broaden the scope for the sharing of information” (para 4) without transparency, and with barely any accountability. The policy intent is clear:

“the data-related work will be part of wider reforms set out in the Digital Economy Bill. [GDS Director General Kevin] Cunnington said as an example, that both DWP and the NHS have large databases of citizen records, and that “we really need to be able to match those”. (interview)

While there is a  broad prohibition on the use of data from health and social care for research further down on the face of this Bill, in Chapter 5, the approach taken in clause 30 is very different, and contains no such prohibition. Regulations (currently draft) published under clause 36 simply omit the Secretary of State for Health from the list of Ministers, thereby excluding NHS bodies but not copies of health data others require to be provided. This is another fatal flaw in clause 30.

medConfidential is deeply concerned that Chapter 2 of Part 5 contains no safeguards against bulk copying. We accept the case for a power to disclose civil registration information on an individual consented basis – a citizen should be able to request the registrar informs other bodies of the registration – but, just as clause 30 contains insufficient safeguards and is designed to enable bulk copying, so is Chapter 2. One of the amendments laid to Part 5 Chapter 2 should be accepted.

Governments have had since 2009 to solve the problems that clause 30 not only leaves unaddressed, but exacerbates. The Government should either heavily amend Clause 30 at Report stage, or ensure it is removed before Third Reading. This clause is a breeding ground for disaster and a further collapse in public trust, and it simply doesn’t have to happen.

While medConfidential is open to legislation that treats sensitive and confidential personal data in a consensual, safe and transparent manner, this legislation does not. Despite more than 2 years of conversations about accessing data through systems that respect citizens and departments (ie data subjects and data controllers) and the promises they make to each other; Cabinet Office instead took a clause from 2009 off the shelf, and has been actively misleading about the process.

Briefing for Committee stage

Your Records in Use – Where and When… — Political will (or wont) for telling you how your data has been used.

The NHS changes greatly over time, but there are few “big bang” changes overnight, that happen without involving the patient. Your health context can change in the course of a single consultation, but the system does not change – only how you interact with it. Press releases may suggest that the NHS is rushing towards genomics and AI, but it’s much more a slow stroll.

The publication of Caldicott 3 called for an “informed” “continuing conversation” about health data. We agree – the best way for a patient to understand how their data may be used next month, is to be able to see how it was used last month. But if there are caveats that remain hidden from the public, a dishonest entry is worse than no entry.

Every patient has a personal lived experience of the NHS, and using that as the starting point for accountability of data use is vital. Data usage reports can give a patient the information about how data is used, in a context that directly relates to their personal experience of the NHS. Some of that they were involved in, and some of it is the system doing its thing and hoping no one notices.


Databases: poor and past?

Why are some patients being told to bring their passport to receive care, even though the NHS was there when they were born?

Databases that have benefits will receive public support for doing what they were supposed to do, but there is a widespread recognition that some past data choices by the NHS may have not been wise.

Whether that legacy will be repaired, or left to fester, is now up to the Department of Health, when they respond to the Caldicott Review. The Review left a number of hard questions unanswered, including the abuse of some patients that has been described as tantamount to “blackmail”. Care.data was just one of those. There are others that have hidden under a rock for some time, and followed care.data as it it were a guidebook.

The databases proliferate, there is almost no evidence for whether they are useful. Is the energy spent on them worthwhile? Is there a better way of delivering the goals they were designed to meet? There is an opportunity cost to doing anything…

There are many good reasons to use data, but just because a data collection has existed for decades, doesn’t mean it’s still the best way to deliver on the goals. Continued secrecy about the effectiveness of some data projects suggests that perhaps the claims of benefits are overblown, and are not supported by the evidence of what actually happened.

A continuing conversations requires ongoing evidence of reality, not political hyperbole.


Will patients be shown the benefits?

Will patients be provided with the evidence to show how their wishes have been implemented? What was the outcome of projects where their data was included?

What was the outcome of the “necessary” projects where dissent was ignored?

Will the Caldicott Consent Choice ignore the choices patients were previously offered?

In 2016, NHS Digital have made the final preparatory steps to telling patients how their data is used, which was firstly, keeping track (a side effect of beginning to honor objections), but they also now publish a detailed data release register – with sufficient detail for you to work out where some of your data went and why. Such a register allows for independent scrutiny of any data flow, and is a necessary prerequisite to a data usage report.

It does not tell an individual whether their data was used, nor what the knowledge generated was (e.g. see notices tab), but it is the key step. And while two thirds of data sold by NHS Digital does not honour your opt out, Public Health England sneak a copy of NHS data, refuse to honour objections, and hide those actions from their data release register. (As of December 2016, some administrators pretend that there was no opt out offered from “anonymised” hospital data… here’s the video from Parliament).


Digital, Deepmind, and beyond

How AI will support care is a choice for the future, but if there is going to be any move towards that world (and there already is), the transparency of all digital services must be fundamentally, inviolable, and clear — it can include AI, but can’t include dodgy caveats.

If there is any secrecy about how patient data is used, NHS institutions may hope to be given the benefit of the doubt for secrecy, Google not so much. If there is secrecy for the NHS organisations, companies will try and sneak in too.

Similarly, if patients are to be offered digital services that they can use without fear, there must be an accountability mechanism for when those services were accessed, that they can view when they wish. Otherwise, the lowest form of digital predators will descend on health services like it’s feeding time. It doesn’t have to happen – unless there is a political decision that mistakes can be covered up.

When companies put out a press release, we often get called for comment and insight  on what is actually going on. That’s a journalist’s job, and ours, because some good intentions come with too high a price.

Will the mistakes of the past begin to be rectified, creating the consensual, safe, and transparent basis for the (digital) health service of the future?


Demonstrations of Delivery on promises

There will always be a demand to do more with data – but any framework has to respect that some things will not be permitted.

As Caldicott 3 recognised, telling patients how their data has been used is necessary for public confidence in the handling of data. If there is to be confidence in the system, and allowing data to be used to its full potential, then there should be a recognition that when that use is objected to by an individual, then that objection is respected.

We focus on health data, but this applies across the public sector, where there is a desire to make data great again in 2017…


Briefing for the Digital Economy Bill – House of Lords 2nd Reading

Our 3 page briefing is here.


Given the obstinacy of the Cabinet Office, Part 5 of this Bill has been offered on a take or leave it basis to Parliament.  If it is not improved at Committee stage, we suggest you leave it.

A major hospital in London has a deal with Google to produce an app to tell doctors which patients are in the most urgent need. This is a good thing. But to produce it, Google insisted on having a copy of main dataset covering every patient in the hospital, which is only available up until the end of the previous calendar month.  The appropriate way to get the information needed, was to get up to the minute information on the patient whose details they were going to display. However, Google wanted all the data, and insisted on it if the hospital wanted to work with them.

It’s not the creation or production of a pretty app that’s the problem – it’s the demand for excessive data in return for using the app. It’s entirely rational for the hospital to accept the app as it may lead to marginally better care for their patients; but the price is being paid in their patients’ data. The Bill applies this principle across Government: third parties want the benefits of having the data, because this Bill does not require any protections.

The Minister was asked a simple question about safeguards: “Could you explain where they are and what they look like? and no answer – because there are none.

Characterising Chapters 1 and 2, it can be said they “will have the effect of removing all barriers to data-sharing between two or more persons, where the sharing concerns at least in part the sharing of personal data, where such sharing is necessary to achieve a policy objective…”

Unfortunately for the Government, that characterisation is quoting from the Government’s explanatory notes for s152 of the Coroners and Justice Bill (para 962). Nothing has changed in Government thinking since 2009, when the House of Lords threw out that clause.

Our 3 page explanatory briefing is here.

Data in the rest of Government: A register of data sharing (managed, rather than inadvertent)

The Codes of Practice for the Digital Economy Bill aren’t worth the paper they’re (not) printed on. They aren’t legally binding (para 11), and are effectively subservient to the ICO’s existing code, even while paragraph 60 pretends a single side of A4 is a valid Privacy Impact Assessment for data sharing for operational purposes.

As this is the case, why is there a code of practice necessary under the legislation? Is does nothing new. Is it solely to make it look more respectable than the dark and dank dodgy deal that it actually is?

In places such as supermarkets, you have a choice of whether to use a clubcard, and can easily use one of the other supermarkets that are available – Government doesn’t have competition promoting integrity. To ensure a citizen can see how Government uses data about them, there should be a statutory register of data sharing agreements (involving public sector data). A register prevents nothing (which seems to be the policy intent of the Bill), but is simply a list of stated intents. From the Register comes an informed discussion of what organisations are actually doing and sharing, rather than excessive secrecy and double dealing.

Opposition to a register comes from fear, based in Government’s lack of knowledge of what data they have, or currently share it. If you don’t have a clue where your data is, or why it’s there, you oppose a register because you don’t want to find out.

How this state of affairs came about, is at the heart of this Bill.

We’ve previously posted about the definition of Personal Data in the Investigatory Powers Bill. What about in the non-secret parts of Government?

In 2010, the Cabinet Office told GCHQ that “to be considered personal data, a dataset has to contain at least the actual names of individuals.GCHQ being subject to the national security exceptions of the Data Protection Act.

In March 2015, the term “bulk personal datasets” was used by Parliament, and entered common terminology, but it wasn’t until November 2015 that the full definition of the Data Protection Act was restored (with DPA exceptions for National Security).

But, in the middle 7 months, the term gained increased currency within Government and used much more widely as it crossed into the non-secret sphere. The Cabinet Office took the existing meaning and thinking and applied it elsewhere.

It was never noted that the definitions in the non-secret parts of Government should have been  different, likely weren’t, and hence possibly are invalid under DPA, because the narrow term for GCHQ was classified, and hence restricted. Ie “actual names” is not the DPA standard.

So what effect did this have?

Following the talk talk hacks, Government ran an exercise described as the “Cabinet Office audit 2016” looking at what each Department held, and the impact of them losing it.

We made FoI requests about what each department held, and got very interesting answers (we excluded national security or serious crime).

The Cabinet Office hold no bulk personal data (apparently… ).

DCMS hold some bulk personal datasets – on people who have responded to their consultations (and some data from the Olympics)…  (erm…  what?)

The Department of Transport gave a much longer answer, (but didn’t know how much was in each dataset).

Does the Government known what personal data it has, uses and shares, and where it keeps it? If so, did Departments share that list with the Cabinet when asked?

Since we can see they probably didn’t, are all uses of large datasets (whether they are considered personal data or otherwise) fully compliant with the definitions in the Data Protection Act?

How does the Bill and associated work help resolve this mess? 

Data in the rest of Government: a fertile breeding ground for fraud and misery

We have now published our submission on Part 5 of the Digital Economy Bill to the Committee scrutinising the Bill.

The Digital Economy Bill (Part 5) promotes a fertile breeding ground for fraud and misery: excessive data sharing, but only in secret.

The protections on sharing data to academic researchers are strict – people who work to increase human knowledge have to jump through a range of hoops to get funded, with critique and transparency; and then even more hoops to get data, again with critique and transparence; and even then, the the Digital Economy Bill does not apply that process to health data for research until we see the outcome to the Caldicott process.

Yet, the Digital Economy Bill contains no prohibition on civil servants secretly sharing any data, including medical records. This includes secret sharing by Government Departments to inform operational decisions. There are no meaningful restrictions on copying of any data held by any public body. From anywhere, to anywhere except independent experts.

“If you give me six lines written by the hand of the most honest of men,
I will find something in them which will hang him.” – Cardinal Richelieu

In recent weeks, it has emerged that a citizen was stripped of housing benefits because data showed they were cohabiting with “Joseph Rowntree”, the 19th Century philanthropist whose modern legacy includes a Housing Trust which bears his name, and which was that woman’s social landlord. The DWP contractors used just enough data to create “evidence” that reinforced their prejudice, but not enough to realise their “evidence” was lunacy. This Bill makes those events more common, more harmful, and more opaque.

The Cabinet Office believe that they should be trusted with everything, and independent academics can be trusted with almost nothing. Neither of those are likely to be right.

That is learning the wrong lesson from care.data. Academic use of data to improve the public knowledge was not what people objected to. It was the secrecy and other purposes.

The Cabinet Office clearly don’t understand that their approach is part of the problem. More copying with the same rules was care.data; this bill is more copying with less rules, less transparency, and less oversight. No one suggests that fraud prevention should be subject to consent, but Parliament should be able to assess whether the approach has worked.

The existing minimal oversight of Parliament goes away, and Departments can do what they wish. A statutory basis for disclosure to prevent fraud is entirely sensible – individuals shouldn’t get that choice – but in addition to all sharing the data, the Government demands secrecy as to whether that sharing was effective.

If governance is effective, then it should apply to all. If it’s ineffective, then it shouldn’t happen at all. Parliament needs to pick one.

The reason the last Government didn’t want good governance and accountability to apply to Government, is because the governance and accountability process works.

An accountability process that is entirely absent from most parts of this Bill.

Which way is the new Government going to go?

Data in the rest of Government: When is personal data not personal data?

This forms a background note for our Investigatory Powers Bill Briefing.

We asked a selection of Government departments what “Bulk Personal Datasets” they hold. These are collections of personal data on a lot of people, which inadvertently answer the question, what data does Government use to make decisions?

So here are the lists, by Department and how many datasets:

Apparently, the Cabinet Office doesn’t use data on people for anything other than “National Security” or the “prevention and detection of crime”.

If you look at the Education and Transport lists, they include the things you would expect – databases on car registrations, and drivers, or of pupils, schools and teachers. This is the sort of data that Government should be using, transparently and accountably, to make decisions. 

But just as Justice is blind, Health, DWP, apparently make decisions without recourse to data on the population of the country. Do DH really not use any data on doctors, or on patients? Do MoJ not use data on prisoners?  According to both of them, no they don’t.

Of course, they actually do. When DWP counted, they had 15,000+ copies of their Customer Information System lying around their analysts computers, which each had data on up to 120 million citizens (there aren’t that many people in the country, but they had a lot of duplicates; it’s now down to about 80 million records for 65 million people). Just lying around – the loss of any one of those copies (for which there are no records) would have dwarfed the HMRC child benefit data loss of 2007.

None of those copies are accounted for, or considered personal data. Your pension contribution history may be personal data to you, but not according to DWP.

Why is that?

From the work of Privacy International, we see:

“…agreed with Cabinet Office in 2010, as part of the Review of Agency Handling of Bulk Personal Data, that, to be considered personal data, a dataset has to contain at least the actual names of individuals.

On that basis, all of the answers we’ve received may be “accurate” if you use unstated definitions: DWP’s analysts don’t get names (just addresses, dates of birth, and detailed employment/NI history, etc).

The Cabinet Office’s working definition of bulk data on page 2 says simply “personal data”, and gives no indication that the second clause of the definition in the Data Protection Act has been dropped as it was inconvenient:

“personal data” means data which relate to a living individual who can be identified—
(a) from those data, or
(b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller,

Perhaps the current Digital Economy Bill should sneak  “unless it’s inconvenient” onto the end…

Data in the Rest of Government – the Cabinet Office Data Programme

If you see care.data as anything other than a complete success of vision and implementation, this Cabinet Office “process” should cause you concern.

Organisations that want to copy your data around Government have developed a figleaf to allow it. It forms the basis for the “Digital Economy” Bill that has been laid before Parliament, and which will be debated after the summer.

To inform that debate, we’ve used NHS England’s public comments to answer the Cabinet Office “Data Science” “Ethical” “Framework”.

Because we exclusively use NHS England quotes, this runs to 2 sides of paper in length. The Cabinet Office version is one side long, so this is twice as long as they think it should be.

If you’re wondering whether care.data could happen again, this is how:

Cabinet Office Data Science Ethical Framework: justification for care.data

  1. Start with a clear public benefit:

– How does the public benefit outweigh the risks to privacy and the risk that someone will suffer an unintended negative consequence? (PIA step 1)

“NHS England is introducing a modern information service on behalf of the NHS called care.data. The service will use information from a patients’ medical record to improve the way that healthcare is delivered for all.” (Source: NHS England)

– Brief description of the project, including data to be used, how will it be collected and deleted. (PIA step 2)

“The care.data programme will link information from different NHS providers to give healthcare commissioners a more complete picture of how safe local services are, and how well they treat and care for patients across community, GP and hospital settings.” (Source: NHS England)

    – What steps are you taking to maximise the benefit of the project outcome?

“At the moment, the NHS often doesn’t have the complete picture as information lies in different parts of the health services and isn’t joined up.  This programme will give NHS commissioners a more complete picture of the safety and quality of services in their local area which will lead to improvements to patient outcomes.”  (Source: NHS England)

“The information can also be used by NHS organisations to plan and design services better, using the best available evidence of which treatments and services have the greatest impact on improving patients’ health.”(Source: NHS England)


  1. Use data and tools which have the minimal intrusion necessary

– What steps are you taking to minimise risks to privacy? (for example using less intrusive data, aggregating data etc)

“The HSCIC has been handling hospital data securely in this way for decades.  The system is designed to be extremely secure, with a suite of safeguards to protect confidentiality.” (Source: NHS England)

“The service will only use the minimum amount of information needed to help improve patient care and the health services provided to the local community. A thorough process must be followed before any information can be shared and strict rules about how information is stored and used are followed.” (source: NHS England)

  1. Create robust data science models

– What steps have you taken to make sure the insight is as accurate as possible and there are minimal unintended consequences? (for example thinking through quality of the data, human oversight, giving people recourse)

“Everyone making healthcare decisions needs access to high quality information: clinicians need it to inform their decision making; patients need it when deciding which treatment option is best for them; and commissioners need it when making decisions about which services are right for their populations.” (source: NHS England)


  1. Be alert to public perceptions:

– How have you assessed what the public or stakeholders would think of the acceptability of the project? What have you done in addition to address any concerns?

“Materials and guidance have been developed in collaboration with the Health and Social Care Information Centre (HSCIC), British Medical Association (BMA) and the Royal College of General Practitioners (RCGP), to support practices to raise awareness. Patients who are not happy for their data to be used in this way can ask their GP practice to make a note of this in their medical record and this will prevent their information leaving the practice.”  (source: NHS England)


  1. Be as open and accountable as possible?

– How are you telling people about the project and how you are managing the risks?

“NHS England, together with the Health and Social Care Information Centre, announced that throughout January, all 22 million households in England will receive a leaflet explaining how the new system will work and the benefits it will bring.  The leaflet drop is the next stage of NHS England’s public awareness plan and follows wide consultation with a range of stakeholders including GPs and patient groups.”  (source: NHS England)

– Who has signed this off within your organisation? Who will make sure the steps are taken and how? (PIA Step 5)

“This programme is too important to get wrong, and while I think that there is understanding on both sides of the House about the benefits of using anonymised data properly, the process must be carried out in a way that reassures the public.” (Source: Secretary of State, Jeremy Hunt, to Parliament)


  1. Keep data secure

    – What steps are you taking to keep the data secure?

“The NHS is very good at preserving the privacy of people in analysing that kind of data.” … “in 25 years there’s never been a single episode where the very strict rules have ever compromised the patient’s privacy,” (source: Mr Kelsey of NHS England on BBC Radio 4audio)

According to the Cabinet Office, that’s all you need to do as “answering these questions will also act as your Privacy Impact Assessment” (top of page 6). That is clearly ridiculous – the above is as false and misleading as it is entirely accurate. The care.data privacy impact assessment was 32 pages long plus other supporting documents.

The Digital Economy Bill makes the above superficiality entirely legal for any part of Government to acquire data from any other, and will be discussed by Parliament in September.

Reporting to a new Minister, and a new Director General, the GDS data programme needs an external review to provide constructive input from outside the existing whitehall silo. Otherwise, across Government, the public facing legacy of GDS may become care.data style fiascos.

2016 Digital Economy Bill

On the day that Tory MPs vote on a new leader, with the Home Secretary who tore up an ID card on her first day in office in the lead, the Government has introduced legislation to bring the database state back via the side door.

s38 of the Digital Economy Bill may require sharing of births, marriages, and deaths across the public sector in bulk without individual consent.

s29 as written allows sharing of medical information to anywhere in the public sector, or commercial companies providing public services, if it may increase “contribution to society”.

The National Data Guardian is not placed on a statutory footing.

As the Conservative leadership election moves forward, it seems to be that the database state is back.



update: The Cabinet Office have been in touch to say:

Para 18 of the government response clearly states:
18.       The Government acknowledges the importance of health and social care data in multi-agency preventative approaches and early intervention to prevent harm. We will do further work with the National Data Guardian following the publication of her review/report to consider how health data is best shared in line with her recommendations.

As a result health bodies are out of scope of the powers in the draft regulations.

The Bill itself contains no such exclusion, and many local authorities have been lobbying for precisely that access. We will look to clarify with a probing amendment at committee stage, but appreciate the press office getting in touch.

Data use in the rest of Government: Where is the consultation on any ethics?

Where is the consultation on any ethics?

As care.data was in the NHS bureaucracy, this consultation is about doing more of what Government been doing already: Not better sharing, just more copying.

If this wasn’t about databases, the same consultation could be had about buying more filing cabinets, ink, and scribes. Continue reading

Data in the rest of Government: Put data to good use?

{this is a background reference blog post, ahead of more on the Cabinet Office’s data copying consultation. The call to action will be in the next newsletter.}

Let’s make data easy to put to good use” says the Cabinet Office. But good for whom? Good for the civil service? Good for each citizen? Who makes sure the balance is right?

Care.data was claimed as a “good use” of data. The details showed it to be something radically different. The Cabinet Office consultation launched last week is about bureaucracy as usual. The mantra is reform, but the reform is to bring all the benefits to Government, and the downsides for citizens.

Digital transformation, this is not.

Continue reading