Author Archives: medcon

NHS England will go fishing in a “Data Lake”, but says “let them eat APIs” to doctors

An “Emerging Target Architecture” from NHS England aims to direct all NHS patient data into a new “national data lake” (page 14). This involves taking genomic, GP, and other health data for direct care, and then going fishing in that dataset for commissioning purposes, while keeping such actions secret from the patients whose data they access.

The inclusion of the data lake and claims to be ‘direct care’ show NHS England has no faith that the tools they propose to doctors will work. The fig-leaf of “localisation” is undermined by the “national” “data lake”, and it seems unlikely that DH and NHS England will cease meddling if a local area decides not to to rifle through patient records.

NHS England’s approach does not fix any problems that exist: there is no analysis that should be done, that this model will allow, that cannot be done now if someone cared to do it. The approach does however do away with patient privacy, safeguards and oversight, and allow nefarious access that is currently prohibited. This model does nothing to solve the actual problem, which is the need for more analysis. There is already an excess of data that no one is looking at, this simply creates more data. And no matter how much data there is, “more data” will remain an analyst’s desire. Patients, and the clinicians who treat them, don’t have such luxuries.

Conflating direct care and secondary uses will cause pain throughout the NHS for as long as it persists the legacy of the thinking behind care.data.

Direct care?

For direct care, the idea of patient-visible, audited, “near real time” access to records held elsewhere is not novel nor necessarily problematic in principle (though the details often fall short).

The Lefroy Act from 2015 requires hospitals to use the NHS number to identify patients, which makes data easy to link. The use of near-real-time access where there is a clinical need is not necessarily a problem everywhere, but there are clearly some areas where very great care is needed, and the ‘Emerging Target Architecture’ document contains none at all.

There are benefits to using FHIR APIs (or equivalent) as the definition of a “paperless” NHS (currently conveniently undefined). But this “target architecture” is not about that, and notably doesn’t say that. The APIs proposed can help patients, but do not require new data pools; the “national data lake” assumes they do not, and is included to allow fishing expeditions by NHS England itself and its customers – an “NHS England datamart”.

NHS England’s desire for unlimited access to data for direct care is to get unlimited access for other purposes. The document claims that “privacy by design” is important, but doesn’t go beyond words and completely ignores privacy from its worldview.

Where is the transparency?

Access to records to provide direct care is valid – but at the scale of the entire NHS, how will a patient know whether their records have been accessed by someone on the other side of the country? The system says nothing about transparency to patients.

While such an architecture can do good, it can also be abused, and the worldview of NHS England offers no potential for dissent.

Open Data and dashboards on current status are necessary for transparency in the NHS. However, paragraph 3.29.3 of ‘Emerging Target Architecture’ suggests that open data can be recombined into a patient record, which suggests something has gone very wrong in the “understanding” behind the document.

NHS England will go fishing in the genetic data lake

Because all patients’ records will be included in the data lake, NHS England will then be able to extract anything for which it can provide a plausible justification. But, as the care.data Expert Reference Group showed, anything can be justified if you use the right words and no one asks questions, e.g. “national data lake” and “privacy by design”.

The existence of a data lake means people will go fishing. You can put up “no fishing” signs, but we all know how that plays out with people who have good intentions, but priorities that undermine the larger goals.

The paper does not talk about genomic data, but Genomics England (GEL) is envisaged as an inflow. Was this a deliberate choice?

This free-for-all stands in comparison to the transparency of the current NHS Digital processes. We may fundamentally disagree with some of those decisions, but there is at least transparency on what decision was made and why.

“Datamart”

The idea of a “datamart” is the clear reappearance of the care.data principle of taking all the data from patients and clinicians, and selling it to anyone who might offer a few beans to get the detailed medical histories of patients.

The conflation of direct care and (dissentable) secondary uses now looks less accidental, and more like an end state goal – for which ignoring patient opt outs was a necessary means to an end.

There must continue to be rigorous and transparent processes for accessing patient level data – and that should include transparency to patients of which organisations have accessed their data. APIs may help care, but they also help those with other intentions.

This proposal also does nothing to reduce the administrative overhead of the NHS billing bureaucracy, nor does it reduce the requirement for identifiable information to be shown to accountants at multiple NHS bodies, simply because they don’t trust each other. A “national data bus” architecture could address that problem, but NHS England has chosen not to care about reducing the burden on others.

There should be no third party access protocols – statistics should be published, or data to solve a specific problem should be available to appropriate analysts within a safe setting, when their questions have received appropriate review, who have the data appropriate to answer them, and who publish their results.

Drug companies should be prevented from changing the questions they ask after they know what the results of their trials are. And CQC shouldn’t be allowed to pretend they never asked a question, purely because they don’t like the answer they got. Analysis of the data may lead to new questions; but it should never lead the original question not being answered. And all questions asked of the data should be published.

The future of (Fax) Machines

There is still no clarity on what will replace the fax machine for one clinician sending information along a care pathway to a department in another organisation. The desire to abolish fax machines isn’t unwise, but they serve a clinical purpose that e-mail demonstrably doesn’t resolve.

Wither Summary Care Record?

The Summary Care Record could perform many of the direct care features, had NHS England not decided upon an “all or nothing” approach to having a SCR.  Had the enhancements to Summary Care Records been done on an iterative and consented basis, it would have been simpler to widen SCR to the new areas proposed. But NHS England, with the bureaucratic arrogance and technical mediocrity that pervades this proposal, simply insisted on the same “all or nothing” approach to the enhanced SCR. This being the case, it insists on all patient data being included in a data lake, as the access to data of last resort for clinicians.

Some of the proposals in this document clearly have merit, but when claims are made for “privacy by design” alongside such a fundamentally misconceived and diametrically opposed notion as a “national data lake”, the vision articulated is shown to be incoherent at best.

Prioritising a data copying exercise over actual care repeats exactly the same errors in thinking that set care.data on its path to failure. And, published just weeks after it emerged that patients’ objections to their data being used for purposes beyond their care are being ignored, this looks even more like a deliberate attempt to ignore that there are – and always will be – valid objections.

Ignoring the past in this way puts at risk access to the data of those who would be happy for their medical records to be used, given sufficient safeguards and transparency. Unfortunately, a data lake can never meet those requirements.

The “Emerging Target Architecture” document is here, and NHS England is taking comments until the end of the week…

Your Records in Use – Where and When… — Political will (or wont) for telling you how your data has been used.

The NHS changes greatly over time, but there are few “big bang” changes overnight, that happen without involving the patient. Your health context can change in the course of a single consultation, but the system does not change – only how you interact with it. Press releases may suggest that the NHS is rushing towards genomics and AI, but it’s much more a slow stroll.

The publication of Caldicott 3 called for an “informed” “continuing conversation” about health data. We agree – the best way for a patient to understand how their data may be used next month, is to be able to see how it was used last month. But if there are caveats that remain hidden from the public, a dishonest entry is worse than no entry.

Every patient has a personal lived experience of the NHS, and using that as the starting point for accountability of data use is vital. Data usage reports can give a patient the information about how data is used, in a context that directly relates to their personal experience of the NHS. Some of that they were involved in, and some of it is the system doing its thing and hoping no one notices.

 

Databases: poor and past?

Why are some patients being told to bring their passport to receive care, even though the NHS was there when they were born?

Databases that have benefits will receive public support for doing what they were supposed to do, but there is a widespread recognition that some past data choices by the NHS may have not been wise.

Whether that legacy will be repaired, or left to fester, is now up to the Department of Health, when they respond to the Caldicott Review. The Review left a number of hard questions unanswered, including the abuse of some patients that has been described as tantamount to “blackmail”. Care.data was just one of those. There are others that have hidden under a rock for some time, and followed care.data as it it were a guidebook.

The databases proliferate, there is almost no evidence for whether they are useful. Is the energy spent on them worthwhile? Is there a better way of delivering the goals they were designed to meet? There is an opportunity cost to doing anything…

There are many good reasons to use data, but just because a data collection has existed for decades, doesn’t mean it’s still the best way to deliver on the goals. Continued secrecy about the effectiveness of some data projects suggests that perhaps the claims of benefits are overblown, and are not supported by the evidence of what actually happened.

A continuing conversations requires ongoing evidence of reality, not political hyperbole.

 

Will patients be shown the benefits?

Will patients be provided with the evidence to show how their wishes have been implemented? What was the outcome of projects where their data was included?

What was the outcome of the “necessary” projects where dissent was ignored?

Will the Caldicott Consent Choice ignore the choices patients were previously offered?

In 2016, NHS Digital have made the final preparatory steps to telling patients how their data is used, which was firstly, keeping track (a side effect of beginning to honor objections), but they also now publish a detailed data release register – with sufficient detail for you to work out where some of your data went and why. Such a register allows for independent scrutiny of any data flow, and is a necessary prerequisite to a data usage report.

It does not tell an individual whether their data was used, nor what the knowledge generated was (e.g. see notices tab), but it is the key step. And while two thirds of data sold by NHS Digital does not honour your opt out, Public Health England sneak a copy of NHS data, refuse to honour objections, and hide those actions from their data release register. (As of December 2016, some administrators pretend that there was no opt out offered from “anonymised” hospital data… here’s the video from Parliament).

 

Digital, Deepmind, and beyond

How AI will support care is a choice for the future, but if there is going to be any move towards that world (and there already is), the transparency of all digital services must be fundamentally, inviolable, and clear — it can include AI, but can’t include dodgy caveats.

If there is any secrecy about how patient data is used, NHS institutions may hope to be given the benefit of the doubt for secrecy, Google not so much. If there is secrecy for the NHS organisations, companies will try and sneak in too.

Similarly, if patients are to be offered digital services that they can use without fear, there must be an accountability mechanism for when those services were accessed, that they can view when they wish. Otherwise, the lowest form of digital predators will descend on health services like it’s feeding time. It doesn’t have to happen – unless there is a political decision that mistakes can be covered up.

When companies put out a press release, we often get called for comment and insight  on what is actually going on. That’s a journalist’s job, and ours, because some good intentions come with too high a price.

Will the mistakes of the past begin to be rectified, creating the consensual, safe, and transparent basis for the (digital) health service of the future?

 

Demonstrations of Delivery on promises

There will always be a demand to do more with data – but any framework has to respect that some things will not be permitted.

As Caldicott 3 recognised, telling patients how their data has been used is necessary for public confidence in the handling of data. If there is to be confidence in the system, and allowing data to be used to its full potential, then there should be a recognition that when that use is objected to by an individual, then that objection is respected.

We focus on health data, but this applies across the public sector, where there is a desire to make data great again in 2017…

 

Jeremy Hunt has changed his mind

Welcome to another newsletter from medConfidential.

Jeremy Hunt changed his mind and is still selling your medical records

If you opted out of your hospital records being sold, Jeremy Hunt has changed his mind about your choice.

At the time, he said in Parliament (emphasis added):

“…this Government decided that people should be able to opt out from having their anonymised data used for the purposes of scientific research, which the previous Labour Government refused to do? When they extended the programme to out-patient data in 2003 and to A and E data in 2008, at no point did they give people the right to opt out. We have introduced that right

The right Jeremy Hunt was so publicly proud of introducing, he has secretly taken away again. He was right to give it you – his election manifesto promised it would be there.

Over 1.2 million people, just like you, opted out of their hospital records being sold. The opt out has begun to work, but NHS confirms hospital records are still being sold.

The opt out process you followed in 2014 was the easiest way to opt out, but was not the only way. It was what the Government said would work. They have now changed their minds. We complained to the ICO, and they agreed with the Government.

As a result, we will have more details on what you can do to protect yourself in the new year. The Government had to perform a pirouette to pull this off, and may still have fallen flat on their face.

For now, you may wish to write to your MP and ask about this change. Ask your MP why the Government has gone back on its manifesto promise to let you opt out. Tell them why confidence in the privacy of your medical records matters to you.  More details of the change are on our website.

Other steps you may wish to take to protect your medical records will become clear in the new year. If you are in immediate distress, our website contains a longer route to doing so now if necessary. If that is not the case for you, we’d suggest you wait until our full response is available. There is more to come on this, and the shabby secret is now out.

Jeremy Hunt offered you a convenient route which didn’t place an undue burden on your the NHS. If you took him up on that, he should keep his word. He retracted it in secret, and it took 6 months of work to find out what had actually happened. The opt out you took up for hospital has begun to be implemented, but is not yet fully in place. The opt out of your GP data, which is a separate tick box on the form you used, is not affected. The GP opt out is working, as it has been since you handed in your form.

Where does data go?

NHS Digital publishes details of where they send data each month, and why. Now they publish detailed official spreadsheets, we turn it into simple webpages. They are at https://dataregister.medconfidential.org

That gives a list of which projects honoured your opt out, and which companies got data on you anyway.

Merry Christmas

2017 is looking busy. The Government will announce what it is going to do. We hope they will do the right thing and honour your opt out (even if they try to do everything else first).

We rely on donations for some of our work, and anything you wish to offer in support will be put to good use. We have some fun plans for ensuring your choice is respected, and donations help them happen.

We will still be here. The Government know we will still be here, and know we will do what we say we will do. We work to ensure that your medical records are only used in a way which is consensual, safe, and transparent.

You can help make that happen.

We wish you and your loved ones a Merry Christmas, and we’ll have more in the New Year. The next newsletter will have better news than this one. We hope.


Thanks for helping

Best wishes, for a Merry Christmas, and a consensual, safe, and transparent New Year.

From Phil, Sam, and all at MedConfidential.

Briefing for the Digital Economy Bill – House of Lords 2nd Reading

Our 3 page briefing is here.

Summary

Given the obstinacy of the Cabinet Office, Part 5 of this Bill has been offered on a take or leave it basis to Parliament.  If it is not improved at Committee stage, we suggest you leave it.

A major hospital in London has a deal with Google to produce an app to tell doctors which patients are in the most urgent need. This is a good thing. But to produce it, Google insisted on having a copy of main dataset covering every patient in the hospital, which is only available up until the end of the previous calendar month.  The appropriate way to get the information needed, was to get up to the minute information on the patient whose details they were going to display. However, Google wanted all the data, and insisted on it if the hospital wanted to work with them.

It’s not the creation or production of a pretty app that’s the problem – it’s the demand for excessive data in return for using the app. It’s entirely rational for the hospital to accept the app as it may lead to marginally better care for their patients; but the price is being paid in their patients’ data. The Bill applies this principle across Government: third parties want the benefits of having the data, because this Bill does not require any protections.

The Minister was asked a simple question about safeguards: “Could you explain where they are and what they look like? and no answer – because there are none.

Characterising Chapters 1 and 2, it can be said they “will have the effect of removing all barriers to data-sharing between two or more persons, where the sharing concerns at least in part the sharing of personal data, where such sharing is necessary to achieve a policy objective…”

Unfortunately for the Government, that characterisation is quoting from the Government’s explanatory notes for s152 of the Coroners and Justice Bill (para 962). Nothing has changed in Government thinking since 2009, when the House of Lords threw out that clause.

Our 3 page explanatory briefing is here.

medConfidential statement on continued sale of hospital records

During the failed Care.Data project, NHS England and the Department of Health said “patients have a choice” about how their data is used – they could opt out if they wished.

NHS Digital, the bit of the Department of Health that sells data to companies, has gone back on the Secretary of State’s word on a critical detail, and Jeremy Hunt has given up. To the Information Commissioner, they now say: there is no choice about whether your hospital data is sold. NHS Digital admit and demonstrate that it continues to be sold.

The opt out was the gift of the Secretary of State, and he has taken part of it away again. Merry Christmas everyone.

On that basis, other legal options remain open to patients. This is not the end, but it is the end of the beginning.

The opt out has begun to be implemented – it does do some things – but the main purpose of opting out of your hospital data being sold, is that your hospital data doesn’t get sold. That is the part that continues to happen in spite of the NHS promise to you as a patient.

We are obviously disappointed that Jeremy Hunt has chosen to go back on his word, and continue selling the nation’s private hospital history to anyone who fills in a form correctly, after he offered patients a choice to opt out of that.

The ICO has ruled that it was the Secretary of State’s choice, and he was entitled to make it. This does not affect rights available to patients under the Data Protection Act.

If patients are concerned, we suggest they join our newsletter at www.medConfidential.org, and we will provide a detailed update shortly – it is likely to involve a trip to the post box.

We will have a more detailed analysis of the contradictory parts of the ICO response in due course.

medConfidential

Notes to Editors

    1. Care.data was the extension of GP data to link it with Hospital data, and continue the practices used in ongoing releases of hospital data. The Government was very clear that if patients didn’t want their hospital data used, they could opt out:
      Parliament: https://www.theyworkforyou.com/whall/?id=2014-03-25a.49.0#g56.7
      NHS England: https://www.dropbox.com/s/qaax5zj77zxddwz/leaflet-manchester.jpg?dl=0 
    2. NHS Digital’s convoluted policy statement is the 5th bullet point here: http://content.digital.nhs.uk/article/7092/Information-on-type-2-opt-out 
    3. For alternate approaches, we note s10 of the Data Protection Act allows a person to dissent from processing, and purposes beyond direct care are subject to legal dissent. The opt out was supposed to be the convenient way of expressing dissent; it is not the only way. 
    4. This decision is about data flows as they exist today. Looking forwards to future changes, NHS Digital argue that this implementation is entirely consistent with the future Caldicott Consent Choice under review by the Government following a public consultation. That is in the hands of the Government. 
    5. The NHS Digital Privacy Impact Assessment for the Hospital Episode Statistics shows that reidentification from this data could happen: http://content.digital.nhs.uk/article/7116/Consultation-on-the-Hospital-Episode-Statistics-Privacy-Impact-Assessment-Report
    6. The recipients of data releases, which includes releases containing data on patients who had opted out, can be seen here: https://dataregister.medconfidential.org
    7. For what patients can do about this change, see: https://medconfidential.org/2016/opt-out-process-update-december-2016/ 

-ends-

PHE Board papers

In 2016, Public Health England stopped publishing linkable PDFs for their board papers, and started hiding them in zip files. As we have previously done with HSCIC, we’ll publish the zip archives here until they switch back (which they may now have done).

PHE Board May 2018

PHE Board February 2018

PHE Board November 2017

PHEBoard September 2017

PHE Board June 2017

PHE Board April 2017

PHE Board February 2017

PHE Board January 2017

These papers were published 36 hours after the meeting happened.

PHE Board November 2016

PHE Board September 2016

PHE board July 2016

/wp-content/uploads/hscic/phe/phe-2016-08/PHE16-036 (1 of 4) Board paper public mental health – v7.pdf

PHE Board May 2016

Showing the Thing: Jeremy Hunt’s desire for an “NHS App”

Every flow of data across the NHS should be consensual, safe and transparent. Following the Caldicott Review call for a continuing conversation, Secretary of State Jeremy Hunt has asked for an “NHS App” by September 2017. Since he made that request, there has been no visible progress.

Additionally, patients should not have to work out how the NHS works in order to use digital services.

One of the pre-requisites is a login system for patients that works across the NHS – how many usernames and passwords should people have to remember?

The savings from avoiding missed appointments (via the “choose and book” system) can only come if patients can log in to change their appointments, and aren’t put off by terrible technology.

We’ll have more on patient identity shortly, but the GP managed login, and password reset process, is sufficient for now (it’s not perfect everywhere, but the other suggestions of perfection are untarnished by the requirements of delivery or reality).

Using Gov.UK Verify’s providers would require every patient to only receive care in the name they have legal documents for.  Facebook may insist on a real name policy, the NHS should not. There are many reasons it does not.

So what might a good digital app start to look like?

Here’s our thought experiment:

https://nhsapp.experiment.medconfidential.org

Comments welcome.

Deepmind try again – November 2016

DeepMind this morning reannounced their partnership with the Royal Free Hospital. Updates are at the bottom – details are in the 9:50 and 10:10 updates.

There’s apparently a new legal agreement to copy exactly the same data that caused so much controversy over the summer. We have not yet seen the new legal agreement, so can’t comment on what it permits or disallows.

Responding to the press release, Phil Booth, Coordinator of medConfidential said:

“Our concern is that Google gets data on every patient who has attended the hospital in the last 5 years and they’re getting a monthly report of data on every patient who was in the hospital, but may now have left, never to return.

“What your Doctor needs to be able to see is the up to date medical history of the patient currently in front of them.

“The Deepmind gap, because the patient history is up to a month old, makes the entire process unreliable and makes the fog of unhelpful data potentially even worse.

As Deepmind publish the legal agreements and PIA, we will read them and update comments here.


8:50am update. The Deepmind legal agreement was expected to be published at midnight. As far as we can tell, it wasn’t. Updated below.

TechCrunch have published a news article, and helpfully included the DeepMind talking points in a list. The two that are of interest (emphasis added):

  • An intention to develop what they describe as “an unprecedented new infrastructure that will enable ongoing audit by the Royal Free, allowing administrators to easily and continually verify exactly when, where, by whom and for what purpose patient information is accessed.” This is being built by Ben Laurie, co-founder of the OpenSSL project.
  • A commitment that the infrastructure that powers Streams is being built on “state-of-the-art open and interoperable standards,” which they specify will enable the Royal Free to have other developers build new services that integrate more easily with their systems. “This will dramatically reduce the barrier to entry for developers who want to build for the NHS, opening up a wave of innovation — including the potential for the first artificial intelligence-enabled tools, whether developed by DeepMind or others,” they add.

Public statements about streams (an iPhone app for doctors) don’t seem to explain what that is. What is it?


9:30 update: The Deepmind website has now been updated. We’re reading.

The contracts are no longer part of the FAQ, they’re now linked from the last paragraph of text. (mirrored here)


9:40 update: MedConfidential is greatly helped in its work by donations from people like you.


9:50 update: Interesting what is covered by what…

screen-shot-2016-11-22-at-09-54-17screen-shot-2016-11-22-at-09-45-28

screen-shot-2016-11-22-at-09-47-34


10:10 update: What data does the DeepMind FIHR API cover? What is the Governance of that API? Is it contractually, legally, and operationally independent of the Streams app?

(it’s clearly none of those things, as the above screenshots say).

Deepmind have made great play of their agreement being safe, but consent is determined in a google meeting room, and the arrangements for the “FIHR API” are secretive and far from transparent.

There is likely to only be one more update today around 1pm. Unless Google make an announcement that undermines their contractual agreements.


1pm update: The original information sharing agreement was missing Schedule 1, and has been updated.


3:30 update: DeepMind have given some additional press briefings to Wired (emphasis added):

“Suleyman said the company was holding itself to “an unprecedented level of oversight”. The government of Google’s home nation is conducting a similar experiment…

““Approval wasn’t actually needed previously because we were really only in testing and development, we didn’t actually ship a product,” which is what they said last time, and MHRA told them otherwise.

Apparently “negative headlines surrounding his company’s data-sharing deal with the NHS are being “driven by a group with a particular view to pedal”.”. The headlines are being driven by the massive PR push they have done since 2:30pm on Monday when they put out a press release which talked only about the app, and mentioned data as an aside only in the last sentence of the first note to editors. – Beware of the leopard.

As to our view, MedConfidential is an independent non-partisan organisation campaigning for confidentiality and consent in health and social care, which seeks to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent. Does Google Inc disagree with that goal? 

Data in the rest of Government: A register of data sharing (managed, rather than inadvertent)

The Codes of Practice for the Digital Economy Bill aren’t worth the paper they’re (not) printed on. They aren’t legally binding (para 11), and are effectively subservient to the ICO’s existing code, even while paragraph 60 pretends a single side of A4 is a valid Privacy Impact Assessment for data sharing for operational purposes.


As this is the case, why is there a code of practice necessary under the legislation? Is does nothing new. Is it solely to make it look more respectable than the dark and dank dodgy deal that it actually is?

In places such as supermarkets, you have a choice of whether to use a clubcard, and can easily use one of the other supermarkets that are available – Government doesn’t have competition promoting integrity. To ensure a citizen can see how Government uses data about them, there should be a statutory register of data sharing agreements (involving public sector data). A register prevents nothing (which seems to be the policy intent of the Bill), but is simply a list of stated intents. From the Register comes an informed discussion of what organisations are actually doing and sharing, rather than excessive secrecy and double dealing.

Opposition to a register comes from fear, based in Government’s lack of knowledge of what data they have, or currently share it. If you don’t have a clue where your data is, or why it’s there, you oppose a register because you don’t want to find out.

How this state of affairs came about, is at the heart of this Bill.

We’ve previously posted about the definition of Personal Data in the Investigatory Powers Bill. What about in the non-secret parts of Government?

In 2010, the Cabinet Office told GCHQ that “to be considered personal data, a dataset has to contain at least the actual names of individuals.GCHQ being subject to the national security exceptions of the Data Protection Act.

In March 2015, the term “bulk personal datasets” was used by Parliament, and entered common terminology, but it wasn’t until November 2015 that the full definition of the Data Protection Act was restored (with DPA exceptions for National Security).

But, in the middle 7 months, the term gained increased currency within Government and used much more widely as it crossed into the non-secret sphere. The Cabinet Office took the existing meaning and thinking and applied it elsewhere.

It was never noted that the definitions in the non-secret parts of Government should have been  different, likely weren’t, and hence possibly are invalid under DPA, because the narrow term for GCHQ was classified, and hence restricted. Ie “actual names” is not the DPA standard.

So what effect did this have?

Following the talk talk hacks, Government ran an exercise described as the “Cabinet Office audit 2016” looking at what each Department held, and the impact of them losing it.

We made FoI requests about what each department held, and got very interesting answers (we excluded national security or serious crime).

The Cabinet Office hold no bulk personal data (apparently… ).

DCMS hold some bulk personal datasets – on people who have responded to their consultations (and some data from the Olympics)…  (erm…  what?)

The Department of Transport gave a much longer answer, (but didn’t know how much was in each dataset).

Does the Government known what personal data it has, uses and shares, and where it keeps it? If so, did Departments share that list with the Cabinet when asked?

Since we can see they probably didn’t, are all uses of large datasets (whether they are considered personal data or otherwise) fully compliant with the definitions in the Data Protection Act?

How does the Bill and associated work help resolve this mess?