Tag Archives: digitaleconomy

Rest of Government: Data misuse as “Missed Use”

Even during the height of the pandemic, DCMS continues (and continued) its work to share data, including and especially your medical records.

Not all data projects are a good idea – such as when a Home Office civil servant tried arguing it is part of the Home Office’s ‘public task’ to copy the medical records of all women in an attempt to discover sham marriages. In Government, not using data in that way is known as “missed use”.

Powers in the 2017 Digital Economy Act made it easier for most data in Government to be used in that way, with the exception of NHS data. DCMS has now conducted a ‘review’ of how those powers are used, and how they are not, which will recommend removing that safeguard. DCMS didn’t ask for our input, so we did our own review.

In another example of “missed use”, the UCAS form includes a question that asks whether an applicant’s parents went to university. The question is optional, and was originally added for statistical purposes. However, the data generated by that question soon came under pressure to be used for admissions decisions. The rules were changed, and then behaviour changed too – for if your child answers truthfully ‘yes’, they may be ‘penalised’ in favour of those whose parents did not go to university. As it is an optional question, there is a third choice which every university application coach tells their teenage clients to use: just don’t answer that optional question – thereby being both truthful (by omission) and avoiding any penalty for their parents’ education. 

Those who cannot afford such coaching, but who go to schools where having a parent who went to university is common, are therefore at a significant disadvantage. This is a stark illustration of the way in which flawed incentives, created entirely because of claims of “missed use”, can destroy the integrity and utility of the data itself – and now skew the statistics about intergenerational access to university.

Widening participation may be important, but administrative idiocy is inevitable.

Both of the examples above might appear to be clear, logical, even defensible decisions by civil servants following a public task – albeit with narrow definitions and no obligation to the bigger picture, or even any assessment that there might be a bigger picture. Such uses are driven by the simplistic view that “Administrative data is an invaluable resource for public good. Let’s use it.

“Let’s use it”

As pre-Covid DHSC paved the way for the new grab of your GP records, the “public debate” about uses of data continued without any meaningful public debate, and DCMS carried on its work as if the pandemic never happened. Pre-pandemic, the number of “round tables” and “expert panels” advancing cases that were entirely guessable from their titles was already ramping up, as useful idiots (and those chasing funding) made themselves useful to a wide-ranging programme in the data world, e.g. page 8.

Meanwhile, the “public awareness” effort was (and is) more subtle and better planned than that of the early days of care.data in 2013-14, but it is no less flawed. If you happen to attend one of those events, when they restart, one good question for the panel would be this: “What would make you take a different view on this project – is there space for diverse views, based on information you’ve not yet seen?”

In another example, it was perceived risk of ‘missed use’ that undermined appropriate scrutiny of a data request for a ‘causes of cancer’ study. That the study was run by a tobacco company didn’t stop Understanding Patient Data later explaining why cancer patients’ data should be used by a tobacco company. Will the gatekeepers and their defenders keep justifying anything that passes through their gates? The tobacco industry faux-naïvely asks how a richer, more statistically-informed ‘debate’ about harms can be anything other than a good thing – while the debating points are based on cherry-picked data, with misleading or disingenuous framing. 

The only way to avoid such issues is to tell every patient how (their) data is used, what their choices are and how they work. Because there will always be incentives that make it in someone’s interests to use data in ways that undermine the promises made to data subjects.

An ‘academic project’ to look at the effectiveness of justice and education incentives could be a good thing – especially when subject to peer review, published in a peer-reviewed academic journal, and subject to usual academic funding rules. But this project was commissioned by the Home Office, and its ‘academic’ input appears limited to that of a librarian wearing a fig leaf.

Data librarians do a vital job, and there is great value in well-curated data. Academics too are vital, but policy-based evidence-making – where the only real choice is to do what the Home Office has commissioned – is not academic research. Such ‘research’ may be legitimate to inform civil service and/or political action, but it is not an academically led process. So why is it being funded by UKRI / ADR, using resources that should be going to bona fide academic research?

The data to which ADR has access is detailed and sensitive personal data, for legitimate research. It is self-evidently wrong for ADR to claim the data it holds “is no longer classed as personal data” – classification by ADR is irrelevant, what matters is the law. And even their own academic specialists disagree: “Safe Data. This is a misnomer” – both GDPR and DPA 2018 are explicit that pseudonymised data is personal data. 

Always even more data

There are endless claims that “more data” is needed. Always more data, and “legal barriers” are forever being cast as the biggest problem – hence the Digital Economy Act Part 5. But three years after the DEA was passed, how many pilot projects for public service delivery in England did Government (or government) approve in the last year? Zero. None. Zilch. And how many have Departments proposed to run? Zero. Nada. None. (Although there is one single project in Scotland.)

Despite not even having made use of existing powers, Whitehall now wants to ‘loosen’ the rules around access to health data in a very similar way. Having provided no evidence of benefit – nor even of pressing need – the problem claimed is still “not enough data”.

It is inconvenient facts such as these that are most often omitted from, or masked in, the endless briefing packs for ‘public acceptability debates’. It’s almost as if the most toxic desires and projects are pushing the agenda, while never actually themselves going forwards. And, as history has shown, focusing solely upon a singular objective or key result as justification to expand powers or drive policy is fundamentally toxic.

One good thing about the Digital Economy Act was going to be that all of the decisions taken, the minutes and related papers were supposed to be made public in a single register – but they aren’t. It should not be only the privacy campaigners and their pesky FOIs that get to see what is going on; DCMS has gone back on its promised approach, and is hiding as much as it can.

medConfidential has been asking since January 2019, and had a stream of excuses – the latest being that the pandemic means that while DCMS’ data work and agenda continues apace, officials can’t publish anything because of it. We wrote to the Secretary of State in May, and were told that the DEA ‘review’ would be published “shortly”. We won’t be holding our breath. And once Government believes Covid has passed, we can start chasing these again.

Everyone should be able to see what is done with their data. In that clarity lies an informed debate based on how data is actually used – not just special interests pleading to do more of whatever it is they want to do. The decision makers should not be the op-ed pages of the national newspapers, but rather properly-informed data subjects, making choices based on the truth.

The first care.data programme collapsed after official assurances were shown to be false. The problem was not the fact that many believed, in good faith, that what they were being told was true. It was the fact that it wasn’t.

“Why should I care?” is an entirely legitimate question about data use. A better question, however, is: why should anyone have to take on trust what the public briefings say about it, or choose to exclude?

Six years on, we have a new Government, a new data environment, and exactly the same policy debates. Do you really think – as HMG privately does – that people care less about their data now than they did six years ago? And do those putting their professional reputations on the line (again) really believe they’ve been told everything?

It is one thing to be opaque and deceptive; it is far worse to be transparent and deceptive. If you combine a barrel full of wine and a teaspoon of sewage, what you get still looks like a barrel full of wine. And public bodies will spin their figures to say that it’s statistically safe to drink – while they reach for their gin…


Documents:

  1. medConfidential’s Review of Powers in  the Digital Economy Act (which DCMS didn’t ask for). This was written before the ADR RCB was abolished but our comments are on process and information provided to decision makers, not the name of their committee)
  2. How HMG used credit checks for covid test eligibility

The Data Protection Bill reaches the Commons

Updated: 16 April: the Bill has been renumbered again. All clauses 185-188 are now numbered 183-186. No other meaningful changes…

Updated 11 March: short briefing for commons committee stage

The Data Protection Bill has reached the Commons. We have 3 briefings on the Bill and an annex on the proposal to make DCMS the lead department for data processing by Government:

(We were expecting 2nd reading this Tuesday/Wednesday, but it’s possible the Whitehall bickering over the DCMS data grab has delayed it; if DCMS has put the politics of empire building ahead of the legislative schedule is a really good indicator that they shouldn’t take over the GDS data function…). Those two links (which were published after the briefing was first circulated), confirm that what is the Cabinet Office’s ‘data science ethics framework’ may get rewritten by DCMS to become the ‘Framework for Data Processing by Government’. For that task, even the iteration that has been discussed is entirely unfit for purpose.

GDPR and Transparency in Government

The EU’s Article29 Working Party held a consultation on their transparency guidance, and with an efficiency that probably infuriates Boris Johnson, ignores late submissions.

For the UK’s NHS, the GDPR is generally just a restatement of the existing ethical good practice that medical bodies should have been following anyway – but it does provide an opportunity (and necessity) to review past decisions and bring them up to scratch (and blame the EU for having to do it).

The main new provision for the NHS, and the topic of A29WP’s recent transparency consultation, are the provisions about what transparency and provision of information to the data subject means. Even that isn’t that new – but it is something that Government has paid lip service to for some time (remember the care.data junk mail leaflets?). That leaves a simple question:

What should transparency look like in practice?

For the NHS, there must be an electronic report on how data was used. NHS Digital keeps track, and with a digital login to the NHS (via patient online), the patient can see where data went, why, and what the benefits of those projects turned out to be, and if they wish to read the published papers (and simpler explanations) that resulted from those uses.

The rest of UK Government lags behind the NHS and is far more murky. Clearly stated in the “Better Use of Data” section of the Technology Code of Practice is a requirement that “the service should clearly communicate how data will be used”, which is akin to the GDPR. Unusually for a GDS recommendation, there is no exemplar given – here is ours.

The best way for an ongoing transactional service to communicate how data will be used next month, is to show how it was used last month.  For any data derived from a digital service behind a login (e.g. any eIDAS compliant system, such as Gov.UK Verify), on any subsequent login, a full accounting of how data on that data subject was accessed, copied, analysed or disseminated, should be available to that data subject.

The best way to know how your data will be used next month is to see how it was used last month. Processes will change over time, but not that rapidly.

This information must also be accurate. It is unclear what the consequence of providing misleading information currently is, but there should be some in a post-GDPR world. Mistakes are a prime facie breach of fair processing, and potentially cause serious distress which is a clear breach of current law.

Taking an example of where information could and should be provided, let’s look at Universal Credit: How much burden is placed on the entire system by the fact that how data is used inside UC & DWP is clouded in secrecy and consequent distrust?

The transparency obligations from GDPR do not extend to investigation of fraud or crimes, so it is not universal, but there are many other consequences of the current system which can be mitigated by informing citizens. UC is already a fully digital service, where users login repeatedly, and access and reuse of data by DWP is already (mostly) logged.

UC used to have such a screen visible to claimants – but the DWP civil servants insisted it be turned off as the Minister might like it. Of course the Minister would like it, as it would be an evidence base of facts and accurate information for a citizen on what the Department actually did – the thing for which the Minister gets held publicly accountable.  With an audit trail, visible to those involved, there will be fewer scandals that land on the Secretary of State’s desk when the stated policy was one thing but the actions of the Department were contradictory.

It is only where ministers deliberately mislead the House that GDPR accountability is a negative…

Access to Individual level Data

As part of transparency, it must be clear how promises to citizens are met. While the NHS does audits on recipients of data, companies regularly fail them with negligible consequences

Population scale citizen level datasets include an administrative census such as the cancer registry (everyone with any cancer for the last ~30 years), HES (everyone who has been treated in hospital since the early ’90s), or the National Pupil Database (everyone who has attended a state school since the mid-90s), or other large scale sensitive datasets (the rest of the NHS data estate).

When population scale data (that does not respect dissent) is copied out of the infrastructure of the data controller, it is impossible to ensure that promises to patients are kept. There are no technical measures which can provide assurance that what should have happened, actually did. That assurance is what the ‘limited environment’ of an independently run safe setting provide.

It is already standard process to allow access to detailed (decennial population) Census data in a safe setting where queries can be audited. The transparency and information provisions of GDPR should be read as requiring that where queries on a dataset can not be audited, that state must be available to a data subject since it makes much more likely that the promises of a data controller may be broken – because the controller has no means to know they are kept.

The 2017 Annual Report from the National Data Guardian again calls for “no surprises”. As the GDPR brings more data controllers closer to the standards already required in the NHS, the best way to inform a data subject how their data is likely to be used next month, is to show how it was used last month. From accountability, can come trustworthiness.

As the Whitehall machine grinds on, as the opt out moves to DH from NHS England, and as data moves from CO to DCMS, the forgetting happens: institutions forget what happened, and institutional memory is what they wished happened. Care.data was just a communications failure, and not a policy failure; etc. Where they forget, we will have to remind them.

Data Protection Bill: “Framework for data processing by Government”

Update February 2018: The Bill has now moved to the Commons, and clauses have been edited and become clauses 185-188. Updates are on the Commons page. This page covers the Bill as it was in the Lords.

Updated early-January 2018 – briefing, including the AI and ethics unit.

Update December 2018: Wider updated briefing for report stage

Imagine a data processing framework for social networks, where facebook get told, don’t worry about all those laws, the framework will take care of what you need to follow; the ICO, Judges, and courts, and human rights can’t touch you; you don’t need to worry anything pesky like following the law, or checking that election ads aren’t paid for in Rubles – since that’s just too hard for you to do, you don’t have to do it.

The first statutory “Framework for data processing” (in Government), snuck into the Data Protection Bill (clauses 175-178, page 99), legalises government using any data for anything it wishes (such Home Office typos or punitive DWP processing). None of the other rules apply besides what Ministers write into the framework, and they can change it at whim.

The framework is only 23 sub-clauses, but 10 of them remove rights, scrutiny, consultation or oversight. It seems this Government has lessons for Henry VIII on using power to show contempt for both citizens and Parliament.

Of course, Government rarely does data processing these days and instead outsources most of it. So this is not just data processing, but a framework about data controllers and the merging data for data processing.  We have spoken of these risks before (and pages 12-13).

If “Data Trusts” replicate the model of tax havens, then this is the framework that lets any sector, starting with Government, be exempted from the law. We have seen the effects of tax loopholes, this creates data loopholes. The next “frameworks” will apply to AI or health data.

Clauses 175-178 and Schedule 2 paragraph 4 must be removed from the Data Protection Bill.

Data in the rest of Government: AI, and today’s laws for tomorrow’s benefits

AI has finally got Government to take data seriously.

Information is the blood of any bureaucracy – and copying is the circulatory system. “Digital” in its broadest form is just the latest wave of faster photocopiers – decisions keep getting made no matter how fast the machines work. Any good private secretary knows: if you control the paper flow, you steer the decisions.

Just as the Cabinet Office has “spend controls” for technology, there should be flow controls for data. Current data practice in Government is 5 different scandals away from adequacy. As with our work in the NHS, some of those will be public, some of those will be private – the scandal is optional, the improvements are inevitable.

Even where the is a fundamental disagreement about a policy in the non-secret parts of Government, there should be the ability to have a shared factual understanding of how data is used.  But even in the “non-secret” parts of Government, there are legitimate reasons for some projects to have limited information disclosed (fraud detection being an obvious one where some information should be withheld, or generalised). The recent Data Sharing Code of Practice Consultation from the Cabinet Office seems to get that balance right for fraud data.

It would be helpful to have political leadership stand up and say (again) that “Citizens should know how data about them is used, in the same way taxpayers should know how taxpayers’ money is spent.” (quoting Matt Hancock MP – then Minister for the Cabinet Office). But that is only helpful, not necessary, and there are sub-political choices which deliver benefits for the civil service and Departmental priorities absent political leadership.

The Spring 2017 Conservative Manifesto gave a strong and clear vision of how Verify could be at the heart of a Government that was accountable to its citizens (page 3). The question is whether new guidances lets that be implemented, or stymied. The Article 29 Working Party has yet to issue full guidance on the transparency requirements of GDPR – but waiting to do the minimum is not in the spirit of the UK’s desire for leadership in AI, nor goals regarding data.

Government has a range of data sharing powers, and they should all be subject to transparency – otherwise the failings of one will infect public confidence in all.

Fortunately, the range of discussions currently ongoing give the opportunity for the choices of the future to be better than the the past; if that is the desire. The National Statistician’s Data Ethics Committee is a good start, addressing the highest profile and precedent setting issues across Government. However, as with other parts of the Digital Economy Act (Part 5), there should be a Data Review Board for all data sharing decisions that don’t reach NSDEC: it gives a process for which data sharing decisions can be reviewed.

However, if there is an informed citizenry, with citizens able to see and understand how their data has  been used by government, the more complex questions of AI and algorithms become tractable. The status quo will not lead to a collapse in public services, and they will always be able to catch up, the question is only the nature of the political pain that Ministers will suffer because of their civil servants.

A number of Departments believe that “digital transformation” has either failed or is not for them, and they wish to go another way. But the target was always the outcome not the method, and the test is not the pathway, but delivery. How do Departments transform to reflect their current situation? Will they be accountable and to whom?

 

Bad ideas beyond the AI Review

The recent “AI Review” talks about how “Navigating a complex organisation like the NHS is an unfathomable task for small startups like Your.MD.”. Your.MD being a company which hosts data they collect in the US (ie subject to US law), and outsources coding to eastern Europe (it’s cheaper), and generally cuts every corner that a startup cuts (the corners being things required to protect NHS patients). It should not be too much to ask that anyone wishing to use NHS patient data is capable of hiring someone who can use google to find NHS data rules. Although, as that is a test that DeepMind catastrophically failed, maybe Monty Python was right to hope for intelligence somewhere out in space.

 

Loopholes (and the Data Protection Bill)

There are some areas where narrow special interests still see themselves as more important than the promises made to patients or citizens, and as more important the principle of no surprises for patients. No bureaucracy can rid itself of the temptation to do what is in the interests of only the bureaucracy. However, it can decide to hold itself to a higher standard of transparency to the people it serves, and let them make the decisions.

With clause 15 it is Government’s demonstrable intent to carve holes into data protection law for its own purposes. To balance such attempts, through the many gateways through which it is possible in the Bill, there must be transparency to a citizen of how their data is copied, even if it entirely lawfully. That allows a separation between whether data is copied, from the rules that cover data copying and access, and an informed democratic debate

AI has finally got institutions to take data seriously. In doing so, it has created a clear distinction between those who understand data from those who do not (the transition from the latter to the former is incentivised as the latter are easier to replace with an AI). As yet, the AI companies don’t yet understand (or wish to understand) the institutions they want data from – which suggests those companies too are easily replaceable (paras 35-49). The AI review also suggests “data trusts” mirror other dodgy kinds and replace the existing principle of safe havens. While some of the large charities can look at that approach as insurance should public confidence in a particular disease registry collapse, and they are entirely wise to do so, a lawful disease registry should command public confidence.

The dash to big data and AI does not mean everything we have learnt about confidentiality, institutions, and public confidence should be thrown away to satisfy startups with less history than a Whitehall cat.

Any external body which seeks to prevent misuse of data will likely fail over time. It is easy for mediocre managers to believe the sales pitch to buy a big system that will “do everything” – to flood a data lake – while earnestly convincing others that this approach will solve whatever problem they think you have. Care.data was supported by many sectors, long after the flaws were undeniable, it was only when the public became aware that their tune changed. How will the new bodies learn from that mistake? Do they even think they have to?

The actions of the Home Office have destroyed the integrity of Country of Birth / ethnicity data in the National Pupil Database. At no point was that a discussion – just a directive. It impossible to expect even the most privacy-interested civil servant to defend such a line – even if they remained implacably opposed, their successor eventually would not. There are 3.5 years before the next census. If the first thing the nation’s children know about a census is that it deports their classmates, the fundamental basis for all statistics about the UK will be fatally undermined for a decade. This isn’t counting cranes, it’s extra resources for the areas that think they have high levels of immigration….

Bad ideas never die until they are replaced by better ideas. The misstep in the life sciences strategy illuminates the way that the future may go wrong – there needs to be a way to course correct over time. Just as every use of data in the NHS should be consensual, safe, and transparent; every use of data by Government can be fair, safe, and transparent. That includes uses by any group who cares to assist and be accountable to the individuals whose data they desire.

Is there an interest in a strategic, practical, and available solution? If not, then how many more data scandals will it take, and how high will the associated price be?

There is a better approach, using today’s laws for tomorrow’s benefits.

Overview of Current Data Discussions – October 2017

Two weeks after our annual report and rest of government supplement, there are now a number of data consultations on going. We attempt to summarise them all here.

Data Protection Bill

The Data Protection Bill is passing through the House of Lords. Clause 15 if so significant concern, giving Ministers the ability to carve a hole in the Data Protection Act at will – something this Government claimed it wouldn’t do, as it was key safeguard in the Digital Economy Act earlier this year. As written, it is a dramatic change from the data protection status quo, and gives the Government broad powers to exempt itself from the rule of law.

We have a briefing on the Bill for Second Reading in the Lords.

As the NHS moves towards transparency over medical records, the very information provided via transparency must be subject to the same protections against enforced SAR as the records themselves. It’s unclear whether clause 172(1) does this sufficiently.

Implementing the Digital Economy Act: “Better Use of Data”

To plagiarise Baroness O’Neill, whose approach is very relevant here: better than what?

The Cabinet Office are consulting on the Digital Economy Act Codes of Practice. We have a draft response to that consultation, which goes into more detail on a number of issues raised in our rest of government supplement.

As for how that will be used in practice, the Cabinet Office are having meetings about updating their data science ethics framework, and the ODI is seeking views on their proposed data canvas. The canvas is better, but to qualify as science, it can’t just be some greek on a whiteboard, but must include a notion of accountability for outcomes, and falsifiability of hypotheses.

Otherwise, it’s not science, it’s medieval alchemy – with similar results.

Most interestingly, it appears that despite all it’s flaws, the current “data science ethics framework” is in use by Departments, and they do find it useful for stopping projects that are egregiously terrible. So while the framework allows unlawful and unethical projects through, preventing those was not their goal – the hidden goal was to stop the worst projects where every other “safeguard” has demonstrably failed. This is a good thing; it’s just a pity that the previous team denied it existed. The honesty from the post-reset team is welcome – the previous approach included denying to our face that a meeting like this one was taking place, after someone else had already told us the date.

… part 2 is now here

Any Data Lake will fail; there is an alternative

We’ve added some new words to our front page.

Any attempt to solve problems of records following patients along a care pathway that involves putting all those records into a big pile, will either fail – or first breach the Hippocratic Oath, and then fail.

A Data Lake does not satisfy the need for doctors to reassure their patients (e.g. false positive tests), does not satisfy the need for doctors to hold information confidentially from others (e.g. in the case of Gillick competency, or on the request of a patient), or when institutions cannot tell doctors relevant details, e.g. in situations where there is “too much data, but no clear information”.

From the NHS’ national perspective, micromanagers at NHS England will get to reach into any consultation room and read the notes – especially in the most controversial cases. They might be trying to help, and while members of Jeremy Hunt’s Office itself might not reach in (to be fair, they probably wouldn’t), do you believe the culture at NHS England is such that some NHS middle-manager wouldn’t think that is what they were expected to do, urgently, under the pressure of a crisis?

This is also why any ‘blockchain approach’ to health (specifically) will fail. Such technologies don’t satisfy the clinical and moral need to be opaque – deniability is not a user need of your bank statement.

Just as every civil servant recognises aspects of Sir Humphrey in their colleagues, it is the eternal hope of the administrator – however skilled, and especially when more so – that if a complex system worked just as they think it should, everything would be eternally perfect.

Such a belief, whether held by NHS England, DH, or the Cabinet Office is demonstrable folly. If you build a better mousetrap, the system will evolve a better mouse; everything degrades over time.

It was a President of the Royal Statistical Society who talked about “eternal vigilance”. This is why, and it also provides the solution.

As we’ve outlined before, the alternate approach to a leaky Data Lake is to add accountability to the flow of data along a care pathway.

The system already measures how many patients are at each stage, and their physical transfers; it should give the same scrutiny to measuring how many records follow electronically. Where the patient goes, but their data doesn’t, should be as clear to patients as statistics on clinical outcomes – because access to accurate data is necessary for good clinical outcomes.

Interoperability of systems, in a manner that is monitored, is already being delivered by care providers up and down the country. Creating lakes of records is simply an administrator’s distraction from what we already know works for better care.


medConfidential takes donations

medConfidential comment on DCMS Data Protection “Statement of Intent”

DCMS’s intent is clearly to pay more attention to Civil Service silos than citizens’ data.

Sometimes you reveal as much in what you don’t say, as in what you do. Or in what you pointedly ignore…

The ‘Statement of Intent’ document suggests that the confidential information in your medical records deserves no better protection than your local council’s parking list. This is contradicted by both the Conservative Party Manifesto, and the pre-election commitment around Jo Churchill MP’s Bill in the last Parliament to put the National Data Guardian on a statutory footing. So why is DCMS saying no?

DCMS says it intends this to be a “world leading “ Data Protection regime. Even if this weren’t the UK’s implementation of the General Data Protection Regulation, DCMS would know its intent falls short had its Ministers and officials paid any attention to what’s happening outside their own offices.

Three weeks ago, the Government and the NHS committed to telling data subjects when their NHS medical records have been used, and why; and multinationals such as Telefonica have argued clearly and cogently that full transparency to data subjects is the only way forwards with innovation and privacy, without pitchforks.

The Government, however, is doing the minimum legally necessary – and already failing to meet the promises that it was elected on.

Given the Government’s manifesto and the Government’s commitments elsewhere, it is entirely possible for the UK to use digital tools to implement a world class data transparency and protection framework… So why is DCMS saying no?

On what principles will data be used in the Single Government Department?

Whitehall proceeds step-wise, and ever more rapidly, towards an end state of a “Single Government Department”. This is Sir Humphrey’s decades-old vision of “Joined-up Government”, predicated upon Government doing whatever the hell it likes with your data, wherever and however it gets it, in flagrant disregard of Data Protection and Human Rights, Articles 8 and 14 (at least). User needs or departmental needs?

In a world where there’s a single Government (and government) data controller – the Data Controller in Chief; the Prime Minister, the final arbiter – will a single Department’s policies, practices and prejudices determine the list of Government policies?

We don’t see how it doesn’t.

It may be useful to begin with an NHS analogy. It’s a gross simplification, but it carries the necessary meaning.

There are multiple hospitals in Manchester – Royal Manchester Children’s Hospital, Manchester Royal Infirmary, and St Mary’s – all on the same site, with interconnected modern buildings, all built at the same time. Why are there three hospitals? Because when the new buildings were constructed, and everything was consolidated on one site, though treating them as a single hospital would seem most sensible, that would (to many people) effectively be “closing two hospitals”. Hence, there are three.

What about Government departments?

In a Britain with a single government department, what is currently the Home Office – with its particular approach (covered elsewhere) – will go on the rampage across all areas of everything.

For how will weaker policy goals be defended against stronger ones? “It’s a matter of national security, don’t you know…”

Clause 38 of the Digital Economy Bill “solves” this problem by simply ignoring it – those with the highest bureaucratic power will win the fight; we’ve seen this already with the Home Office demanding the names and addresses (p16) of patients – and it’s quite clear they’d have grabbed everything if they’d wanted to.

In this context, with the Digital Strategy of DCMS, and Cabinet Office’s warmed-over Government Transformation Strategy in play, what should happen to make the world they’re trying to build safe?

The greatest concerns must be with the Transformation Strategy; the current “ethics framework” suggested by GDS (the part of Cabinet Office responsible for writing the Strategy) is so flawed, for example, that it suggests a Privacy Impact Assessment can fit on a single sheet of A4 – the self-same strategy used to justify care.data, relying on NHS England’s public statements. Thus far, the country has been saved from a systemic collapse in trust by the fact that this “ethics framework” isn’t actually used by departments.

So what’s the alternative? A citizen view of Government.

Government insists it should be able to copy our data – whatever it wants, wherever and whenever it likes – including to its commercial partners, e.g. Google (or rather, Alphabet) DeepMind and Palantir, for whatever policy whim catches the interest of any particular official. Proportionality and public acceptance are irrelevant; these are not what the civil service is set up to do.

As we saw with DeepMind at the Royal Free Hospital, one person with power can torpedo years of careful and diligent work in order to meet their own short-term, narrow perspective, self-interested goals.

The single Government department makes this worse, if left unaddressed. What should replace it is a citizen view of Government.

This conversation has never been had. The discussions that have been facilitated were designed to get to the pre-conceived end state of the Cabinet Office. As such, the answer was given and civil society time was wasted on a ‘debate’ that was entirely pointless; any wider opportunity to improve the use of data in Government through the Digital Economy Bill was lost.

As an example, well-defined APIs might work for departments – but if departmental silos weaken (as is the explicit goal: “to remove barriers to data sharing”) then things begins to fail. Citizens should not have to rely on how Government talks to itself.

The start of the conversation has to be with complete transparency to citizens – with the likes of Verify and public bodies being accountable to the citizens they work for. Citizens can now be shown what data is required for their transactions, and from where it will be accessed, and why. Operational decisions should inform democratic debates, both by policy makers and citizens who wish to engage in democratic debates about the services that affect them.

Civil servants all work for the Crown and not the public – whatever ‘flavour’ of Government is in power – and this may be a tension that needs consideration. What happens when the political will meets the public won’t? How is trust in institutions maintained?

Because without action, continued secrecy and the drip drip of cockup will undermine all trust.

This works in practice

Fortunately, some NHS GPSoC IT Providers (the data processors who provide IT systems to your GP) have taken the lead in fixing the systems from within the system. How many decades will it take Whitehall to catch up?

We have already demonstrated what this looks like – with Verify and other tools.

Rather than a “single government department”, the principle should be a “Citizen View of Government” – where every service a citizen has touched can be seen, with accountability for how they used data and why. This would make Government accountable to the citizen, as it should be – without the citizen having to understand the intricacies of how Government works.

In a “Citizen View” world, whether Government is one Department or many doesn’t matter as much. If civil servants want to justify access to data, they can – but they must be aware that citizens will be told what data and why, and might become unhappy about it if the reasons aren’t just.

Any Government that fails tell its citizens what it is doing and why, or which doesn’t really want them to know, will not be wanted in return – as the EU discovered with Brexit. This is what the open policy making process should have prepared the groundwork for; the price of that failure keeps going up as digital continues its march.

Unless we wish to treat data about human beings with less care than we treat the data about carcasses in our food supply chain, ‘Globalisation 2.0’ will be based on registers and code – determining risk and eligibility for consumers and for regulators. This simply does not square with a world of copying data; it can only work in a world of APIs to data where there is a lawful, published case for each access, grounded in fundamental accountability to citizens about their data.

It is obvious that data about the food we eat should not be locked in a filing cabinet in Whitehall. It should be equally obvious that “taking back control” shouldn’t mean giving every civil servant a copy of all the data on every citizen.


Related pieces:

The Home Office: Secretive, Invasive, and Nasty

In various guises, those who coordinate medConfidential have been dealing with the effects of Home Office missteps for what now in total amounts to decades.

Liberty Human Rights Awards 2010

Liberty Human Rights Awards 2010

Here is some of what we have learnt:

Home Office is the part of Government that must confront and ‘deal with’ the absolute worst in the world: murder, rape, terrorism, paedophilia – the stuff no-one really wants to have to know about; things from which civilised people prefer to turn their eyes. There are obvious – and legitimate – reasons that some of what the Home Office does must be confidential or classified.

The people who we task with dealing with these terrible issues deserve to work in a culture of compassion and competence, with solid foundations in Justice – the current Home Office has none of these.

Secret: Hiding errors in a file marked Secret harms the public good.

As can happen with bureaucracies more generally, the hint of secrecy at Home Office has spread into an all-encompassing security blanket around any information that might be be helpful to an informed debate in a democracy.

Treating information about every offence and misdemeanour as if they were the worst, keeping arbitrary secrets, and hiding your actions while telling others they must simply trust that “It’s for your own good” are the actions of someone who has lost perspective. Lost a sense of proportion. And lost the ability to discriminate, except in the prejudicial sense.

The examples of this are countless – from Ministers’ refrain of “trust us” about the ID scheme to “We know but we can’t tell you” about the Communications Data Bill; from petty refusals to extreme resistance to simply ignoring requests for information; and as evidenced by the secret ‘National Back Office’ in the NHS, only exposed in 2014, when Sir Nick Partridge reviewed what happened in the building where the previous-but-one Home Office-administered ID card scheme ended up.

Worse than that, on getting information via a backdoor into people’s medical records, the Home Office wrote in secret to people’s doctors, telling them to deny treatment.

Invasive: No consideration of innocence, or the consequences of action

The political culture pervading the Home Office has led to an organisation which cannot consider side-effects.

It sent round “Go home” vans because they might contribute to a “hostile environment” for illegal immigrants, without any regard to the effects of that hostile environment on innocent parties.

And it’s lost the ability to discriminate: to Home Office, everything it looks at is a crime – or a potential crime – so it is prejudicial towards everyone.

In being unable to discriminate between ‘crimes’ – including thoughtcrime, and perfectly normal behaviour, such as trying to keep your personal communications private – Home Office discriminates wildly and inappropriately against whole classes of people, and individuals who have in fact done nothing (or very little) wrong.

And, in pursuit of its obsessions, it considers nowhere, and nothing, sacred (Q78).

If it will not respect the boundary of the confidential relationship between you and your doctor, where is it that you believe the Home Office will not go?

In this world view, the entire country gets treated the way the Home Office treats illegal immigrants (which it claims is “respectful”!) and – after many attempts, including a RIP Act that for years emboldened nasty, technocratic petty-mindedness down to the local council level – it has finally got its Investigatory Powers Act, so it can snoop on all our communications data.

Nasty: Fear breeds paranoia, and suspicion is contagious

Bullies are fearful. They don’t always appear to be – especially when they get themselves a gang. But you can tell bullies by the way they pick on people, and who they pick on; the weak, the odd, the vulnerable. People who can’t put up a fight.

The Home Office delivers little itself; it cannot act directly in many of the areas for which it is responsible. For these areas of concern, it develops policy, dispenses budgets for various programmes, commissions systems, lobbies for legislation, and other things – but it assumes everything will fail, which leads to suggestions like a 15 foot high concrete wall around Parliament: “Operation Fortress Commons”.

But the few things it can do corrupt everything. It tries to turn everyone it leans on in every part of the public services into a border guard, or a snitch. Demanding the Met hand over details of those who witness crimes makes everyone less safe – if you are the victim of a crime, you want those who know something to share what they know with the police, without fear that it may be used against them. In this case, the hostile environment is hostile against innocent victims of street crime, because the Home Office has harmful priorities.

There are countless examples of each of these, which will appear over the course of the campaign. Some of them will even come from us…

The Home Office has been responsible for a string of high profile, national embarrassments in recent years. Flawed decisions by Home Office led to national humiliation at the opening of Terminal 5 – ever wondered why the baggage handlers couldn’t get to work? The shameful disarray of the G4S contract for the Olympic Games, from which Home Office had to be rescued by the military. The collapse of many criminal trials because policy at SOCA and NCA was simply unlawful. The harm to the UK’s economy, and international reputation, from the wrongful deportation of 48,000 students – because the Home Office panicked after watching a TV programme. And the harm to public safety and public confidence.

Shorn of Justice, the Home Office has lost touch with humanity, proportion, and the fundamentally positive spirit of the Britain. Human Rights are pretty much all that protects you from excesses or mistakes by the Home Office.

How does this relate to the NHS and privacy?

The greatest hazard in this election comes not to/from Brexit, but rather the deeper, more insidious threat to the autonomy of every citizen from the State. It forgets the worldview that created the NHS: that no matter what the world’s darkness, there will always be people there helping.

In a Brexit world, the Home Office worldview offers the NHS just three choices: be nasty to ‘brown people’; be nasty to everyone; or ID cards. These are the only choices its worldview can see, while the perspective of the NHS is quite simple; healthcare, free at the point of use, for all those in need. Without discrimination.


Related pieces: