Author Archives: medcon

The Opt Out process for those with children

This page has been superseded by a 2021 page

Below is left here for historical purposes

medConfidential’s opt-out form for GP data continues to allow you to protect the information held by your family GP as it has always done – but the only way to protect your data collected by hospitals and in other care contexts is now via NHS Digital.

NHS Digital does not (yet) have direct access to your GP record, so its opt-out process requires you to take several steps – and officials have decided that people who have children must make a third.

Additional steps forced on those who care for children

If you have children, or other dependents, and wish to express a choice about the use of their medical records for purposes beyond their direct care, the process is needlessly complex.

NHS Digital provides no online process for families with children even those who are registered with the NHS at the same home address as their parent or carer – so you must use its online process for yourself alone, and then use a separate postal process for any children under 13.

(Children aged 13 or over can express their own choice about their medical records for themselves online.)

So, instead of the single form you could use to express your choice for yourself and your family back in 2014, you must now do THREE things:

1) For your own hospital and other non-GP data: opt out (or opt back in) online

2) For your dependents’ hospital and other non-GP data: post this form to NHS Digital

3) For your and your dependents’ GP data: give this form to your GP

If you don’t have access to a working printer, e-mail children@medConfidential.org with your postal address and we will post you copies of the paper forms, for free, no questions asked. If you don’t have e-mail, you can text your address to 07980 210 746.

If you can afford to make a small donation to support us in offering this service to others, we have a donation page.

registered with the ICO to process personal data in this way.)

Why?

It appears the “most digital” Secretary of State the NHS has ever had would rather place the burden of understanding and action onto individual patients – especially those with families – than solve the problem his officials have created.

And the fix is simple: NHS Digital’s online process confirms who you are to the extent that you are able to express your choice online, yet it chooses not to ask if you have any children – who could then be confirmed in exactly the same way the process confirms that you are who you say you are. The move to ‘digitise’ the National Data Opt-out left YOU with more work to do…

This extra work for you is a deliberate choice; it didn’t have to be this way.

Once you have expressed your wishes about your and your family’s data, you may also wish to write to your MP and ask them to ask the Department of Health why you are being forced to jump through these hoops.

Because while healthy, young, and particularly male protagonists commonly have little understanding or insight into the types of sensitive information they will one day have to divulge to their doctor – and the consequences of confidentiality not being respected – others do not have the luxury of such ignorance.

Late October update

At the start of October, the Department of Health took away your ability to opt out via your GP from having information about you, collected by the rest of the NHS, being used for purposes beyond your direct care. (The option to prevent information from your GP record leaving your GP practice remains. For now.) The new process is so ‘hip and digital’ that you also have to use the Royal Mail if you wish to make a consent choice for your children, as well as visiting your GP practice to make a choice for your GP data that the online process tells you nothing about.

Is this Matt Hancock’s view of a digital NHS?

We are testing a new trifold to guide families through expressing their full opt out choices –which is now a three step process: online, post box, and at the GP. This may be simpler for NHS Digital, but it’s a lot harder for you – a choice with which Matt Hancock seems to be entirely happy.

NHS Digital was apparently very proud that more people opted in via the digital service than opted out in its first 2 months – though sending out 1.6 million letters could be said to have stacked the scales somewhat – but that represents at most a few hundred people a month, whereas 5,000-10,000 people a month were still opting out via their GP until the Secretary of State took that choice away from you.

We have previously given a commitment that there will be a functional digital opt-out process for patients, and that if NHS Digital wasn’t going to deliver one, then medConfidential would have to (though this will likely be very analogue on their side…).

Data rights and proper information can together empower every patient and citizen to have more confidence in those who use their data. NHS Digital seems to want to make it more complicated. Though official information is published in various forms, in various places, the only way a patient can currently read how their wishes were respected is to visit TheySoldItAnyway.com

If you didn’t receive a letter from NHS Digital about the new ‘National Data Opt-out’, and since you’re reading this on our website, you should check the online process to see if your choice disappeared somewhere in the machine (and, if so, to set it to what you want). You’ll then need to set it for your children too by post – and at your GP, for your GP data, to ensure that too is set.

 

Consultation Responses, etc.

With the National Data Guardian Bill having its second reading in the Lords this week, medConfidential has published a letter of support for the Bill. Meanwhile, the Organ Donation Bill contains a supposed safeguard that is overly complex and will not provide reassurance to those who wish to see how their organs will be used after death. We have drafted an amendment for the Lords to fix the broken Bill, if the Commons does not.

As part of the next piece of NHS legislation, the National Data Opt-out should be placed on a statutory footing. The next legislation will likely be the result of NHS England’s consultation on “integrated care providers” (our response) and the “long term plan” (our response), which also referenced the need to reform invoice reconciliation.

Our friends at dotEveryone published their views on digital harms and responsible technology, suggesting that data and ethics in Government should be led by someone “relatable … charismatic and imaginative”. Which would be better than the current person, whose company created the problems around commercial abuses of data in the NHS, and which is still causing problems 20 years later. The current  ‘imagination’ at CDEI (the ‘Centre for Data Ethics and Innovation’) seems to be to repeat the sort of data sale scandals in Government they already caused in the NHS. The Information Commissioner also sought views on a ‘regulatory sandbox’, where companies can experiment with personal data – we had views.

Data use across the rest of Government has also been keeping us occupied. Our evidence to the House of Commons Science and Technology Committee contains some new thinking on the failures of agile in public bodies. Some of that thinking was also in our response to the call for evidence ahead of the UK visit of the UN Special Rapporteur on Extreme Poverty and Human Rights, who is looking at algorithms and digital effects around Universal Credit.

 

Data and the rule of law

The data sharing powers under the Digital Economy Act 2017 are still not fully in force. This did not prevent the Ministry of Housing, Communities and Local Government (MHCLG) demanding data on every homeless person in the country, such as in Camden. The secrecy of data use in such cases must be addressed by the UK Statistics Authority / Office for National Statistics – it is doubly disturbing that MHCLG used the research process to evade the scrutiny that would have applied via other routes.

Decisions by public bodies must, today, comply with the standards of the rule of law. As we move towards more automated decision-making, how will those standards be maintained?

The tech companies and their apologists want the approach to be one defined by ‘ethics’ – as if no tyrant ever failed to justify their crimes. “The computer says no” (or “DeepMind says no”) is wholly insufficient for suppliers of data processing functions to government making decisions about citizens.

All reputable companies will be entirely willing to explain how their “AI” systems arrive at the suggestions or decisions they make – including the (sources of) data on which they were trained. Disreputable companies will be evidenced by their failure or inability to do so.

Government Departments should deliver accountability to ‘their’ data subjects (currently they don’t). But beyond accountability to individuals on how data about them is used, there are standards that must be followed by institutions – especially those which govern.

The Venice Commission has produced a ‘Rule of Law checklist’, covering the context of decision-making. We’ll be taking a look at a couple of Government automated processing plans, and seeing how they conform – and how the checklist applies to digital projects, probably starting with Universal Credit and Settled Status, based on past work. We anticipate identifying holes in some of the frameworks currently used by Government, as compared with the standards required by the rule of law and judicial review. Comments are very welcome to sam@medConfidential.org.

Initial response to new ‘tech vision’ from DHSC

Update: longer response.

The current Westminster penchant for speeches with applause lines but without details has reached DH… 

Update: NHS Digital has now published some details – which are utterly underwhelming when compared with the Secretary of State’s hyperbole. “Use the NHS number” and “upgrade from ICD-10 to ICD-11” is not the radical changes Secretary of State appeared to suggest. Although with the promise of registers, we might dust off the amendment we suggested to the Lefroy Bill (which mandated NHS numbers by law) in 2014. We will update this document when NHS Digital published the document that should have appeared at the same time.

Notes:

  • Data: “We are supportive of … Data Trusts” – does the SofS/DH have so little confidence in the NHS getting data right that he/it is supportive of stripping the NHS of that governance role?
  • DH blindspots: “We will know we have achieved our goals when”… does not mention patients other than suggesting they use apps for self-care…
  • Privacy: There is no reference to the duty of confidence every clinician is under to their patients (it instead points at the Data Protection Act)
  • Google: The obligation on all systems to use FHIR interoperability removes the figleaf behind which DeepMind insisted on data for all patients in the Royal Free for 5 years.
  • Google: It also proposes outlawing the monopoly line in Google’s standard contract that forces users of the Streams app to only connect to DeepMind’s servers. It is unclear whether that line will survive the lobbying it is about to receive.
  • Amazon: Case study 7 is as true as leaving a note on the fridge, but there are other effects of giving Amazon/Alexa such information. Facebook’s new Portal offers the same functionality, and will explicitly be used to target ads at users.

 

Below quotes can be attributed to Sam Smith, coordinator of medConfidential, which works to improve uses of data and technology in the NHS.

The NHS knows what good working technology like, but to get there, you can’t just turn A&E off and on again and see if helps.

Mr Hancock says “we are supportive” of stripping the NHS of its role in oversight of commercial exploitation of data. That should be a cause for widespread concern. If Matt thinks the NHS will never get data right, what does he know that the public don’t?

The widely criticised National Programme for IT also started out with similar lofty vision. This is yet another political piece saying what “good looks like”, but none of the success criteria are about patients getting better care from the NHS. For that, better technology has to be delivered on a ward, and in a GP surgery, and the many other places that the NHS and social care touch. Reforming procurement and standards do matter, and will help, but it helps in the same way a good accountant helps – and that’s not by having a vision of better accounting.

There’s not much detail in here. It’s not so much ‘jam tomorrow’, as ‘jam… sometime’ – there’s no timeline, and jam gets pretty rancid after not very long. He says “these are standards”, but they’re just a vision for standards – all the hard work is left to be done.

-ends-

More DeepMind secrecy – What the lawyers didn’t look at

The Royal Free has been recommended by ‘independent’ lawyers to terminate its ‘Memorandum of Understanding’ with DeepMind (page 68, second bullet from bottom)

If the “research” agreement with DeepMind – the MoU covering “the use of AI to develop better algorithms” – isn’t terminated, the deliberate exclusions from the legal opinion can only be interpreted as an attempt to mislead the public, once again.

What is the legal basis for continuing to copy 8 years of data on every patient in the hospital? While DeepMind claims the “vital interest” of patients, it still keeps the data of over a million past patients whose interests it will never serve, because RFH’s systems cannot provide “live data” (para 26.1) – despite the report saying that is only temporary (para 15.1).

When RFH completes its move to “fully digital”, will the excessive data be deleted?

The biggest question raised by the Information Commissioner and the National Data Guardian appears to be missing – instead, the report excludes a “historical review of issues arising prior to the date of our appointment” (page 9, para 8.4, 5th bullet, and page 17, para 5,bullet 7).

The report claims the ‘vital interests’ (i.e. remaining alive) of patients is justification to protect against an “event [that] might only occur in the future or not occur at all” (page 43, para 23.2). The only ‘vital interest’ protected here is Google’s, and its desire to hoard medical records it was told were unlawfully collected. The vital interests of a hypothetical patient are not vital interests of an actual data subject (and the GDPR tests are demonstrably unmet).

The ICO and NDG asked the Royal Free to justify the collection of 1.6 million patient records, and this legal opinion explicitly provides no answer to that question (page 75, para 5, final bullet).

The lawyers do say (page 23, para 12.1) “…we do not think the concepts underpinning Streams are particularly ground-breaking.” In Streams, DeepMind has built little more than a user-friendly iPhone app – under scrutiny, its repeated claims of innovation are at best misleading.

But Google DeepMind clearly still thinks it is above the law; it tries to defend all of the data it has by pointing at different justifications each time. Is this the ‘ethical’ ‘accountable’ approach we must accept from the company that wants to build dangerous AIs?

-ends-

Background to the long running saga.

Where are the CAG regulations?

We talk a lot about NHS Digital, and its data releases that continue to ignore opt-outs. But 4 years ago today, Royal Assent of the Care Act 2014 gave NHS Digital a “general duty” to “respect and promote the privacy of recipients of health services and of adult social care in England” – which clearly hasn’t been honoured in some areas of its work. The Act also changed the law specifically so that the Confidentiality Advisory Group (CAG) of the Health Research Authority has the power to advise NHS Digital; advice to which NHS Digital must listen.

Caldicott 3 itself does not require dissent to be honoured when data is disseminated in line with the ICO’s Code of Practice on Anonymisation. (The National Data Guardian – who has been given no enforcement powers – is very careful not to ‘cross wires’ with the UK’s data regulator, who does have such powers.) And, despite well over a million patients clearly indicating their wishes to the contrary, NHS Digital continues to argue its dissemination of pseudonymised data is “anonymised” to the satisfaction of the 1998 Data Protection Act.

The UK is about to get a new Data Protection Act, aligned with and based on the EU General Data Protection Regulation, which says:

(26) … Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person.

The UK’s Information Commissioner will update the Code of Practice on Anonymisation in due course –  she’s just a little busy right now, and the new Data Protection Act is not yet on the statute book – but the Irish Commissioner has already said: (emphasis added)

“Although pseudonymisation has many uses, it should be distinguished from anonymisation, as it only provides a limited protection for the identity of data subjects in many cases as it still allows identification using indirect means. Where a pseudonym is used, it is often possible to identify the data subject by analysing the underlying or related data.

Current practice will have to change.

While IGARD may be the appropriate body to advise whether a request meets NHS Digital’s standards for dissemination, it is not an appropriate body to advise on releasing data which does not honour patients’ objections. The adjudication of the principles of those decisions, by statute, belongs to CAG.

There are legitimate instances where patients’ dissent may be be overridden – but IGARD is not, and never should have been, the body to decide that.

The opt-out is there to protect patients who decide that the safeguard the NHS currently relies upon – pieces of paper, which include for example commercial re-use contracts with commercial companies that service other commercial companies, including pharmaceutical companies that use the data for promoting their products (i.e. marketing) to doctors – is not sufficient for their situation. As is their right.

Another example: in 2014, the Health Select Committee asked for a safe setting for researchers. Only in April 2018 did a remote safe setting begin to be piloted for researchers – that work not only needs to be completed, but it should become the standard means of access.

NHS Digital continues to insist that a piece of paper is sufficient safeguard under which to release copies of the entire nation’s lifelong, linked medical histories to hundreds of organisations. Its own published records show that two-thirds of NHS Digital’s data releases do not respect patient dissent.

It should be CAG which makes such decisions, whenever and wherever it is necessary. The CAG Regulations will make that clear, when they exist. Assurances to patients are less than meaningful when the Regulations to which they relate do not yet exist.

If someone applying for patients’ data cannot do what they need with only 98% of people’s data, they should simply explain to a responsible body why this is the case. Public Health England’s cancer registry already takes this approach with the choice of protections if offers for event dates. NHS Digital simply releases data on every patient, with the medical event dates completely unprotected.

The National Data Guardian was asked to determine a single choice by which patients could express their dissent from their data being used for purposes beyond their direct care. When that choice is disregarded, it must be on a basis clearly and specifically defined in statute, and approved by CAG.

As it is doing around the world, the introduction of the GDPR will force a change, and that change should protect patients’ data that under the new Data Protection Act will be considered identifiable. Those who still need everyone’s data will have to explain why to a competent body – which really isn’t too much to ask.

Given the clear promises given as a consequence of the care.data and HES data scandals – promises much repeated, but yet to be delivered – we’ve been waiting a long time for this to be fixed.

Instant Messaging in Clinical Settings

November 2024: slight tidy ups.

June 2023: NHS England recently made their guidance less comprehensible (again). The 2023 wording is entirely compatible with the much clearer 2018 wording; paragraphs in blue below are omissions from that official guidance.


As a clinician or nurse, you should not have to keep up with the latest fluff of the apps you might use for work. But teams need to talk to each other (without using Microsoft Teams for everything!)

NHS England has put our several attempt at ‘guidance’ on using instant messaging apps. It previous WhatsApp was not banned, but failed to provide helpful guidance on what to actually use. It still hasn’t. There was a Do & Don’t list, which was better than nothing, but it isn’t in the latest version, and was almost impossible to turn into practice in the real world.

If asked, we would suggest something like this:

Summary

  1. If your employer offers an instant messaging solution, use that.
  2. If you are picking apps to use yourself, you are safest with Signal.
  3. If you are not picking the apps you use, you will probably have to use WhatsApp or Skype. But be aware that someone will be held responsible when Facebook or Skype change their rules – and it’s probably not going to be the person who picked the app…
  4. Don’t use Facebook Messenger, Instagram, or Telegram.

Whatever app you use for work, chats in that app should be set to expire in a few days (possibly less), and the vast majority of people should avoid having their phone going ding for work purposes while they are not at work. For most apps, a swipe left on the main list of ‘chats’ should show an option to “hide alerts” for some time period – this should ensure that if you do give your personal number to work colleagues, it doesn’t end up driving you to distraction outside work. If someone really wants to get in touch, they can always just call you normally.

The reasoning behind our suggestions: Doctor-to-Doctor encryption

The important step in secure messaging is something called “end-to-end” encryption, which prevents anyone – a third party ‘listening in’, or even the service making the connection –  knowing what you said. It’s the equivalent of having a conversation in a private consultation room, rather than doing it standing next to the nurses station, or in a waiting room. But even with Signal, if you are messaging using your personal device, you should treat any conversation as if it were in a lift where another person might be listening.

Signal allows you to decide for how long you will keep messages from any particular person or group, and will automatically delete the stored messages after that. But what happens with the stored message history in other apps? WhatsApp, for example, wants you to give it a full copy of all your messages and send them to its servers as a ‘backup’ (though at some point it will show you ads against them – it is part of Facebook after all).

You may also have set your phone itself to backup to somewhere. Do you know where the backup goes, and what’s in it? If chats don’t auto-delete in minutes, your backups will need to be carefully managed.

Of course, it is best practice to backup everything on your phone, and most apps assume (probably correctly) that you don’t want to lose every message or photo you receive of your kids. This doesn’t necessarily translate neatly to a clinical setting – anything that must be kept should be recorded elsewhere, so that if you lose your phone, the only thing you won’t have kept was ward chit-chat. WhatsApp wants everything – it doesn’t offer clinical reassurance. And while Snapchat has deletion as a feature, it has other problems akin to Facebook and Skype.

The longer-term security of your messaging is dependent upon who makes the app – and when, and why, they will change the rules on you. We (also) recommend Signal because it is produced by a charitable foundation whose sole mission is to provide secure, usable, communications. One key reason why the NHS England guidance is so terrible is that WhatsApp has lobbyists telling NHS England that it should allow their product; Signal doesn’t.

Since Facebook (the owner of WhatsApp) lies to regulators about its intentions, you clearly cannot rely on the company not to do tomorrow what it denies it will do today.  As a consequence of this, any official guidance must in future be kept up to date by NHS Digital. And, as corporate policies change, so must the guidance – removing from the equation NHS England’s fear of the deluge of lobbying that created this mess in the first place.

Clinicians deserve better tools than those that NHS England chooses to recommend, where a national body prioritises its own interests over the needs of those delivering direct care. The NHS England guidance is the output of meetings and committees that with every iteration gets progressively less useful for those who need to use something to help the and the people they work with practice medicine.

(This post will be kept under review as technologies change; it was last updated in October 2024)

October 2024: slight tweaks, primarily doctor-to-doctor encryption and deleting messages.

June 2023:  The December 2022 guidance from NHS England is split over a page about messaging, with key parts on the page about devices.

March 2021: added link to common definition and tests for a secure app.

March 2020 Update: custom apps are now in the NHS Apps library, and so apps that your staff routinely use for other purposes shouldn’t be used.

Data and AI in the Rest of Government: the Rule of Law

medConfidential spoke about the Framework for Data Processing by Government at the All Party Parliamentary Group on the Rule of Law. The topic of the APPG provides a useful perspective for much work on data in the public sector, and the wider use of AI by anyone. The meeting was on the same day as the launch of the AI Select Committee Report, which addresses similar key issues of  ‘data ethics’.

The ‘Rule of Law’ is defined in 8 principles as identified by Lord Bingham. The principles are not themselves law, but rather describe the process that must be followed for the Rule of Law to be respected.

Public bodies must already follow that process, and also be able to show how that process has been followed. As a result, those developing AIs (and data processing tools) for use by public bodies must also show how these processes have been followed. This is necessary to satisfy the lawful obligations of the bodies to which they are trying to sell services.

The principles identified by Lord Bingham are a model for testing whether an explanation of an AI and its output, or a data model, is sufficient for use by a public body.

While debates on ethics and society, and on politics and policy, focus on whether a technology should be used – the Rule of Law is about the evidence for and integrity of that debate. As Departments implement the Framework for data processing, to deliver on their obligations under the Rule of Law, it must be compliant with the Principles identified by Lord Bingham – not just the ethics and policies of the Minister in charge that day.

Public bodies are already bound by these rules – unless Parliament legislates to escape them. The principles are widely understood, they are testable, and they are implementable in a meaningful way by all necessary parties, with significant expertise available to aid understanding.

 

Companies and other non-public bodies

Companies (i.e. non-public bodies) are not subject to the same legal framework as public bodies. A Public Body must be able to cite in law the powers it uses; a Private Body may do (almost) anything that is not prohibited by law. This is why facebook’s terms and conditions are so vague and let it get away with almost anything – such a data model does not apply to the tax office.

Some of those looking to make money – to “move fast and break things” – would like the standard to be ethics, and ethics alone. There are currently many groups and centres having money poured into them, with names involving ‘data and society’, ‘ethics and society’, and DCMS’s own ‘Centre for Data Ethics’. The latter is led by a Minister in a Government that will always have political priorities, and – given recent revelations about Facebook – the consequences of incentives to lower standards should be very clear.

Ethics may contribute to whether something should be done – but they are not binding on how it is done, and they offer no actual accountability. After all, no tyrant ever failed to justify their actions; it is the rule of law that ultimately holds them accountable, and leads to justice for those harmed. Ethics alone do not suffice, as facebook and others have recently shown.

There is a great deal more work to do in this area. But unlike other AI ‘ethics’ standards which seek to create something so weak no-one opposes it, the existing standards and conventions of the Rule of Law are well known and well understood, and provide real and meaningful scrutiny of decisions – assuming an entity believes in the Rule of Law.

The question to companies and public bodies alike is therefore simple: Do you believe in the Rule of Law?

[notes from APPG talk]
[medConfidential (updated) portion of the APPG briefing]

Response to the House of Lords AI Select Committee Report

The AI Select Committee of the House of Lords published their report this morning.

In respect of the NHS, it suggests nothing the NHS wasn’t already doing anyway.

The suggestion that ‘data trusts’ be created for public sector datasets – such as tax data – will likely cause fundamental distrust in AI amongst the public (paragraphs 82 & 84). The NHS has shown how that model ends badly when the prime drivers are commercial, not ‘human flourishing’.

Sam Smith, a coordinator at medConfidential said (referring to paragraphs 99, 129, 317-318, 386, 419-420) :

“A week after Facebook were criticised by the US Congress, the only reference to the Rule of Law in this report is about exempting companies from liability for breaking it.

“Public bodies are required to follow the rule of law, and any tools sold to them must meet those legal obligations. This standard for the public sector will drive the creation of tools which can be reused by all.

 

-ends-

medConfidential are speaking at the APPG Rule of Law in Parliament from 11 – 12:30, and more details are now available.

NHS Digital failing to uphold patient interest


The Health Select Committee has published a report on data sharing which “raises serious concerns about NHS Digital’s ability to protect patient data” under the headline “NHS Digital failing to uphold patient interest”.  The Home Office is “treating GP patient data like the Yellow Pages” according to the RCGP.

The NHS has been trying to rebuild trustworthiness around data since the last big NHS data project collapsed in 2014. This report shows that all promises can be undermined by the narrow minded view of one office in Whitehall

The Health Select Committee is clear that NHS Digital has again failed in its statutory duties, and has put patients at risk by the processes it has adopted and refuses to change.

HSCIC rebranded into NHS Digital in an attempt to avoid the history of past failures, but this report shows actions are unchanged…

We submitted written evidence to the inquiry.

medConfidential Bulletin, 9th March 2018

It has been a while since we last sent a newsletter. Our apologies for that – we have been kept busy on a number of fronts, but rather than spam you with speculations we believe it’s better to communicate when there are significant developments.

 

New national opt-out for medical records

An announcement has been delayed for some months and there’s still some time until action is taken, but to quote NHS Digital last week:

The Secretary of State has agreed that the national data opt-out will be introduced alongside the new data protection legislation on 25 May 2018. It has also been agreed to present the national data opt-out as a single question to cover both research and planning. Type 2 opt-outs (which currently prevent identifiable data from leaving NHS Digital) will be converted to the new national data opt-out when it is introduced in May. Patients with type 2 opt-out will be contacted directly about this change.

There are still a number of important questions to be answered, but we’re working on those for you. For example, at this point, the Government has not yet confirmed that every data release that would be covered by the Type 2 opt-out will be covered by the new opt-out.

medConfidential has yet to see the final wording of the question, but this announcement is clear confirmation that if you opted out in 2014 (or subsequently), you will be sent a letter about what happened. We also haven’t yet seen the wording of the letter, as we and the other members of CDAG (the care.data Advisory Group) would previously have done, but apparently we are to be consulted on that too. When we have the ability to cite formal statements on the new process, we will update our website – this is likely to be in May.

So, if you have already opted out, the NHS will write to you about the new opt-out model. Whether anyone will tell other people remains unclear. We do hope the Secretary of State won’t snatch defeat from the jaws of a victory which could improve patient confidentiality and everyone’s confidence in how the NHS uses data.

 

This week: Data Protection Bill

The Data Protection Bill was delayed by political squabbling, but must pass by early May, and is now on a very tight timescale.

medConfidential’s concerns with the Bill relate to something called the “Framework for Data Processing by Government” which, in effect, creates a ‘Data Controller in Chief’ who can ignore the Information Commissioner, and the fact that the Government wishes to deny your ability to access information on how your records are used, if that might be used by someone else at another time in a way which may “prejudice… effective immigration control”.

Thanks to a great deal of work by many concerned groups and organisations, the Government no longer considers this framework above the law, just above enforcement of the law. The Rule of Law requires that justice both be done, and be seen to be done – requiring transparency that Governments and companies often prefer to avoid.

 

What you can do

Many parts of England have local elections in May. The ongoing stealth reorganisation of the NHS in England (into 44 “Sustainability and Transformation Partnerships” and “Integrated Care Systems”) will give your local council more responsibility for data re-use in your area. No details will be given until after the elections – of course! – but if anything does emerge before that, we’ll let you know.

The health and care issues that most burden the NHS differ from place to place, sometimes quite widely. So when local politicians ask for your vote in the next few weeks, you might ask them what their council would do about the biggest issues in your area.

You can see the top three issues most impacting health in your local authority, and those nearby, on this map: http://bit.ly/2FVYVE1

(Created thanks to current data from Public Health England, and with the help of tools provided by Democracy Club whose volunteers collate and share information on elections across the UK.)

 

What’s next?

medConfidential keeps working even when we’re not sending newsletters; we won’t spam you if there’s nothing important to say. As you can see from this Bulletin, we are approaching another critical time for patient confidentiality that we hope can be negotiated with far greater success than in 2014! If you appreciate our ongoing efforts, we accept donations. Thank you for your support.

 

Phil Booth & Sam Smith
9th March 2018