If they’re ‘appy and you’re worried clap your hands…

medConfidential mostly works on issues to do with confidentiality and consent around what the NHS (and wider care system) do with your data beyond your direct care; what are called ‘secondary uses’.

However, the world of ‘health-enabled’ smartphones has slipped into almost everyone’s pocket, and the NHS is beginning to notice. Unfortunately, NHS England is starting from its usual cultural assumption that it can do things by dictat, ignoring the rules – even ones it made up – if they prove less than convenient.

Health apps are quite different to most of what the NHS does; in many ways they are more like a pharmacy than a hospital. Apps are something that patients do for themselves – possibly with professional advice, possibly without. Apps are done by patients, not something the doctor or the system does to, or for, the patient.

Apps are the rough equivalent of a prescription, in that it’s up to patients themselves to ‘take the pills’. Apps are not some sort of “machine doctors” that NHS England can bend to its will. (It rarely turns out well when NHS England tries to do this, but that doesn’t stop it trying again and again and again.)

For the main part, apps exist between a patient and a third party without a medical consent relationship. The Terms and Conditions of some (should you read them) set you up to have your data exploited and sold on – quite legally, under the contract you signed up to when you installed the app and gave it permissions – in ways even Pharmacy2U would never dream of.

Unfortunately, compliance with the Data Protection Act – a legal minimum – offers nothing like the standards of ethics and confidentiality you should expect for your medical records. And consent in the ‘planet of the apps‘ is merely a tick box, or a flick of the finger.

That’s not to say that app providers can’t do “mass participation surveys” properly, ethically and in ways impossible by other means. Some certainly do. It’s just that – as with all innovative but immature markets – there needs to be guidance, and proper oversight, to help members of the public distinguish between legitimate research and profit-seeking charlatans.


In a future NHS world, if an app had access to an individual’s details and offered services which could receive that individual’s consent settings from the Spine, then their existing consent choices could, in principle, be honoured (though whether widening access to NHS Spine is a good idea or not is a subject for another blog post). What’s for certain now, though, is that app screw-ups and scams will continue until consent improves.

Most health apps don’t and will not connect to anything in the NHS, other than maybe allowing a patient to e-mail a standardised report to somewhere. In the Apple ecosystem, where health apps have to write data to the protected ‘HealthKit repository’, it’s at least possible that the 4 UK GP IT providers could handle reading and integration of your data with NHS systems, under the control of the patient. [UPDATE 7/8/15: EMIS already does something along these lines – thanks to @theABB for screenshots.] So building something useful doesn’t necessarily require dealing with the idiosyncrasies of the Directorate of Patients and Information at NHS England.

The NHS ‘Health Apps Library’ right now is in a mess. The positive intention may have been to help patients navigate shark-infested waters, the reality in some cases is more like being left up a creek without a paddle.

To be included in the NHS Apps Library, there must be far tighter restrictions on data transfer, sale and exploitation – burying a statement somewhere on page 97 of the terms of use, because “this is part of our business model”, may suffice for the Android Play Store and the Information Commissioner – it cannot be sufficient for an endorsement by the NHS.

If an app is able to connect to the NHS infrastructure, it must honour the consent settings available to whatever NHS service it connects to – which includes providing a complete, patient-accessible audit trail. The vast majority of apps will not be connected, so they must proactively request consent – with informed opt-in (not opt-out) for any and all data transfers to third parties, and a separate opt-in for any sale of data.

In fact, good apps should probably follow Apple’s lead or equivalents that are beginning to emerge in other places: health data stays in a locked silo on your device, in your control, and all transfers and processing must honour your wishes. If you claim to be doing research, and you want to use the NHS brand, then your project must have received ethics approval.

When you walk into a pharmacy, if you look, there’s a sign which tells you the name and registration number of the professional currently responsible for dispensing from that pharmacy. On the page for each app in the NHS Apps Library, the equivalent information should be visible: who is responsible for the quality of this app? NHS England may decide the answer “no-one” is OK as an answer – but patients deserve to know that.

If all these and the existing – and emerging – criteria for apps are not met, NHS England’s Apps Library (which sits on MPA Red-rated NHS Choices) will simply accelerate the race to the bottom for predatory data sale, and public confidence in its recommendations will collapse. Again.

You would hope by now that NHS England has been “listening” and learning enough to realise the very real risks of jumping feet-first into a “visionary” programme; there’s a lot at stake, but it’s your medical data they’re gambling with.

2 thoughts on “If they’re ‘appy and you’re worried clap your hands…

  1. Pingback: NHS-backed apps put patient data at risk – Digital Health Intelligence

  2. Pingback: NHS-backed apps put patient data at risk – Digital Health Intelligence

Comments are closed.