Privacy is not just a technical problem

Today’s article looking at the challenges facing the new government turns to NHS IT and data. Emma Byrne is a software engineer and academic, whose research interests include the NHS. She has also worked on the independent evaluation of the NHS Summary Care Record. She is also a member of ORG’s Board.

When your medical record was held in a brown cardboard envelope you could be fairly certain that you were confiding in your doctor or nurse. And if you did discover that the details of your health problems were doing the rounds, you’d at least have a pretty good idea who to blame. When we decide to consult a doctor, we all carry out a pretty sophisticated mental calculus that balance our need for help with the pain of disclosing intensely private information. That’s why “Doctor-patient confidentiality” is a tenet of medical folklore as precious as “first do no harm.”

Let’s be fair: the people who look after your health are a well meaning bunch on the whole, but the more people who have access to your information, and the less likely they are to ever meet you, the more likely it becomes that your most sensitive data will end up being misused. In 2009 a doctor in Scotland gained unauthorised access to the medical records of Gordon Brown, Alex Salmond and a number of other high profile people. It’s unlikely that this will be a one-off incident.

Bizarrely enough, I’m slightly reassured by this story: we know about incidents in Scotland only because their Emergency Care Record system is set up to notify patients when their record has been accessed. In England, you have no idea if your Summary Care Record (SCR) has been accessed or not. Even worse, in 2009 a GP testing the security of their own (opted-out) SCR, found that he could “toggle” the opt out flag, view his SCR, then change the flag back again, all without generating an alert. In Scotland there is some protection afforded by the fact that you know who has accessed your records. In England your (supposedly blank) record could be accessed and you would have no idea about it.

Even where medical professionals are acting in a patient’s best interests, there are still circumstances where patient record security is at risk. The National Programme for IT (NPfIT) has spent a lot of time and money on technical measures such as role-based access cards. However, they haven’t been anywhere near as diligent about the need to understand how medical professionals work. In a busy A&E department, difficulties posed by poorly-designed security measures may cost lives. Faced with a choice between loss of life and loss of confidentiality, South Warwickshire General Hospitals NHS Trust chose to share smartcards.

The problems are made far worse by the way NPfIT runs its key projects. When a project simultaneously manages to be “not much use” and “too big to fail” you have a recipe for perverse incentives and disastrous privacy consequences. The biggest project in the NHS, the Summary Care Record (SCR) is a clear example of this.

The political pressure for the SCR to be seen as a success has always been immense: it was announced in 1997 as a personally favoured project of the then Prime Minister, Tony Blair. But this was never a vision shared by the doctors and nurses working in the NHS. When we studied the way health care professionals felt about the SCR in 2008, most of them said that they didn’t really see the point of it: if you have an accident they would much rather get the information from you directly, either by examining you or by talking to you or your carer.

Given that it’s not particularly effective at improving health care, the project has to be seen to be a success in some other way. As a result, the reported “benefits” of the SCR consist of things like “the growth in number of patient records on the system,” and “the number of times that SCRs have been accessed.” In turn, this has lead to the following troubling developments:

The increase in shared medical information leaves us ever more exposed. I would suggest four urgent steps that the new administration should do take to reverse this trend.

  1. Put the patent at the heart of the process: if we are to spot abuses of trust, we should be notified every time our records are used.
  2. More users = less security. Don’t store or share information unless there is a clear clinical need.
  3. Don’t saddle projects with political baggage. It makes nigh on impossible for the project to change in the face of honest criticism.
  4. Don’t assume that users will always behave as they should: people will circumvent technical security measures for all sorts of reasons. Foster a professional culture that rewards good information governance and punishes abuses of trust.

There’s a lot at stake here: when it’s business as usual for mental health records to be visible to council staff, the pain of disclosure becomes acute, and the drive to seek help a lot harder to act on.

(Editor’s note: Supporters can get practical steps on how to protect their privacy and campaign to preserve medical confidentiality from TheBigOptOut.org)

 

I have been contacted by the Clinical Director for the Summary Care Record and have been asked to make the following corrections to my piece “Privacy is not just a technical problem”:
The Scottish version of the SCR, the Emergency Care Summary system, does not actively send patients an alert when records are viewed. The system maintains audit trails that can be seen by GPs and by patients on request. In the case referred to in the original article it was a practice manager who spotted the security breach and alerted the patients.
In the English system, alerts are generated when someone “self certifies” that they have a legitimate relationship with the patients. These are sent to privacy officers in the governance of the setting where record was viewed. In a hospital trust, for example, a privacy officer is based in the hospital.
In the incident of “flag toggling” referred to above, the flag did not control the creation of the GP’s own record – it only changed whether or not it would be shared. The flag has been replaced with one that controls whether the record is created. This flag is currently maintained on the GPs’ own systems but will eventually be stored on the Spine. There is no system alert to the patient when opt out status changes on the GP system. Nor will there be a system of alerts of changes when the codes move to the Spine.
I am happy to make these corrections.

Correction:

I have been contacted by the Clinical Director for the Summary Care Record and have been asked to make the following corrections to my piece “Privacy is not just a technical problem”:

The Scottish version of the SCR, the Emergency Care Summary system, does not actively send patients an alert when records are viewed. The system maintains audit trails that can be seen by GPs and by patients on request. In the case referred to in the original article it was a practice manager who spotted the security breach and alerted the patients.

In the English system, alerts are generated when someone “self certifies” that they have a legitimate relationship with the patients. These are sent to privacy officers in the governance of the setting where record was viewed. In a hospital trust, for example, a privacy officer is based in the hospital.

In the incident of “flag toggling” referred to above, the flag did not control the creation of the GP’s own record – it only changed whether or not it would be shared. The flag has been replaced with one that controls whether the record is created. This flag is currently maintained on the GPs’ own systems but will eventually be stored on the Spine. There is no system alert to the patient when opt out status changes on the GP system. Nor will there be a system of alerts of changes when the codes move to the Spine.

I am happy to make these corrections.