call +44 20 7096 1079
May 21, 2010 | Emma Byrne

Privacy is not just a technical problem

Today's article looking at the challenges facing the new government turns to NHS IT and data. Emma Byrne is a software engineer and academic, whose research interests include the NHS. She has also worked on the independent evaluation of the NHS Summary Care Record. She is also a member of ORG's Board.

When your medical record was held in a brown cardboard envelope you could be fairly certain that you were confiding in your doctor or nurse. And if you did discover that the details of your health problems were doing the rounds, you'd at least have a pretty good idea who to blame. When we decide to consult a doctor, we all carry out a pretty sophisticated mental calculus that balance our need for help with the pain of disclosing intensely private information. That's why "Doctor-patient confidentiality" is a tenet of medical folklore as precious as "first do no harm."

Let's be fair: the people who look after your health are a well meaning bunch on the whole, but the more people who have access to your information, and the less likely they are to ever meet you, the more likely it becomes that your most sensitive data will end up being misused. In 2009 a doctor in Scotland gained unauthorised access to the medical records of Gordon Brown, Alex Salmond and a number of other high profile people. It's unlikely that this will be a one-off incident.

Bizarrely enough, I'm slightly reassured by this story: we know about incidents in Scotland only because their Emergency Care Record system is set up to notify patients when their record has been accessed. In England, you have no idea if your Summary Care Record (SCR) has been accessed or not. Even worse, in 2009 a GP testing the security of their own (opted-out) SCR, found that he could "toggle" the opt out flag, view his SCR, then change the flag back again, all without generating an alert. In Scotland there is some protection afforded by the fact that you know who has accessed your records. In England your (supposedly blank) record could be accessed and you would have no idea about it.

Even where medical professionals are acting in a patient's best interests, there are still circumstances where patient record security is at risk. The National Programme for IT (NPfIT) has spent a lot of time and money on technical measures such as role-based access cards. However, they haven't been anywhere near as diligent about the need to understand how medical professionals work. In a busy A&E department, difficulties posed by poorly-designed security measures may cost lives. Faced with a choice between loss of life and loss of confidentiality, South Warwickshire General Hospitals NHS Trust chose to share smartcards.

The problems are made far worse by the way NPfIT runs its key projects. When a project simultaneously manages to be "not much use" and "too big to fail" you have a recipe for perverse incentives and disastrous privacy consequences. The biggest project in the NHS, the Summary Care Record (SCR) is a clear example of this.

The political pressure for the SCR to be seen as a success has always been immense: it was announced in 1997 as a personally favoured project of the then Prime Minister, Tony Blair. But this was never a vision shared by the doctors and nurses working in the NHS. When we studied the way health care professionals felt about the SCR in 2008, most of them said that they didn't really see the point of it: if you have an accident they would much rather get the information from you directly, either by examining you or by talking to you or your carer.

Given that it's not particularly effective at improving health care, the project has to be seen to be a success in some other way. As a result, the reported "benefits" of the SCR consist of things like "the growth in number of patient records on the system," and "the number of times that SCRs have been accessed." In turn, this has lead to the following troubling developments:

The increase in shared medical information leaves us ever more exposed. I would suggest four urgent steps that the new administration should do take to reverse this trend.

  1. Put the patent at the heart of the process: if we are to spot abuses of trust, we should be notified every time our records are used.
  2. More users = less security. Don't store or share information unless there is a clear clinical need.
  3. Don't saddle projects with political baggage. It makes nigh on impossible for the project to change in the face of honest criticism.
  4. Don't assume that users will always behave as they should: people will circumvent technical security measures for all sorts of reasons. Foster a professional culture that rewards good information governance and punishes abuses of trust.

There's a lot at stake here: when it's business as usual for mental health records to be visible to council staff, the pain of disclosure becomes acute, and the drive to seek help a lot harder to act on.

(Editor's note: Supporters can get practical steps on how to protect their privacy and campaign to preserve medical confidentiality from TheBigOptOut.org)

 

I have been contacted by the Clinical Director for the Summary Care Record and have been asked to make the following corrections to my piece “Privacy is not just a technical problem”:
The Scottish version of the SCR, the Emergency Care Summary system, does not actively send patients an alert when records are viewed. The system maintains audit trails that can be seen by GPs and by patients on request. In the case referred to in the original article it was a practice manager who spotted the security breach and alerted the patients.
In the English system, alerts are generated when someone “self certifies” that they have a legitimate relationship with the patients. These are sent to privacy officers in the governance of the setting where record was viewed. In a hospital trust, for example, a privacy officer is based in the hospital.
In the incident of “flag toggling” referred to above, the flag did not control the creation of the GP’s own record – it only changed whether or not it would be shared. The flag has been replaced with one that controls whether the record is created. This flag is currently maintained on the GPs’ own systems but will eventually be stored on the Spine. There is no system alert to the patient when opt out status changes on the GP system. Nor will there be a system of alerts of changes when the codes move to the Spine.
I am happy to make these corrections.

Correction:

I have been contacted by the Clinical Director for the Summary Care Record and have been asked to make the following corrections to my piece “Privacy is not just a technical problem”:

The Scottish version of the SCR, the Emergency Care Summary system, does not actively send patients an alert when records are viewed. The system maintains audit trails that can be seen by GPs and by patients on request. In the case referred to in the original article it was a practice manager who spotted the security breach and alerted the patients.

In the English system, alerts are generated when someone “self certifies” that they have a legitimate relationship with the patients. These are sent to privacy officers in the governance of the setting where record was viewed. In a hospital trust, for example, a privacy officer is based in the hospital.

In the incident of “flag toggling” referred to above, the flag did not control the creation of the GP’s own record – it only changed whether or not it would be shared. The flag has been replaced with one that controls whether the record is created. This flag is currently maintained on the GPs’ own systems but will eventually be stored on the Spine. There is no system alert to the patient when opt out status changes on the GP system. Nor will there be a system of alerts of changes when the codes move to the Spine.

I am happy to make these corrections.

google plusdeliciousdiggfacebookgooglelinkedinstumbleupontwitteremail


Comments (7)

  1. Ben Toth:
    May 27, 2010 at 08:42 PM

    Good article, but there's a few points to make:

    - the first urgent step in the article is vital to reforming health informatics, but the idea that a patient has some ownership rights over the data about themselves is even less popular with clinicians than SCR (typo - patents should be patient)

    - the brown envelope was always just part of a person's medical record. And they weren't particularly secure (or accurate, or useful).

    - there will always be political baggage in any project about patient records, including from clinicians - it's not just politicians who have interests.

  2. Peter Singleton:
    Jun 17, 2010 at 11:14 AM

    The idea of alerting whenever a patient's consent flag is changed comes up ever so often. The key question with alerts is who gets it and what would they do about it.
    Sending it to the patient would make sense as they should know if they had had a recent encounter and had asked for changes. However, sending letters is expensive, not necessarily reliable, and open to misdirection and mishandling, which is probably why it hasn't been taken up with enthusiasm.
    Sending an alert to anyone else about a change in the patient's consent status is simply pointless - what are they to do to determine whether this is valid or not? If there was a recent encounter, then there may have been such a request; if there wasn't an encounter then there may still have been a letter from the patient asking for a change (this raises authentication issues), but the supposed auditor won't be able to tell that. Even so, how would they know what the request was: to opt-in or to opt-out? So they would have to contact the patient to check - which brings us back around to checking with the patient as being the more effective control in the first place.
    We could try using an email or SMS message instead of a letter - it would be cheaper, and could easily be piloted to see what did or didn't work.

  3. Emma Byrne:
    Jun 10, 2010 at 07:36 AM

    I totally concur with your points. The point about the brown envelope was not that it was particularly secure - it was just that you knew who to blame when your information leaked.

    I'm under no illusions that changing clinicians' culture will be any easier than changing civil service culture (or indeed, changing patient culture - in our first evaluation we found a surprisingly high proportion of people who actively don't care about their medical records.) Just because it's hard, doesn't mean it's wrong.

  4. Peter Singleton:
    Jun 17, 2010 at 11:27 AM

    I think as an academic you should be more precise about your phrasing. Using 'it's business as usual for mental health records to be visible to council staff' is clearly meant to suggest that information is wholly unfettered, when even the 'blog' makes it clear that access is restricted to (a rather large number, admittedly) of those that may need to know.
    The 'blog' itself does not seek to determine whether it was the whole of the patient's records which were available (which I doubt, or simply the core fact that she was being treated for depression - unfortunately, there is not enough evidence to make it clear whether it was or was not appropriate for this information to be shared. However, from what I can glean SWIFT is intended to support the Common Assessment Framework (CAF) process rather than being a primary system for mental health, so it would seem very unlikely that 'mental health records' are actually being shared with local authority staff.
    It is rather worrying for the validity of your recent SCR report if you play so fast and loose with both facts and phrasing.

  5. external usb cd burner:
    Feb 09, 2011 at 08:55 AM

    Don't assume that users will always behave as they should: people will circumvent technical security measures for all sorts of reasons. Foster a professional culture that rewards good information governance and punishes abuses of trust.

  6. iscsi chap authentication:
    Feb 09, 2011 at 09:00 AM

    I totally concur with your points. The point about the brown envelope was not that it was particularly secure - it was just that you knew who to blame when your information leaked.

  7. Atlas America Insurance:
    Feb 22, 2011 at 03:47 PM

    Privacy of medical records is a huge issue in N. America as well, when I worked in business development for a chain of hospitals in my area, everyone, from janitors to doctors had to be immersed in the patient privacy policies known as HIPAA. You can read more about those regulations for privacy protection here: http://www.hhs.gov/ocr/privacy/.



This thread has been closed from taking new comments.