Digital Privacy

What does David Cameron want?

On Monday, David Cameron declared war on encryption as the latest knee-jerk reaction to the atrocities committed in Paris against Charlie Hebdo journalists. He asked whether:

“we want to allow a means of communication between two people which even in extremis with a signed warrant from the home secretary personally that we cannot read? … My answer to that question is no, we must not. The first duty of any government is to keep our country and our people safe.”

On the face of it, he is pushing to ensure that encryption is always reversible after a warrant signed by the Home Secretary. We know very little about exactly what Cameron believes he can propose in order to access encrypted material, or even how much material that is encrypted is truly inaccessible. Instead, his unclear and highly broad remarks have caused an unhelpful debate about whether, in principle, law enforcement and security services should ‘always’ be able to read communications.

Of course, that is impossible. You cannot ‘always’ be able to open, read, or find a record of a communication. Nor should it be compulsory for you and I to record every time time we talk to someone, online or offline. But we should take a moment to consider what Cameron might actually be proposing.

The security services and police can try to access the plaintext content and metadata of your communications from at least four places.

  1. On your device, where you store email or other communications, or on the device of the person that you talked to
  2. In transit, when data moves from your device to a service or person
  3. At your ISP, your metadata can be accessed, if they have recorded details of your communication
  4. At the Internet platform, such as Google or Facebook, if they store a copy of your communications

It won’t always be true that a record will be kept at each or all of these points. The content may be encrypted by the end user at each point it is stored. The police or GCHQ might find it hard to decrypt information: Cameron appears to be demanding that it be made possible to decrypt any information at some point without the knowledge of the person who is under surveillance.

Encrypted information can always be accessed by use of the specific private keys and / or a passphrase (for instance a number or pattern you type into your phone to unlock it). It has been a criminal offence since 2007 to refuse to hand over keys or passphrases and numerous people have been convicted (albeit some convictions seem unsatisfactory because the accused had significant mental health issues).

Let’s look at the different places data might be accessed in turn.

Devices:

Both Apple and Android phones now encrypt their storage by default, so you can be a little less worried if you lose your phone, with perhaps photos, banking, contact and email information on it. These could be useful to criminals and you would be concerned if it was not encrypted and safe.

The same applies to computers. You and your workplace should be encrypting your hard drives in case your computer is stolen.

In transit:

We do know that the information in transit has been made more secure, so this will mean that intelligence and the police have to go to the companies more often, rather than simply harvest the data off the wire, as TEMPORA attempts to do (this is the GCHQ program which takes over 30% of UK-US Internet traffic for analysis at Bude, Cornwall).

Encryption for in-transit communications also protects you against mobile operators and ISPs trying to read your communications. It is vital when you transmit financial data in case criminals try to access it. However, we know that GCHQ and others go to some lengths to circumvent technologies that protect communications in transit. But it is important for people and businesses that communications are transmitted securely.

At your ISP:

Some records are kept at your ISP or by mobile providers. However, these are perhaps less relevant as we don’t use ISPs as much to provide email, for instance. This is one reason why the government wants the Snoopers’ Charter: they want richer records of your online communications that are stored and easily available within the UK.

At the Internet platform:

Most services store information in ways they can access, so they can make commercial use of it. This information can be retrieved, although with some companies, it may be necessary to go through the US courts.

With some communications platforms, the end user might encrypt the contents, which makes it inaccessible to the platform. This includes the body of an email, encrypted by PGP, or the content of Google chat, when a user uses “Off The Record” (OTR) software, which encrypts your messages when using certain chat platforms. Or you could store encrypted files at Dropbox: Dropbox can’t read the document if you use your own encryption tools.

Some companies try to provide more private communications that they cannot read, so these may be the target of Cameron’s complaint. Often the reason for private communications is business security, because of sensitive information (such as trade secrets, confidential deals or storing intellectual property) or a desire for personal privacy, prompted by oversharing on platforms like Facebook. It is hard to argue that these groups do not deserve privacy. It’s really difficult to see how platforms can stop end users from encrypting their own content.

The magic bullet

It should be obvious that there are good reasons for encrypting information at most of the points that it is transmitted or stored. Cameron argues however that privacy is not an ‘absolute’ and the police should therefore ‘always’ be able to break the encryption.

Requiring companies to have back door access is problematic because not everyone uses a commercial service to encrypt their data – you could use PGP on email for instance. Companies cannot add back doors if users are running their own encryption tools.

He could ask that companies are responsible for storing private personal encryption keys. This is obviously a bad idea, as your security is automatically compromised. It is also unenforceable: why should anyone comply with such a requirement?

Another means of gaining access to encrypted material could be to require ‘master keys’ for encrypted material. This is called ‘key escrow’.

The problem with key escrow or the use of master keys is that they leave a particular encryption method with a secret backdoor, and give every criminal the certain knowledge that this backdoor exists. Criminals then know that they can find a way to break into encrypted material, given a certain amount of effort. Thus the barrier to breaking in becomes time and money, so is a question of the value of the material you want access to. A more general problem is that criminals simply don’t have to use encryption which is compromised by escrow, leaving law abiding citizens with the risks, while criminals simply use safer but perhaps illegal technologies. The use of escrow is again unenforceable.

Cameron may be angling for more pragmatic measures, such as dissuading commercial platforms from storing encrypted material, or legal compulsions to find ways to compromise someone’s security in certain circumstances. He could seek to mandate weak keys or weak encryption. Perhaps he wishes to target VPNs to require logging, to ban Tor exit nodes, or systems that are designed to prevent the provider from recording communications.

Measures like these are likely to be undesirable as well: but we need to know what exactly he believes is a problem, rather than hearing bland generalisations which inevitably sound incredibly dangerous to people’s everyday security. Only then can we assess how bad an idea it is, although it should be clear that anything which compromises security is likely to adversely affect somebody with legitimate reasons to value their information.

If we find that Cameron is seeking to limit people’s access to safe and truly effective encryption technologies, then he will find a great deal of resistance. People can write their own encryption software, and run it themselves: this is hard to stop. Companies supply many markets, and may be unwilling to sacrifice technologies that make their products effective. The prospect of lowering privacy and security across the globe, and increasing the surveillance powers of states that have less regard for human rights may begin to look distasteful. But first Cameron needs to explain what he really means.