Blog


March 07, 2017 | Ed Johnson-Williams

Yes, the CIA can hack phones but Signal and WhatsApp are still safe for nearly everyone

The CIA can hack phones but Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. The worst thing to do would be to throw our hands up in the air and give up on our digital security.

Wikileaks have published documents claiming that the CIA can use some vulnerabilities in the iOS and Android operating systems to hack mobile phones and then monitor anything that happens on those phones. The vulnerabilities are expensive to buy or discover. In order to keep their existence secret for as long as possible they are likely to have been used on a targeted basis.

Some journalists – working for newspapers including the New York Times, the Independent and the Telegraph – have reported this story as showing that the CIA can bypass the encryption on messaging apps like Signal and WhatsApp. This is emphatically not accurate. The apps themselves are secure. They are probably uncritically repeating a Wikileaks tweet to that effect.

If the CIA (or the NSA, MI5 or GCHQ for that matter) hacks your phone then they will be able to read messages on any messaging app, regardless of how good the app's encryption is.

There is a big difference between phone operating systems being hacked and message encryption being broken. If a messaging app’s encryption has been broken, that would affect every user of the app. The encryption in Signal and WhatsApp has not been broken.

If the CIA is so interested in you personally that they would hack your phone, then yes you are vulnerable to attack. This is not new. Most of us, however, are not national security journalists reporting on sensitive state secrets so the CIA hacking our phone is very unlikely. We can and should still use encrypted messaging apps to help keep our messages private and secure from people who a) aren’t as powerful and well-resourced as the CIA and b) far more likely to try to read our messages.

Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. 1 The worst thing to do would be to throw our hands up in the air and give up on our digital security.

If someone tunnels into a bank vault, the unbreakable lock on the vault's door doesn't help very much. But we know it's still a good idea to put a really good lock on your bank vault to deal with other break-in attempts.

There are issues with the way the CIA and other intelligence agencies hoard and use device vulnerabilities without reporting them to the manufacturers of the devices. If vulnerabilities in the devices remain unfixed it means that people’s devices are also open to attack from criminals and from other countries’ intelligence agencies.

From a personal security point-of-view it’s important to keep all of this in perspective. Most people are at far greater risk of their devices being infected from clicking a link in a phishing email than they are of being hacked by the CIA using a vulnerability in their device.

1. Issues about WhatsApp's data collection and contribution to Facebook's business model are covered here.

Update 8/3/2017, 10:30am - Added links to further incorrect media coverage that questioned the security of Signal and WhatsApp.

[Read more] (1 comments)


March 07, 2017 | Jim Killock

Scrap the DEBill Age Verification and censorship—before it is too late

The Digital Economy Bill (DEBill) is at the report stage in the House of Lords, a long way down the parliamentary process, and the concerns around AV and censorship of legal pornographic websites are still there. Today, further amendments have been published but unfortunately, they are too little and too late to limit the Bill’s harm to free expression and privacy of UK citizens.

The Bill is an example of how not to legislate for the Internet and complex social issues. Among other things, the DEBill, attempts to address the issue of under 18s seeing pornography, by forcing porn sites to implement Age Verification (AV). This ‘simple’ solution has been fraught with problems from the start.

Age verification - an unseemly scramble to patch it up

Age Verification is in a terrible mess. There are no privacy safeguards on the face of the Bill, which means that UK citizens could be at risk of having private information about their porn habits and sexuality leaked, hacked or exploited. ORG has repeatedly called for the privacy concerns to be addressed.

The Government now says that it “agrees that parliamentary oversight is necessary”.[1]  Their changes are published as amendments. One Statutory Instrument will spell out what is and is not to be censored (page one, altering Clause 15). A second sees the guidance from the AV regulator given statutory force (page four, adding in a new clause). A third regulates the regulator, adding in duties that have until now been forgotten (page five, adding a new clause).

If this sounds like a dog’s breakfast, then that is because it is. The Government is admitting that it has done its work so badly it will need to patch up the DEBill with three pieces of secondary legislation at a later date.

Apart from the democratic deficit of leaving the big problems to future, as yet undrafted secondary legislation, we need to ask whether the problems with age verification can be fixed this way. We are not confident they can.

The Bill allows the regulator, the BBFC, to regulate pornographic websites and ensure that they verify the age of their customers.The new clause says that a Statutory Instrument (SI) will provide guidance for “types of arrangements for making pornographic material available that the regulator will treat as complying”.

The Government claims this statutory “guidance” means that the BBFC can regulate the age checking tools and their privacy requirements.

However, this is rather dubious. AV providers are not mentioned in the Bill, only porn providers; the Bill does not spell out any power to regulate AV providers or what the duties might be.

Can the contents of this SI be applied to AV providers and impose privacy duties on them, if neither is mentioned in the Bill? How would the SI do this fairly and without exceeding the remit laid out for them in the Bill?

Website operators and Age Verification providers may wish to check customers’ age in all kinds of novel and potentially intrusive ways. They will have strong commercial reasons for doing so. It may seem to them be quite unreasonable for statutory instruments to go beyond the question of whether age is properly checked, and impose privacy duties not even referenced in the bill that run against their commercial interests.

The Bill therefore needs to be clear that it can regulate the AV providers, and not just the websites.

Consent and free choice

The SI would also need to ensure that users can choose their own tools. This point is essential. If the websites choose the age checking tools, then these will be chosen for the benefit of the websites, as either cheapest, or the best for tracking people, or simply the tool that everyone signed up to, whether or not it is private or safe.

It is not clear that the Government has understood the need for free choice and an open market. Previously, ministers have fallen back on claiming that data protection laws are sufficient, which they are not, as they rely on consent and free choice when you choose to relinquish some of your privacy. Consent is absent here, as everyone must use the tools in order to access the content.

To legislate this kind of structure into existence through SIs without any mention in the Bill is pretty unusual. It would be needed, but it is difficult to justify doing this through secondary legislation, which is meant to be about detail, not principle.

We cannot be confident that these problems will be fixed in the SI and by then it will be too late for Parliament to do anything about it. The absence of privacy duties is very worrying, not least because it suggests that the Government has not accepted that anything actually needs to be done.

Would safe Age Verification be enough?

It isn’t clear that even ‘safe’ age verification tools would be enough to prevent free expression harms. Some people won’t want to use these tools, and may not trust them. Ensuring they are safe would reduce the “chilling effect” but it isn’t clear that it will be enough to stop them creating a barrier to people accessing legal content, nor is it clear that publishers really should be given this duty.

Nevertheless, the Government should take its duties seriously, and ensure that privacy and anonymity are duties for the regulator, spelt out in the Bill.

The same applies to the censorship regime, which was added at the last minute.

Inadequate censorship safeguards

The government has not changed its mind about giving wide censorship powers to an administrative body, the BBFC. The BBFC will be allowed to censor any pornographic website that does not provide an Age Verification tool.

Most websites of course won’t implement Age Verification. The BBFC will therefore have a power that could be applied to millions of non-compliant websites. We are told that we should trust that the BBFC will only apply this power to block a small number, perhaps 100 major websites. This begs the question, what is the point? Blocking just 100 porn sites would not prevent under 18s from accessing pornography as there will be plenty of other sites for them to access.

This power would be a hostage to future demands to apply it more widely, to perhaps thousands more websites, in order to “make the Internet safe” for children, which would imply attempting extremely expansive blocking.

Once politicians realise that the vast majority of pornography is neither subject to Age Verification, nor blocked, calls to ramp up the number of blocked websites will be made, and the power will be available to block them. All that will be needed is more money and a change of policy at the BBFC.

On appeals, the response concedes that an independent body should review any appeals:

“The government accepts, however, that the appeal must be considered by someone independent from the original decision maker and will table appropriate amendments to require the Secretary of State to be satisfied that the appeal will be independent before designating the regulator. [...] Additionally, the Secretary of State may issue guidance to the regulator on this matter” [Digital Economy Bill: Government Response  Clause 17(4) - appeal arrangements]

However, it does not envisage these involving the courts. This would mean that appeals might be dealt with narrowly, and fail to deal with free expression and association impacts adequately.

The Bill is clearly in a mess

The only conclusion we can make is that the Bill is so far from ready, so absent of safeguards, that these sections need to be dropped.

At this point, the Bill is meant to be in its final shape, and the Government is proposing its last amendments. There is of course a slim chance that the Opposition could force a vote to insert privacy safeguards. This would be unlikely to be enough, given the stage we are at at.

The only sensible course of action is to remove the entire section from the Bill.


[1] “The manner in which age verification technology operates will be fluid given the pace of technological change … the regulator should remain responsible for the production of guidance about the types of arrangements for making pornographic material available that it treats as complying with being “not normally accessible” to persons under the age of 18, but. The government will table amendments so that that the guidance must be laid before Parliament subject to the affirmative procedure for first exercise of the power and the negative procedure thereafter.”

“We are planning to underpin these changes with the introduction of a clause which will give the Secretary of State Power to publish guidance, which the regulator must have regard to, as to how the its exercises its functions, including the guidance it produces.”   [Digital Economy Bill: Government Response Clauses 15(3), 21(9) and 22(7) - the regulator’s guidance]

[Read more] (3 comments)


March 06, 2017 | Jim Killock

Why the IPO needs to change the criminal offence for online copyright infringement

The IPO says no change is needed to their proposed criminal offence for online copright infringement punishable by ten year sentences

The IPO has responded to your letters to the minister Jo Johnson MP about the new 10 year sentences for online copyright infringement.

In the bill, ten year sentences are available where online publication of a copyright work means that a “loss” has occurred (including not paying licence fees) or a “risk of loss” is created.

We do not think the IPO have adequately explained why they cannot or should not introduce a threshold for criminality.

The IPO says:

It is important to note that the criminal offences apply to making material available to others, not to those just downloading material to their computers. Anyone seeking to enforce their rights for the downloading of material would be unlikely to refer to this legislation.

Ten year sentences would only be applied in the most serious of criminal circumstances. It is highly unlikely that small, unintentional infringement would be caught by this offence. [our emphasis]

As we have said, publication without a licence is often an intentional act, where people either know or ought to know that they are infringing copyright.  The question is whether these usually minor offences are worthy of criminal sanctions?

Examples could include:

  • Using copyright pictures from other websites, such as images of politicians or famous buildings, on a personal blog or social media

  • Using images of musicians or actors found on news websites, for instance from award events, on a blog or social media

  • Sharing files (which includes uploading as well as downloading) via Bittorrent at small scale

In each case, a licence has not been paid, the user should understand they are infringing copyright, and they are causing further risks that other people might reuse or re-share the images or files.

In the case of file sharing, it is only ever detected when files are “uploaded” (and shared back to the copyright owner or their agent).

The acts appear to be criminal under the proposed offence. We understand that they are unlikely to be sentenced, or even prosecuted, but the question remains as to why these minor acts should be criminalised, rather than being subject to civil charges.

The risk of an increase of ‘trolling’ is considered to be low but the government will periodically review and respond to any concerns.

We may never hear about many threats sent privately. Gathering evidence of harm will be extremely difficult except in the most egregious examples of letters sent in their thousands.

The proposed offence creates new opportunities for trolls, while there is a simple way to remove this risk, which is to introduce thresholds. The statement says that:

It would not be practical for the government to set a specific level of loss or gain at which infringement becomes a criminal offence. This is because the circumstances of each infringement needs to be taken into account.

Our suggestions are not for "specific" levels of loss or gain, such as “acts causing under £200 of damages”.

Our proposal is to set a threshold of "commercial scale loss", and revising "risk of loss" to "serious risk of commercial scale loss". These are flexible rather than “specific”, so the government’s objection does not make sense to us.

If the losses are small, and the risks are minor, why should “circumstances” mean that an act should be criminal?

Our changes would give the public, lawyers and courts a clear indication that minor acts of file sharing or unlicensed online publication would be unlikely to meet the thresholds of "serious risk" or "commercial scale" losses.

This would protect people who received threatening letters - whether in bulk or privately, under the radar.

It is true that some minor acts of copyright infringement can be regarded as criminal today. The current offence criminalises “prejudicial effect”, which we agree is insufficiently narrow. The IPO argues that it has tried to narrow this by focusing on the intention of the infringer.

However, the proposed changes do not solve the original problem of criminalising ordinary internet users. Introducing the fuzzier “risk of loss” actually makes it more likely that grannies and teenagers will end up facing threats of criminal charges, perhaps agreeing to admit guilt, or simply paying up when faced with threats.

This change is small, and sensible, and we ask the government to look at this again. 

You can email the minister Jo Johnson MP to tell him to change it before it is too late..

 

[Read more]


February 09, 2017 | Jim Killock

Ten years jail for file sharers—the governments’ gift to copyright trolls

Why does the government want to encourage legal threats to grannies and put file sharing teens in prison for a decade?

Gran being asked to cease and desist for copyright infringementTen years jail for filesharing: or in fact any minor copyright infringement where there is a “loss by not getting what one might get” or cause a “risk” of further infringement.

Clause 27 of the Digital Economy Bill will mean that more or less any wrongful use where somebody hasn’t paid a licence fee (think of memes) is a crime. Causing “risk” to the copyright holder means almost by definition ordinary file sharing is a criminal rather than civil infringement.

Is the government really intending to threaten teenagers with prison?

Why has the Digital Economy Bill been left with such a stupid legal change? Both the government and the Intellectual Property Office said they just wanted to bring online infringement into line with “real world” fake DVD offences. They were worried about the difficulties with charging people who run websites that help people download copyright works.

However, that isn’t how they offence is drawn up: and the government has now been told in Parliament twice that they are both criminalising minor infringements and helping copyright trolls. Copryight trolls, we should remember, specialise in threats concerning file sharing of niche pornographic works in order to frighten and embarass people into payment, often incorrectly, and to our knowledge, have never taken anyone to court in the UK.

The answers have been startlingly bad. Kevin Brennan stated, for Labour:

The Open Rights Group has expressed concern about the Government’s insistence that there needs to be “reason to believe” that infringement will cause loss or “the risk of loss”. Its fear is that that phrase, “the risk of loss”, could capture quite a wide range of behaviour, perhaps beyond the scope of what the Government say they intend. In particular, its concern is the extent to which that phrase will capture file sharing.

Copyright trolls get their profits when a certain number of people are scared enough to respond to those notifications and pay up. Frequently these accusations are incorrect, misleading and sent to account holders who did not sanction any such further file sharing. However, as I understand it, sending that kind of speculative threat to consumers is, unfortunately, perfectly legal. Some are concerned that if the Bill retains the concept of risk of loss, it could aid the trolls by enabling them to argue with more credibility that account holders may face criminal charges and a 10-year prison sentence.

Matt Hancock gave a non-answer:

We recognise that the maximum sentence of 10 years, even if only for the most serious cases, must be carefully targeted. Consequently, clause 26 also makes changes to the existing offence of online copyright infringement to make it clearer when that offence is committed and who should be considered liable. The amendments speak to some of those points.

The concept of prejudicial effect in the existing legislation will be replaced with a requirement that the infringer intends to make a monetary gain for themselves or knows or has reason to believe their actions will expose the rights holder to a loss or risk of loss in money. I will come to the debate around definition of that in more detail.

The point of this clarification is to act as a safeguard to ensure that the increased maximum penalty is applied only to serious criminals who deserve it and will not apply to those who share material accidently or without knowledge of the consequences.

In the Lords, Labour suggested returning to the current definition of “prejudicial effect”: which (as Matt Hancock says) suffers the same problem of being very wide and catching people it should not.

The government have failed to give any serious answers. The Opposition, Labour and Liberal Democrat should be able to see that an egregious mistake is being made, and they have the ability to force a change.

The problem is really easily fixed. The government simply need to put in thresholds to ensure that only significant damage or serious risk is caused. We have an amendment prepared and published.

Why does the government want to help copyright trolls bully grannies and criminalise file sharers whose actions may be idiotic, but hardly criminal?

The government needs to fix this before it becomes law and abuse of copyright ensues.

 

[Read more] (4 comments)


February 08, 2017 | Jim Killock

Just how much censorship will the DEBill lead to?

How could the power to block pornographic websites lead to massive censorship, when the BBFC thinks it wants want to censor “just” a few hundred sites.

Officials wrote to the New Statesman yesterday to complain about Myles Jackman’s characterisation of the Digital Economy Bill as leading to an attempt to classify everything on the Internet. (They perhaps hadn’t understood the satire.)

However, the fact of the matter is that the DE Bill gives the BBFC (the regulator, TBC) the power to block any pornographic website that doesn’t use age verification tools. It can even block websites that publish pornography that doesn’t fit their guidelines of taste and acceptability - which are significantly narrower than what is legal, and certainly narrower than what is viewed as acceptable by US websites.

A single video of “watersports” or whipping produces marks, for instance, would be enough for the BBFC to ban a website for every UK adult.

The question is, how many sites does the regulator want to block, and how many can it block?

Parliament has been told that the regulator wants to block just a few, major websites, maybe 50 or 100, as an “incentive” to implement age checks. However, that’s not what Clause 23 says. The “Age-verification regulator’s power to direct internet service providers to block access to material” just says that any site that fits the criteria can be blocked by an administrative request.

What could possibly go wrong?

Imagine, not implausibly, that some time after the Act is in operation, one of the MPs who pushed for this power goes and sees how it is working. This MP tries a few searches, and finds to their surprise that it is still possible to find websites that are neither asking for age checks nor blocked.

While the first page or two of results under the new policy would find major porn sites that are checking, or else are blocked, the results on page three and four would lead to sites that have the same kinds of material available to anyone.

In short, what happens when MPs realise this policy is nearly useless?

They will, of course, ask for more to be done. You could write the Daily Mail headlines months in advance: BBFC lets kids watch porn.

MPs will ask why the BBFC isn’t blocking more websites. The answer will come back that it would be possible, with more funding, to classify and block more sites, with the powers the BBFC has been given already. While individual review of millions of sites would be very expensive, maybe it is worth paying for the first five or ten thousand sites to be checked. (And if that doesn’t work, why not use machines to produce the lists?)

And then, it is just a matter of putting more cash the way of the BBFC and they can block more and more sites, to “make the Internet safe”. 

That’s the point we are making. The power in the Digital Economy Bill given to the BBFC will create a mechanism to block literally millions of websites; the only real restraint is the amount of cash that MPs are willing to pour into the organisation.

What could possibly go wrong?

[Read more] (1 comments)


February 07, 2017 | Slavka Bielikova

Government says privacy safeguards are not “necessary” in Digital Economy Bill

The Government still doesn’t consider privacy safeguards necessary in the Digital Economy Bill and they see court orders for website blocking as excessively burdensome.

BBFC 18 logoThe House of Lords debated age verification for online pornography last week as the Committee stage of the Digital Economy Bill went ahead.

Peers tabled a considerable number of amendments to improve the flawed Part 3 of the Bill, which covers online pornography. In their recent report, the Committee on the Constitution said that they are worried about whether a proper parliamentary scrutiny can be delivered considering the lack of details written on the face of the Bill. Shortly after the start of the debate it became obvious that their concerns were justified.

Lords debated various aspects of age verification at length, however issues of appeal processes for website blocking by Internet service providers and privacy safeguards for data collected for the age-verification purposes will have to be resolved at a later stage.

In our view, if the Government is not prepared to make changes to the Bill to safeguard privacy, the opposition parties should be ready to force the issue to a vote.

Appeals process for ISP blocking

Labour and Lib Dem Lords jointly introduced an amendment that would implement a court order process into the blocking of websites by Internet service providers. The proposal got a lot of traction during the debate. Several Peers disagreed with the use of court orders, arguing about the costs and the undue burden that it would place on the system.

The court order process is currently implemented for the blocking of websites that provide access to content that infringes copyright. However, the Government is not keen on using it for age verification. Lord Ashton, the Government Minister for Culture, Media and Sport, noted that even the copyright court order process “is not without issues”. He also stressed that the power to instruct ISPs to block websites carrying adult content would be used “sparingly”. The Government is trying to encourage compliance by the industry and therefore they find it more appropriate that ISP blocking is carried out by direction from the regulator.

The Bill doesn’t express any of these policy nuances mentioned by the Government. According to Clause 23 on ISP blocks, age-verification regulator can give a notice to ISPs to block non-complying websites. There is no threshold set out in the clause that would suggest this power will be used sparingly. Without such threshold, the age-verification regulator has an unlimited power to give out notices and is merely trusted by the Government not to use the full potential of the power.

The Government failed to address the remaining lack of legal structure that would secure transparency for website blocking by ISPs. Court orders would provide independent oversight for this policy. Neither the method of oversight, nor enforcement of blocking have been specified on the face of the Bill.

For now, the general public can find solace in knowing that the Government is aware that blocking all of social media sites is a ridiculous plan. Lord Ashton said that the Government “don’t want to get to the situation where we close down the whole of Twitter, which would make us one of two countries in the world to have done that”.

Privacy protections and anonymity

Labour Peers - Baroness Jones and Lord Stevenson and Lord Paddick (Lib Dem) introduced an amendment that would ensure that age-verification systems have high privacy and data protection safeguards.

The amendment goes beyond basic compliance with data protection regulations. It would deliver anonymity for age-verification system users and make it impossible to identify users throughout different websites. This approach could encourage people’s trust in age-verification systems and will reassure people to safely access legal material. By securing anonymity, people’s right to freedom of expression would be less adversely impacted. Not all the problems go away: people may still not trust the tools, but fears can at least be reduced, and the worst calamities of data leaks may be avoided.

People subjected to age verification should be able to choose which age-verification system they prefer and trust. It is necessary that the Bill sets up provisions for “user choice” to assure a functioning market. Without this, a single age-verification provider could conquer the market offering a low-cost solution with inadequate privacy protections.

The amendment received wider support from the Lords.

Despite the wide-ranging support from Lib Dem, Labour and cross-bench Lords, the Government found this amendment “unnecessary”. Lord Ashton referred to the guidance published by the age-verification regulator that will outline types of arrangement that will be treated as compliant with the age-verification regulator’s requirements. Since the arrangements for data retention and protection will be made in the guidance, the Government asked Lord Paddick to withdraw the amendment.

Guidance to be published by the age-verification regulator drew fire in the Delegated Powers and Regulatory Reform Committee’s Report published in December 2016. In their criticism, the Committee made it clear that they find it unsatisfactory that none of the age-verification regulator’s guidelines have been published or approved by Parliament. Lord Ashton did not tackle these concerns during the Committee sitting.

The issue of privacy safeguards is very likely to come up again at the Report stage. Lord Paddick was not convinced by the Government’s answer and promised to bring this issue up at the next stage. The Government also promised to respond to the Delegated Powers and Regulatory Reform Committee’s Report before the next stage of the Bill’s passage.

Given the wide support in the Lords to put privacy safeguards on the face of the Bill, Labour and Lib Dem Lords have an opportunity to change the Government’s stance. Together they can press the Government to address privacy concerns.

The Government was unprepared to discuss crucial parts of the Part 3. Age verification for online pornography is proving to be more complex and demanding than the Government anticipated and they lack an adequate strategy. The Report stage of the Bill (22 February) could offer some answers to the questions raised during the last week’s Committee sittings, but Labour and Lib Dems need to be prepared to push for votes on crucial amendments to get the Government to address privacy and free expression concerns.

 

[Read more] (3 comments)


January 20, 2017 | Javier Ruiz

Lords Committee slams data sharing powers in Digital Economy Bill

The Delegated Powers and Regulatory Reform Committee of the House of Lords has made some very critical recommendations about the data sharing proposals in the Digital Economy Bill.

report coverIn a report published today the Committee asks for the “almost untrammeled” powers given to Ministers in the Bill to be severely curtailed, and for all Codes of Practice associated with these data sharing powers to be laid before Parliament in draft for full approval before coming into force.

The Committee “consider it inappropriate” for Ministers to have the powers to define lists of specified persons and non-specific purposes related to public service provision, fraud or debt. Instead, they argue that those given the powers to share data and the purposes for which it is used should be on the face of the Bill, with Ministers only able to make very limited additions based on a clear necessity.

We can see that the Government will resist such a move, as that level of flexibility appears central to their approach to data sharing. If they plan to ignore these recommendations, the Cabinet Office will need to include much stronger safeguards on the face of the Bill about the criteria and processes for inclusion in the data gateways.

The report also raises concerns with the onward disclosure of shared data, which is subject to very broad exemptions for the purposes of crime, anti-social behaviour or legal proceedings.

The Committee starkly sets out that the data shared under these powers for benign social services could be used to bring criminal proceedings against the same individuals without restriction. This was always a red line during the open policy making pre legislative discussion where ORG participated. ORG has proposed various amendments to narrow down these further reuses of data, but we may have to revisit our proposals to further tighten them up.

We particularly welcome the Committee’s recommendations made on the Codes of Practice. The Government has so far refused to put key safeguards on the use of the powers on the face of the Bill, leaving these to the Codes. The Committee is under no doubt that the Codes are “legislative” in nature, despite the arguments by the government that these are not legally enforceable.

The report demands that the Codes are laid in draft form in front of Parliament for discussion and affirmative approval, and not just presented for filing in the statute book. They concede that further modifications could be made by negative procedure. Clarity on the full legal status of the Codes is critical, and we can only hope the Government will heed these recommendations, which chime with those of many others including ORG.

The Committee ask for various so-called “Henry VIII powers” peppered throughout the Bill to be narrowed down. These kind of powers add a provision to a Bill which enables the Government to repeal or amend it after it has become an Act of Parliament, and are an anachronism meant to be used sparingly for very narrow purposes. The Committee finds that some of these powers could be useful here to stop data sharing and narrow down future provisions, but the way they are written they could be used to expand the powers in the Bill without any accountability.

The report also tackles a fairly technical but potentially important point that ORG and others engaged in their process had missed so far: the so-called “dehybridisation clauses”. A Hybrid Instrument is a piece of legislation that disproportionately affects a particular group of people within a class. The clauses in the Bill simply state that this should be disregarded. This can be important due to an obscure provision in the House of Lords that gives those who are specially and directly affected by Hybrid Instruments

the opportunity to present their arguments against the SI [statutory instrument] to the House of Lords Hybrid Instruments Committee and then, possibly, to a select committee charged with reporting on its merits and recommending whether or not the SI should be approved by both Houses of Parliament. The hybrid instrument procedure is unique to the House of Lords and the process must be completed before the SI can be approved by both Houses.

We can see why the Government would want to remove this provision to speed up legislation, but it seems unfair and potentially abusive to simply decree that what may be a hybrid instrument should not be treated as such, thus denying those affected their right to make their case.

[Read more] (3 comments)


January 16, 2017 | Ed Johnson-Williams

Let's save 'backdoor' for the real thing

The Guardian reported on Friday last week that WhatsApp - owned by Facebook - has a “backdoor” that “allows snooping on encrypted messages”. The report was based on research by Tobias Boelter, published in April 2016. The Guardian has since changed the word "backdoor" in its article to "vulnerability" or "security vulnerability".

A few days before the Guardian article was published, the journalist contacted ORG for a quote. She couldn’t discuss the details of the alleged security flaw so we gave a generic quote about the importance of transparency from companies that offer end-to-end encryption and the dangers to encryption within the Investigatory Powers Act.

The vulnerability that was reported theoretically works like this. Say Ed is texting his dad on WhatsApp.

  1. Ed texts his dad on WhatsApp and his dad texts back - all good, happy families.
  2. Then Ed texts his dad again but his dad’s phone is off. Ed's message is still on Ed’s phone waiting to be sent.
  3. WhatsApp or somebody else with access to WhatsApp's servers registers Ed’s dad’s mobile number with WhatsApp on a different phone. This could be done by stealing Ed’s dad's SIM card or using vulnerabilities in the mobile phone network to re-route SMS confirmations.
  4. Ed's WhatsApp app now sees the number that used to be linked to his dad’s phone is active again and automatically re-sends the message.
  5. The new phone receives the message that Ed intended to send to his dad. The message never reaches Ed’s dad’s phone.
  6. Depending on whether a non-default setting is enabled, Ed may receive a notification saying that his dad’s security code has changed because he reinstalled WhatsApp or switched phones.

This means that somebody collaborating with WhatsApp could theoretically read a small number of messages. This is very unlikely though and would be very easy to detect. This is not a backdoor that WhatsApp can use for routine access to users’ messages. And unless an app forces you to verify encryption keys with someone before you can send and receive messages with them, and also whenever they change their phone, then this vulnerability is going to be present.

WhatsApp have made an intentional decision about usability. It means that - in the example given above - if Ed’s dad’s phone was off because it was broken, Ed’s dad could put his SIM card into a new phone and still receive the messages without anyone having to change anything.

It would be incredibly difficult for WhatsApp to use the vulnerability to read messages this way at scale without gaining a terrible reputation for not delivering messages. Lots of people would receive a notification saying that the security key of many of their intended recipients had changed. Messages would go missing. The risk to the company of actively tampering with someone's message stream is very high and would be very complicated to get right. And if you’re worried about law enforcement, they have other ways (such as hacking the phone) to target an individual WhatsApp user’s messages that would be cheaper, quicker, and more difficult for the target to detect.

Lots of people recommend Signal as an alternative to WhatsApp. Signal is a highly respected encrypted messaging app which is preferable to WhatsApp for many reasons. Unlike WhatsApp, Signal does not collect data about users and share that data with Facebook. Facebook’s business model is to collect as much data about people as possible to help sell advertising. And unlike WhatsApp, Signal’s code is open-source meaning it’s possible to verify that it’s working properly. Some people find Signal more difficult to use than WhatsApp.

But Signal are planning to use the same behaviour as WhatsApp that was reported as a backdoor in an attempt to make their app easier for people to use. As Matthew Green, Assistant Professor at Johns Hopkins University, said on Twitter in response to the Guardian’s article, “I wish we could put the word "backdoor" in a glass case and only bring it out when something is really deserving.”

It is a struggle to get people to use secure messaging tools. Facebook and WhatsApp’s business model leaves much to be desired and Signal does a lot more to respect the privacy of its users. But WhatsApp have been successful in getting millions of people to encrypt the contents of their messages end-to-end.

The UK’s Investigatory Powers Act has powers in it for the Government to serve companies with Technical Capability Notices for the “removal of electronic protection applied by a relevant operator” to force them to carry out hacking and intercept data for the Government.

There are big fights ahead on encryption and we have to remain vigilant to those. Let’s save the word “backdoor” for the real thing.

Update: I fixed point 3 to say that if's Ed's dad's SIM card were stolen, it could be used to re-register Ed's dad's WhatsApp account on a different phone. It used to say if 'Ed's SIM card' were stolen.

[Read more] (3 comments)