February 07, 2017 | Slavka Bielikova

Government says privacy safeguards are not “necessary” in Digital Economy Bill

The Government still doesn’t consider privacy safeguards necessary in the Digital Economy Bill and they see court orders for website blocking as excessively burdensome.

BBFC 18 logoThe House of Lords debated age verification for online pornography last week as the Committee stage of the Digital Economy Bill went ahead.

Peers tabled a considerable number of amendments to improve the flawed Part 3 of the Bill, which covers online pornography. In their recent report, the Committee on the Constitution said that they are worried about whether a proper parliamentary scrutiny can be delivered considering the lack of details written on the face of the Bill. Shortly after the start of the debate it became obvious that their concerns were justified.

Lords debated various aspects of age verification at length, however issues of appeal processes for website blocking by Internet service providers and privacy safeguards for data collected for the age-verification purposes will have to be resolved at a later stage.

In our view, if the Government is not prepared to make changes to the Bill to safeguard privacy, the opposition parties should be ready to force the issue to a vote.

Appeals process for ISP blocking

Labour and Lib Dem Lords jointly introduced an amendment that would implement a court order process into the blocking of websites by Internet service providers. The proposal got a lot of traction during the debate. Several Peers disagreed with the use of court orders, arguing about the costs and the undue burden that it would place on the system.

The court order process is currently implemented for the blocking of websites that provide access to content that infringes copyright. However, the Government is not keen on using it for age verification. Lord Ashton, the Government Minister for Culture, Media and Sport, noted that even the copyright court order process “is not without issues”. He also stressed that the power to instruct ISPs to block websites carrying adult content would be used “sparingly”. The Government is trying to encourage compliance by the industry and therefore they find it more appropriate that ISP blocking is carried out by direction from the regulator.

The Bill doesn’t express any of these policy nuances mentioned by the Government. According to Clause 23 on ISP blocks, age-verification regulator can give a notice to ISPs to block non-complying websites. There is no threshold set out in the clause that would suggest this power will be used sparingly. Without such threshold, the age-verification regulator has an unlimited power to give out notices and is merely trusted by the Government not to use the full potential of the power.

The Government failed to address the remaining lack of legal structure that would secure transparency for website blocking by ISPs. Court orders would provide independent oversight for this policy. Neither the method of oversight, nor enforcement of blocking have been specified on the face of the Bill.

For now, the general public can find solace in knowing that the Government is aware that blocking all of social media sites is a ridiculous plan. Lord Ashton said that the Government “don’t want to get to the situation where we close down the whole of Twitter, which would make us one of two countries in the world to have done that”.

Privacy protections and anonymity

Labour Peers - Baroness Jones and Lord Stevenson and Lord Paddick (Lib Dem) introduced an amendment that would ensure that age-verification systems have high privacy and data protection safeguards.

The amendment goes beyond basic compliance with data protection regulations. It would deliver anonymity for age-verification system users and make it impossible to identify users throughout different websites. This approach could encourage people’s trust in age-verification systems and will reassure people to safely access legal material. By securing anonymity, people’s right to freedom of expression would be less adversely impacted. Not all the problems go away: people may still not trust the tools, but fears can at least be reduced, and the worst calamities of data leaks may be avoided.

People subjected to age verification should be able to choose which age-verification system they prefer and trust. It is necessary that the Bill sets up provisions for “user choice” to assure a functioning market. Without this, a single age-verification provider could conquer the market offering a low-cost solution with inadequate privacy protections.

The amendment received wider support from the Lords.

Despite the wide-ranging support from Lib Dem, Labour and cross-bench Lords, the Government found this amendment “unnecessary”. Lord Ashton referred to the guidance published by the age-verification regulator that will outline types of arrangement that will be treated as compliant with the age-verification regulator’s requirements. Since the arrangements for data retention and protection will be made in the guidance, the Government asked Lord Paddick to withdraw the amendment.

Guidance to be published by the age-verification regulator drew fire in the Delegated Powers and Regulatory Reform Committee’s Report published in December 2016. In their criticism, the Committee made it clear that they find it unsatisfactory that none of the age-verification regulator’s guidelines have been published or approved by Parliament. Lord Ashton did not tackle these concerns during the Committee sitting.

The issue of privacy safeguards is very likely to come up again at the Report stage. Lord Paddick was not convinced by the Government’s answer and promised to bring this issue up at the next stage. The Government also promised to respond to the Delegated Powers and Regulatory Reform Committee’s Report before the next stage of the Bill’s passage.

Given the wide support in the Lords to put privacy safeguards on the face of the Bill, Labour and Lib Dem Lords have an opportunity to change the Government’s stance. Together they can press the Government to address privacy concerns.

The Government was unprepared to discuss crucial parts of the Part 3. Age verification for online pornography is proving to be more complex and demanding than the Government anticipated and they lack an adequate strategy. The Report stage of the Bill (22 February) could offer some answers to the questions raised during the last week’s Committee sittings, but Labour and Lib Dems need to be prepared to push for votes on crucial amendments to get the Government to address privacy and free expression concerns.


[Read more] (3 comments)

January 20, 2017 | Javier Ruiz

Lords Committee slams data sharing powers in Digital Economy Bill

The Delegated Powers and Regulatory Reform Committee of the House of Lords has made some very critical recommendations about the data sharing proposals in the Digital Economy Bill.

report coverIn a report published today the Committee asks for the “almost untrammeled” powers given to Ministers in the Bill to be severely curtailed, and for all Codes of Practice associated with these data sharing powers to be laid before Parliament in draft for full approval before coming into force.

The Committee “consider it inappropriate” for Ministers to have the powers to define lists of specified persons and non-specific purposes related to public service provision, fraud or debt. Instead, they argue that those given the powers to share data and the purposes for which it is used should be on the face of the Bill, with Ministers only able to make very limited additions based on a clear necessity.

We can see that the Government will resist such a move, as that level of flexibility appears central to their approach to data sharing. If they plan to ignore these recommendations, the Cabinet Office will need to include much stronger safeguards on the face of the Bill about the criteria and processes for inclusion in the data gateways.

The report also raises concerns with the onward disclosure of shared data, which is subject to very broad exemptions for the purposes of crime, anti-social behaviour or legal proceedings.

The Committee starkly sets out that the data shared under these powers for benign social services could be used to bring criminal proceedings against the same individuals without restriction. This was always a red line during the open policy making pre legislative discussion where ORG participated. ORG has proposed various amendments to narrow down these further reuses of data, but we may have to revisit our proposals to further tighten them up.

We particularly welcome the Committee’s recommendations made on the Codes of Practice. The Government has so far refused to put key safeguards on the use of the powers on the face of the Bill, leaving these to the Codes. The Committee is under no doubt that the Codes are “legislative” in nature, despite the arguments by the government that these are not legally enforceable.

The report demands that the Codes are laid in draft form in front of Parliament for discussion and affirmative approval, and not just presented for filing in the statute book. They concede that further modifications could be made by negative procedure. Clarity on the full legal status of the Codes is critical, and we can only hope the Government will heed these recommendations, which chime with those of many others including ORG.

The Committee ask for various so-called “Henry VIII powers” peppered throughout the Bill to be narrowed down. These kind of powers add a provision to a Bill which enables the Government to repeal or amend it after it has become an Act of Parliament, and are an anachronism meant to be used sparingly for very narrow purposes. The Committee finds that some of these powers could be useful here to stop data sharing and narrow down future provisions, but the way they are written they could be used to expand the powers in the Bill without any accountability.

The report also tackles a fairly technical but potentially important point that ORG and others engaged in their process had missed so far: the so-called “dehybridisation clauses”. A Hybrid Instrument is a piece of legislation that disproportionately affects a particular group of people within a class. The clauses in the Bill simply state that this should be disregarded. This can be important due to an obscure provision in the House of Lords that gives those who are specially and directly affected by Hybrid Instruments

the opportunity to present their arguments against the SI [statutory instrument] to the House of Lords Hybrid Instruments Committee and then, possibly, to a select committee charged with reporting on its merits and recommending whether or not the SI should be approved by both Houses of Parliament. The hybrid instrument procedure is unique to the House of Lords and the process must be completed before the SI can be approved by both Houses.

We can see why the Government would want to remove this provision to speed up legislation, but it seems unfair and potentially abusive to simply decree that what may be a hybrid instrument should not be treated as such, thus denying those affected their right to make their case.

[Read more] (3 comments)

January 16, 2017 | Ed Johnson-Williams

Let's save 'backdoor' for the real thing

The Guardian reported on Friday last week that WhatsApp - owned by Facebook - has a “backdoor” that “allows snooping on encrypted messages”. The report was based on research by Tobias Boelter, published in April 2016. The Guardian has since changed the word "backdoor" in its article to "vulnerability" or "security vulnerability".

A few days before the Guardian article was published, the journalist contacted ORG for a quote. She couldn’t discuss the details of the alleged security flaw so we gave a generic quote about the importance of transparency from companies that offer end-to-end encryption and the dangers to encryption within the Investigatory Powers Act.

The vulnerability that was reported theoretically works like this. Say Ed is texting his dad on WhatsApp.

  1. Ed texts his dad on WhatsApp and his dad texts back - all good, happy families.
  2. Then Ed texts his dad again but his dad’s phone is off. Ed's message is still on Ed’s phone waiting to be sent.
  3. WhatsApp or somebody else with access to WhatsApp's servers registers Ed’s dad’s mobile number with WhatsApp on a different phone. This could be done by stealing Ed’s dad's SIM card or using vulnerabilities in the mobile phone network to re-route SMS confirmations.
  4. Ed's WhatsApp app now sees the number that used to be linked to his dad’s phone is active again and automatically re-sends the message.
  5. The new phone receives the message that Ed intended to send to his dad. The message never reaches Ed’s dad’s phone.
  6. Depending on whether a non-default setting is enabled, Ed may receive a notification saying that his dad’s security code has changed because he reinstalled WhatsApp or switched phones.

This means that somebody collaborating with WhatsApp could theoretically read a small number of messages. This is very unlikely though and would be very easy to detect. This is not a backdoor that WhatsApp can use for routine access to users’ messages. And unless an app forces you to verify encryption keys with someone before you can send and receive messages with them, and also whenever they change their phone, then this vulnerability is going to be present.

WhatsApp have made an intentional decision about usability. It means that - in the example given above - if Ed’s dad’s phone was off because it was broken, Ed’s dad could put his SIM card into a new phone and still receive the messages without anyone having to change anything.

It would be incredibly difficult for WhatsApp to use the vulnerability to read messages this way at scale without gaining a terrible reputation for not delivering messages. Lots of people would receive a notification saying that the security key of many of their intended recipients had changed. Messages would go missing. The risk to the company of actively tampering with someone's message stream is very high and would be very complicated to get right. And if you’re worried about law enforcement, they have other ways (such as hacking the phone) to target an individual WhatsApp user’s messages that would be cheaper, quicker, and more difficult for the target to detect.

Lots of people recommend Signal as an alternative to WhatsApp. Signal is a highly respected encrypted messaging app which is preferable to WhatsApp for many reasons. Unlike WhatsApp, Signal does not collect data about users and share that data with Facebook. Facebook’s business model is to collect as much data about people as possible to help sell advertising. And unlike WhatsApp, Signal’s code is open-source meaning it’s possible to verify that it’s working properly. Some people find Signal more difficult to use than WhatsApp.

But Signal are planning to use the same behaviour as WhatsApp that was reported as a backdoor in an attempt to make their app easier for people to use. As Matthew Green, Assistant Professor at Johns Hopkins University, said on Twitter in response to the Guardian’s article, “I wish we could put the word "backdoor" in a glass case and only bring it out when something is really deserving.”

It is a struggle to get people to use secure messaging tools. Facebook and WhatsApp’s business model leaves much to be desired and Signal does a lot more to respect the privacy of its users. But WhatsApp have been successful in getting millions of people to encrypt the contents of their messages end-to-end.

The UK’s Investigatory Powers Act has powers in it for the Government to serve companies with Technical Capability Notices for the “removal of electronic protection applied by a relevant operator” to force them to carry out hacking and intercept data for the Government.

There are big fights ahead on encryption and we have to remain vigilant to those. Let’s save the word “backdoor” for the real thing.

Update: I fixed point 3 to say that if's Ed's dad's SIM card were stolen, it could be used to re-register Ed's dad's WhatsApp account on a different phone. It used to say if 'Ed's SIM card' were stolen.

[Read more] (3 comments)

December 21, 2016 | Javier Ruiz

EU Court slams UK data retention surveillance regime

Here’s our quick overview of what the CJEU has told the UK and Sweden they must do to fix requirements for data retention.

The full judgment can be read here.

Generalised Data Retention

The CJEU has repeated arguments, made previously in the Digital Rights Ireland case, to rule that generalised data retention is disproportionate and unlawful.

The UK case did not ask about general data retention. In the original sentence that triggered this CJEU case, the UK High Court argued that general retention was acceptable as long as the safeguards were strong:

 “70. In oral argument Ms Rose modified her stance on point (i). She accepted that the CJEU cannot have meant that CSPs can only lawfully be required to retain the communications data of “suspects or persons whose data would contribute to the prevention, detection or prosecution of serious criminal offences”. Such a restriction would be wholly impracticable. Rather the Court must be understood to have held that a general retention regime is unlawful unless it is accompanied by an access regime which has sufficiently stringent safeguards to protect citizens’ rights set out in Articles 7 and 8 of the Charter.” (from the High Court judgment)

Unfortunately for the UK government, ORG and PI were there to argue the opposite, alongside the joined Swedish case brought by Tele2 Sverige AB, a telecoms company challenging the compatibility of generalised data retention orders in that country.

The CJEU has made it clear that generalised data retention is not acceptable:

 103 Further, while the effectiveness of the fight against serious crime, in particular organised crime and terrorism, may depend to a great extent on the use of modern investigation techniques, such an objective of general interest, however fundamental it may be, cannot in itself justify that national legislation providing for the general and indiscriminate retention of all traffic and location data should be considered to be necessary for the purposes of that fight (see, by analogy, in relation to Directive 2006/24, the Digital Rights judgment, paragraph 51).

 Retention must be restricted somehow to a section of the public more likely to be of use to investigations, possibly by geography:

 111 As regard the setting of limits on such a measure with respect to the public and the situations that may potentially be affected, the national legislation must be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences, and to contribute in one way or another to fighting serious crime or to preventing a serious risk to public security. Such limits may be set by using a geographical criterion where the competent national authorities consider, on the basis of objective evidence, that there exists, in one or more geographical areas, a high risk of preparation for or commission of such offences.

 Summed up in the ruling:

 1. Article 15(1) of Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), as amended by Directive 2009/136/EC of the European Parliament and of the Council of 25 November 2009, read in the light of Articles 7, 8 and 11 and Article 52(1) of the Charter of Fundamental Rights of the European Union, must be interpreted as precluding national legislation which, for the purpose of fighting crime, provides for general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication.

This will come as a shocker to the UK government, which could be forgiven for safely assuming that at least the basic principles of retention would be accepted by the CJEU, given the opinion of the Advocate General and the views of UK courts.

The UK has pioneered population level data retention and drove the adoption of the original EU Data Retention Directive after the London bombings in 2005. It will now be forced to rethink its approach.

Access only allowed for serious crime:

The Court accepts that some data retention can be necessary and acceptable, as it had previously said in the Digital Rights Ireland case, but only for very limited purposes defined in the e-privacy directive. Within this narrower retention regime, access should be even more restricted.

The CJEU fully supports the ruling by the UK High Court, which triggered the case, that only serious crime is an acceptable purpose for accessing retained data.

The case hinges on the interpretation of Article 15 of the EU e-privacy Directive 2002/58, which sets out limitations to the confidentiality of communications. The UK government had argued that the purposes for which retention was acceptable were not restricted by those included in this article, but instead should cover the broader set of purposes in Article 13 of the Data Protection Directive 95/46 (now replaced by the GDPR):

 (e) an important economic or financial interest of a Member State or of the European Union, including monetary, budgetary and taxation matters;

(f) a monitoring, inspection or regulatory function connected, even occasionally, with the exercise of official authority in cases referred to in (c), (d) and (e);

(g) the protection of the data subject or of the rights and freedoms of others.

The CJEU rejected this point saying that the list in Art 15 is a narrow closed list of the allowed purposes that allow for data to be retained:

 90 It must, in that regard, be observed that the first sentence of Article 15(1) of Directive 2002/58 provides that the objectives pursued by the legislative measures that it covers, which derogate from the principle of confidentiality of communications and related traffic data, must be ‘to safeguard national security — that is, State security — defence, public security, and the prevention, investigation, detection and prosecution of criminal offences or of unauthorised use of the electronic communication system’, or one of the other objectives specified in Article 13(1) of Directive 95/46, to which the first sentence of Article 15(1) of Directive 2002/58 refers (see, to that effect, judgment of 29 January 2008, Promusicae, C‑275/06, EU:C:2008:54, paragraph 53). That list of objectives is exhaustive, as is apparent from the second sentence of Article 15(1) of Directive 2002/58, which states that the legislative measures must be justified on ‘the grounds laid down’ in the first sentence of Article 15(1) of that directive. Accordingly, the Member States cannot adopt such measures for purposes other than those listed in that latter provision.

Furthermore, the CJEU says that even in the area of fighting crime, laws should be proportionate and access must be narrowed:

 115 As regards objectives that are capable of justifying national legislation that derogates from the principle of confidentiality of electronic communications, it must be borne in mind that, since, as stated in paragraphs 90 and 102 of this judgment, the list of objectives set out in the first sentence of Article 15(1) of Directive 2002/58 is exhaustive, access to the retained data must correspond, genuinely and strictly, to one of those objectives. Further, since the objective pursued by that legislation must be proportionate to the seriousness of the interference in fundamental rights that that access entails, it follows that, in the area of prevention, investigation, detection and prosecution of criminal offences, only the objective of fighting serious crime is capable of justifying such access to the retained data.

The new leaked e-privacy Regulation maintains a similar list in its Article 11(1) so this ruling should stand:

 Union or Member State law may restrict by way of a législative measure the scope of the obligations and rights provided for in Articles 5, 6, 7, and 8 of this Régulation when such a restriction respects the essence of the fundamental rights and is a necessary, appropriate and proportionate measure in a démocratie society to safeguard national security (i.e. State security), defence, public security, and the prévention, investigation, détection- or prosecution of criminal offences or the exécution of criminal penalties, or of unauthorised use of electronic communications systems. Any législative measure refeiïed to in paragraph l shall be in accordance with the Charter of Fundamental Rights of the European Union, in particular with Articles 7, 8, 10 and 52 thereof.

The IPA contains a much broader set of purposes for access to communications data by some 48 public authorities that include NHS trusts and the Gambling Commission. It is very hard to see how this can be squared with the ruling.

Prior review and authorisation by a court or independent administrative body

 The CJEU has also fully endorsed the UK High Court ruling that required independent authorisation for access to retained data:

 120 In order to ensure, in practice, that those conditions are fully respected, it is essential that access of the competent national authorities to retained data should, as a general rule, except in cases of validly established urgency, be subject to a prior review carried out either by a court or by an independent administrative body, and that the decision of that court or body should be made following a reasoned request by those authorities submitted, inter alia, within the framework of procedures for the prevention, detection or prosecution of crime.

This is a blow to the UK legal system, where authorisation is performed by a 'Designated Senior Officer', who is part of the same organisation that requests the data.

Other issues raised in the judgment:

The CJEU judgment also raises a few other issues that  were not explicitly raised by the UK Court of Appeal. However, they will be very important for any future legislation in this area.

Freedom of expression

The Court reiterates the points previously made in the Digital Rights Ireland case that data retention engages not just privacy but also freedom of expression, “one of the essential foundations of a pluralist, democratic society”.

93 Accordingly, the importance both of the right to privacy, guaranteed in Article 7 of the Charter, and of the right to protection of personal data, guaranteed in Article 8 of the Charter, as derived from the Court’s case-law (see, to that effect, judgment of 6 October 2015, Schrems, C‑362/14, EU:C:2015:650, paragraph 39 and the case-law cited), must be taken into consideration in interpreting Article 15(1) of Directive 2002/58. The same is true of the right to freedom of expression in the light of the particular importance accorded to that freedom in any democratic society. That fundamental right, guaranteed in Article 11 of the Charter, constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which, under Article 2 TEU, the Union is founded.

This is important because it could make it harder to justify the blanket retention of Internet Connection Records, which could be deemed a 'reading list'. Measures that made ordinary citizens refrain from accessing materials or expressing opinions online this could well impinge the “essence of the right”. This would move the argument away from safeguards on access to the records towards the broader direct impact of the measures, in a way that an analysis purely focused on individual privacy may not.


Open Rights Group and other human rights groups have long argued that people whose data is accessed should be notified, once this will not impact on investigations. Our calls have always been rejected on the grounds that investigations can go cold and be revived later on, and this would give too much information to suspects.

The CJEU has, almost unprompted, taken the opportunity to remind national courts that this is indeed a basic component of the legal framework around surveillance:

121 Likewise, the competent national authorities to whom access to the retained data has been granted must notify the persons affected, under the applicable national procedures, as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities. That notification is, in fact, necessary to enable the persons affected to exercise, inter alia, their right to a legal remedy, expressly provided for in Article 15(2) of Directive 2002/58, read together with Article 22 of Directive 95/46, where their rights have been infringed.

This would shake the secretive UK surveillance regime to its core, almost more than introducing independent authorisation, as it might be feasible to maintain the current black box model with the use of secret court orders or extending the role - and resources - of the Judicial Commissioners in the IPA. Having to notify discarded suspects would be a crack through which light may reach the darker corners of the current regime.

Given that there are over half a million requests a year for communications data, notification was perceived as introducing a huge administrative burden. It would also give visibility and raise social awareness of the extent of surveillance.

Only suspects' data can be accessed

In addition to rejecting generalised retention and narrowing down access to serious crime with independent authorisation, the CJEU has further established that as a rule only the data of people suspected of direct involvement in those crimes can be accessed. Accessing other people’s data must be an exception and also based on specific evidence of how this may help investigations.

119 Accordingly, and since general access to all retained data, regardless of whether there is any link, at least indirect, with the intended purpose, cannot be regarded as limited to what is strictly necessary, the national legislation concerned must be based on objective criteria in order to define the circumstances and conditions under which the competent national authorities are to be granted access to the data of subscribers or registered users. In that regard, access can, as a general rule, be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime (see, by analogy, ECtHR, 4 December 2015, Zakharov v. Russia, CE:ECHR:2015:1204JUD004714306, § 260). However, in particular situations, where for example vital national security, defence or public security interests are threatened by terrorist activities, access to the data of other persons might also be granted where there is objective evidence from which it can be deduced that that data might, in a specific case, make an effective contributionto combating such activities.

The IPA contains powers for the bulk acquisition of communications data by the Security and Intelligence Agencies, which had been in place through secretive interpretations of previous legislation. MI5 has been getting a copy of all of the country’s phone calls, texts and possibly other data for decades. Clearly, this would not fit the criteria set out by the CJEU and we expect these practices to be challenged in court.

Retained data must be kept in the EU

This was a point raised in the original UK ruling and unsurprisingly it was ratified by the CJEU. It is worth repeating as a reminder of the dire consequences that leaving the EU data protection regime, including data retention, would have for the UK digital economy.

122 With respect to the rules relating to the security and protection of data retained by providers of electronic communications services, it must be noted that Article 15(1) of Directive 2002/58 does not allow Member States to derogate from Article 4(1) and Article 4(1a) of that directive. Those provisions require those providers to take appropriate technical and organisational measures to ensure the effective protection of retained data against risks of misuse and against any unlawful access to that data. Given the quantity of retained data, the sensitivity of that data and the risk of unlawful access to it, the providers of electronic communications services must, in order to ensure the full integrity and confidentiality of that data, guarantee a particularly high level of protection and security by means of appropriate technical and organisational measures. In particular, the national legislation must make provision for the data to be retained within the European Union and for the irreversible destruction of the data at the end of the data retention period (see, by analogy, in relation to Directive 2006/24, the Digital Rights judgment, paragraphs 66 to 68).

If you want to support our work in future cases, and help to ensure that this ruling is enforced, please join ORG today.

[Read more]

December 20, 2016 | Jim Killock

Is the government misleading the Lords about blocking Twitter?

Last week we reported that the UK government expect the BBFC to ask social media providers, such as Twitter, to block the use of their service by accounts that are associated with porn sites that fail to verify the age of their users.

Twitter censoredThe Bill is even worse than we illustrated. The definition of a “pornographic website” in Clause 15 (2) is purely a site that operates on a “commercial basis”. This could catch any site—including Twitter, Reddit, Tumblr—where pornography can be found. The practical limit would therefore purely be down to the discretion of the regulator, the BBFC, as to the kind of commercial sites they wanted to force to use Age Verification. However, the BBFC does not seem to want to require Twitter or Reddit to apply age verification—at least, not yet.

However, we also got one part wrong last week. In relation to Twitter, Reddit and other websites where porn sites might promote their content, the Bill contains a power to notify these “ancillary services” but has no specific power to enforce the notifications.

In other words, they expect Twitter, Google, Facebook, Tumblr and other companies to voluntarily block accounts within the UK, without a specific legal basis for their action.

This would create a toxic situation for these companies. If they fail to “act” on the “notifications”, these services will leave themselves open to the accusation that they are failing to protect children, or actively “supplying” pornography to minors.

On the other hand, if they act on these notices, they will rightly be accused by ourselves and those that are censored of acting in an unaccountable, arbitrary manner. They will not have been legally obliged to act by a court; similar content will remain unblocked; and there will be no clear remedy for someone who wished to contest a “notification”. Liability for the blocks would remain with the company, rather than the BBFC.

The government has not been clear with the Lords that this highly unclear situation is the likely result of notifications to Twitter—rather than account blocks, as they have suggested.

There are very good reasons not to block accounts after a mere notification. For instance in this case, although sites can contest a classification at the BBFC, and an internal appeals process will exist, there is no external appeal available, other than embarking on an expensive judicial review. It is not clear that a classification as pornography should automatically lead to action by ancillary services, not least because compliance automatically results in the same content being made available. To be clear, the bill does not aim to remove pornography from Twitter, Reddit users or search engines.

Why then, has the government drafted a bill with this power to notify “ancillary services”, but no method to enforce? The reason appears to be that payment providers in particular have a long standing agreement amongst themselves that they will halt payments when they are notified that someone is taking payments for unlawful activity. Similarly, large online ad networks have a similar process of accepting notifications.

There is therefore no need to create enforcement mechanisms for these two kinds of “ancillary providers”. (There are pitfalls with their approach—it can lead to censorship and unwarranted damage to businesses—but let us leave that debate aside for now.)

It seems clear that, when the bill was written, there was no expectation that “ancillary providers” would include Twitter, Yahoo, or Google, so no enofrcement power was created. 

The government, in their haste, has agreed with the BBFC that they should be able to notify Twitter, Google, Yahoo and other platforms. They have agreed that BBFC need not take on a role of enforcement through court orders.

The key point is that the Lords are being misled by the government as things stand. Neither the BBFC or government have explored with Parliamentarians what the consequences of expanding the notion of "ancillary providers” is.

The Lords need to be told that this change means that:

  1. the notices are unenforceable against Internet platforms;
  2. they will lead to public disputes with the companies;
  3. they make BBFC’s decisions relating to ancillary providers highly unaccountable as legal responsibility for account blocks rest with the platforms.

It appears that the BBFC do not wish to be cast in the role of “national censor”. They believe that their role is one of classification, rather than enforcement. However, the fact that they also wish to directly block websites via ISPs rather flies in the face of their self-perception, as censorship is most clearly what they will be engaging in. Their self-perception is also not a reason to pass the legal buck onto Internet platforms who have no role in deciding whether a site fails to meet regulatory requirements.

This mess is the result of rushing to legislate without understanding the problems involved. The obvious thing to do is to limit the impact of the “ancillary services” approach by narrowing the definition to exclude all but payment providers and ad networks. The alternative—to create enforcement powers against a range of organisations—would need to establish full accountability for the duties imposed on ancillary providers in a court, something that the BBFC seems to wish to avoid.

Or of course, the government could try to roll back its mistaken approach entirely, and give up on censorship as a punishment: that would be the right thing to do. Please sign our petition if you agree.

[Read more] (3 comments)

December 16, 2016 | Javier Ruiz

ORG's first take on the leaked e-Privacy Regulations

The European Commission's proposed e-Privacy Regulations have leaked. We take a first look at what's in there.

apple watch on a wristBlog Leak

The leaked e-Privacy Regulation (ePR) brings many improved protections to our communications data, which are now extended to communications devices and internet services, not just traditional telecom providers. At the same time this modernisation has brought other fundamental changes that could have less welcome consequences.

Here we focus on the basic changes to electronic communications. Most other analyses of the leaked ePR will probably focus on cookies and the impact on online advertising, and rightly so as this is really important. We don’t have the space here for a proper take on both here, but in the coming months we will also engage with those other areas: cookies, marketing, nuisance calls, as well as the enforcement aspects.

One point we have to stress is that the new ePR explicitly allows national legislation for the interception of communications as long as this is compliant with human rights.

It is important to remember that this is a leak of a European Commission version of the Regulation, which will then have to be amended and fought over by the European Parliament and the Council of Member States, so the final legislation could be different in many areas. There will also be a concerted lobbying campaign from industry to change the parts of this leak that they don’t like.

Whatever happens with Brexit, this Regulation will have an impact in the UK given the current commitment to keep UK data laws compatible with the EU in order to facilitate data flows, e-commerce and services.

Confidentiality of electronic communications

The leaked Regulation is concerned with the confidentiality of 'electronic communications data' meaning both the content and metadata of electronic communications.

The ePR establishes a general principle that nobody can interfere with or monitor electronic communications and that metadata shall be “erased or made anonymous as soon as the communication has taken place”.

The ePR also sets out several cases where metadata can be retained and used for lawful processing by providers, mainly around typical needs to provide security, quality of service, billing and access to emergency services. This is similar to the provisions in the previous Regulation.

There are some differences when it comes to other uses, such as analysing users’ data for commercial purposes. The ePR allows these activities on the basis of consent for specified purposes, provided these could not be achieved with anonymous data.

It also establishes that where there is a “high risk to rights and freedoms” the provider must perform a data protection impact assessment and consult with the ICO, in reference to part of the General Data Protection Regulation (GDPR) that was recently passed by the EU.

Consent in general is strengthened and brought in line with the GDPR. Its is also explicit that consent for the use of electronic metadata must be user friendly and separate from general T&Cs and there is a ban on making services conditional on giving data access.

This is a good but it may not be enough. Communications data in the mobile phone age gives insights into our most intimate personal details and we believe that impact assessment should be compulsory in all cases, as even the best consent system can be bent.


One very important change is that location data has disappeared as a separate category in the new ePR and it is now explicitly described as communications data. In the current Regulation, there are stricter conditions for consent to reuse location data when compared with other types of metadata, and it restricts its use to value-added services and not marketing. That was at the time when mobiles were coming into the mainstream and policymakers saw a high risk involved in knowing where you are at any time.

In the new version, location is just another piece of metadata. Location analytics has become mainstream and restricting the use of such data for mobile phone providers while Google and Apple get it from the handset didn’t really work. At the same time the potential value of collecting location data from mobile phones - whoever gets it, how and to what level of detail - is huge and continues to have high privacy risks.

Removing the different regimes for location and what used to be called “traffic data” is more consistent and will avoid complex debates on what is traffic and what is location. But it is still unclear whether this will be sufficient protection given the high interest in location analytics among industry.


The leaked ePR contains stronger provisions on the protection of data stored in devices and the extraction of data which should bring some real changes to the way the whole tech industry operates. There are even restrictions on using the processing power of end-users’ devices that could see blockchain technologies requiring some clear consent. There could be some issues with the implementation in some computer environments as it appears to be conceptualised around mobile devices run by corporates.

There are also detailed provisions on the tracking of devices, for example in public wifi in shopping malls or transport networks, where large notices must be displayed. In its guidance for Wifi Location Analytics, the UK ICO does go further in asking for the hashing of personal identifiers though, which makes it more difficult to identify individuals in a dataset.

It is very good that the recitals clarify that machine to machine communications of the kind involved in the internet of things and the coming 5G wave of hyper-connectivity are explicitly covered.


The new rules give companies more leeway in how they use our data while simultaneously tightening the rules on how consent is used in alignment with broader data protection. The new ePR seems particularly good for traditional telcos, which not only see their internet nemesis communications providers now included in the rules - WhatsApp, FaceTime, etc. - but also are the main beneficiaries of these changes on electronic communications data.

The Commission is unapologetic about wanting to create a data market around the reuse of communications data with consent, in recital 23. Interestingly this is exactly the big pitch from Telefonica around reinventing itself as a data company and giving their customers more control, also followed to a lesser extent by Vodafone.

One area that we will be looking into is the use of anonymisation to process communications data, so far the preferred modus operandi of telcos, who are only now starting to move towards a consent model. The new Regulation appears clearer than the previous formulation to delete or anonymise data “when no longer needed’, but we will see in practice if this stops companies building pseudonymous profiles of their users.

[Read more]

December 16, 2016 | Jim Killock

How it works: website blocking in the Digital Economy Bill

We realised that what will and will not be blocked under the Digital Economy Bill is becoming increasingly hard to understand. So here is a handy guide.

Blocking takes two shapes, after the Lords debate.


Firstly, websites that either don’t use Age Verification, or supply pornography that the new national censor, the BBFC, deems “non-conventional” can be blocked. In addition, if they use an Age Verification technology that seems inadequate, such as credit cards, this could lead to a block, although we believe this would be less likely as it would seem a very harsh response.

Ancillary services

Secondly, “ancillary services” is now clarified to include Twitter or other platforms where an account is used to promote a pornographic website. Here, a block could only be applied if the BBFC has decided to sanction the website for non-compliance. This would mean it could block an account from a website that publishes “non-conventional” pornography, or one that doesn’t provide Age Verification, or only uses credit card verification. However, other similar accounts from sites that had not been reviewed cannot be blocked under this power.

Blocked Twitter feeds would not need to be displaying pornography, they might just provide links.

As a further example of where this might go, we have also included DNS results in the table. Provision of DNS results for a pornographic website could easily be included in the expansive concept of an “ancillary service”.

BBFC classification and blocking will be selective

To complicate matters further, blocking can only take place if the BBFC has decided to classify a website. So the whole process is limited by their capacity to review  hundreds of thousands of pornographic websites. Furthermore, the BBFC cannot block “non-commercial” websites.

This incredibly complicated picture of course risks being perceived as extremely arbitrary. That is the inevitable result of pursuing censorship as a legitimate sanction against regulatory compliance, rather than limiting it to clearly illegal and harmful material.

Don’t forget to sign our petition against these proposals. 

Our summary of what will be blocked

Type of pornographic related content

Type of age verification

Can it be blocked

Will it be blocked in the UK

Major website with US style legal pornography {1}

Credit card or none



Website with BBFC-compliant content (2)

UK approved (4)



Website with BBFC-compliant content

Credit card only



Non-commercial site with US style legal content




Major website with US style legal pornography

UK approved



First thousand websites by market share, reviewed by BBFC (3)

Credit card or none



Next three million websites by market share, not reviewed by BBFC

Credit card or none



Twitter feed for BBFC approved commercial website




Twitter feed for a website deemed non-compliant by the BBFC




Twitter feed for the millions of websites not classified by BBFC




Non-commercial Twitter feed




DNS result locating website






1 This table uses “US style legal content” as a shorthand for content that may not be legal in the UK, or legal but not approved by the BBFC.

2 BBFC-compliant means approved by the BBFC, a more restrictive concept than legal in the UK

3 Or whatever number of websites the BBFC feels able to classify. We assume they wll aim to cover market share, so 1000 websites seems a reasonable number to target

4 By "UK approved" age verification we mean systems that meet BBFC requirements. These are currently undefined other than that they must verify age. Privacy and interoperability requirements are absent from the bill.

[Read more] (8 comments)

December 13, 2016 | Jim Killock

MPs leave it to House of Lords to sort out porn cock up

Plans, outlined in the Digital Economy Bill, to make the Internet safe for children are in a worse state than when the Bill was first published this Autumn. It’s now down to peers to sort them out as the Bill has its second reading in the House of Lords.

Although Labour raised the issue of privacy, nothing was changed so there are still no privacy duties in the Bill.  However, the Commons did find time to add powers for the regulator to block legal websites, through a poorly worded amendment from the government.

The Lords therefore have three issues to resolve: will age verification be safe? Will it lead to widespread censorship of legal content? And how will it make both age verification and website blocking safe and fair?

These aren’t easy questions and they ought to have been dealt with well before this bill reached Parliament. The privacy risks of data breaches and tracking of people around the Internet simply have to be addressed. We believe that privacy duties have to be written onto the face of the bill.

On website blocking, it’s clear that MPs are misleading themselves. The objective of website blocking appears to be to restrict access to websites that are not verifying the age of their users. However, the truth is that this is completely beyond the reach of the regulator.

The regulator, the BBFC, almost certainly have no intention of blocking more than a small fraction of the pornographic sites available. This would make pornography in general only slightly less accessible to someone under 18, as they will still be able to reach millions of other sites. It will however restrict access to specific, relatively popular sites. These sites could be aimed at specific communities or identities, which would be particularly harmful.

It is clear that website blocking is not likely to be a safety measure, but a punishment directed at non-compliant websites. Of course it will also punish the users of these websites. It is not clear to us that this approach is necessary or proportionate.

All this ought to make it clear that Age Verification isn’t a particularly wise policy.

The very least that needs to be done is for the regulator to make a proportionality test in relation to the blocking of any given website. This can also take into account the issues relating the which ISPs might do the blocking, for instance so that ISPs that lack the capability to block are not asked to do so, or at the very least, not without compensation.

Another concern that the BBFC itself raised is whether its own classification standards are imposed on websites, or the standard of what is legal to view in the UK. Understandably, the BBFC will seek to sanction websites that are publishing material that it does not view as publishable – or classifiable – in the UK. However, this needs to be the legal standard, rather than the BBFC’s view of what is legal or acceptable.

For this reason, there has to be a simple external appeals mechanism before any sanction is applied, and this too is missing.

Censorship, however you look at it, is a drastic step. While “improvements” can be made that might limit some particularly awful practices from developing, the vast majority of what gets blocked will be legal material that any adult has a right to access. This paradox – the censorship of legal content – won’t disappear just because the process is improved.

The only justification could be that there is a serious and widespread harm emerging that can be addressed; and while it is completely valid to discourage teenagers from accessing this content, it is far less clear that the proposed measures will work, nor that the alternative approaches such as filtering for specific users cannot work. And articulating a valid social goal is still a long way off a rigorous demonstration of harm.

We will hear the first signs of whether the Lords will resolve any of these issues today. The debate starts late this afternoon and can be watched online.

For more details you can read our briefing. if you want to help with the campaign, please sign the petition.

[Read more] (4 comments)

: E-voting's Unsolvable Problem-->
  • ORG Glasgow: A discussion of the General Data Protection Regulation (GDPR)
  • ORG Aberdeen: March Cryptonoise event
  • ORG North East: Take control of your online life
  • ORG Cambridge: Monthly March Meetup