Submission to Universal Periodic Review

Human rights and technology in the UK

“Do not treat cyberspace as if it belongs to you….You should not imagine for an instant that you can resist the growing force of the tide now flowing for transparency, open information and the free exchange of ideas.”

Foreign Secretary William Hague, the London Conference on Cyberspacei

1 The UK Government has, within the last month, made firm commitments to the principles of freedom of expression and privacy online through it’s London Conference on Cyberspaceii. However, a number of domestic policies made in the past four years have undermined these ideals. Taken together, they demonstrate a failure to adequately consider the impact that digital policies can have on human rights.

2 The problems in digital policy to date include a failure to undertake full human rights impact assessments and a failure to consider the importance of proper due process and the rule of law.

3 Open Rights Groupiii, a digital rights advocacy organisation, welcomes the opportunity to contribute to the review. We aim in our submission to provide examples of a general failure by the UK Government to build in a respect for and defence of human rights into their digital policy making. We hope that through reviews such as this, the UK Government will be encouraged to make it’s policy making practice match its rhetoric regarding human rights online.

Digital Economy Act

4 In 2010 the then Labour Government passed the Digital Economy Act. It fails to respect human rights in three ways. First, it poses a privacy risk to users who are identified as suspected copyright infringers. Second, it contains (currently latent) provisions for technical measures to be taken against those identified as repeat copyright infringers. Third, it has provisions for the blocking of websites.

5 The UN Special Rapporteur for Freedom of Expression Frank La Rue, in his report earlier in 2011 on freedom of expression and the Internet, singles out the UK’s Digital Economy Act for concern. Regarding ‘graduated response’ schemes, in which technical measures are proposed against alleged infringers, La Rue continues says he is:

“…alarmed by proposals to disconnect users from Internet access if they violate intellectual property rights. This also includes legislation based on the concept of “graduated response”, which imposes a series of penalties on copyright infringers that could lead to suspension of Internet service, such as the so-called “three-strikes-law” in France and the Digital Economy Act 2010 of the United Kingdom.”iv

6 The Act represents a failure to properly consider human rights and their weighting against other competing policy objectives. Not only was there a failure to interrogate the impact of the Act on human rights, compounded by the Government’s failure to analyse the evidence base supporting the Act’s measures. In a recent reply to a Freedom of Information request, the Department for Communications, Media and Sport, who now hold the relevant brief, admitted they hold no evidence of the effects of copyright or the efficacy of different strategies for tackling it. This makes a full and proper consideration of the proportionality of the policy impossible.

Technical measures against users

7 The Digital Economy Act 2010 sets out that “technical measures” against users can include, limiting the speed or other capacity of a connection, preventing a subscriber from using the service to gain access to particular material, suspending the service provided to a subscriber, or limits the service provided to a subscriber in another way.v Disconnection, suspension or interference with citizens’ communications is a clear restriction on their ability to share and receive information, is a disproportionate response to the problem of copyright infringement and demonstrates a clear failure to respect human rights.

Website blocking for copyright infringement

8 The Digital Economy Act also contains provisions for the blocking of websites that is or is likely to be involved in copyright Ofcom recently concluded that the provisions are unworkable. We consider blocking to be a risk to freedom of expression due to the dangers of over- blocking and the likely follow-on demands for more intrusive action against circumvention measures that are often vital tools for the protection of rights online (such as VPNs, for example).

Privacy concerns

9 The Act places requirements on ISPs for keeping records of those accused of copyright infringement. Our concern is that the Digital Economy Act will lead to large lists of very sensitive information being compiled, stored and disclosed to third parties, at a time when there are serious concerns about the efficacy of data protection oversight and enforcement.vii This is especially problematic given concerns about the reliability of the IP address evidence used against alleged infringers.viii

10 Recommendations: First, the repeal sections 3-16 of the Digital Economy Act covering private surveillance and technical measures. Second, repeal sections 17 and 18, as recommended by Ofcom, to ensure website blocking is no longer a part of the Act.

Proposals for Internet censorship

11 There are a number of plans currently being developed by the UK Government for various forms of online censorship.ix These go beyond the four types of content outlined by Frank La Rue in his October reportx, and in most cases they also do not adhere to the cumulative ‘three-part test’ set out by the Rapporteur in his report form earlier in 2011xi.

12 For example, there are currently discussions by MPs of proposals to filter ‘adult’ material by default xii. Similarly, the Home Office’s Prevent counter-terrorism strategy suggests filtering is necessary ‘across the public estate’.xiii The strategy proposes terrorist websites be banned in schools, libraries, academic institutions and government buildings, with ISPs should be encouraged to ban these sites as well. Critically, little or no account is given of a need for a court order or any legal process. There are suggestions that content falling into vaguely defined categories such as ‘violent’ should be blocked.

14 Interventions that reach beyond giving parents tools to manage their own households’ access mean taking responsibility for what people are allowed to see and do online out of their own hands. Furthermore, it is likely that default filtering and blocking will endanger children where it pursuades parents that they do not need to take responsibility themselves.

Proposals with no basis in law and that involve inadequate legal due process pose clear risks to freedom of expression.

15 Recommendation: Government must not mandate censorship beyond the four types of content outlined by Frank La Rue. Government should not interfere with ISPs duty to provide citizens with the freedom to access information. The Government should concentrate on providing parents with the tools to manage their own children’s access.

Self-regulation and due process

16 Related to our concerns about censorship online, we are concerned about the current interest in self- regulation for policies affecting affect the free flow of information. There are currently discussions ongoing regarding a number of proposals to intervene in the flow of information online that potentially ignore due process, including new proposals for a voluntary website blocking schemes , discussions about possible requirements for search engines to alter search resultsxiv, and new relationships between the creative industries, police and payment services to cut off financial support for sites allegedly involved in infringement.xv

17 There are obvious risks when the relationships involved in a ‘self-regulation’ scheme, affect basic freedoms and rights, especially where those interests are not reflected in the process. We are concerned that there is a lack of transparency, certainty and due process in the developing relationships affecting the flow of information online in the UK. There are obvious risks to handing too much unchecked power to businesses to affect what information people are allowed to access.

18 Furthermore, there are currently discussions hosted by Nominet regarding the process governing the suspension of domains. A Nominet ‘issue group’ is currently formulating a form of ‘self-regulation’, raising concerns about due process and the application of justice.xvi Private policing powers, such as those discussed by Nominet, risks giving law enforcement extra-judicial powers seizure powers without sufficient protections.

19 Recommendation: The Government should make sure that any self-regulatory proposals do not undermine protections for human rights through circumvention of proper due process and the rule of law. Domain suspensions should require court orders.

Copyright flexibility and freedom of expression

20 The rules that copyright sets for the use, or reuse, of copyrighted works also affect the fulfilment of human rights. Where the rules are too strict, and unduly restrict legitimate activities, then those rules have an impact on freedom of expression.

An exception for parody

21 An exception to copyright for parody and pastiche is available under Directive 2001/29/EC.xvii However, that exception has not been implemented in UK law. This has led to a number of examples of copyright being used to inhibit legitimate criticisms through take-downs of infringing content. The most recent example is the campaign created by GreenPeace, which used Volkswagon’s Star Wars themed adverts to criticise the company’s history of lobbying. GreenPeace’s video was removed from YouTube as it gained momentum, damaging their ability to build their message.xviii

22 A lack of a parody exception leaves such videos vulnerable to take-downs when they are made available to the public. There is clear uncertainty about the legitimacy of parodies that is inhibiting legitimate expression. As such, the lack of such a an exception represents a threat to freedom of expression. The UK Government is currently considering whether to introduce such an exception following an independent review of IP and growth.xix

Exceptions for disabilities

23 Similarly, there is a continued absence of formal exceptions to copyright to ensure those who are blind or have other disabilities do not face additional problems accessing copyrighted material. This causes obvious impediments to the right to freely distribute and access information.

24 Recommendations: The UK Government should follow the recommendations of the Hargreaves Review in full, including a new exception for parody in UK law, and also support moves to ensure that copyright is not a barrier to accessibility of information for those who are blind or have other disabilities by pushing for new exceptions.

Privacy in the UK
RIPA: private interception

25 The Regulation of Investigatory Powers Actxx, in our view, fails to regulate the private interception of communications effectively. Nevertheless, private interception has become an increasingly common phenomenon in the UK.

26 Communications providers, such as Internet Service Providers, check the ‘packets’ of their customers communications to understand or gain information about the traffic, using what is called “Deep Packet Inspection” (DPI). In the UK, such interception is governed by RIPA, which allows this interception in limited circumstances, for instance to ensure that services can function,xxi or where the consent of the parties of to the communication has been given.xxii

27 Nevertheless, service providers have sought to intercept communications for purposes which are both wider than merely keeping networks functioning, and do not involve consent of one or both parties to the communication.

28 Examples include Phormxxiii, an advertising system trialled in the UK, and TalkTalk’s HomeSafe providing scanning of users’ traffic to locate web pages with malware and potentially offensive content.xxiv The justification for such interception is dubious, but TalkTalk’s interception is not governed by any regulator.

29 Unlike other aspects of privacy – such as data protection, governing data collected by private groups, or public interception and surveillance, governed by the Interception Commissioner and the Surveillance Commissioner – there is no regulator giving advice and guidance to private companies.

30 Additionally the police and the public prosecutor, who might intervene in criminal private interception, are unlikely to see the harms as great enough to be a major priority for investigation or prosecution. The gaps and lack of enforcement of privacy rights attracted the attention of the EU Commission in the wake of the Phorm case.xxv

31 The government recently agreed to extend the regulation of “unintentional interception” to the duties of the Interception Commissioner.xxvi However, this is a very narrow change that would not increase regulation of clearly intentional projects such as Phorm or TalkTalk’s HomeSafe.xxvii

32 Our recommendation would be to remove the regulatory gap, in order to make sure this expanding area is properly regulated. Ideally, the regulator would be within the role of a single Privacy Commissioner, rather than adding to the confusing and potentially duplicating number of relatively unknown commissioners that govern different aspects of privacy.

Data retention

33 The UK requires telecoms companies to retain subscriber ‘traffic data’ beyond its business use for its potential use in criminal investigations. This is a requirement of European law, rejected at the time by human rights groups, xxviii and has since been rejected in many EU states on constitutional, human rights grounds.xxix The UK has a particularly long retention period, of 1 Retaining the data of all users as a precaution, most of whom will be innocent, should be seen as a form of mass surveillance that does not meet the requirements of a fundamental right to privacy.

34 Recommendation: remove data retention requirements. Cookies and online profiling

35 Many Internet advertising companies profile web users by tracking their visits to websites through “cookies”. The profiling and use of cookies takes place without explicit, prior consent, and is an intrusion into the right to privacy. Users privacy is protected by the e-privacy Directive, and Article 5(3) gives an expectation of prior consent, in the view of the EU’s Article 29 Working Party, for instance.xxxi The UK has not yet decided how to implement consent, but has rejected the idea of prior consent, largely because it might affect businesses.

36 Recommendation: the UK requires users to give prior consent before they are profiled

Data Protection Act

37 The UK’s Data Protection Act protects users’ right to privacy in circumstances where they might choose to share some of their personal information with private companies or organizations. However, it is widely acknowledged to lack many basic protections, including poor definitions of personal data, an erroneous notion of ‘implied consent’, poor enforcement and inspection powers (powers of entry), limiting the notion of damage to just that which is tangible, rather than risk and distress, and no realistic means for citizens to individually or collectively seek redress.xxxii

38 Furthermore, the government is seeking wider use and re-use of personal information, potentially increasing the risks through projects such as the MiData initiative.xxxiii

39 Recommendations: create a much stronger data protection regime to enforce people’s right to privacy, specifically: using the EU definition of personal data, removing the notion of implied consent, create a general power for the Information Commissioner inspect data controllers without their permission, widen the definition of damage to include ‘moral’ damage such as distress or risk, create a means for citizens to seek collective redress after data breaches, and create mandatory breach notifications.

vii See for example ‘Are the ICO fit for purpose?’, Alexander Hanff, Privacy International, February 1 2011,
viii For more on our concerns about privacy and the Digital Economy Act, see ORG’s submissions to the Judicial review of the Digital Economy Act:


xi Specifically in Chapter 3, paragraph 24, concerning ‘any limitation to the right to freedom of expression must pass the following three-part, cumulative test’:
xiv For more on current self-regulation proposals, see,-copyright-enforcement-and-self-regulation

xv xvi xvii xviii xix

xxi RIPA, Section 3 (3) (b) “for purposes connected with the provision or operation of that service”

xxii RIPA Section 3 (1)
xxiv and for a technical analysis see For a legal analysis, see

xxv See reference=IP/09/570&format=HTML&aged=0&language=EN&guiLanguage=en


xxvii See our consultation response, consultation-response
xxviii ,,en/#letter

xxix ,
xxxii See and sepecific changes suggested data-protection-amendments