Digital Economy Bill Briefing to House of Lords Committee Stage

Open Rights Group (ORG) is the UK’s only digital campaigning organisation working to protect the rights to privacy and free speech online. With over 3,000 active supporters, we are a grassroots organisation with local groups across the UK.

Digital technology has transformed the way we live and opened up limitless new ways to communicate, connect, share and learn across the world. But for all the benefits, technological developments have created new threats to our human rights. We raise awareness of these threats and challenge them through public campaigns, legal actions, policy interventions and tech projects.

ORG has concerns about the following areas of the Digital Economy Bill (DE Bill):

  • Age verification (Part 3) – pg 1
  • Web-blocking (Part 3) – pg 5
  • Online copyright infringement (Part 4) – pg 6
  • Data sharing (Part 5) – pg 8
  • Internet filtering (new clause) – pg 15

1. Age Verification (Part 3 of the Bill)

1.1 Fundamental problems with Age Verification

We do not believe that Age Verification (AV) is the appropriate policy response and would recommend it is completely removed from the statute, as it is:

  • Unlikely to be ubiquitous (vast majority of websites will not implement it).
  • Unlikely to reduce availability of pornography to minors.
  • Likely to dissuade adults from accessing legal pornography.
  • Likely to be circumvented via technical measures such as Virtual Private Networks (VPN), allowing UK adults and under 18s to ‘jump the fence’.
  • Risky in its own terms for both individual privacy and wider cybersecurity.

Targeted solutions focused on children’s access to devices and screens are more appropriate and more likely to be effective.

1.2 Privacy and AV

Despite our fundamental misgivings about this policy, we feel it is only responsible to make clear recommendations that could reduce the privacy risks of the AV scheme. They may not deal with the chilling effect on free expression of AV, but without clear privacy obligations, risks of data leaks and tracking could have disastrous consequences.

Since July, there have been two major hacks on porn websites affecting more than 400 million people (xHamster– 380,000 people[1]and FriendFinder412 million accounts[2]). People’s personal details including their email addresses and usernames have been traded on the dark web. Another public exposé of people’s names, addresses and phone numbers in the Ashley Madison case[3],reportedly resulted in two suicides.[4]

Age verification proposals in the Bill could create similar databases of data to those that have already been hacked. The proposals demand that people give up a part of their privacy and trust age verification systems with their personal information which could be connected to some of their most sensitive activity online.

The age-verification regulator would be able to make sites verify age and issue penalties, but they have no specific dutiesto protect people’s privacy, security or defend against cyber security risks that may emerge[5]from the age verification systems.

The UN Special Rapporteur, David Kaye, made the same points in his letter[6]to the Permanent Mission of the UK to the United Nations Office. The Rapporteur is especially “concerned at the imposition of the age-verification mechanism which has implications for the right to privacywithout imposing conditions for the storage of such data.”

In his opinion, it is likely that government and private companies will be able to share data on citizens’ viewing habits between departmentswithout citizens’ consent. Data collection and retention practices described in the Bill do not comply with the minimal requirements under Article 19 (3)of the International Covenant on Civil and Political Rights.[7]

TheDelegated Powers and Regulatory Reform Committee[8]highlighted further issues in their report on the Bill. The Committee found it unsatisfactorythat details of the age-verification regulator identity are not on the face of the Bill.The Committee members also criticised the lack of parliamentary approval for guidelines on financial penalties and definitions of making pornographic material available.

Following the Committee’s criticism, the Committee on the Constitution[9]questioned whether the House would be able to scrutinise the Bill if the key terms and concepts are omitted and left up to the age-verification regulator.

The Information Commissioner’s Office also raised serious privacy concerns in their oral evidence to the Commons, finding it “important that data minimisation is at the heart of any solution”.[10]

1.3 Recommendations:

Insert privacy safeguards onto the face of the Bill.

The Digital Economy Bill’s impact on privacy of users should, in human rights law, be properly spelled out (“in accordance with the law”).[11]Reducing some of the previously mentioned privacy and free expression[12]concerns can be achieved if the age-verificationregulator has specific duties to ensure that:

  • Age verification systems are low security risk.
  • Age verification systems do not create wider security risks for third parties, for instance to credit card systems.
  • Users’ data is kept anonymous. Age verification systems should not disclose the identity of individuals when they verify their age to persons making pornography available on the internet.
  • Users of adult websites are able to choosewhich age verification system they want to use, as opposed to being given only one prescribed method by the website.
  • Users of adult websites have clarity on the liability of data breaches and what personal data is at risk.
  • Age verification methods are easy to operate and cheap to implement for website owners.

We have expanded our recommendations to include an obligation for AV systems to allow end users to choose the AV provider of their choice. This is the model chosen by the government in relation to their own Verify system. In Verify, end users can decide which company they trust with sensitive information.

Choice enables competition, and helps trust. Without choice, the system would almost certainly default to a single, dominant AV provider, exacerbating the risks of abuse.

ORG’s proposed amendment:[13]

23b Approval of Age Verification Providers

Clause 23, page 26, line 31, at the end insert___

(1) Age verification providers must be approved by the Age Verification Regulator.

(2) In this section an “age verification provider” means a person who appears to the Age Verification Regulator to provide, in the course of a business, a service used by a person to ensure that pornographic material is not normally accessible by persons under the age of 18.

(3) The Age Verification Regulator must publish a Code of Practice to be approved by the Secretary of State and laid before Parliament.

(4) The Code will include provisions to ensure that Age Verification Providers,

(a) perform a Data Protection Impact Assessment and make this publicly available,

(b) take full and appropriate measures to ensure the accuracy, security and confidentiality of the data of their users,

(c) minimise the processing of personal information to that which is necessary for the purposes of Age Verification,

(d) do not disclose the identity of individuals verifying their age to persons making pornography available on the internet,

(e) take full and appropriate measures to ensure that their services do not enable persons making pornography available on the internet to identify users of their sites or services across differing sites or services,

(f) do not create security risks for third parties or adversely impact security systems or cyber security,

(h) comply with a set standard of accuracy in verifying the age of users;

(5) The Code will include provisions to ensure that publishers of pornographic material take full and appropriate measures to allow their users to choose the Age Verification Provider of their preference.

(6) Age Verification Providers and publishers of pornographic material must comply with the Code of Practice.

(7) To the extent that a term of a contract purports to prevent or restrict the doing of any act required to comply with the Code, that term is unenforceable.

2. Web-blocking (Part 3 of the Bill)

Government plans to enforce age verification on adult websites by blocking them have raised concerns that legal adult contentwill be censored. Through an ORG petition, nearly 20,000 peoplehave expressed their disagreement with the Government plans to enforce age verification on adult websites by blocking them.[14]

The proposals to block non-complying adult websites have been hastily drafted and inserted at the last moment during the Commons debate. Their possible impacts are not yet fully understood and have not been properly assessed.

2.1 Route of appeal

The lack of appropriate appeal routesagainst website blocks was identified by by the Delegated Powers and Regulatory Reform Committee. The Committee recommended including a statutory right of appeal on the face of the Bill. They considered it inappropriate that appeals against enforcement actions available to the regulator (such as blocking) are not set up to be made to an independent body.

The Committee’s opinion was echoed in the letter by the Special Rapporteur who stressed that“any legislation restricting the right to freedom of expression and the right to privacy […] must be undertaken by a body which is independent of any political, commercial or unwarranted influences […].” [15]

2.2 Costs

Website blocking will impose costs from the technical deployment and maintenance of censorship systems. Internet service providers (ISPs) are particularly worried about the lack of consultation and disregard for costs.[16]It would be unreasonable to demand Internet service providers (ISPs) arefinancially responsiblefor these costs. In some circumstances, the costs could be prohibitive, as not all ISPs have the means to implement blocking.

2.3 Proportionality

The proposal needs to assess the impacts of blocking of non-complying adult websites in each case. It might be disproportionate to block websites that have a very low number of UK visitors. A particular ISP might only supply businesses, so an order asking them to apply blocks may be disproportionate. Blocking a particular website may not be proportionate if it will deprive a specific minority of their only way of expressing their sexuality. An assessment of the harms needs to be made in each case.[17]

2.4 Recommendations:

Given the serious step towards censorship involved in these proposals we can only recommend that the new Clause 23[18]introduced by the Government stand part of the Bill.

3. Online copyright infringement

The Government has presented its plans to raise the maximum penalty for online copyright infringement to 10 years as simply a matter of parity with the offline world. We are concerned however that the definition of the infringement is too broad and will catch large numbers of Internet users.

Currently, Clause 27 states that criminal liability is to be determined by “causing loss”and “risk of loss”to the owner of the copyright. This is defined as merely failing to pay a licence fee. In practice, this would mean that an image trivially used to accompany an amateur blog post without paid license fee would cause the owner of the image (copyright) a loss. Posting the image on the blog will also expose the image owner to a risk of loss because the blog post can be shared on the Internet by a vast number of people.

Ordinary people engaged in domestic “filesharing” on a noncommercial basis could find themselves facing long jail sentences.

Furthermore, the raised penalties will likely be used for speculative invoicing by predatory law firms.There are a number of companies, often referred to as “copyright trolls”, that look for evidence of copyright infringement online, in order to send threatening letters asking for payment. The process of detection is vulnerable to error.[19]

The alleged infringements often relate to niche pornography, as this is not something most people would want to discuss publicly whether they are innocent or not. See Golden Eye,[20]TCYK LLP[21]for examples.

Clause 27 aids these companies by empowering them to threaten any online infringer with the much stronger criminal sanctions, which include ten years in prison. Even as a general comment within a letter, this could have a powerful persuasive effect on innocent people that they should pay the sums mentioned in the threatening letters.

3.1 Recommendation

Clause 27 can be improved by adding thresholds of seriousnessto the “risk of loss” and “causing loss”.

This change will ensure that individuals infringing copyright are normally dealt with through civil courts and civil copyright action. It will help deliver the “expected outcome”[22]of the Clause, that is to criminally prosecute only commercial copyright infringers.

ORG’s proposed copyright amendment:

27 Offences: infringing copyright and making available right

Clause 27, page 28, line 18,leave out “loss” and insert “commercial scale loss”

Clause 27, page 28, line 20, leave out “risk of loss” and insert “serious risk of causing commercial scale loss”

Explanatory note: This amendment would introduce thresholds for criminal liability to avoid prosecution of non-commercial infringers. The Agreement in Trade-Related Aspects of Intellectual Property Rights (TRIPS) defines criminal infringement as “on a commercial scale”.[23]

Kevin Brennan MP used this amendment as a probing amendment in the HoC Committee[24]. The responsible minister Matt Hancock rejected it on the basis that an accidental filesharer “is not expected to be caught by this offence.” The Minister refused to provide any meaningful safeguards and preferred to rely on a future court’s interpretation of the clause, which is not enough.

4. Data sharing (Part 5)

Part 5 contains a broad range of new powers for public bodies to disclose information – also known as data sharing – mainly among each other but in some cases with the private sector.

ORG has been involved in extensive discussions about these measures. Part 5 of the Bill has serious deficiencies that need to be amended.

4.1 The Bill gives ministers too much power without enough scrutiny

The Delegated Powers and Regulatory Reform Committee[25]“consider it inappropriate” for Ministers to have the powers to define lists of specified persons and non-specific purposes related to public service provision, fraud or debt. Instead, they argue that those given the powers to share data and the purposes for which it is used should be on the face of the Bill, with Ministers only able to make very limited additions based on a clear necessity. The Committee also ask for various “Henry VIII powers” peppered throughout the Bill to be narrowed down.

We support these proposals and expect appropriate amendments will be laid down in the days before the House discusses the Bill.

4.2 Lack of sufficient privacy safeguards

The power to share data for public services correctly attempts to restrict the scope of sharing to government activities that improve citizens’ well-being. However, the safeguards in the Bill are certainly not strong enough. There are similar concerns for chapters 3 and 4 on fraud and debt.

The Information Commissioner, Elizabeth Denham, raised similar concerns in oral evidenceto the Commons, finding that safeguards were “scattered throughout some of the draft codes of practice, but not in the Bill”, making clear that “right now there is no consistency across all the codes of practice for those kinds of safeguards. I believe that some improvements are needed to the Bill”.[26]

This could lead to abuses or risks for individuals as data may end up being used for purposes very different from those originally intended in the power. For example, onwards disclosure is not allowed unless “required or permitted by any enactment” (s 33(2)(a) page 31 line 19), which can be very broad and impossible to foresee.

We therefore recommend that the bill is fixed by including key privacy and transparency safeguards in the face of the Bill, including restricting onward disclosure to necessity.

Proposed amendments:

Clause 30

LORD COLLINS OF HIGHBURY, LORD STEVENSON OF BALMACARA, BARONESS JANKE, LORD CLEMENT-JONES

Page 30, line 8, at end insert—

“( ) Information disclosed from one specified person to another specified person

should be used for the purposes of a specific objective only.

( ) Where the information is to be used for purposes other than the specified

objective, additional approval must be provided.”

Clause 33

LORD COLLINS OF HIGHBURY LORD STEVENSON OF BALMACARA LORD CLEMENT-JONES BARONESS JANKE

Page 32, line 15, at end insert— “( ) In addition, in determining whether to make regulations under section 30 or 31, the appropriate national authority must ensure that— (a) the sharing of information authorised by the regulations is limited to what is strictly necessary to fulfil one of the conditions or purposes falling within subsection (2), (b) the conduct authorised by the regulations to achieve the specified objective is proportionate to what is sought to be achieved by that conduct, (c) a Privacy Impact Assessment compliant with the relevant Code of Practice of the Information Commissioner’s Office has taken place and been made publicly available, (d) the proposed measures have been subject to public consultation for a minimum of 12 weeks, and responses have been given conscientious consideration. ( ) As soon as is reasonably practicable after the end of three years beginning with the day on which the regulations come into force, the relevant Minister must review the operation of the regulations for the purpose of deciding whether they should be amended or repealed. ( ) Before carrying out the review, the relevant Minister must publish the criteria by reference to which that decision will be made. ( ) In carrying out the review, the relevant Minister must consult the Information Commissioner, open the review to public consultation for a minimum of 12 weeks and demonstrate that responses have been given conscientious consideration.”

Clause 34

LORD COLLINS OF HIGHBURY LORD STEVENSON OF BALMACARA BARONESS JANKE LORD CLEMENT-JONES

Page 33, line 25, leave out “or permitted”

Page 33, line 33, leave out “made” and insert “necessary”

Page 33, line 34, leave out “made” and insert “necessary”

Page 33, line 37, leave out “made” and insert “necessary”

After Clause 35

LORD COLLINS OF HIGHBURY LORD STEVENSON OF BALMACARA LORD CLEMENT-JONES BARONESS JANKE

Insert the following new Clause— “Public register of information disclosures (1) No disclosure of information by a public authority under Part 5 shall be lawful unless detailed by an entry in a public register. (2) Each entry in the register must contain, or include information on— (a) the uniform resource locator of the entry, (b) the purpose of the disclosure, (c) the specific information to be disclosed, (d) the data controllers and data processors involved in the sharing of the information, (e) any exchange of letters between the data controllers on the disclosure, (f) any other information deemed relevant. (3) In this section, “uniform resource locator” means a standardised naming convention for entries made in a public register.”

Clause 42

LORD CLEMENT-JONES

Page 43, line 2, at the end insert— “( ) In addition, in determining whether to make regulations under section 41, the appropriate national authority must ensure that— (a) the sharing of information authorised by the regulations is minimised to what is strictly necessary on grounds falling within subsections (2) and (3), (b) the conduct authorised by the regulations to achieve the “specified objective” is proportionate to what is sought to be achieved by that conduct, (c) a Privacy Impact Assessment compliant with the relevant code of practice of the Information Commissioner’s Office has taken place and been made publicly available, (d) the proposed measures have been subject to public consultation for a minimum of 12 weeks, and responses have been given conscientious consideration.”

Clause 43

LORD CLEMENT-JONES BARONESS JANKE

Page 43, line 20, leave out “or permitted”

Page 43, line 27, leave out “made” and insert “necessary”

Page 43, line 28, leave out “made” and insert “necessary”

Page 43, line 31, leave out “made” and insert “necessary”

Page 43, line 33, leave out “made” and insert “necessary”

Clause 51

LORD CLEMENT-JONES BARONESS JANKE

Page 51, line 18, leave out “or permitted” Page 51, line 25, leave out “made” and insert “necessary”

Page 51, line 26, leave out “made” and insert “necessary”

Page 51, line 29, leave out “made” and insert “necessary”

After Clause 69

LORD CLEMENT-JONES

Insert the following new Clause—

Provisions that apply to the processing of personal data

  • (1) This section relates to this Part and to the processing of personal data defined by section 1 of the Data Protection Act 1998 (basic interpretative provisions).
  • (2) Where the Information Commissioner is of the view that the processing of personal data which has been shared under the provisions of this Part contravenes Article 8 of the European Convention on Human Rights, the Commissioner may serve an enforcement notice (see section 40 of the Data Protection Act 1998) specifying that fact.
  • (3) Provisions of this Part do not allow personal data which has been shared to be processed as follows—
  • (a) data matching of personal data in order to identify any data subject who can be excluded from any benefit;
  • (b) profiling using personal data in order to target any data subject who can be excluded from any benefit;
  • (c) facilitating a disclosure of a“bulk personal dataset”(whether directly or indirectly) to an “intelligence service” (as described in the Investigatory Powers Act 2016).
  • (4) Any data sharing arrangement (as required by Chapter 14 of the Data Sharing Code of Practice produced by the Information Commissioner) that applies to the disclosure of personal data from a data controller to any “third party” (as defined in section 70(1) of the Data Protection Act 1998) must contain the proposed or estimated benefits associated with the data sharing before any disclosure of personal data occurs, and the data sharing arrangement must describe how these benefits are to be measured or assessed.
  • (5) The Information Commissioner, with respect to an assessment of whether any data sharing arrangement subject to subsection (4) is beneficial, can require the production of-

(a) key performance indicators which demonstrate that the benefits associated with any data sharing are being realised by the data sharing;

  • (b) the costs associated with the data sharing arrangements;
  • (c) the number of data subjects involved; and
  • (d) any other information that the Information Commissioner considers reasonable in order to make an informed and independent assessment of the benefits of data sharing.

(6) If the benefits associated with data sharing are not being realised, the Information Commissioner can require the sharing to cease by serving an enforcement notice (see section 40 of the Data Protection Act 1998).

(7) With respect to any data sharing arrangement subject to subsection (4), the provision in section 40(8) of the Data Protection Act 1998 shall be read as if the words “If by reason of special circumstances” were replaced by “If for any reason”.”

4.3 Codes of practice to be legally enforceable and fully approved by Parliament

Some added safeguards are contained in the code of practice. This is not adequate among other reasons because the codes legal status is unclear. The main code itself states:

“11. The contents of this Code are not legally binding, though the provisions of the Bill require that you have regard to the Code when making use of these powers. The Code does not itself impose additional legal obligations on parties seeking to make use of the powers, nor is it an authoritative statement of the law.”

The Delegated Powers and Regulatory Reform Committee is under a different impression:

“35. We do not accept (…) that the code may only “possibly” be legislative in nature. In our view, there is no doubt that it is legislative because persons to whom the code applies must have regard to it.”

The Committee goes on to demand that all Codes of Practice associated with these data sharing powers to be laid before Parliament in draft for full approval before coming into force.

Clearly this uncertainty is untenable. We therefore recommend that the bill is amended to clarify that the codes are legally binding and must be fully approved by Parliament. We also recommend the codes are subjected to ample public consultation.

Proposed amendments

Clause 30

LORD COLLINS OF HIGHBURY, LORD STEVENSON OF BALMACARA, LORD CLEMENT-JONES, BARONESS JANKE

Page 30, line 25, leave out “had regard to” and insert “complied with”

Clause 31

LORD COLLINS OF HIGHBURY, LORD STEVENSON OF BALMACARA, BARONESS JANKE, LORD CLEMENT-JONES

Page 31, line 36, leave out “had regard to” and insert “complied with”

Clause 36

LORD COLLINS OF HIGHBURY LORD STEVENSON OF BALMACARA BARONESS JANKE LORD CLEMENT-JONES

Page 35, line 4, leave out “have regard to” and insert “comply with”

LORD CLEMENT-JONES

Page 35, line 7, leave out subsection (4) and insert— “(4) As soon as reasonably practicable after issuing or reissuing the code of practice, the relevant Minister must arrange for a copy of it to be laid before, and approved by, a resolution of both Houses of Parliament.”

LORD COLLINS OF HIGHBURY LORD STEVENSON OF BALMACARA BARONESS JANKE LORD CLEMENT-JONES

Page 35, line 15, at end insert— “( ) the public, for a minimum of 12 weeks, and”

Page 35, line 16, at end insert— “and the relevant Minister must demonstrate that responses have been given conscientious consideration.”

Clause 41

LORD CLEMENT-JONES

Page 41, line 41, leave out “had regard to” and insert “complied with”

Clause 45

LORD CLEMENT-JONES BARONESS JANKE

Page 44, line 38, leave out “have regard to” and insert “comply with”

LORD STEVENSON OF BALMACARA

Page 44, line 40, at end insert— “(3A) A specified person is required to ensure that he or she complies with the code of practice in respect of any action taken in connection with a debt listed in section 41(3).” Page 44, line 40, at end insert— “( ) Any person capable of being a specified person in regulations made under section 41(4) is required to follow the code of practice in respect of any action taken in connection with a debt listed in section 41(3).”

LORD CLEMENT-JONES BARONESS JANKE

Page 44, line 41, leave out subsection (4) and insert — “(4) Before issuing or reissuing the code of practice, the relevant Minister must arrange for a draft to be laid before, and approved by a resolution of, both Houses of Parliament.” Page 44, line 42, at end insert— “(4A) The code of practice must be subjected to public consultation for a minimum of 12 weeks, and the relevant Minister must demonstrate that responses have been given conscientious consideration.”

Clause 49

LORD CLEMENT-JONES BARONESS JANKE

Page 49, line 32, leave out “had regard to” and insert “complied with” Page 49, line 43, at the end insert— “( ) In determining whether to make regulations under subsection (5), the appropriate national authority must ensure that— (a) the sharing of information authorised by the regulations is minimised to what is strictly necessary on grounds falling within subsections (2) and (3), (b) the conduct authorised by the regulations to achieve the “specified objective” is proportionate to what is sought to be achieved by that conduct, (c) a Privacy Impact Assessment compliant with the relevant Code of Practice of the Information Commissioner’s Office has taken place and been made publicly available, (d) the proposed measures have been subject to public consultation for a minimum of 12 weeks, and responses have been given conscientious consideration.”

Clause 53

LORD CLEMENT-JONES

Page 52, line 41, leave out “have regard to” and insert “comply with” Page 53, line 1, leave out subsection (4) and insert— “(4) Before issuing or reissuing the code of practice, the relevant Minister must arrange for a draft to be laid before, and approved by a resolution of, both Houses of Parliament.” Page 53, line 2, at end insert— “( ) The code of practice must be subjected to public consultation for a minimum of 12 weeks, and the relevant Minister must demonstrate that responses have been given conscientious consideration.”

Clause 57

LORD CLEMENT-JONES

Page 57, line 34, leave out “has regard to” and insert “comply with”

4.4 Unconstrained sharing of bulk civil registration data

Chapter 2 provides for the sharing of civil registration – births, deaths and marriages – for any public body’s functions without restrictions other than those expressly provided in other legislation. Ministers have presented this chapter as a way of improving electronic government transactions by avoiding the need for paper certificates to be circulated, which indeed is a good thing. However we have been told unequivocally that the power is intended for bulk data sharing of the full civil register across government.

This case for the power for bulk sharing of civil registration has not been made, but it appears to be more about convenience for administrators instead of a clear social purpose. There are few safeguards on how the power is used and broad purposes for which data can be shared wholesale.

We recommend that this power is removed from the Bill. Alternatively the Bill should contain a consent based power, where citizens can request the sharing of electronic individual records in order to improve e-government.

Proposed amendments:

Clause 39

BARONESS BYFORD

Baroness Byford gives notice of her intention to oppose the Question that Clause 39 stand part of the Bill.

LORD CLEMENT-JONES

Page 38, line 23, leave out from “that” to end of line 26 and insert— “(a) the authority or civil registration official to whom it is disclosed (the “recipient”) requires the information to enable the recipient to exercise one or more of the recipient’s functions, and (b) the data subjects whose information is being disclosed have given valid consent under data protection legislation.” Page 40, line 7, leave out “have regard to” and insert “comply with” Page 40, line 18, after “before” insert “, and approved by a resolution of both Houses of,”

5. Internet filtering

The new amendment proposed by Lord Ashton of Hyde after Clause 84 for a new clause enabling Internet Service Providers to implement filters in accordance with their Terms and Conditions is a clear breach of EU Net Neutrality regulations.

This would put OFCOM in a difficult position of enforcing contradictory legislation. We cannot see how this amendment can be approved and recommend its removal.


[1]http://www.theinquirer.net/inquirer/news/2478759/xhamster-hack-380-000-accounts-exposed-in-porn-site-breach

[2]https://www.theguardian.com/technology/2016/nov/14/adult-friend-finder-and-penthouse-hacked-in-largest-personal-data-breach-on-record

[3]https://www.wired.com/2015/08/happened-hackers-posted-stolen-ashley-madison-data/

[4]http://www.bbc.co.uk/news/technology-34044506

[5]Evidence to the Public Bill Committee on the Digital Economy Bill submitted by Alec Muffett outlining security risks of proposed age verification systems. http://www.publications.parliament.uk/pa/cm201617/cmpublic/digitaleconomy/memo/DEB39.htm

[6]Letter by the UN Special Rapporteur: http://www.ohchr.org/Documents/Issues/Opinion/Legislation/UK_DigitalEconomyBill_OLGBR1.2017.pdf

[7]http://www.ohchr.org/EN/ProfessionalInterest/Pages/CCPR.aspx

[8]https://www.publications.parliament.uk/pa/ld201617/ldselect/lddelreg/89/89.pdf

[9]http://www.publications.parliament.uk/pa/ld201617/ldselect/ldconst/96/96.pdf

[10]https://hansard.parliament.uk/Commons/2016-10-13/debates/bd1cfe5f-c1b6-4431-ac85-2fdccbaed442/DigitalEconomyBill(ThirdSitting)

[11]https://en.wikipedia.org/wiki/Article_8_of_the_European_Convention_on_Human_Rights

[12]https://www.openrightsgroup.org/blog/2016/a-database-of-the-uks-porn-habits-what-could-possibly-go-wrong

[13]https://wiki.openrightsgroup.org/wiki/Age_Verification_amendment

[14]https://www.openrightsgroup.org/campaigns/digital-economy-bill-hub/stop-uk-censorship-of-legal-content

[15] Letter by the UN Special Rapporteur: http://www.ohchr.org/Documents/Issues/Opinion/Legislation/UK_DigitalEconomyBill_OLGBR1.2017.pdf

[16]Internet Service Providers’ Association: “We are concerned and disappointed it has gone down this path … this change in direction has been agreed without any consultation, with no assessment of costs nor is there any certainty that it will comply with judicial rulings on interference with fundamental rights.” http://www.ispreview.co.uk/index.php/2016/11/uk-isps-say-new-website-blocking-powers-lack-thought-consultation.html

[17]More detailed analysis of the web blocking proposals can be found here: https://www.openrightsgroup.org/ourwork/reports/digital-economy-bill-briefing-to-house-of-commons-at-report-stage

[18]http://www.publications.parliament.uk/pa/bills/lbill/2016-2017/0080/lbill_2016-20170080_en_4.htm#pt3-l1g23

[19]For instance, the account holder may not be the infringer. The infringer might be a family member or someone else using the Internet connection. Wifi connections are not always secured by password, allowing neighbours to use the connection. Other errors include incorrect logging of Internet address connections to customers, as these vary over time.

[20]ISP customers were accused of illegally downloading a porn movie and the company demanded they pay £700.

[21]83-year old pensioner was accused by TCYK LLP of illegally downloading a movie. The company demanded that the pensioner pays £600 to settle the case.

[22]Matt Hancock stated that “A person who accidentally shares a single file without the appropriate licence, particularly when the copyright owner cannot demonstrate any loss or risk of loss, is not expected to be caught by this offence.” https://goo.gl/7OCDWJ

[23]https://wiki.openrightsgroup.org/wiki/Online_Copyright_Infringement_amendment#27_Offences:_infringing_copyright_and_making_available_right

[24]https://hansard.parliament.uk/commons/2016-10-25/debates/b550c185-3b53-4f46-bade-16a4e426befb/DigitalEconomyBill(SeventhSitting)

[25]http://www.publications.parliament.uk/pa/ld201617/ldselect/lddelreg/95/95.pdf

[26]https://hansard.parliament.uk/Commons/2016-10-13/debates/bd1cfe5f-c1b6-4431-ac85-2fdccbaed442/DigitalEconomyBill(ThirdSitting)