Digital Economy Bill Briefing to House of Lords Report Stage

This briefing is also available in PDF format.

Open Rights Group (ORG) is the UK’s only digital campaigning organisation working to protect the rights to privacy and free speech online. With over 3,000 active supporters, we are a grassroots organisation with local groups across the UK.

Digital technology has transformed the way we live and opened up limitless new ways to communicate, connect, share and learn. But for all the benefits, technological developments have created new threats to our human rights. We raise awareness of these threats and challenge them through public campaigns, legal actions, policy interventions and tech projects.

ORG has concerns about the following areas of the Digital Economy Bill (DE Bill):

  • Age verification (Part 3) – pg 1

  • Online copyright infringement (Part 4) – pg 4

  • Data sharing (Part 5) – pg 6

1. Age verification

We do not believe that the amendments proposed1 for the Report stage on Age Verification (AV) are the appropriate policy response. Amendments were tabled by the Government to introduce three pieces of secondary legislation at a later date. This shows that the age-verification policy is still not adequate and has not been thought through.

We recommend Part 3 is completely removed from the statute as it is not capable of achieving the policy objective of preventing children and young people from accessing pornography.

1.1 AV guidance on AV providers

The Bill needs to be clear that it can regulate the age verification providers, and not just websites.

The Government have introduced several amendments following recommendations made by the Delegated Powers and Regulatory Affairs Committee’s report2. The amendments are supposedly responding to the criticism of the lack of detail in the guidance published by the AV regulator.

The new proposed clause says that a statutory instrument will provide guidance for “types of arrangements for making pornographic material available that the regulator will treat as complying”3.

The Government expects this statutory guidance to be able to regulate age checking tools and privacy requirements. However, age checking tools do not get a mention in the Bill and neither do providers of AV tools. The Bill and the proposed new clauses attempt to regulate websites that provide adult material to the public on a commercial basis, but not the providers of AV tools who will have considerable power in the proposed system.

The criticisms made by the Committee still stands. There is lack of detail in the guidance that will be published by the AV regulator. It is not clear that age-verification providers can be regulated by the AV regulator if they are not mentioned in the Bill.

Unregulated AV tool providers will not have incentives to incorporate appropriate privacy standards for age verification and may wish to check customers’ age in all kinds of novel and potentially intrusive ways.

The Bill and the proposed amendments do not include any requirements for privacy standards for porn websites, despite the promise Government made4 during the Committee stage debate. The complete lack of any privacy requirements in the Bill and the secondary legislation will expose people to hacks similar to Ashley Madison, FriendFinder or xHamster. These hacks affected millions of people5. 

1.2 Consent and free choice

The statutory guidance must ensure that users can choose their own tools.

Previously, ministers claimed that data protection laws are sufficient to protect people’s privacy. Since the Data Protection Act of 1998 relies on explicit consent that is freely given for sharing of data, they are unlikely to be sufficient. People will have no choice in giving their consent because everyone must use AV tools in order to access adult content. Freely given consent is absent here.  

The guidance will only regulate adult websites. This means they will be the ones choosing the age checking tools. The tools they choose will be chosen for the benefit of the websites – for example, because they are cheapest, best for tracking people or simply because it is the tool everyone signed up to.

Such arrangement reinforces an environment where an age-verification tool could establish a monopoly. The fact that the tool could be unsafe or lack in privacy will not be the decisive factor when chosen by a website. Rather, cost or number of established users will be the deciding factors.

It is necessary that the statutory guidance regulates standards for AV tool providers to minimise negative impact on people if these tools get hacked or their personal information reused. High privacy standards must be set in the guidance for AV providers because of the possibility of AV tool monopoly on the market and the lack of freely given consent.

The government has stated that data protection is sufficient to guarantee user privacy. However, the data protection approach assumes that people are choosing to share their data freely. Users are expected to share a lot of data, according to the needs of the person and the service, and for the service to protect the data as it is used. With age verification, however, users are obliged to use the tools, despite high risks, so should be given the opportunity to ensure that the level of data shared is as minimal as possible. This requires additional safeguards.

1.3 Inadequate censorship safeguards

Appeals process against enforcement actions by the AV regulator should be made to a court. 

Despite the Committee’s recommendations, the Government decided not to make the appeals process  take place within the UK court system. The Bill will allow the British Board of Film Classification to censor any pornographic website that does not provide an AV tool. This arrangement will create situations where people’s freedom of expression is affected. 

According to the Government new proposals, an independent body (not the BBFC) will take care of appeals. The independent body has not been specified and it is not clear what the appeals process will look like. Unlike an unspecified independent body, courts have very clear obligations to the freedom of expression. Case law could establish the boundaries. This we believe would be harder with a more closed system of appeals. 

1.4 Unrestrained censorship 

The BBFC’s power to censor pornoraphic websites that do not apply Age verification is said to be a ‘punishment’ for non-compliance. It has been reported that perhaps 100 popular websites would be targeted for non-compliance, meaning a very low level of website blocking. 

However, the blocking orders can be applied to any pornographic website that is notified by the BBFC. There are tens of thousands of websites that could be legally blocked under the power. Without some means to constrain website blocking, the BBFC will be subject to political pressure to extend the lists of blocked websites from a handful to thousands. This is a hostage to fortune. 

1.5 Recommendations

We recommend that Parliament:

  • Inserts privacy safeguards onto the face of the Bill

  • Includes age-verification tool providers in the regulatory guidance to be published by the age-verification regulator

  • Specifies in the guidance that people can choose their preferred age-verification tool

  • Makes appeals process against enforcement actions by the age-verification regulator statutory

  • Places thresholds to limit the extent of website blocking by the BBFC


    2. Online copyright infringement

Thresholds of seriousness should be set for online copyright infringement.

The Government has presented its plans to raise the maximum penalty for online copyright infringement to 10 years as simply a matter of parity with the offline world.

We previously raised our concerns that the definition of the infringement is too broad and will catch large numbers of Internet users.

In the bill, ten year sentences are available where online publication of a copyright work means that a “loss” has occurred (including not paying licence fees) or a “risk of loss” is created. The Intellectual Property Office’s response6 to our concerns stated that “it would not be practical for the government to set a specific level of loss or gain at which infringement becomes a criminal offence.”

Their response does not adequately explain why they cannot or should not introduce a threshold for criminality. The question remains as to whether these usually minor offences are worthy of criminal sanctions. These could include:

  • Using copyright pictures from other websites, such as images of politicians or famous buildings, on a personal blog or social media

  • Using images of musicians or actors found on news websites, for instance from award events, on a blog or social media

  • Sharing files (which includes uploading as well as downloading) via Bittorrent at small scale 

Minor infringements appear to be criminal under the proposed offence. It is our understanding that they are unlikely to be sentenced, or even prosecuted. However it is unclear why these minor acts should be criminalised, rather than being subject to civil charges.

The proposed offence creates new opportunities for copyright trolls; there is a simple way to remove this risk, which is to introduce thresholds. 

Copyright troll companies specialise in threats concerning file sharing of (usually niche) pornographic works in order to frighten and embarrass people into payment, often incorrectly. The current wording of the offence aids these companies by empowering them to threaten any online infringer with the much stronger criminal sanctions – ten years in prison. Even as a general comment within a letter, this could have a powerful persuasive effect on innocent people that they should pay the sums mentioned in the threatening letters. 

Our proposal is to set a threshold of “commercial scale loss”, and revising “risk of loss” to “serious risk of commercial scale loss”. These are flexible rather than “specific”, and the Government’s objection for not being practical does not hold up. 

Thresholds of seriousness would give the public, lawyers and courts a clear indication that minor acts of file sharing or unlicensed online publication would be unlikely to meet the “serious risk” or “commercial scale” losses. 

As it currently stands, the clause on offences for online copyright infringement appears to be incompatible with both Council of Europe and EU law. The clause fails to meet the tests of foreseeability and proportionality. 

“In particular, the legal basis for a conviction has to be sufficiently clear and its scope must be foreseeable.”7 

The wording of the offence is not sufficiently clear and precise in its terms so people would understand what the conditions are for causing loss or a risk of loss to someone on a criminal level. As a result of the changes in the Digital Economy Bill, many low impact copyright infringements under the Copyright, Designs and Patents Act8 could be subject to either  criminal or civil action. Nevertheless, government seeks to assure people that low level infringements would not be subject to criminal charges. To meet the ‘foreseeability’ test, some kind of statutory threshold needs to be placed into the bill. 

For the judgment on the offence to be proportionate, the gravity of the offence has to be taken into account. There needs to be a clear distinction between commercial scale infringement and small scale infringement9. This is not possible without thresholds of seriousness. 

2.1 Recommendations

We recommend to consider entering thresholds of seriousness for online copyright infringement based on “commercial scale loss” and “serious risk of commercial scale loss”  as a small and sensible change. 


3. Data sharing

We welcome Government’s amendments:

  • limiting powers of Ministers by narrowing down objectives for data sharing,

  • creating a closer link with functions of public authorities,

  • limiting definitions of “specified persons” able to share data for public service delivery and fraud and debt purposes,

  • making codes of practice statutory,

  • removing powers to repeal and amend the Bill without prior scrutiny. 

All of these changes are welcomed and needed. However there are several outstanding issues in Part 5 of the DEBill. 

3.1 Unconstrained sharing of bulk civil registration data

Chapter 2 on civil Registration is excessively intrusive and should be scrapped from the Bill. 

Ministers have presented this chapter as a way of improving electronic government transactions by avoiding the need for paper certificates to be circulated, but it appears to be more about convenience for administrators instead of a clear social purpose. 

The Bill does not include a clause requiring consent from the data subject to share their civil registration data (births, deaths and marriages). Chapter 2 provides for the sharing of civil registration for any public body’s functions without restrictions. The power is intended for bulk data sharing of the full civil register across government but this power has not been sufficiently justified by the Government. 


We recommend that this power should be removed from the Bill. Alternatively the Bill should contain a consent based power, where citizens can request the sharing of electronic individual records in order to improve e-government.

3.2 Review for all powers under Part 5

The Government tabled an amendment limiting the powers of Minister to repeal or amend fraud and debt powers after their review. 

All the powers in Part 5 would benefit from reviews. Applying reviews to all powers and limiting powers to amend and repeal will increase transparency of data sharing and prevent unjustified onward disclosure of data to other public authorities. 


The Bill should include limited powers to amend or repeal all sections of Part 5 after a review. 

3.3 Exclusion of punitive objectives

We consider it essential that the powers in Part 5 regarding disclosure for debt (Chapter 3) and civil registration (Chapter 2) reasons and sharing of data with electricity and gas suppliers (Chapter 1) are only used for the benefit of an individual or a household. 

This requirement is already in place for public service delivery (Chapter 1) and it prevents data sharing from supporting any punitive objectives of public authorities. For example, some public authorities carry out enforcement as one of their functions. In such cases, some of their objectives for data sharing could be of a punitive nature. 


We recommend the Bill clearly states that the relevant powers in Part 5 are only to benefit individuals, not to punish them.



[3] New Clause by Lord Ashton of Hyde – After Clause 24 “Guidance to be published by age-verification regulator” 1(b)

[4] Lord Ashton of Hyde stated: “However, we agree that we must ensure that it is built into the age verification process in a meaningful way[…].”

[5] xHamster- 380,000 people affected and FriendFinder412 million accounts affected




[9] As per Case 275/06 Promusicae v Telefonica, commercial scale would mean “particularly serious cases such as […] offences committed with a view to making a profit” [AG 119]. Whilst this would include infringing websites and initial uploaders injecting content on P2P (file-sharing) networks to direct file sharers to these websites for financial gain (through advertising, donations, VIP access for faster downloads etc), this would exclude actions performed by file shares for personal and not-for-profit purposes (EDPS 2012, 9–10).