Blog


September 20, 2018 | Javier Ruiz

Machine learning and the right to explanation in GDPR

One of the rights in GDPR is the right to explanation. Here we take a look at some of the debates about the right and how it can be implemented.

This blogpost is a small section of a much larger research report Debates, awareness, and projects about GDPR and data protection. The report complements the launch of the Digital Rights Finder tool delivered by Projects by IF and Open Rights Group. We highlight some of the most interesting and important debates around GDPR (General Data Protection Regulation).


There is some concern about the practical feasibility of implementing the right to explanation in GDPR in the context of complex data processing such as big data, artificial intelligence and machine learning. (See this section of the report for more on debates about the existence of the right to explanation.)

Lilian Edwards and Michael Veale argue that a right to an explanation is not the remedy to harms caused to people by algorithmic decisions. They also argue that the narrowly-defined right to explanation in GDPR of “meaningful information about the logic of processing” is not compatible with how modern machine learning technologies are being developed.

The problems to tackle here are discrimination and fairness. Machine learning systems are designed to discriminate but some forms of discrimination are socially unacceptable and the systems need to be restrained. The general obligation of fairness in data protection provides the basis for the need to have some level of insight into the functioning of algorithms, particularly in profiling.

One of Edwards and Veale’s proposals is to partially remove transparency as a necessary key step towards accountability and redress. They argue that people trying to tackle data protection issues have a desire for an action, not for an explanation. The actual value of an explanation will not be to relieve or redress the emotional or economic damage suffered, but to understand why something happened and helping ensure a mistake doesn’t happen again.

Within this more limited sense, problems remain in defining  transparency in the context of algorithmic accountability. For example, providing the source code of algorithms may not be sufficient and may create other problems in terms of privacy disclosures and the gaming of technical systems. They argue that an auditing approach could be more successful instead by looking at the external inputs and outputs of a decision process, rather than at the inner workings: “explaining black boxes without opening them”.

The authors see the right to explanation as providing some grounds for explanations about specific decisions. They present two types of algorithmic explanations that could be provide: model-centric explanations (MCEs) and subject-centric explanations (SCEs), which seem broadly aligned with explanations about either systems or decisions.

SCEs are seen as the best way to provide for some remedy, although with some severe constraints if the data is just too complex. Their proposal is to break down the full model and focus on particular issues through pedagogical explanations to a particular query, “which could be real or could be fictitious or exploratory”. These explanations will necessarily involve trade offs with accuracy to reduce complexity.

Their main concern seems to be to avoid a creating a “transparency fallacy”, where similarly to the “consent fallacy” people regimen an illusion of control that does not exist, instead of being offered practical remedies to stop harmful data practices.

There is growing interest in explanation of technical decision making systems in the field of human-computer interaction design. Practitioners in this field criticise efforts to open the black box in terms of mathematically interpretable models as removed from cognitive science and the actual needs of people. Alternative approaches would be to allow users to explore the system’s behaviour freely through interactive explanations. This is quite similar to the proposals by Edwards and Veale.

A complementary approach has been put forward by Andrew Selbst and Solon Barocas, who argue that the increasing calls for explainability of automated decision making systems rely on an intuitive approach that will not work with machine learning. ML is both inscrutable and non-intuitive. Inscrutability is the back box problem, the inability to understand the inner cogs of a model, but non-intuitiveness means being unable to grasp the rules the model follows, even if we were able to open the box. Accountability requires not only knowledge of the process, but also whether it is justified, or fair.

Selbst and Barocas argue that lawyers and scholars asking for explanations will be disappointed because intuition cannot deal with the truly novel insights produced through machine learning that associate data in patterns that completely escape human logic and imagination.

Their alternative proposal is to focus accountability on the processes around ML models, not the models themselves. Policies and documentation of intent and design choices should be made available, some by default, such as impact assessment, and others in the context of a complaint or regulatory action. This approach chimes with the main thrust of GDPR, which puts accountability at the fore.

In summary, the right to an explanation as defined in GDPR may be harder than expected to implement. This does not invalidate the basic premise that individuals have a right to know what is being done with their data, but – particularly with novel machine learning techniques – it means that we need to look beyond simple calls for transparency.

[Read more]


September 11, 2018 | Ed Johnson-Williams

Helping IoT developers to assess ethics, privacy, and social impact

GDPR has brought a renewed focus to assessing the privacy risks of using personal data. How can we also assess the ethical and social impacts of how organisations use data?

GDPR (General Data Protection Regulation) introduces a mandatory Data Protection Impact Assessment. This is to help organisations to identify and minimise the data protection risks of a project to individuals. But there are other consequences to collecting and using personal data beyond privacy and data protection considerations. We should also be thinking about the ethical and societal outcomes of what we do with data. Open Rights Group (ORG) is exploring these issues as part of the VIRT-EU consortium alongside the London School of Economics, Uppsala University, Polytechnic University of Turin, and Copenhagen Institute for Interaction Design.

The project is researching Internet of Things (IoT) development and development culture. It is also creating tools and frameworks to help foster ethical thinking among IoT developers. One of these tools will be the Privacy Ethical and Social Impact Assessment (PESIA), which augments and interacts with the Data Protection Impact Assessment from GDPR. The PESIA is being developed predominantly by Alessandro Mantelero at the Polytechnic University of Turin with the help of ORG. It will be a voluntary, self-assessment tool to help organisations who collect and process personal data to assess the wide variety of risks and repercussions related to how they use data.

While Privacy Impact Assessments and Data Protection Impact Assessments look primarily at issues around privacy, the PESIA extends beyond that by including ethical dimensions and societal consequences of using data. Privacy- and data protection-focused assessments seldom address issues like the possibility of discrimination affecting individuals and groups due to decisions made using big data. This is despite Recital 75 to Article 35 of GDPR which highlights “social disadvantage” and “discrimination” as some of the potential consequences of data processing. These considerations will be integrated within the PESIA.

The PESIA emphasises a public, participatory approach where a product or service engages its users and people it affects in the process of identifying the issues around relevant privacy, ethical, and social impacts. This contrasts with Privacy Impact Assessments and Data Protection Impact Assessments which, for the most part, are carried out internally and are not necessarily easily accessible for users and customers.

We hope that the PESIA will help developers to integrate ethical and social values into their work and the devices and services they create. Previous attempts to create a Social Impact Assessment have highlighted the significant investment of time and other resources that can be required for organisations to assess the impact of their products and services. For this reason, the PESIA aims to be easy to implement by people within the organisation itself – at least for the early stages of the analysis. Some involvement by experts may sometimes be necessary to assess social consequences of data processing.

The PESIA will not be a box-ticking exercise where organisations assess privacy-, ethical-, and social impact-related issues without fully engaging in the consequences of their work. Instead, our goal in the development of PESIA is that this tool will help IoT developers to develop a clearer understanding of the ways in which their use of data has an impact on society and what the ethical implications of their work are. Being a voluntary self-assessment, PESIA leaves the developers to decide how to address the issues that are raised in the process of thinking through these implications.

Because ethical and social values can vary significantly between countries and cultures, one of the most challenging aspects of developing the PESIA is defining the ethical and social values that the assessment uses. This is quite a significant shift from some aspects of conventional data protection assessments where technological considerations such as data security can be much more easily generalised across countries and cultures.

The VIRT-EU project is looking to address this challenge by separating out the PESIA into layers. The foundational layer presents the common values of international human rights charters and other law including the EU Charter of Fundamental Rights. It also brings in the case law of decisions regarding data processing made by data protection authorities, the European Data Protection Board (previously the Article 29 Data Protection Working Party), the European Court of Justice and the European Court of Human Rights. This layer should directly allow us to secure common ground across different contexts. The PESIA assessment will provide short cases and examples to help developers (who usually do not have a legal background) to understand these questions.

The second layer deals with the diversity of ethical and social values across Europe. The project is undertaking a comparative analysis of the decisions of different national data protection authorities on similar cases to identify values that may have informed and led to different outcomes in those cases. Specific values will not necessarily be obvious in those legal decisions and we do not aim to be able to describe or define national values.

The third and most specific layer involves what sorts of values are enacted by IoT developer communities. Our colleagues at the London School of Economics and at the IT University of Copenhagen are trying to learn from IoT developers through active participation in those communities. In doing so, they hope to gain an understanding of their work practices and the ethical and social values they enact in the course of technology development.

We will be developing the PESIA assessment over the next year and then trialling it with IoT developers. We hope these trial sessions will provide feedback about the content and process of the assessment. If you are a UK-based IoT developer and are interested in participating in the these trial sessions, please contact ORG researchers at ed@openrightsgroup.org and we will contact you closer to the time.

[Read more]


September 04, 2018 | Mike Morel

We need copyright reform that does not threaten free expression

The controversial Copyright Directive is fast approaching another pivotal vote on 12 September. For the third time in half a year MEPs will decide whether Article 13 - or something even worse - will usher in a new era, where all content is approved or rejected by automated gatekeepers.

Seen through an economic lens the Directive’s journey is viewed as a battle between rights holders and tech giants. Yet a growing chorus of ordinary citizens, Internet luminaries, human rights organisations and creatives have rightly expanded the debate to encompass the missing human dimension.

Open Rights Group opposes  Article 13 - or any new amendments proposing similar ideas - because it poses a real threat to the fundamental right to free speech  online.

Free speech defenders claimed a victory over industry lobbyists this summer when MEPs rejected plans to fast-track the Directive and a lasting triumph is now in reach. UK residents are in an especially strong position to make a difference because many of their MEPs remain undecided. Unlike some other EU states, voting patterns aren’t falling strictly on party lines in the UK.

This time new amendments will be added, and the underlying principles of Article 13 will again face a vote. They include:

Changes to Internet platform liability

If Internet platforms become directly liable for user content, they will become de facto copyright enforcers. This will leave them little choice but to introduce general monitoring of all user content with automated filters. Companies are not fit to police free speech. To avoid penalties they will err on the side of caution and over-block user content.

The Implicit or explicit introduction of upload filters

Everything we know about automated filters shows they struggle to comprehend context. Yet identifying the vital legal exceptions to copyright that enable research, commentary, creative works, parody and more requires a firm grasp of context. An algorithm’s poor judgement will cause innocent speech to be routinely blocked along with copyright violations.

The introduction of general monitoring

General monitoring of all user content is a step backwards for a free and open Internet. It is also technically infeasible to monitor every form of content covered in Article 13's extraordinarily wide mandate which includes text, audio, video, images and software.

Outspoken Article 13 cheerleader Axel Voss MEP said “Censorship machines is not what we intend to implement and no one in the European Parliament wants that.” This is what happens when copyright reform is pursued with little consideration to human rights.

The proposals within Article 13 would change the way that the Internet works, from free and creative sharing, to one where anything can be removed without warning, by computers. This is far too high a price to pay for copyright enforcement. We need a copyright reform which does not sacrifice fundamental human and digital rights.

If you’re a UK resident, please contact your MEP before they vote Wednesday 12 September. You can use our tool to reach them:

https://action.openrightsgroup.org/robocopyright-returns

If you live outside the UK, try this website:

https://saveyourinternet.eu

[Read more]


August 21, 2018 | Javier Ruiz

The right of access in GDPR: What are the debates?

There is much debate about the right of access in GDPR but many of them are yet to be discussed widely outside of academia. Here we discuss some of those debates including around subject access requests and privacy protections.

This blogpost is a small section of a much larger research report Debates, awareness, and projects about GDPR and data protection. The report complements the launch of the Digital Rights Finder tool delivered by Projects by IF and Open Rights Group. Here we highlight some of the most interesting and important debates around GDPR (General Data Protection Regulation).


The right of access to personal information, codified in Article 15 of GDPR, is one of the key elements of the European data protection framework, enabling the exercise of other rights and providing for the fundamental “right to data”. This right is set to expand, as GDPR removes several practical barriers to its exercise, such as the payment of fees. This new situation presents some potential challenges, particularly for organisations implementing automated tools to support the exercise of the right.

Third parties

The use of third parties for subject access requests (SARs) is fairly common, for example solicitors acting on behalf of their clients. The removal of the associated fees in GDPR will almost certainly trigger a huge increase in the use of bulk third party SARs as part of a growing contestation of data practices.

Many of the tools we discuss in the report facilitate SARs in ways that do not require the third party to make the request. For example, by creating templates or pre-populated emails that are sent by the data subject from their own email client. In these cases, there is no real third party, although the facilitators will bear some responsibility if the texts they provide are inaccurate or somehow lead to the request to fail.

In other cases, the intermediary will communicate directly with the organisation holding the data. This is perfectly admissible and the ICO has provided guidance on the matter that essentially says that it is the responsibility of the third party to demonstrate there are entitled to act on behalf of the data subject. The ICO also says that if a data controller is concerned that the subject may not fully understand the implications of sharing the data with a third party they can send the data to the subject directly.

Organisations carrying out SARs will need to ensure they document their entitlement and that the people on whose behalf they act are fully aware of the implications, including what these organisations may want to do with the data. In some cases these third parties will want to analyse the responses for research or other purposes, or the SARs may be part of some broader complaint or legal action. This will create a new set of data protection obligations for the third party.

SARs involving children require particular care, as in principle the child should be the recipient of the response if he/she is mature enough – which can be complicated to assess.

Coordination of subject access requests

There are many projects that attempt to coordinate subject access requests targeting a company or a sector. There are concerns among some privacy activists that this could be used by some data controllers to reject the requests as excessive or manifestly unfounded, or attempt to charge a fee.

In principle each request should be considered independently, and the organisation will have to demonstrate they grounds for rejection. Batch requests are fairly common and should not be considered excessive.

The debate centres on whether a company can try to reject a coordinated SAR campaign as unfounded if they can argue that the individuals are using the SARs as a punitive tool or for reasons unrelated to data protection, for example in order to reverse engineer a database.

Recital 63 GDPR states that the right of access is there “in order to be aware of, and verify, the lawfulness of the processing”, which could be understood in fairly narrow terms of data protection. However, Art 15 GDPR simply states that individuals have the right to obtain information without any consideration as to the purposes. Given that recitals are not legally binding, it seems that there are no strong grounds for such rejection, but national courts may take a different view.

Repeated requests by the same person are a different matter, and may be considered excessive more easily if not enough time has passed or it is unlikely that enough has changed to deserve another request.

Privacy protections that hinder access

One potential pitfall and controversial topic around the right of access is the extent to which privacy protecting practices may hinder the exercise of this and other data rights. Companies increasingly use pseudonymisation, data compartmentalisation and other technical measures that can make it harder to exploit the data if there were any security breaches. In some cases not even company employees can fully access the data and link it to a named individual.

These practices generally fall under the rubric of privacy – or data protection – by design, which is part of GDPR and something that normally is perceived in positive terms. The problems arise when the person trying to access the data is not a hostile third party, but the person whose data was processed in the first place.

Michael Veale, Reuben Binns and Jef Ausloos have argued that these privacy by design techniques focus exclusively on protecting confidentiality and the identification of individuals, but the data is still potentially re-identifiable by third parties with enough capabilities. At the same time the added difficulties in identifying specific individuals make it very difficult to exercise data subject rights, such as access, erasure and objection.

The authors document their own research with two case studies. In one case involving wifi data collected by TfL in the London Underground and used to track movements, subject access requests could not work because the data had been masked using cryptographic techniques. However, it has been demonstrated that location traces are so unique that re-identification is very easy.

Michael Veale, Reuben Binns and Jef Ausloos also attempted to obtain recordings from the personal assistant provided in Apple products, Siri. Apple said they were unable to provide the recordings they hold because these cannot be linked to the individual, as they have different specific identifiers and they have not retrieval mechanisms. The authors made various proposals for how information systems could be engineered to improve rights while preserving privacy and who to manage any trade offs involved.

A similar case has been documented by Paul Olivier Dehaye, who asked Facebook for certain information held on him and was rejected because the data in question was kept in backup storage and was not accessible.

The above are only some of the possible points of contention around the right of access to personal data. Once GDPR sinks in we will have a better understanding of whether these issues become widespread obstacles to the exercise of the right, or whether new issues appear.

[Read more]


August 03, 2018 | Matthew Rice

What is at stake with the immigration exemption legal challenge?

The immigration exemption in the Data Protection Act 2018 will remove key data protection rights for everyone in the United Kingdom. Some may think that because we’re talking immigration we’re talking about asylum seekers, refugees, or immigrants. But they’d be wrong. The law is so broad, and touches on so many parts of our deeply interconnected lives, that it is likely to affect everyone’s rights.

From May to July 2018, Open Rights Group and the3million, the grassroots organisation for EU citizens living in the United Kingdom, successfully crowdfunded for a legal challenge against the immigration exemption in the Data Protection 2018. That is thanks to everyone who donated, shared, tweeted and wrote articles in support of the challenge. Now the groups are moving towards the next stage of challenge, it is important to reiterate what we are challenging and why we are challenging it.

The immigration exemption in the Data Protection Act 2018, under Schedule 2, Part 1, para 4 would remove fundamental data subject rights if the data controller thinks the disclosure of that data would “prejudice” “the maintenance of effective immigration control”, or “the investigation or detection of activities that would undermine the maintenance of effective immigration control”.

In a simple sense, if someone contacts an organisation to get a copy of their personal data held about them, that organisation doesn’t have to provide it if, they argue, it would “prejudice” “effective immigration control”.

But this is about way more than access to data.

Not only does it remove important rights for individuals, it also removes important responsibilities that bodies owe everyone. And further, it removes restrictions from sharing of data between bodies; a shadowy, opaque, pernicious problem.

Some of the characterisation of the challenge is that it is solely about access to data, or subject access rights to give its Data Protection title. Although that is an important part of it, it is about more than knowing whether you have been treated fairly and lawfully. The challenge is about people being treated fairly and lawfully, period.

We are all granted the fundamental right to data protection under Article 7 of the Charter of Fundamental Rights of the European Union. This right is explicit about access to data, and also the right to rectify that data. Under the General Data Protection Regulation (GDPR) we also have rights:

  • to restrict processing;

  • to object to the processing;

  • to erasure; and

  • to be provided information about what our data will be used for when it is collected from us.

Each of these rights, either explicitly, or practically would be exempted if a data controller decided disclosure would “prejudice” “effective immigration control”.

The challenge from ORG and the3million is for the removal of the immigration exemption in its entirety. The exemption is a blunt instrument, affecting the rights of potentially millions: EU citizens, non-EU citizens, even British nationals, due to the clear as mud term “immigration control” and handing the power to exercise this exemption to all data controllers.

The term “immigration control” isn’t defined. Does it mean any criminal act such as overstaying a visa? Maybe. Does it mean routine checking up of a person’s immigration status? Perhaps. Will the exemption be applied against the routine or the exceptional? Unclear, and political promises mean nothing when the letter of law is this vague.

This imprecision does not help when you consider who is going to have the power to apply the exemption. The exemption is not limited to UK Border Force, or Passport Office, or Immigration Enforcement. It is any data controller. That could mean the private contractors like G4S and Serco that are running immigration detention centres. But it could also mean the NHS, your local authority, your schoolyour landlords and up until recently banks and building societies. These could all potentially rely on or operate some part of the immigration exemption.

Circling the lack of precision in “immigration control” and the non-exhaustive list of who could apply the exemption is the lifting of restrictions and responsibilities to data sharing.

Previously, immigration data sharing initiatives from the education sector and the NHS have been criticised for betraying the trust of individuals accessing these services. Under the immigration exemption, the responsibility to confirm data sharing has been removed from controllers if it would “prejudice” “effective immigration control”. No more acknowledgement of these data sharing initiatives between the NHS or schools and the immigration enforcement system. 

Each of these data controllers owe obligations to us as data subjects when they process our data. They need to process our data fairly, lawfully, and transparently. They also need to make sure it is adequate and relevant. They must process only what is necessary for the purpose they collected the information. They even have to ensure appropriate security of the personal data. All of this is a responsibility they have to us as controllers of our data. All of these can be exempted if to respect these responsibilities would be to undermine the restriction of our rights.

Due to the imprecise nature of the term, who might apply it, the freewheeling data sharing and the lifting of pesky responsibilities to protect people’s data, the exemption could affect far more people than just those going through an immigration process. You could be a British national married to an EU or non-EU citizen for instance. Your data tied up in your marriage could be shared with others without your knowledge because to allow you to know would “prejudice” “effective immigration control”. Or maybe your child is in school, maybe you’re just travelling and your data is being used to profile you against other travellers. The exemption is so wide it doesn’t draw a narrow box, it is a big carve out.

Everyone is entitled to the protection of their personal data. Everyone’s right is threatened by the broad immigration exemption.

Open Rights Group and the3million are challenging all of the exemption, for everyone.

[Read more]


July 05, 2018 | Mike Morel

MEPs hold off Article 13's Censorship Machine

In a major triumph for free speech over digital censorship, MEPs voted today to REJECT fast tracking the EU Copyright Directive, which contains the controversial Article 13.

The odds were steep, but thanks to everyone who contacted their representatives, MEPs got the message that Article 13 and it’s automated “upload filters” would be a catastrophe for free expression online.

In the final days, support for Article 13 collapsed, as pressure from real people like you convinced MEPs that they needed a rethink.

Today’s victory is a rude awakening to industry lobbyists who expected Article 13 to pass quietly under the radar. We can expect a fierce battle in the coming months as the Copyright Directive returns to the EU Parliament for further debate.

Today’s vote preserves our ability to speak, create, and express ourselves freely on the Internet platforms we use everyday. Instead of rolling over and putting computers in charge of policing what we say and do, we’ve bought ourselves some time to foster public debate about the wisdom, or lack thereof, behind automated censorship.

Today’s battle is won but the filter wars against free speech continue. We need your help, because the proposal is not dead yet. We will be fighting it again in September. If you haven’t already, join thousands of ORG members who make our work possible. Become a member today!

[Read more]


June 26, 2018 | Ed Johnson-Williams

A new GDPR digital service: the crowdsourced ideas

ORG supporters sent in some great ideas for a new digital service about rights under GDPR. We take a look at some of the best ones.

A few months ago we put out a call for ideas for a new digital service that would help people use their rights under General Data Protection Regulation (GDPR).

Open Rights Group supporters sent in some great ideas for a new digital service about rights under GDPR. We take a look at some of the best ones.

post it notes laid out with a big light bulb in the middle

People sent in some great ideas. They were really varied and we could have gone with many of them. Some were brilliant, but were outside of the scope of the project or too complex for us to do justice to. We didn't want to let good ideas go to waste so we thought we'd tell you about a few of them here. Hopefully someone will spot a project they want to build!

Currently, we're working with Projects by IF on a tool that aims to make it easier for consumers to understand and exercise their rights under GDPR, starting by making privacy policies easier to understand. We're focusing first on businesses in the fintech sector, as this is an area of innovative and complex data practices, but we plan to expand that over time. This work is funded by a grant from the Information Commissioner's Office (ICO).

Here are some of the ideas supporters sent in.

Young person's data cleaner

This idea was to help teenagers to understand and then 'clean' their digital identity using the right to erasure under GDPR. People would see visualisations of information that pulled in data from key online accounts. They would then be able to use different website's facilities for requesting that data relating to them is deleted – possibly by using links that jump straight to the relevant functionality on a website.

A consumer compensation tool

This tool idea would make it easier for people to claim compensation after a data breach. It would use a form that asked for all the information required to make a claim and would then submit the claim. Ideally, an organisation would take note that lots of people were making a claim and could handle all the claims en masse in a consistent manner. Users could say that they only wanted a nominal amount of compensation or that they wanted to donate the compensation to a rights-based cause such as Open Rights Group or to the site itself.

GDPR benchmark

This idea from William Heath and others was to build a tool which helps people easily to make a Subject Access Request (SAR) to an organisation, and also let them give feedback about how good or bad their experience was. The website would present aggregates of the ratings of the quality of data that an organisation typically sends back to people.

GDPR Quiz

This idea also came from Williams Heath and others. It was to create a social media quiz that helps people better understand their rights and learn about how to use them. It would link to good resources and would be frame everything in positive language, reassuring people and encouraging them to use their rights in a constructive way.

Open Rights Group recently launched a quiz in response to this great suggestion idea so take a look!

[Read more]


June 13, 2018 | Alex Haydock

Victory for Open Rights Group in Supreme Court web blocking challenge

The Supreme Court ruled today that trade mark holders must bear the cost of blocking websites which sell counterfeit versions of their goods, rather than passing those costs on to Internet service providers.

Jim, Alex and Myles at the Supreme Court The Supreme Court ruled today that trade mark holders must bear the cost of blocking websites which sell counterfeit versions of their goods, rather than passing those costs on to Internet service providers.

This decision comes in the case of Cartier v BT & Others, in which the jeweller Cartier sought a court order requiring ISPs to block websites which sold goods infringing their trade marks. ORG have been intervening in the case along with help from solicitor David Allen Green.

Lower courts have already ruled in the this case that trade mark holders such as Cartier have the right to request blocking injunctions against websites in the same way as copyright holders are already able to. The courts decided that general powers of injunction were sufficient for trade mark holders to request such blocks, even though copyright holders have a specific power granted by the Copyright, Designs and Patents Act 1988.

The question for the Supreme Court today was whether a trade mark holder who obtained a blocking injunction against infringing sites would be required to indemnify ISPs against the costs of complying with those injunctions.

If trade mark holders were able to demand blocks which ISPs were required to pay for, this could open the door to large-scale blocking of many kinds of websites. ORG was concerned that the costs would be passed on by ISPs to their customers. Increasingly trivial blocks could be requested with little justification, even when it would not be economically justifiable.

Today, the Supreme Court ruled unanimously that the rights-holders would be required to reimburse the ISPs for reasonable costs of complying with blocking orders.

The Court based their judgment on a few factors.

  • Firstly, that there is a general principle in law that an innocent party is entitled to the remuneration of their reasonable costs when complying with orders of this kind.

  • Secondly, the Court rejected the suggestion that it was fair to for ISPs to contribute to the costs of enforcement because they benefit financially from the content which is available on the internet, including content which infringes intellectual property rights.

Although ISPs have been complying with orders to block copyright-infringing sites for a number of years, the issue of who bears the cost of implementing such blocking has not been challenged until today.

We expect that, in future, ISPs may wish to try and use this ruling to argue that they should be remunerated for the cost of complying with copyright injunctions as well as just trade mark injunctions.

This is a very welcome judgment for ORG. ISPs are now administering lists of around 2,500 domain blocking injunctions. ORG’s Blocked! tool is currently tracking over 1,000 sites which have been blocked using these orders. If rights-holders are now required to bear costs, then we should see better administration of the blocks by the ISPs themselves.

[Read more]