Press releases

Press releases

NEWS RELEASE Complainants call on ICO to take action against adtech sector

The ICO has responded to a complaint brought by Jim Killock and Dr Michael Veale in Europe’s €12 billion real-time bidding adtech industry. Killock and Veale are now calling on the ICO to take action against companies that are processing data unlawfully.

The ICO has agreed in substance with the complainants’ points about the insecurity of adtech data sharing. In particular, the ICO states that:

“Processing of non-special category data is taking place unlawfully at the point of collection”

“[The ICO has] little confidence that the risks associated with RTB have been fully assessed and mitigated”

“Individuals have no guarantees about the security of their personal data within the ecosystem”

However the ICO is proceeding very cautiously and slowly, and not insisting on immediate changes, despite the massive scale of the data breach.

Jim Killock said:

“The ICO’s conclusions are strong and very welcome but we are worried about the slow pace of action and investigation. The ICO has confirmed massive illegality on behalf of the adtech industry. They should be insisting on remedies and fast.”

Dr Michael Veale said:

“The ICO has clearly indicated that the sector operates outside the law, and that there is no evidence the industry will correct itself voluntarily. As long as it remains doing so, it undermines the operation and the credibility of the GDPR in all other sectors. Action, not words, will make a difference—and the ICO needs to act now.”

Ravi Naik, solicitor for the complaints and for Dr Johnny Ryan’s simultaneous complaint before the Irish DPC, said:

“Between the ICO’s report and the actions of the DPC, there can no longer be any question; AdTech cannot comply with the GDPR. We welcome the ICO’s findings and look forward to the Commissioner taking concrete steps to prevent further violations of individual rights. It is time for action.”

For more information and interviews, contact, 07749 785 932.

Notes to Editors

The ICO Report is available here:

The ICO concludes:

Overall, in the ICO’s view the adtech industry appears immature in its understanding of data protection requirements. Whilst the automated delivery of ad impressions is here to stay, we have general, systemic concerns around the level of compliance of RTB:

  • Processing of non-special category data is taking place unlawfully at the point of collection due to the perception that legitimate interests can be used for placing and/or reading a cookie or other technology (rather than obtaining the consent PECR requires).
  • Any processing of special category data is taking place unlawfully as explicit consent is not being collected (and no other condition applies). In general, processing such data requires more protection as it brings an increased potential for harm to individuals.
  • Even if an argument could be made for reliance on legitimate interests, participants within the ecosystem are unable to demonstrate that they have properly carried out the legitimate interests tests and implemented appropriate safeguards.
  • There appears to be a lack of understanding of, and potentially compliance with, the DPIA requirements of data protection law more broadly (and specifically as regards the ICO’s Article 35(4) list). We therefore have little confidence that the risks associated with RTB have been fully assessed and mitigated.
  • Privacy information provided to individuals lacks clarity whilst also being overly complex. The TCF and Authorized Buyers frameworks are insufficient to ensure transparency and fair processing of the personal data in question and therefore also insufficient to provide for free and informed consent, with attendant implications for PECR compliance.
  • The profiles created about individuals are extremely detailed and are repeatedly shared among hundreds of organisations for any one bid request, all without the individuals’ knowledge.
  • Thousands of organisations are processing billions of bid requests in the UK each week with (at best) inconsistent application of adequate technical and organisational measures to secure the data in transit and at rest, and with little or no consideration as to the requirements of data protection law about international transfers of personal data.
  • There are similar inconsistencies about the application of data minimisation and retention controls.
  • Individuals have no guarantees about the security of their personal data within the ecosystem.

FixAdTech campaign website includes the complaints and details of other complaints made across the EU.

The complaints are being made by Dr Gemma Galdon Clavell (Eticas Foundation) and Diego Fanjul (Finch), David Korteweg (Bits of Freedom), Dr Jef Ausloos (University of Amsterdam), Pierre Dewitte (University of Leuven), Jose Belo (Exigo Luxembourg), Katarzyna Szymielewicz, President of the Panoptykon Foundation, Jim Killock, Executive Director of the Open Rights Group, Dr Michael Veale of University College London, and Dr Johnny Ryan of Brave, the private web browser. The complainants in Ireland and in the UK have instructed Ravi Naik, Partner at ITN Solicitors.

[Read more]

NEWS RELEASE Age Verification delay is an opportunity to fix privacy in porn block

The Open Rights Group has responded to reports that government plans to force online porn companies to verify the age of users have been delayed again. According to reports by Sky journalists, DCMS had failed to notify the European Commission about the measures in the Digital Economy Act.

Executive Director Jim Killock said:

“While it’s very embarrassing to delay age verification for the third time, this is an opportunity for the Government to address the many problems that this ill-thought through policy poses.

“Age verification providers have warned that they are not ready; the BBFC’s standard to protect data has been shown to be ineffective.

“The Government needs to use this delay to introduce legislation that will ensure the privacy and security of online users is protected.”

Last week, Open Rights Group published a report into the BBFC’s Age-verification Certificate Standard, which outlines measures for AV providers to demonstrate that they will keep users’ data safe. ORG’s report shows that the Scheme provides little assurance to the 20 million adults that are estimated to watch porn in the UK. You can read the key criticisms of the report here:,-misleading-and-potentially-dangerous

Since the report, one Age Verification provider, the 18+ App has declined to the use the scheme so that they can monetise their product through what they call “digital wallets".

For more information, contact Pam Cowburn, 07749 785 932,

[Read more]

ORG report: BBFC age verification standard is pointless, misleading and potentially dangerous

  • From July 15, people in the UK are expected to prove they are 18 if they want to watch porn online.
  • Open Rights Group report warns that voluntary BBFC Age-verification Certificate Standard gives consumers little privacy protection as it is vague, imprecise and largely a ‘tick box’ exercise.
  • ORG believes consumers do not know enough about age verification scheme to make informed and safe choices.
  • ORG's report is available here:

Just one month until age verification for online pornography is launched in the UK, the Open Rights Group has warned that the Government is failing to protect the privacy and security of adults who watch pornography online.

Open Rights Group has analysed the BBFC’s Age-verification Certificate Standard, which outlines measures for AV providers to demonstrate that they will keep users’ data safe. ORG’s report shows that the Scheme provides little assurance to the 20 million adults that are estimated to watch porn in the UK.

Executive Director Jim Killock said:

“On July 15, millions of Internet users in the UK will have to make a decision about which age verification providers they trust with data about their personal pornography habits and preferences.

“Due to the sensitive nature of age verification data, there needs to be a higher standard of protection than the baseline which is offered by data protection legislation.

“The BBFC’s standard is supposed to deliver this. However, it is a voluntary standard, which offers little information about the level of data protection being offered and provides no means of redress if companies fail to live up to it. Its requirements are vague and a ‘tick box’ exercise. This renders it pointless, misleading and potentially dangerous as advice to consumers seeking safe products.”

ORG’s key criticisms of the BBFC standard:

  • The Standard is voluntary, which means that age verification providers are under no obligation to apply it. 
  • There are no penalties for AV providers who sign up to the standard and then fail to meet its requirements. 
  • The Standard is very broadly drafted and there are not enough specific rules for providers to follow. Instead, providers must state they have considered problems and choose their own way to deal with them.
  • Those providers that meet the Standard will have an identifier mark. However, because of the vague criteria and wording within the standard, consumers will have little idea about the level of data protection being applied. 
  • Age verification providers have not been given enough time to apply the Standard, which was only published in April.

Privacy timebomb
Porn companies will have to apply age verification to UK users from July 15, 2019. As far as Open Rights Group is aware, there has been no government advertising to make the millions of UK porn users aware that the law has changed and there appears to be very little public awareness of the scheme.

A YouGov poll from March 2019 showed that 74% of the British public are unaware of that age verification is being introduced.

Killock added:
“Age verification will affect millions of people in the UK, yet the Government has done little to advertise this change, nor offered advice to consumers about what they need to do to keep their sensitive data safe.

“A DCMS impact assessment outlined that this scheme could put UK citizens at risk of fraud and blackmail, which could have a devastating impact on individuals. We urge the Government to delay age verification until there are proper mechanisms in place to protect privacy."

Protecting under 18s
The requirement to verify the age of porn users aims to prevent under 18s from accessing pornographic content. However, it only applies to companies that provide pornographic content on a commercial basis. This means that young people will still be able to access pornography on free sites, through file sharing or on social media platforms, such as Twitter. A DCMS impact assessment of the scheme stated that it created, “a risk that both adults and children may be pushed towards ToR where they could be exposed to illegal activities and more extreme material."

For more information, please contact, 07749 785 932.

Notes to Editors
The BBFC’s Age verification standard was published on April 2019:
ORG's analysis of the standard is here:

[Read more]

Amazon enforcement action forces ink cartridge sellers to close

Patent-trolling techniques deployed by printer manufacturing giant Epson have escalated in severity this week with small ink cartridge resellers being informed that due to excessive numbers of takedown notices, their Amazon marketplace accounts have been indefinitely suspended. Resultant loss of sales is expected to lead to business closures.

Epson has long used online platform content removal procedures as a form of privatised patent enforcement. Strict application of notice-and-takedown policies allows Epson to hide behind Amazon and rely on the pure existence of its patent to silence competition. Affected resellers have no opportunity to challenge the removal of their marketplace listings or assert their right to post content. This damages online free speech and unfairly restricts independent small business activity.

Other manufacturers such as Canon have behaved similarly in this market. Affected small businesses have in some cases been forced to lay off employees or close entirely.

Open Rights Group (ORG) calls on Amazon to reinstate and protect these seller accounts. Amy Shepherd, Legal and Policy Officer, said:

“Clumsy takedowns at Amazon are damaging British businesses. Amazon’s decision to suspend seller accounts is disappointing, as it takes Epson’s word that British cartridge resellers are at fault. Epson is using patents to bully small businesses, but if it really believes the patents stand up it should instead take the import companies to court.”

Adrian Meakin, Director of The Ink Squid Ltd, said:

“After 10 years in business we, along with many other long-established sellers, are being forced out of the compatible ink cartridge market by Epson’s actions. Consumers want the option to use third-party cartridges and this freedom of choice is now being taken away.”



Amy Shepherd
Legal and Policy Officer at Open Rights Group

Adrian Meakin
Director of The Ink Squid Ltd.


Note to editors:

Further information from Open Rights Group on this issue is available at

[Read more]

Regulating Online Political Advertising Needs Data Use Transparency

Electoral Commission Director of Regulation Louise Edwards called for new laws requiring online adverts to show clearly who has paid for them.

Pascal Crowe, Data and Democracy Project Officer at Open Rights Group responded saying:

“Effectively regulating online political advertising needs to go beyond campaign spending and require greater transparency over parties’ use of personal data. The Information Commissioner’s Office must be involved.

“Transparency is critical. Political actors using online advertising need to be forced to report with more specificity on their sources of personal data and how their targeting works."


Open Rights Group media enquiries: 0207 0961079




[Read more]

“Dangerous and irresponsible” age verification goes ahead

Government announces 15 July 2019 launch date for dangerous and irresponsible age verification scheme, without compulsory privacy scheme.

Reacting to the government’s announcement on age verification for adult content, Jim Killock Executive Director of Open Rights Group said:

“The government needs to compel companies to enforce privacy standards. The idea that they are ‘optional’ is dangerous and irresponsible.

“Having some age verification that is good and other systems that are bad is unfair and a scammer’s paradise – of the government’s own making.

“Data leaks could be disastrous. And they will be the government’s own fault.

“The government needs to shape up and legislate for privacy before their own policy results in people being outed, careers destroyed or suicides being provoked.”

[Read more]

ICO Age Appropriate Design Code of Conduct: tread lightly

The ICO Age Appropriate Design Code of Conduct is now available for consultation

Reacting to the ICO Code of Conduct  Matthew Rice, Open Rights Group said:

"It is welcome to see the ICO lay out strong “high privacy” by default including switching off geolocation, services the rely on profiling, and restricting nudge techniques, among others. This gives the opportunity for children to access and use online services without becoming tracked and having their personal data monetised as soon as they land on the site.

"However the ICO must tread lightly when it comes to requesting verification of a child’s age. There is a risk that an interpretation of the code will increase the spread of age verification technologies which, if implemented badly, could increase data collection of children or lead to inadvertently restricting access to services for children that don’t have identity documents or sufficient parental support. The ICO must place strong restrictions to minimise the use of data collected, and give children the opportunity to enjoy the freedom to access to information and the communication potential that the best of these services provide.” 

More information

Contact +442070961079


The ICO Code is available at: 

The consultation ends on 31 May 2019

[Read more]

DCMS publishes White Paper on Internet Safety and Online Harm

The Department for Digital, Culture, Media and Sport (DCMS) today published its long-awaited White Paper on Online Harms.

The White Paper focuses heavily on the duties of social media platforms to police user-generated content. It proposes imposing a duty of care on platforms to protect users, particularly children and young people, from harm, with compliance overseen by a regulator.

DCMS extends the scope of the duty of care to include "tools or services which allow, enable or facilitate users to share or discover UGC or interact with each other online". This is a broad-reaching definition and includes search tools, app stores and  messaging services.

Jim Killock, Executive Director of Open Rights Group, said:

“The government’s proposals would create state regulation of the speech of millions of British citizens.

“We have to expect that the duty of care will end up widely drawn with serious implications for legal content, that is deemed potentially risky, whether it really is nor not.

“The government refused to create a state regulator the press because it didn’t want to be seen to be controlling free expression. We are skeptical that state regulation is the right approach.”

“The government is using Internet regulation as a blunt tool to try and fix complex societal problems. Its proposals lack an assessment of the risk to free expression and omit any explanation as to how it would be protected.”

The White Paper forms part of DCMS’s Digital Charter policy. A twelve-week consultation period will now commence.



See ORG’s fuller view of the Duty of Care at

The DCMS White Paper is available at This follows its Internet Safety Strategy Green Paper published in October 2017.

[Read more]

European Parliament approves controversial Copyright Directive

Tuesday 26 March 2019 - Today the European Union’s Copyright Directive was approved in a decisive vote by the European Parliament in Strasbourg.

Reacting to the vote in the European Parliament, Jim Killock, executive director of Open Rights Group said:

“This is a very bad day for free expression, for copyright and democracy. Millions of people have voiced their opposition and thousands have protested against it. If copyright filters go ahead, large numbers of mistaken takedowns will impact what we do and say online. The EU Parliament has made a serious error of judgement here. We will go on doing everything we can to stop this being the disaster it promises to be.”



About Open Rights Group:

Open Rights Group is a non-profit company limited by Guarantee, registered in England and Wales no. 05581537.‍ We challenge mass government surveillance, protect free expression online and the right to privacy online. We campaign, lobby, talk to the media, go to court — whatever it takes to build and support a movement for freedom in the digital age.
Contact Details:

Jim Killock
Executive Director
Mobile / Signal is +44 7894498127

[Read more]

BBFC fails to explain its porn privacy scheme at meeting 300m from its office

Following a roundtable meeting with Age Verification providers, the Open Rights Group (ORG) has demanded that the British Board for Film Classification (BBFC) meet with ORG to discuss their privacy scheme, which BBFC hope will protect some visitors to pornographic websites.

Pornographic websites will soon have to verify the age of their website visitors, under the Digital Economy Act 2017. Late last year, BBFC added a voluntary privacy scheme to ensure that at least some of the age verification systems would be privacy protective. The Act, however, contains no powers which could make the scheme compulsory.

BBFC have to date failed to explain if their Age Verification scheme will be consulted on publicly, and which company is being used to run the scheme.

BBFC did not attend a roundtable with Age Verification providers and privacy experts organised by the Open Rights Group this morning, three hundred meters from BBFC’s offices.

Jim Killock, Executive Director of Open Rights Group said:

“In our meeting today, it was clear that private consultation had begun and a company had been appointed to run the privacy certification scheme.

“But good privacy schemes are written in public, with consultation and as much input as possible. Writing a scheme like this in the dark is very risky, as very important risks might be missed.

“It is also clear that the privacy scheme needs to be compulsory. All Age Verification products need to be in the scheme, not just a few, if users’ data is going to be protected.

“Data leaks of sexual habits could lead to relationships being impacted, individuals being outed and even suicides. The BBFC's privacy scheme matters.

“We were disappointed that the BBFC did not attend our roundtable, which was held three hundred meters from their office.”

More information

Jim Killock, Executive Director Open Rights Group

Mobile / Signal is +44 7894498127BB

[Read more]