Press releases

Press releases

Data Protection Bill to be welcomed but still needs work

ORG responds to the Government's statement of intent on a Data Protection Bill

Javier Ruiz – policy director at digital rights campaign organisation Open Rights Group – said,

“We welcome the Government’s intention to bring European data protection laws into UK law. It will strengthen everyone’s ability to control what data can be collected about them and how it can be used.

These laws could be fundamentally altered after Brexit. The Government must explain how these data protection rights will be guaranteed after the UK has left the EU.

We are disappointed that UK Ministers are not taking up the option in EU law to allow consumer privacy groups to lodge independent data protection complaints as they can currently do under consumer rights laws.

Citizens face increasingly complex data ecosystems. It is almost impossible for average person to be able to know which organisations hold their personal data. Enabling privacy groups to take independent action will ensure consumers’ rights are properly enforced.”

[Read more]

Real people care about security

Open Rights Group responds to Amber Rudd's comments on encryption today

Responding to Amber Rudd’s comments today suggesting that “real people” don’t expect security in their communications, Jim Killock – executive director of UK digital rights campaign Open Rights Group – said:

"The suggestion that real people do not care about the security of their communications is dangerous and misleading. Some people want privacy from corporations, abusive partners or employers. Others may be worried about confidential information, or be working in countries with a record of human rights abuses. It is not the Home Secretary’s place to tell the public that they do not need end-to-end encryption.

Amber Rudd must be absolutely clear on what co-operation she expects from Internet companies. She is causing immense confusion because at the moment she sounds like she is asking for the impossible. She must give the public a good idea of the risks she wants to place them under.

If WhatsApp turn off or compromise encryption, you can expect criminals to use something else. The people who will suffer are law-abiding citizens who want privacy and security."

For more information, contact

[Read more]

Brexit trade agreements “fragile” after CJEU opinion on Passenger Name Record data

The Court of Justice of the European Union has issued an Opinion that an agreement over the transfer of Passenger Name Record data between Canada and the EU, “may not be concluded in its current form because several of its provisions are incompatible with the fundamental rights recognised by the EU”.

Any future agreement between the EU and UK would similarly be open to challenge if the UK’s laws do not uphold the privacy of EU citizens. The Opinion reinforces arguments that privacy and data protection rights in the UK could be put under intense scrutiny, if the agreement covers transfers of personal data, which are fundamental for most communications and commerce.

Executive Director of Open Rights Group, Jim Killock responded:
“This decision has massive implications for Brexit. The EU courts have rejected an agreement that failed to protect fundamental rights, including the rights to privacy and protection of personal data.

“Any future trade agreement between the UK and EU would be subject to the same stringent requirements. Given the UK’s mass surveillance laws and indiscriminate data retention, any trade agreement for digital, communications and even banking and insurance businesses, could look very fragile indeed.”

Current UK arrangements to collect and use PNR data are also likely to need improved safeguards, along the lines the court requires for the EU-Canada agreement.

Notes to Editors
The CJEU have explained their decision in a press release which notes:

“the Court considers that the agreement should:
• determine in a more clear and precise manner certain of the PNR data to be transferred;

• provide that the models and criteria used for the automated processing of PNR data will be specific, reliable and non-discriminatory;

• provide that the databases used will be limited to those used by Canada in relation to the fight against terrorism and serious transnational crime;

• provide that PNR data may be disclosed by the Canadian authorities to the government authorities of a non-EU country only if there is an agreement between the European Union and that country equivalent to the envisaged agreement or a decision of the European Commission in that field;

• provide for a right to individual notification for air passengers in the event of use of PNR data concerning them during their stay in Canada and after their departure from that country, and in the event of disclosure of that data to other authorities or to individuals;

• guarantee that the oversight of the rules relating to the protection of air passengers with regard to the processing of their PNR data is carried out by an independent supervisory authority.

“Since the interferences which the envisaged agreement entails are not all limited to what is strictly necessary and are therefore not entirely justified, the Court concludes that the envisaged agreement may not be concluded in its current form.”

Open Rights Group is a member of European Digital Rights (EDRi), which have also issued a statement.

[Read more]

Age verification plans put web users' privacy at risk

Open Rights Group has responded to the announcement that the Government has initiated plans for the age verification of porn websites.

Executive Director Jim Killock said:

“Age verification could lead to porn companies building databases of the UK's porn habits, which could be vulnerable to Ashley Madison style hacks.

“The Government has repeatedly refused to ensure that there is a legal duty for age verification providers to protect the privacy of web users.

“There is also nothing to ensure a free and fair market for age verification. We are concerned that the porn company MindGeek will become the Facebook of age verification, dominating the UK market. They would then decide what privacy risks or profiling take place for the vast majority of UK citizens.

“Age verification risks failure as it attempts to fix a social problem with technology. In their recent manifestos, all three main political parties called for compulsory sex and relationship education in schools. Sex education would genuinely protect young people, as it would give them information and context.”

For more information, email

[Read more]

ORG response to Queen's speech 2017

Open Rights Group has responded to today's Queen’s Speech.

Executive Director Jim Killock said:

“We need to ensure that Internet companies have as much incentive to fully protect free speech as they do to remove illegal content.

“We would hope that a Digital Charter’s regulatory framework will include independent or judicial oversight of material that is taken down by Internet companies. This will help to ensure that we do not simply place the free speech of UK citizens in the hands of private companies without any safeguards.

“We also hope that the reference to a world class regime for protecting our personal data will mean that the Government is committed to delivering the General Data Protection Regulation in full. We hope that the Government will give privacy organisations like Open Rights Group two important rights that are optional in GDPR. These are: to start enforcement cases without requiring to be instructed by individuals affected, and to be able to help people sue companies for privacy damages. These are vital powers that will help to improve data protection in the UK.

“We are surprised that the Queen’s Speech does not make any reference to plans to ensure that requests for communications data by the police and other bodies are independently authorised. The Court of Justice of the European Union ruled on this before Christmas and we know that the Home Office has put out a tender for businesses to help develop a new “independent communications data authorising body”. Yet the Government has still not been up front with Parliament about this important development.”

For more information, contact

[Read more]

Conservative plans for Internet clampdown are a distraction

Open Rights Group has responded to Theresa May’s post-election hints that she will continue with Conservative plans for Internet clampdowns.

Executive Director Jim Killock said:
“To push on with these extreme proposals for Internet clampdowns would appear to be a distraction from the current political situation and from effective measures against terror.
“The Government already has extensive surveillance powers. Conservative proposals for automated censorship of the Internet would see decisions about what British citizens can see online being placed in the hands of computer algorithms, with judgments ultimately made by private companies rather than courts. Home Office plans to force companies to weaken the security of their communications products could put all of us at a greater risk of crime.

“Both of these proposals could result in terrorists and extremists switching to platforms and services that are more difficult for our law enforcement and intelligence agencies to monitor.
“Given that the priority for all MPs is how the UK will negotiate Brexit, it will be especially hard to give the time and thought necessary to scrutinise these proposals.
“It could be tempting to push ahead in order to restore some of Theresa May’s image as a tough leader. This should be resisted. With such a fragile majority, greater consensus will be needed to pass new laws.
“We hope that this will mean our parliamentarians will reject reactionary policy-making and look for long-term, effective solutions that directly address the complex causes of terrorism.”

For more information, email

[Read more]

FOI response reveals porn company's proposals for UK to block millions of porn sites

A Freedom of Information request to the DCMS has revealed that porn company MindGeek suggested that the BBFC should potentially block millions of porn sites if they didn’t comply with Age Verification requirements outlined in the Digital Economy Act.

MindGeek, who are also developing Age Verification technology, said that the Government’s plans to prevent children from seeing pornography would not be effective unless millions of sites could be blocked.

Notes made by the company and sent to the DCMS state:

“A greylist of 4M URLs already exists from Sky, but lets assume that’s actually much smaller as these URLs will I suspect, be page- level blocks, not TLDs. The regulator should contact them all within that 12 months, explaining that if they do not demonstrate they are AV ready by the enforcement date then they will be enforced against. “On the enforcement date, all sites on the greylist turn black or white depending upon what they have demonstrated to the regulator.”

MindGeek could stand to gain commercially if competitor websites are blocked from UK visitors, or if the industry takes up their Age Verification product.

Executive Director of Open Rights Group, Jim Killock said:

“There is nothing in the Act to stop the BBFC from blocking 4.6 million pornographic websites. The only constraint is cash.

“This leaves the BBFC wide open to pressure for mass website blocking without any need for a change in the law.”

When giving evidence to the Public Bill Committee, the chief executive of the British Board of Film Classification, David Austin implied that only tens of sites would be targeted:

“We would start with the top 50 and work our way through those, but we would not stop there. We would look to get new data every quarter, for example. As you say, sites will come in and out of popularity. We will keep up to date and focus on those most popular sites for children.”

Notes to Editors

The Digital Economy Act 2017 obliges porn sites to verify the age of their users. Sites that fail to do so could be fined or blocked by Internet Service Providers.

Responses to ORG’s FOI requests are here.

For further information, contact

[Read more]

Rights groups demand more transparency over Facebook’s ‘insights’ into young users

Facebook told marketers it can detect teens feeling ‘insecure' and ‘worthless'. Data could be used to drive products based on mood and using manipulation.

Facebook should immediately release all documents describing how it collected and analyzed psychological information it recently collected about its youngest users, some as young as 14, and college students, Public Citizen and a coalition of 25 groups said in a letter to the corporation today.

The groups are concerned about how this information might have been used or may be used in the future by marketers and others to take advantage of young people’s emotions, all without users’ knowledge. Marketing companies and Facebook have secretly moved to tap into teens’ emotions and developmental vulnerabilities strictly for profit, the letter says. The groups want to know how the data was used, when it was used, how many users were impacted and the names of the companies that received the data. 

“What began as a way for college students to keep in touch has morphed into a platform for brand-saturated marketing and psychological manipulation,” said Kristen Strader, campaign coordinator for Public Citizen’s Commercial Alert campaign. “It is incumbent upon Facebook as a cultural leader to protect, not exploit, the privacy of young people, especially when their vulnerable emotions are involved.”

According to The Australian newspaper, Facebook presented research to one of its advertisers that shows it collects sensitive data regarding young users’ emotions and “mood shifts.” The research detailed how Facebook can analyse sensitive user data in real time to determine how young users are communicating emotion, and at which points during the week they are doing so, the letter continued. Facebook’s research was conducted without users’ knowledge, which raises ethical concerns.

“Because Facebook plays such a powerful role in the lives of teens, it must adopt a policy that respects and protects them,” said Dr. Kathryn Montgomery, professor of communication at American University and a consultant to the Center for Digital Democracy. “This should include not only strong safeguards for its advertising and data practices, but also clear limits on the kinds of research it conducts for marketing purposes. Under no circumstances should marketers be using emotional states, stress levels, biometric information or other highly sensitive data to target users. And this should apply to both young people and adults.” 

Jim Killock, Executive Director of UK-based digital rights campaigners, Open Rights Group explained why they had signed on:

“We need more transparency about supposed research projects that are used to create valuable insights, which can be sold to the highest bidder. This is exploiting children and young people, who may not be aware of how Facebook are using and selling their data.”

The public, its users and elected officials have a right to know how pervasive this research was, who was affected and how the company will ensure it does not occur again, the groups said. The only way to fully address those concerns is to publicly release the internal document and related materials, accompanied by a more detailed explanation from Facebook of what was intended, what happened and the company’s actual practices, the letter says.

Read the letter

 For more information, contact

[Read more]

Selective, secret consultations have no place in open Government

Yesterday, Open Rights Group received a leaked copy of the Government's draft technical capability notices (TCNs) regulation.

This is a ‘targeted consultation’ and has not been publicised to the tech industry or public. The Secretary of State is in fact not under any obligation to consult the public, but instead must consult only a small selection of organisations listed in Section 253 (6) of the Investigatory Powers Act 2016.

Executive Director Jim Killock said:

“These powers could be directed at companies like WhatsApp to limit their encryption. The regulations would make the demands that Amber Rudd made to attack end-to-end encryption a reality. But if the powers are exercised, this will be done in secret.

“The public has a right to know about government powers that could put their privacy and security at risk. “There needs to be transparency about how such measures are judged to be reasonable, the risks that are imposed on users and companies, and how companies can challenge government demands that are unreasonable.

“Businesses and the public need to know they aren’t being put at risk. Sometimes, surveillance capabilities may be justified and safe: but at other times, they might put many more people – who are not suspected of any crime – at risk.”

“Selective, secret consultations have no place in open Government.”

Technical capability notices (TCNs)

TCNs can be used to order companies with over 10,000 UK users to adapt their technology to enable intercept and metadata collection. While this power already existed under the Investigatory Powers Act, the regulation provides much more detail about what companies could be compelled to do if they are served with a TCN.

Potentially, these notices could be used to compel companies to introduce backdoors to end-to-end encryption, or put in place other security weaknesses, with little accountability.

The regulations state that companies could be forced to ‘modify’ their products in order to comply with Government demands.

The powers would also limit the ability of companies to develop stronger security and encryption. They could be forced to run future development plans past the Government.

Under the IP Act, TCNs may be challenged on technical grounds, to an Advisory Board. They are also approved by Judicial Commissioners. However, the criteria for making a sound judgement of risk to all parties are not set out in the Act, nor the draft regulations; nor is there a clear route of appeal.

Notes to Editors

The consultation last four weeks, concluding on 19 May with responses to:

The consultation process is outlined at Section 253 of the IP Act 2016

[Read more]

Law Commission's 'shoddy and confused' proposals would threaten free speech in UK

Digital rights campaigners, the Open Rights Group have criticised a consultation report by the Law Commission, which calls for a new Espionage Act that could see journalists and whistleblowers get up to 14 years in prison for handling and publishing official data.

In their own report, ORG condemns the proposals as, “shoddy and confused, contradictory, poorly researched, and ill-informed on public interest issues”.

Executive Director Jim Killock said: “From MPs’ expenses to the Snowden leaks, investigative journalism is vital for exposing wrongdoing and holding the Government to account. If the Law Commission’s proposals had been in place ten years ago, the journalists behind these stories could be in prison right now.

“The Law Commission needs to ditch these proposals and come up with concrete evidence as to why reform is needed. Modernising the Official Secrets Act should not be at the expense of investigative journalism.”

ORG main criticisms are as follows:

Threat to free speech: The proposals could see editors and journalists threatened with up to 14 years in prison just for handling official data. If they publish, they can be charged if it can be shown that they were aware that damage might be caused  – and they would not be allowed a public interest defence. This would pose a massive threat to free speech in the UK.

Lack of evidence: There is a lack of evidence about why reform is needed, other than that existing Official Secrets Acts are old. ORG believes that the Law Commission needs to provide more evidence to justify its proposals.

Omissions: The report fails to discuss the role and effects of the Internet, or its implications in the flow of data through espionage. ORG believes that the review took place as a result of the Snowden leaks but there is no mention of this case nor other significant cases, such as Wikileaks, the Spycatcher affair or the ABC trial.

Poor consultation process: The Law Commission claimed to have consulted with human rights organisations, including ORG but this was not the case. After quietly announcing the consultation report through a Telegraph opinion piece, the Commission extended the deadline for submissions by a month until today, May 3.

Notes to Editor

Over 23,000 people have signed an ORG petition calling for the Law Commission to drop proposals to criminalise whistleblowers and journalists: 

ORG’s full submission is available here.

Last week, the UK fell two places in the World Press Freedom Index, compiled by Reporters Without Borders, who cited the Law Commission’s proposals as ‘alarming’. Contact:

[Read more]