March 05, 2018 | Jim Killock

The Data Protection Bill's Immigration Exemption must go

The Data Protection Bill is meant to be a major advance for privacy – unless you are trying exercise your legal right to live in the UK. Jim Killock, Executive Director, writes about why Data Protection is vital.

The government has introduced a sweeping “immigration exemption” in Schedule 2, Paragraph 4. The exemption will remove your right to data protection if it is likely to prejudice “effective immigration control” or the “investigation or detection of activities that would undermine the maintenance of effective immigration control”. What it won’t do is ensure effective immigration control.

This immigration exemption will ensure that the Government will not need to face up to its mistakes. Currently, according to the Government’s Chief Inspector of Borders and Immigration, mistakes and administrative errors are involved in 1 out of 10 immigration cases.

What’s it like to one of those 1 in 10? You can ask any one of the hundred EU citizens, living in the UK entirely legally, who were sent letters demanding they leave or risk deportation in August last year.

Or speak to Mustafa Ali Baig. He corrected an error in an old tax return, and the Home Office marked him down as an undesirable and gave him 14 days to leave the UK. Mustafa had paid the correct amount of tax, and he has no criminal record, he is being punished for being honest.  

Mustafa is able to challenge these Home Office mistakes, because he had the right to check the data the Government had on him, and is taking them to court. He was able to find out why the Government considered him undesirable. If the Government has its way, and Parliament make this exemption law, that won’t be an option in the future. 

The Government’s creation of a so-called ‘hostile environment’ for immigrants exacerbates all these concerns. From your patient records to your local food bank, the sheer amount of data that will now be subject to these checks is unfathomable. The Government has already set out plans to check through 70 million bank accounts to make sure the account holders are in the UK legally, and freeze the funds if not. The Government’s historic incompetence combined with a new inability for people to fact check them means that mistakes that get made could take years to sort out, causing untold damage and misery.

This exemption removes the obligation on the Government to process personal information fairly and transparently and would severely undermine important rights for millions of people living in the UK. 

The Data Protection Bill is supposed to protect your data, but this immigration exemption ensures it will do anything but. It is vital that MPs take action, and ensure Schedule 2, Paragraph 4 never receives Royal Assent.

[Read more]

February 20, 2018 | Cory Doctorow

Unleashing ORG in the courts

No one said protecting the digital world was going to be easy (or cheap)!

The history of internet regulation is littered with foolish, dangerous proposals made by ignorant and/or self-serving lawmakers who treated the digital world as a kind of unimportant sideshow, regulating the net as though it were a mere pornography distribution system, or a video-on-demand service, or a jihadi recruiting tool.

That lack of gravitas and joined-up thinking leads policymakers into traps, where mass surveillance arrives by means of copyright proposals, where censorship creeps in through anti-terrorism laws, where every human right we have is undermined by ill-starred plans to fight terrorism, or bullying, or nebulously defined moral turpitude.

Since its founding in 2005, ORG has led the fight to join up the dots between all these issues, organising its supporters across the UK to put pressure on firms and officials to treat the internet with the gravitas befitting the subject, and to tread lightly when determining the course of the electronic nervous system of the 21st century.

Pressure only goes so far. Even as the idea that breaking the digital world also breaks the "real world" has gained currency in wider society, Britain's policy debt of foolish rules continues to accrue interest, snaring innocents in legal jeopardy and teeing up legal action whose precedents will redound on all of us for the rest of our lives and even the lives of our children.

That's why ORG is increasingly involved in courtroom battles alongside our fights in the public sphere. We've brought actions over web-blocking, the Snooper's Charter and mass surveillance, arguing before the highest courts in the UK and the EU, and we've got more challenges pending over the government's plan to assemble a giant database of the nation's porn-viewing habits through a misbegotten and ineffective "age-verification scheme."

These challenges don't come cheap: they require sustained legal attention from the most skilled legal minds, backed by huge support teams of legal assistants. There are plenty of groups that do this work, but none fills the niche that ORG occupies. ORG is unique in bringing both legal and technical expertise to these fights, able to join up the thinking of our allies from human rights, privacy, anti-censorship and other campaigning organisations, helping build coalitions that cut across these disciplines, finding common cause and surfacing the underlying issues in all the good works our allies do.

That's why we're seeking to sign up 500 new members: these sustaining donors will provide us with the sound financial basis on which to retain and support legal experts to represent us in these fights, and speak for long-term, sustainable internet policies that do justice to that one wire that delivers free speech, a free press, free association, access to civic and political life, to education and employment, to family and creativity.

If you're not already a member, join ORG today!

[Read more]

February 15, 2018 | Jim Killock

Election results

We are pleased to announce that Christi Scarborough, Brian Parkinson and Tom de Grunwald have been elected to the Open Rights Group Board of Directors. They are elected for a three year term by our members.

The full results are in the linked documents (report, results, details). Thanks also go to the other candidates, Oliver Spall and Mary Black, whose participation meant that we had a contested election with a set of very talented and interesting candidates.

We would also like to give our thanks to Ben Laurie, Maria Farrell and Milena Popova who are stepping down.

We would also like to thank Electoral Reform Services for conducting the election smoothly and professionally.

Our Board has nine members, elected or selected in three groups. At the end of 2018, our Advisory Council and former advisors will be asked to elect three further board members; and in late 2019 an open recruitment process will select three more.



[Read more]

February 13, 2018 | Jim Killock

Even extremist takedowns require accountability

Can extremist material be identified at 99.99% certainty as Amber Rudd claims today? And how does she intend to ensure that there is legal accountability for content removal?

The Government is very keen to ensure that extremist material is removed from private platforms, like Facebook, Twitter and Youtube. It has urged use of machine learning and algorithmic identification by the companies, and threatened fines for failing to remove content swiftly.

Today Amber Rudd claims to have developed a tool to identify extremist content, based on a database of known material. Such tools can have a role to play in identifying unwanted material, but we need to understand that there are some important caveats to what these tools are doing, with implications about how they are used, particularly around accountability. We list these below.

Before we proceed, we should also recognise that this is often about computers (bots) posting vast volumes of material with a very small audience. Amber Rudd’s new machine may then potentially clean some of it up. It is in many ways a propaganda battle between extremists claiming to be internet savvy and exaggerating their impact, while our own government claims that they are going to clean up the internet. Both sides benefit from the apparent conflict.

The real world impact of all this activity may not be as great as is being claimed. We should be given much more information about what exactly is being posted and removed. For instance the UK police remove over 100,000 pieces of extremist content by notice to companies: we currently get just this headline figure only. We know nothing more about these takedowns. They might have never been viewed, except by the police, or they might have been very influential.

The results of the government's’ campaign to remove extremist material may be to push them towards more private or censor-proof platforms. That may impact the ability of the authorities to surveil criminals and to remove material in the future. We may regret chasing extremists off major platforms, where their activities are in full view and easily used to identify activity and actors.

Whatever the wisdom of proceeding down this path, we need to be worried about the unwanted consequences of machine takedowns. Firstly, we are pushing companies to be the judges of legal and illegal. Secondly, all systems make mistakes and require accountability for them; mistakes need to be minimised, but also rectified.

Here is our list of questions that need to be resolved.

1 What really is the accuracy of this system?

Small error rates translate into very large numbers of errors at scale. We see this with more general internet filters in the UK, where our project regularly uncovers and reports errors.

How are the accuracy rates determined? Is there any external review of its decisions?

The government appears to recognise the technology has limitations. In order to claim a high accuracy rate, they say at least 6% of extremist video content has to be missed. On large platforms that would be a great deal of material needing human review. The government’s own tool shows the limitations of their prior demands that technology “solve” this problem.

Islamic extremists are operating rather like spammers when they post their material. Just like spammers, their techniques change to avoid filtering. The system will need constant updating to keep a given level of accuracy.

2 Machines are not determining meaning

Machines can only attempt to pattern match, with the assumption that content and form imply purpose and meaning. This explains how errors can occur, particularly in missing new material.

3 Context is everything

The same content can, in different circumstances, be legal or illegal. The law defines extremist material as promoting or glorifying terrorism. This is a vague concept. The same underlying material, with small changes, can become news, satire or commentary. Machines cannot easily determine the difference.

4 The learning is only as good as the underlying material

The underlying database is used to train machines to pattern match. Therefore the quality of the initial database is very important. It is unclear how the material in the database has been deemed illegal, but it is likely that these are police determinations rather than legal ones, meaning that inaccuracies or biases in police assumptions will be repeated in any machine learning.

5 Machines are making no legal judgment

The machines are not making a legal determination. This means a company’s decision to act on what the machine says is absent of clear knowledge. At the very least, if material is “machine determined” to be illegal, the poster, and users who attempt to see the material, need to be told that a machine determination has been made.

6 Humans and courts need to be able to review complaints

Anyone who posts material must be able to get human review, and recourse to courts if necessary.

7 Whose decision is this exactly?

The government wants small companies to use the database to identify and remove material. If material is incorrectly removed, perhaps appealed, who is responsible for reviewing any mistake?

It may be too complicated for the small company. Since it is the database product making the mistake, the designers need to act to correct it so that it is less likely to be repeated elsewhere.

If the government want people to use their tool, there is a strong case that the government should review mistakes and ensure that there is an independent appeals process.

8 How do we know about errors?

Any takedown system tends towards overzealous takedowns. We hope the identification system is built for accuracy and prefers to miss material rather than remove the wrong things, however errors will often go unreported. There are strong incentives for legitimate posters of news, commentary, or satire to simply accept the removal of their content. To complain about a takedown would take serious nerve, given that you risk being flagged as a terrorist sympathiser, or perhaps having to enter formal legal proceedings.

We need a much stronger conversation about the accountability of these systems. So far, in every context, this is a question the government has ignored. If this is a fight for the rule of law and against tyranny, then we must not create arbitrary, unaccountable, extra-legal censorship systems.


[Read more]

January 10, 2018 | Slavka Bielikova

Peers have a chance to make the UK one of the safest places to be online. They should take it.

Open Rights Group is calling on Peers from across the political spectrum to improve consumer rights as the Data Protection Bill reaches the last day of the Report stage debate in the House of Lords this afternoon.

Do you remember that time when Uber didn’t tell us that the data of 57 million of their users got exposed? Or that time when Equifax failed to protect data of 400,000 people in the UK? Or those two Yahoo hacks that breached more than one billion accounts? Oh, and that time when TalkTalk was fined £400,000 for inadequately protecting 156,959 accounts of their customers?

I could go on. These are just a fraction of the data breaches that have caused leaks of people’s data. Every time you provide your name, date of birth, home address or details for an online payment to a company you do so based on trust that they will keep your data safe. But increasingly, companies fail their customers.

Currently, the Government’s Data Protection Bill will give citizens the power to instruct a select group of not for profit bodies to represent them in complaints to the data protection authority or the judiciary. This is required of the Government - Article 80(1) is a mandatory provision in the EU’s General Data Protection Regulation (GDPR).

But what happens when customers don’t realise they have been a victim of a hack that is a direct result of weak data protection? Or worse, what happens when products and services used by children get hacked and their parents are not aware?

There have been, and will continue to be, cases when consumers are unaware that they have been a victim of a hack or don’t want to have their identity connected to a particular incident such as the hack of Ashley Madison - a dating website specialising in extramarital affairs . These complaints could be dealt with if the Government agreed to implement Article 80(2) of the GDPR (reflected in the amendment 175A supported by Labour Lord Stevenson and Lord Kennedy, Lib Dem Lord Clement-Jones and crossbench Peer Baroness Kidron). The amendment would give select not for profit bodies the option to raise those complaints without having an affected member of the public instruct them.
The amendment also explicitly recognises the right of adults to seek collective redress on behalf of children who are the victims of data breaches. Additionally, it will allow individuals who have been affected by data breaches to bring collective redress actions on behalf of everyone else who has been similarly affected.

The Government has been refusing to implement additional protections claiming that Article 80(1) will provide enough protection. This is simply not true. Article 80(1) and 80(2) provide consumer protections in different scenarios. By not implementing enhanced protections, the Government is consciously allowing for obstacles to collective redress for more vulnerable groups such as children and the elderly.

The idea of collective redress has been around for a while for other consumer issues related to finance or competition. Consumer groups such as Which?, Citizens Advice, the Federation of Small Businesses and the Consumer Council for Northern Ireland have the right to present “super-complaints” on behalf of consumers without being instructed by them.

The time has come to see lack of data protection as a consumer issue which is as important as unfair financial arrangements and bad competition practices.

It has become near impossible for consumers to obtain services and products without providing their data to companies. At the same time, the evidence (see the data breaches above) shows that companies have not always been able to protect consumers’ data. The Information Commissioner’s Office and the Deputy Counsel to the Joint Committee on Human Rights are both in favour of implementing Article 80(2).

The Government cannot disregard that data protection is inevitably linked to consumer protection. Data protection is about rights - the right for the public to hold private and public bodies that collect and process their data to account. This is what drives better practice to make our data more secure.

Implementing the amendment allowing for collective redress will give Peers a chance to help make the UK one of the safest places in the world to be online. They should take it.

[Read more]

January 04, 2018 | Javier Ruiz

Alphonso knows what you watched last summer

Technology startup Alphonso has caused widespread concern by using smartphones microphones to monitor the TV and media habits of games and apps users.

smashed TV

The New York Times has published a story about a company called Alphonso that has developed a technology that uses smartphone microphones to identify TV and films being played in the background. Alphonso claims not to record any conversations, but simply listen to and encode samples of media for matching in their database. The company combines the collected data with identifiers and uses the data to target advertising, audience measurement and other purposes. The technology is embedded in over one thousand apps and games but the company refuses to disclose the exact list.

Alphonso argues that users have willingly given their consent to this form of spying on their media consumption and can opt out at any time. They argue that their behaviour is consistent with US laws and regulations.

Even if Alphonso were not breaking any laws here or in the US, there is a systemic problem with the growing intrusion of these types of technologies that monitor ambient sounds in private spaces without sufficient public debate. Apps are sneaking this kind of surveillance in, using privacy mechanisms that clearly cannot cope. This is despite the apps displaying a widget asking for permission to use the microphone to detect TV content, which would be a "clear affirmative action” for consent as required by law. Something is not working, and app platforms and regulators need to take action.

Privacy prompt

In addition to the unethical abuse of users' lack of initiative or ignorance - a bit like tobacco companies - there could be some specific breaches of privacy. The developers are clearly following the letter of the law in the US, obtaining consent and providing an opt out, but in Europe they could face more trouble, particularly after May when the General Data Protection Regulaiton (GDPR) comes into force.

One of the newer requirements on consent under GDPR will be to make it as easy to withdraw as it was to give it in the first place. Alphonso has a web-page with information on how to opt out through the privacy settings of devices, and this information is copied in at least some of the apps’ privacy policies, buried under tons of legalese. This may not be good enough. Besides, once that consent is revoked, companies will need to erase any data obtained if there is no other legitimate justification to keep it. It is far from clear this is happening now, or will be in May.

There is also a need for complete clarity on who is collecting the data and being responsible for handling any consent and its revocation. At present the roles of app developers, Apple, Google and Alphonso are blurred.

We have been asked whether individuals can take legal action. We think that under the current regime in the UK this may be difficult because the bar is quite high and the companies involved are covering the basic ground. GDPR will make it easier to launch consumer complaints and legal action. The new law will also explicitly allow non-material damages, which is possible already in limited circumstances, including for revealing “political opinions, religion or philosophical beliefs”. Alphonso is recording the equivalent of a reading list of audiovisual media and might be able to generate such information.

Many of these games are aimed at children. Under GDPR, all data processing of children data is seen as entailing a risk and will need extra care. Whether children are allowed to give consent or must get it from their parents/guardians will depend on their age. In all cases information aimed at children will need to be displayed in a language they can understand. Some of the Alphonso games we checked have an age rating of 4+.

Consumer organisations have presented complaints in the past for similar issues in internet connected toys and we think that Alphonso and the developers involved should be investigated by the Information Commissioner. 


[Read more]

December 11, 2017 | Caitlin Bishop

Battle lines have been drawn over the Data Protection bill

Battle lines have been drawn by the Information Commissioner’s Office and the Joint Committee on Human Rights on the debate over the Government’s Data Protection bill.

Open Rights Group have delivered briefings to Peers on its core campaigning points, including:

  • raising concerns about the hazardous immigration exemptions alongside the3million, the campaigning organisation representing EU citizens living in the United Kingdom.

Both topics are also included in the briefings from the Information Commissioner’s Office and the Deputy Counsel note to the Joint Select Committee on Human Rights. These arrive as Report Stage on the Data Protection Bill begins on Monday 11 December, continues on Wednesday 13 December, and finally finishing in early 2018 on the 10 January.

The Information Commissioner’s Office (ICO) is the independent body that enforces the data protection law of the UK. Their views on the proposed Bill gives us a practical insight into their effects and should considered carefully by the Government.

The Joint Committee on Human Rights are a committee made up of representatives from both the House of Commons and House of Lords. Their role includes scrutinising every Government Bill for its compatibility with human rights. The Deputy Counsel’s note that is referred to here is from the lawyer that provides specialist legal advice to the committee on what human rights implications may be raised by a Government bill.

Immigration Exemptions need to go

The Government have introduced an exemption into the Data Protection Bill that would remove the rights of individuals subject to an immigration procedure to discover what personal data companies and public authorities hold on them.

The exemption if allowed to pass would set aside fundamental rights such as individuals access to personal data about them, the right to erasure, and the right to rectification, among others. With mistakes commonplace in immigration procedures, it is vital the law retains the power for individuals to  hold to account those who collect and process personal data in immigration procedures.

The Information Commissioner’s Office shares Open Rights Group’s concerns about the exemptions which in effect remove accountability:

“The majority of data protection complaints to the Information Commissioner about the Home Office relate to requests for access to personal data to UK Visas and Immigration….If the exemption is applied, individuals will not be able to access their personal data to identify any actual inaccuracies and it will mean that the system lacks transparency and is fundamentally unfair.”

The exemption found in Schedule 2 Part 4 of the Bill is much broader than just data held by the Home Office, covering any organisation processing information that is used in relation to immigration controls. The current immigration regime extends the responsibility to control immigration to schools, GPs, hospitals, landlords, employers and even the DVLA.

The Government maintained that the exemption “emphatically does not set aside the whole of the GDPR”.  

Open Rights Group argues it emphatically does.

The note from the Deputy Counsel suggested to the Committee that they should consider “why this exemption is “necessary in a democratic society””, which is one of the legal tests for whether an interference with a fundamental human right is actually a violation.

The counsel’s note also raises concerns about the discriminatory nature of the exemption, based on the nationality of individuals, such as the 3 million EU citizens currently living in the United Kingdom. The potential scope for discrimination is why Open Rights Group worked with the3million to raise shared concerns with Peers.

Representation of data subjects

Open Rights Group have been campaigning since the Bill arrived in the House of Lords for the power to be given to not for profit bodies to represent data subjects without having an affected member of the public instruct them.

The Government do not want to incorporate the optional power. The Government suggested in the debate during Committee stage that the public were already capable of exercising their powers, citing a recent case brought by 5,000 data subjects. The Government failed to mention that the claim was actually brought by the former Executive Director of consumer rights organisation Which?. This isn’t the spontaneous popular organisation of members of the public the Government’s comments in debate would lead people to believe.

The Information Commissioner’s Office agrees with Open Rights Group on the need for 80(2) to be adopted:

“...there are circumstances where data subjects may not necessarily be aware of what data about them is held by organisations, and more importantly what is being done with it. In such instances data subjects could not be expected to know whether and how they could exercise their rights under data protection law…. This point is of particular importance where young and vulnerable data subjects are involved - these groups are less likely to have the means and capability to exercise their rights on their own behalf.”

This support for 80(2) from the Commissioner is welcome. The note from the Deputy Counsel to the Joint Committee on Human Rights raises representation of data subjects too suggesting “the Government’s omission of 80(2) may diminish the protection of privacy rights”.

With these briefings from the ICO and the Deputy Counsel to the Joint Committee on Human Rights, battle lines already drawn by civil society have now been deepened and fortified. The Government can no longer continue to dismiss these concerns out of hand.


[Read more]

November 30, 2017 | Jim Killock

Home Office concedes independent authorisation

This is major victory for ORG, although one with dangers. The government has conceded that independent authorisation is necessary for communications data requests, but refused to budge on retained data and is pushing ahead with the “Request Filter”.

Adding independent authorisation for communications data requests will make the police more effective, as corruption and abuse will be harder. It will improve operational effectiveness, even if less data is used during investigations and trust in the police should improve.

Nevertheless the government has disregarded many key elements of the judgment

  • It isn't going to reduce the amount of data retained

  • It won't notify people whose data is used during investigations

  • It won't keep data within the EU, instead it will continue to transfer it, presumably specifically to the USA

  • The Home Office has opted for a ‘six month sentence’ definition of ‘serious crime’ rather than the Lords’ definition of crimes capable of sentences of at least one year.

These are clear evasions and abrogations of the judgment. The mission of the Home Office is to uphold the rule of law. By failing to do what the courts tell them, the Home Office is undermining the very essence of the rule of law.

If the Home Office won't do what the highest courts tell it to do, why should anybody else? By picking and choosing the laws they are willing to care about, they are playing with fire.

The Home Office thinks it is playing a long game, hoping that courts will adjust their views over time, and that we will all get used to privacy being an increasingly theoretical idea. The truth is that privacy becomes a more necessary principle everyday, in the surveillance economy. We are all in need of greater privacy, so will find ourselves valuing it more.

Nevertheless, the Home Office were always going to find it hardest to concede changes to data retention. We had the right to expect something, even as window dressing, so making no changes at all is pretty audacious. But not in a good way.

ORG, Liberty, BBW, Privacy International and English PEN met Home Office officials today, at precisely the point that the draft changes were released. Which perhaps did not aid matters, as it can only be interpreted as a ploy to keep us away from journalists and kill the story.

The Home Office’s staff that were there did make a very good point about communications data, they said that without communications data, they would have to rely on more intrusive surveillance techniques.

Quite so, and exactly right. All the NGOs present at the meeting were entirely ready to see suspects placed under targeted surveillance measures, rather than having the population at large kept under tabs through retained communications data.

The world has trade offs, and we would suggest that this is a good one.

One final point that the Home Office decided to ignore was the need to notify people whose data has been accessed. The Home Office claimed that this is done in limited circumstances, so no change is needed.

This is another missed opportunity to improve police performance. Notification has the potential to reduce police abuse, and help people spot rotten apples, as the victim will find out when someone is pursuing a campaign of harassment against them or their community. Independent authorisation will help of course, but may not always spot the abuse that an individual will understand to be unfair

Safeguards are suggested for a reason. They are not simply a nicety to satisfy civil liberties campaigners: they are needed to avoid abuse and thereby make the police a better, more trusted, less corruptible, and more effective organisation.

There was one final surprise. The Code of Practice covers the operation of the “Request Filter”. Yet again we are told that this police search engine is a privacy safeguard. We will now run through the code in fine detail to see if any such safeguards are there. On a first glance, there are not.

If the Home Office genuinely believe the Request Filter is a benign tool, they must rewrite this section to make abundantly clear that it is not a mini version of X-Keyscore (the NSA / GCHQ’S tool to trawl their databases of people linked to their email and web visits) and does not operate as a facility to link and search the vast quantities of retained and collected communications data.

[Read more]

: Electronic Voting: An idea whose time has come to go away-->
  • April 05: ORG Glasgow: A discussion of the General Data Protection Regulation (GDPR)
  • March 29: ORG Aberdeen: March Cryptonoise event
  • ORG North East: Take control of your online life
  • ORG Cambridge: Monthly March Meetup