October 10, 2017 | Matthew Rice

Time to make Data Protection work for consumers

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Data Protection. Two words often followed with discussions about “business compliance”. Plenty of business to business conferences out there are making good money off the spectre of data protection. It doesn’t need to be this way. In fact it shouldn’t. Data protection is about rights - the right for the public to hold private and public bodies that collect and process their data to account. 

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Currently, the Government’s Data Protection Bill will give citizens the power to instruct a select group of not for profit bodies to represent them in complaints to the data protection authority or the judiciary. This is required of the Government as Article 80(1) is a mandatory provision in the General Data Protection Regulation, which as a member state of the European Union is directly applicable in the United Kingdom.

However, when given the option to further strengthen consumer rights the Government decided against it. Article 80(2), an optional power in the GDPR, would give those select not for profit bodies the option to take those same complaints without having an affected member of the public instruct them. 

Open Rights Group and other members of civil society are calling for 80(2) to be adopted.

This would improve consumer rights in two ways. Firstly, it will protect the most vulnerable members of society such as children and the elderly. Secondly, it will move data protection to the same status as other consumer rights frameworks like competition or finance.

It would finally acknowledge something that should have been a long time ago: data protection is a consumer right and with that should be given the same powers that have proven so important in other areas of consumer rights.


When it comes to standing up for your rights, it is often those that need it most that are least able. In the case of data protection there have been cases of the profoundly negative effect data sharing has had on the elderly, while research has shown how websites targeted at children are either doing a terrible job of explaining data collection and processing, or no job at all.

The story of Olive Cooke is a sad one. A 92 year old poppy seller who took her own life, was said to be “distressed and overwhelmed” by the huge number of requests for donations from charities she was receiving, according to a 2016 report from the Fundraising Standards Board.

The report found that nearly a fifth of the 99 charities sampled had passed on Olive’s details to others, and that most of those had “assumed” permission to share based on the fact that Olive had not proactively opted out of data sharing. The report concluded that there were “inadequate opportunities for the recipient to opt out” of the mailings and data sharing, which collectively created a situation that was “almost uncontrollable”.

The Global Privacy Enforcement Network, a coalition of data protection authorities from around the world released a report in 2015 showing the level of disregard for data protection standards that websites aimed at children were demonstratingThis included half of the sampled sites sharing personal information with third parties, but only 1 in 3 of them giving effective controls in place to limit the collection of personal information. There was a similarly low number of the websites providing an accessible means for permanent deletion of personal information held.

In both of these examples enforcement either came too late, or not all. One reason was because there was no way for a group to take up the cause for either Olive or children. While the Information Commissioner’s Office in the United Kingdom had actually made public comments on the children’s websites research, there has been no evidence of proceedings or follow up taking place. 

If a power similar to Article 80(2) were in place, a not for profit body like Open Rights Group could take up the enforcement against these bad data protection practices. If done appropriately, the exercise of this power would improve the experience of elderly individuals such as Olive Cooke or millions of children online.


The idea that a not for profit body could take up an independent complaint against bad consumer practices is not novel. Traditional consumer rights such as competition and finance have similar powers for a select group of bodies. Some of these powers have lead to significant developments in the consumer landscape. It is time data protection is recognised as another area of consumer rights, one that is growing in importance, and should be given the same enforcement mechanisms as the others.

Consumer rights group Which? is capable of taking a private enforcement action in civil courts against traders for infringements of consumer protection legislation. There is no need to find an individual affected to instruct Which? to take on the enforcement, instead they need merely to show that consumers have suffered a loss

Also, in the financial sector there is the power for Which?, Citizens Advice, the Federation of Small Businesses and the Consumer Council for Northern Ireland to present “super-complaints” to the Financial Conduct Authority. It was this form of complaint, exercised by Citizens Advice, that played an important role in tackling the mis-selling of Payment Protection Insurance

The right for independent bodies to take complaints independently of finding an affected consumer are already in the wider consumer landscape. They have been wielded with discretion and have shown to be a valuable addition to the consumer rights framework. For data protection to be a modern consumer right, it is only logical that the same accountability frameworks are brought in. Implementing Article 80(2) is the way to achieve this.


The Government’s vision for the Data Protection Bill is to make the UK the safest place to live and do business onlineIt recognises the increasing volumes of personal data, and notes it as an increasing need to protect it. The vision even recognises that data losses can have distressing repercussions on individuals, and that victims can lose trust. All of this sounds hopeful, as though the Government has identified the issues that data protection can help solve. 

If the Government truly wants to achieve its vision, the small addition to improve the accountability framework for consumer rights in data protection will be a big step to take it there.


[Read more]

August 31, 2017 | Pam Cowburn

Nominations for ORG's Advisory Council are open

Are you an expert in digital issues, civil liberties or campaigning?

ORG is recruiting people to our Advisory Council. This is made up of tech, legal, campaigning and political experts who share our values and campaign goals for digital rights and a free and open Internet. As an Advisory Council member, you will be asked to give ORG advice about a range of issues.  Past and present Advisory Council members also help us elect one third of our Board

This year, we are looking for people with the following policy expertise and skills:

• Privacy experts, in data protection, surveillance laws and digital privacy

• Free expression experts, as we prepare for government attempts to control online speech

• People with a legal background

• Experts in the Brexit process

• People with a strong background in copyright reform

• Campaigners, including people with grassroots and local organising experience

• People with experience in FOI, Subject Access Requests, media work

• Journalists and investigative journalists

• People with senior political contacts in the Labour, Lib Dem and Conservative parties

We particularly welcome nominations from or proposing women and people of colour.

This is your chance to help ORG be the most expert and forward thinking digital civil liberties organisation in the UK. Send nominations to by Friday 29 September 2017.


[Read more]

August 09, 2017 | Alec Muffett

The British Public & its Freedom to Tinker with Data

This is a guest blog by Alec Muffett, a security researcher and member of ORG’s Board of Directors.

Britain - after a somewhat shaky start - has recognised its proud tradition of leadership in cryptographic research and development.

jigsaw being put togetherFrom Alan Turing's success at breaking the Enigma cipher at Bletchley Park, and Tommy Flowers' "Colossus" (also there) to break the Lorenz cipher, to early and secret research into what later became known as "Public Key Encryption" by Clifford Cox, to GCHQ's vast deployment of technology to enable mass-surveillance of undersea cable communications — whatever one's opinion of the fruits of the work, Britain is recognised as a world leader in the fields of cryptography.

And one of the great truths of cryptography is: cryptography only improves when people are trying to break it. From academics to crossword-puzzle fans, cryptography does not evolve unless people are permitted to attack its means, methods and mechanisms.

This brings us to the recently announced "Data Protection Bill", in which you will find a well-intentioned paragraph: (our emphasis)

Create a new offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data. Offenders who knowingly handle or process such data will also be guilty of an offence. The maximum penalty would be an unlimited fine. (page 10)

This speaks to the matter of "data anonymisation" (and the reverse processes of re-identification or de-anonymisation) where the intention is that some database — for instance a local hospital admissions list — could be stripped of patient names and yet still usefully processed/shared to research the prevalences of disease in a population.

Done improperly, this can go wrong:

…leading to failures where the "anonymity" can be defeated by combining several data sources, or by attacking the data set analytically, in order to return some semblance of the original data.

Ban it?

So it might sound like a good idea to ban re-identification, yes?

Well, no; the techniques of data anonymisation are mostly a form of "code book" cryptography, and (as above) if it's not legal to prod, poke, and try to break the mechanisms of anonymisation, then anonymisation, like cryptography, will not improve.

Therefore: banning re-identification will harm all of our individual security; it should be explicitly legal for anyone — the professionals, the crossword-puzzlers — to "have a go" at re-identification of data. Certainly it should be illegal for anyone to exploit or share the fruits of any successful re-identification — as is currently suggested — but the act of re-identification itself should not be prevented nor chilled in any way.

To swap metaphors: if you drive a car in the UK then it will have been crash-tested by experts in order to determine how safe it is; but that is not sufficient. We do not rely upon experts to crash them once, declare them safe, and then ban members of the public from crashing their cars. Instead, much of our learning and standards in car safety are from analysing actual, real-world incidents.

Similarly: anonymisation is hard to do correctly, and the failures in how people and organisations have deployed it will only be evident if the many eyes of the general public are permitted to dig into the flaws that may have arisen from one example to the next. It will not be sufficient, as this bill announcement continues, for "…the important role of journalists and whistleblowers […to…] be protected by exemptions."

Everyone has a stake in the collective security of our information, and we — the public — are the code-breakers who should be able to research, and hold to account, any instances of diverse and shoddy anonymisation that may be foisted upon us. Therefore this bill proposal must be amended and the freedom of the public to attempt re-identification must not be abridged.

 — Alec Muffett, security researcher & member of the Board of Directors, ORG

Further reading 

[Read more]

August 01, 2017 | Ed Johnson-Williams

Sorry Amber Rudd, real people do value their security

It’s not for the home secretary to tell the public they don’t need encryption

Amber Rudd has been out doing the media rounds this morning (£) talking about the issues end-to-end encryption poses to law enforcement. One comment in particular caught our eye:

“Real people often prefer ease of use and a multitude of features to perfect, unbreakable security. Who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly user-friendly and cheap way of staying in touch with friends and family?”

This is a little like saying: "Who uses a car because it has airbags and seatbelts, rather than because it’s a convenient way to get around?"

The Home Office strategy here may be to persuade internet companies to take action by telling them that ordinary people don’t care about security. This would be dangerous and misleading.

Clearly, real people (who are Rudd’s not real people?) do value security in their communication, just as they do with safety in their cars. Security is not – or at least does not have to be – the opposite of usability.

For many people, good security makes a service usable and useful. Some people want privacy from corporations, abusive partners or employers. Others may be worried about confidential information, sensitive medical conversations, or be working in countries with a record of human rights abuses.

Whatever the reasons people want secure communications, it is not for the Home Secretary to tell the public that they don’t have any real need for end-to-end encryption.

While Rudd seems to be saying she does not want encryption to be “removed” or bypassed, there are other things she might be looking for. It is possible that she wants the internet companies to assist the police with “computer network exploitation” – that’s hacking people’s devices.

It could mean providing communications data about users which could include data such as: "This user uses this device, often these IP addresses, this version of their operating system with these known vulnerabilities, talks to these people at these times, is online now, is using this IP address, is likely at this address and has visited these websites this many times."

Alternatively, Rudd might mean pushing out compromised app updates with end-to-end encryption disabled.

However, it is likely to be police rather than security services asking for this help. While targeted hacking does provide an investigative option that avoids blanket communications surveillance, it would be risky for the police to have these powers. Training and oversight, after all, are not as thorough or exacting as in the security services.

What is completely lacking is any serious attempt to tell the public what the Home Office wants internet companies to do to make people’s end-to-end communications accessible.

We should be told what risks the public would be exposed to if the companies were to agree to the Home Office’s private requests. Have these risks been properly weighed up and scrutinised? What safeguards and oversight would there be?

One risk is that users may start to distrust tech companies and the apps, operating systems and devices that they make. When security vulnerabilities are identified, firms push out updates to users. Keeping devices and apps up-to-date is one of the most important ways of keeping them secure. But if people are unsure whether they can trust pending updates, will they keep their devices up-to-date?

It would be incredibly damaging to UK security if large numbers of people were dissuaded from doing so. A prime example is the WannaCry ransomware attack that paralysed parts of the NHS in May. It spread through old Windows computers that hadn’t been updated, forcing doctors to cancel thousands of appointments.

The government must spell out its plans in clear, precise legislation and subject that legislation to full parliamentary scrutiny, and it should bring security and usability experts into a public debate about these questions.

Measures that deeply affect everybody’s privacy, freedom of expression, and access to information must not be decided behind closed doors.

[Read more] (2 comments)

June 21, 2017 | Jim Killock

Queen’s speech 2017—threats to privacy and free speech

First analyses of the Queen’s Speech are focussing on what isn’t included, as a weakened Conservative Government appears to have dropped a number of its manifesto commitments but there are several worrying things for digital rights. One welcome development could be data protection legislation, to fill the options in the GDPR.

There are references to a review of Counter-terrorism and a Commision for Countering Extremism which will include Internet-related policies. Although details are lacking, these may contain threats to privacy and free speech. The government has opted for a “Digital Charter”, which isn’t a Bill, but something else.

Here are the key areas that will affect digital rights:

Digital Charter

This isn’t a Bill, but some kind of policy intervention, backed up by “regulation”. This could be the system of fines for social media companies previously mentioned, but this is not explained.

The Digital Charter appears to address both unwanted and illegal content or activity online, and the protection of vulnerable people. The work of CTIRU and the IWF are mentioned as examples of work to remove illegal or extremist content.

At this point, it is hard to know exactly what harms will emerge, but pushing enforcement into the hands of private companies is problematic. It means that decisions never involve courts and are not fully transparent and legally accountable.

Counterterrorism review

There will be a review of counterterrorism powers. The review includes “working with online companies to reduce and restrict the availability of extremist material online”.

This appears to be a watered down version of the Conservative manifesto commitment to give greater responsibility for companies to take down extremist material from their platforms. Already Google and Facebook have issued public statements about how they intend to improve the removal of extremist material from their platforms.

Commission for Countering Extremism

A Commission will look at the topic of countering extremism, likely including on the Internet.

This appears to be a measure to generate ideas and thinking, which could be a positive approach, if it involves considering different approaches, rather than pressing ahead with policies in order to be seen to be doing something. The quality of the Commission will therefore depend on their ability to take a wide range of evidence and assimilate it impartially; it faces a significant challenge in ensuring that fundamental rights are respected within any policy suggestions they suggest.

Data Protection Bill

A new Data Protection Bill, “will fulfil a manifesto commitment to ensure the UK has a data protection regime that is fit for the 21st century”. This will replace the Data Protection Act 1998, which is in any case being removed as the result of the new General Data Protection Regulation passed by the European Parliament last year. Regulations apply directly, so the GDPR does not need to be ‘implemented’ in UK law before Brexit.

We welcome that (at least parts of) the GDPR will be implemented in primary legislation with a full debate in Parliament. It is not clear if the text of the GDPR will be brought into this Bill, or whether it supplements it.

This appears to be a bill to at least implement some of the ‘derogations’ (options) in the GDPR, plus the new rules for law enforcement agencies, that came in with the new law enforcement-related Directive and have to be applied by EU member states.

The bulk of the important rights are in the GDPR, and cannot be tampered with before Brexit. We welcome the chance to debate the choices, and especially to press for the right of privacy groups to bring complaints directly.

Missing: sex and relationships education

There is no mention of the introduction of compulsory sex and relationship education in schools, which was a manifesto commitment for all the main parties, Labour, Lib Dem and Conservative. As there appeared to be a consensus on this issue, it is not clear why this seems to have been dropped.

Encryption is also not mentioned, but that’s because the powers will be brought in through a statutory instrument enabling Technical Capability Notices.

Help us win new rights and fight off censorship

There’s lots to do. Please help us fight proposals for privatised and unaccountable censorship, and to establish rights for privacy groups to complain directly about data protection breaches. Join ORG for £6/month so we can defend your rights.


[Read more] (1 comments)

June 13, 2017 | Ed Johnson-Williams

UK and France propose automated censorship of online content

Theresa May and Emmanuel Macron's plans to make Internet companies liable for 'extremist' content on their platforms are fraught with challenges. They entail automated censorship, risking the removal of unobjectionable content and harming everyone's right to free expression.

The Government announced this morning that Theresa May and the French President Emmanuel Macron will talk today about making tech companies legally liable if they “fail to remove unacceptable content”. The UK and France would work with tech companies “to develop tools to identify and remove harmful material automatically”.

No one would deny that extremists use mainstream Internet platforms to share content that incites people to hate others and, in some cases, to commit violent acts. Tech companies may well have a role in helping the authorities challenge such propaganda but attempting to close it down is not as straightforward or consequence-free as politicians would like us to believe.

First things first, how would this work? It almost certainly entails the use of algorithms and machine learning to censor content. With this sort of automated takedown process, the companies instruct the algorithms to behave in certain ways. Given the economic and reputational incentives on the companies to avoid fines, it seems highly likely that the companies will go down the route of using hair-trigger, error-prone algorithms that will end up removing unobjectionable content.

May and Macron’s proposal is to identify and remove new extremist content. It is unclear whose rules they want Internet companies to enforce. The Facebook Files showed Facebook's own policies are to delete a lot of legal but potentially objectionable content, often in a seemingly arbitrary way. Alternatively, if the companies are to enforce UK and French laws on hate speech and so on, that will probably be a lot less censorious than May and Macron are hoping for.

The history of automated content takedown suggests removing extremist content without removing harmless content will be an enormous challenge. The mistakes made by YouTube’s ContentID system that automate takedowns of alleged copyright-infringing content on YouTube are well-documented.

Context is king when it comes to judging content. Will these automated systems really be able to tell the difference between posts that criticise terrorism while using video of terrorists and posts promoting terrorism that use the same video?

There are some that will say this is a small price to pay if it stops the spread of extremist propaganda but it will lead to a framework for censorship that can be used against anything that is perceived as harmful. All of this might result in extremists moving to other platforms to promote their material. But will they actually be less able to communicate?

Questions abound. What incentives will the companies have to get it right? Will there be any safeguards? If so, how transparent will those safeguards be? Will the companies be fined for censoring legal content as well as failing to censor illegal content?

And what about the global picture? Internet companies like Facebook, Twitter and Youtube have a global reach. Will they be expected to create a system that can be used by any national government – even those with poor human rights records? It’s unclear whether May and Macron have thought through whether they are happy for Internet platforms to become an arm of every state that they operate in.

All this of course is in the context of Theresa May entering a new Parliament with a very fragile majority. She will be careful only to bring legislation to Parliament that she is confident of getting through. Opposition in Parliament to these plans is far from guaranteed. In April the Labour MP Yvette Cooper recommended fines for tech companies in a report she headed up on the Home Affairs select committee.

ORG will challenge these proposals both inside and outside Parliament. If you'd like to support our work you can do so by joining ORG. It's £6 a month and we'll send you a copy of our fantastic new book when you join.

[Read more] (3 comments)

June 05, 2017 | Jim Killock

Our response to the London and Manchester Attacks

Some of you will know that ORG for many years had our offices in Borough. It was a daily occurrence for us until summer 2015 to walk and to eat in the places where where Saturday’s appalling events took place.

As Londoners, we are relieved that we do not know anyone who has been directly affected. It is also genuinely shocking, as it was for some of us during the 2005 bombings, to have personal connections with the places involved in brutal terrorist killings. It is a reminder of the personal trauma that is also being felt by our friends and colleagues in Manchester. Many of us feel very exposed in the face of terrorism and violence.

As individuals, it is also natural to ask whether our own views can withstand this kind of onslaught. Is it right to resist or question measures that the government wishes to pursue, which it claims could improve security, or could at least reassure people that everything possible is being done. Is it selfish, or unrealistic, to argue against potential protections when people are seeking to ensure that, as Theresa May put it, “enough is enough”?

However, many people in London and Manchester will not wish these events to be exploited and used to usher in policies that are ill-thought out, illiberal or otherwise seek to exploit the situation. This is not a denial of the vulnerability that we feel, but a desire to ensure that terrorism does not win. These attacks so often occur in cities with very liberal and open outlooks, where there is little or no expectation of political violence, and toleration is a normal way of being.

London and Manchester are both cities with big creative and tech sectors, with many people very aware of what the Internet does, its benefits and also the dangers of attempts to control, censor and surveil. If the government uses these events to pursue policies that are ineffective, meaningless or dangerous, then many of those who feel a personal investment in seeing our communities protected, may quickly feel that these events are being exploited rather than dealt with maturely.

Calls for an end to tolerance of extremism are perhaps even more ill-judged. It is hard to imagine that the public sector has been tolerating extremism, except in relatively isolated examples. These statements could easily lead to over-reactions and quite divisive policy. For instance, the controversial Prevent programme, backed up by legislative anti-extremist quasi-policing duties across many parts of the public sector, could ramp up, leading to serious misjudgements.

It seems particularly harsh to accuse Muslim communities of tolerating extremist views without also recognising that the there are claims that the Manchester attacker had been reported as potentially dangerous by members of his community, and without articulating that extremists wish to create divisions between us. Whatever the changes that may be needed, it would also be wise to recognise that the government too may have had its failings.

We will be looking very carefully at her proposals for online censorship and attempts to limit the security of ordinary users of Internet services. To be clear, we are not saying that there are no measures that could ever be taken. There are already, quite rightly, laws about what is illegal and duties on companies to act when they are instructed. They also do a great deal well beyond their legal duties, because they do not want any association with any kind of criminality.

However, what we have heard so far from the government does not give us confidence that their proposals will necessary, proportionate, and ensure legal accountability. This is what the Conservative manifesto has to say on page 79:

We will put a responsibility on industry not to direct users – even unintentionally – to hate speech, pornography, or other sources of harm. We will make clear the responsibility of platforms to enable the reporting of inappropriate, bullying, harmful or illegal content, with take-down on a comply-or-explain basis.
We will continue to push the internet companies to deliver on their commitments to develop technical tools to identify and remove terrorist propaganda, to help smaller companies build their capabilities and to provide support for civil society organisations to promote alternative and counter-narratives.
… In addition, we do not believe that there should be a safe space for terrorists to be able to communicate online and will work to prevent them from having this capability. (ORG wiki)

We—and we hope you—will want to know: will the proposals work? Will they create new risks or adverse effects? Who will hold the police or companies to account for their decisions, and how? So far, what we have heard does not give us much confidence that we will receive satisfactory answers.

Theresa May’s speech had the feel of electioneering rather than a common-sense, values and evidence based approach. That is simply not being sufficiently serious and respectful about what has happened.


[Read more] (7 comments)

June 04, 2017 | Jim Killock

The London Attacks

Open Rights Group condemns the appalling attack at London Bridge; this is not only a violent assault on individual lives but an attack against the freedom and security we enjoy in the UK.

It is disappointing that in the aftermath of this attack, the Government’s response appears to focus on the regulation of the Internet and encryption.

This could be a very risky approach. If successful, Theresa May could push these vile networks into even darker corners of the web, where they will be even harder to observe.

But we should not be distracted: the Internet and companies like Facebook are not a cause of this hatred and violence, but tools that can be abused. While governments and companies should take sensible measures to stop abuse, attempts to control the Internet is not the simple solution that Theresa May is claiming.

Real solutions—as we were forced to state only two weeks ago—will require attempts to address the actual causes of extremism. For instance, both Jeremy Corbyn and Theresa May have drawn attention to the importance of finding solutions to the drivers of terrorism in countries including Syria, Iraq and Libya.

Debating controls on the Internet risks distracting from these very hard and vital questions.


[Read more] (4 comments)