Blog


November 30, 2017 | Jim Killock

Home Office concedes independent authorisation

This is major victory for ORG, although one with dangers. The government has conceded that independent authorisation is necessary for communications data requests, but refused to budge on retained data and is pushing ahead with the “Request Filter”.

Adding independent authorisation for communications data requests will make the police more effective, as corruption and abuse will be harder. It will improve operational effectiveness, even if less data is used during investigations and trust in the police should improve.

Nevertheless the government has disregarded many key elements of the judgment

  • It isn't going to reduce the amount of data retained

  • It won't notify people whose data is used during investigations

  • It won't keep data within the EU, instead it will continue to transfer it, presumably specifically to the USA

  • The Home Office has opted for a ‘six month sentence’ definition of ‘serious crime’ rather than the Lords’ definition of crimes capable of sentences of at least one year.

These are clear evasions and abrogations of the judgment. The mission of the Home Office is to uphold the rule of law. By failing to do what the courts tell them, the Home Office is undermining the very essence of the rule of law.

If the Home Office won't do what the highest courts tell it to do, why should anybody else? By picking and choosing the laws they are willing to care about, they are playing with fire.

The Home Office thinks it is playing a long game, hoping that courts will adjust their views over time, and that we will all get used to privacy being an increasingly theoretical idea. The truth is that privacy becomes a more necessary principle everyday, in the surveillance economy. We are all in need of greater privacy, so will find ourselves valuing it more.

Nevertheless, the Home Office were always going to find it hardest to concede changes to data retention. We had the right to expect something, even as window dressing, so making no changes at all is pretty audacious. But not in a good way.

ORG, Liberty, BBW, Privacy International and English PEN met Home Office officials today, at precisely the point that the draft changes were released. Which perhaps did not aid matters, as it can only be interpreted as a ploy to keep us away from journalists and kill the story.

The Home Office’s staff that were there did make a very good point about communications data, they said that without communications data, they would have to rely on more intrusive surveillance techniques.

Quite so, and exactly right. All the NGOs present at the meeting were entirely ready to see suspects placed under targeted surveillance measures, rather than having the population at large kept under tabs through retained communications data.

The world has trade offs, and we would suggest that this is a good one.

One final point that the Home Office decided to ignore was the need to notify people whose data has been accessed. The Home Office claimed that this is done in limited circumstances, so no change is needed.

This is another missed opportunity to improve police performance. Notification has the potential to reduce police abuse, and help people spot rotten apples, as the victim will find out when someone is pursuing a campaign of harassment against them or their community. Independent authorisation will help of course, but may not always spot the abuse that an individual will understand to be unfair

Safeguards are suggested for a reason. They are not simply a nicety to satisfy civil liberties campaigners: they are needed to avoid abuse and thereby make the police a better, more trusted, less corruptible, and more effective organisation.

There was one final surprise. The Code of Practice covers the operation of the “Request Filter”. Yet again we are told that this police search engine is a privacy safeguard. We will now run through the code in fine detail to see if any such safeguards are there. On a first glance, there are not.

If the Home Office genuinely believe the Request Filter is a benign tool, they must rewrite this section to make abundantly clear that it is not a mini version of X-Keyscore (the NSA / GCHQ’S tool to trawl their databases of people linked to their email and web visits) and does not operate as a facility to link and search the vast quantities of retained and collected communications data.



[Read more]


October 24, 2017 | Slavka Bielikova

Epson delete competing Ebay ink listings citing patent claims

Compatible ink cartridges are being removed by Epson under eBay's Verified Rights Owner programme.

Epson have contacted a number of resellers on eBay warning them to remove their listings. They have followed up by directly removing the items from eBay,  as members of the “VeRO” programme.

Epson enjoy a special trusted status on eBay under their VeRO programme. VeRO allows rightsholders to remove listings that they “believe may infringe on their intellectual property rights”. eBay don’t appear to require an actual proof of infringement, for example a decision of the court, but accept Epson’s word as a trusted company.

The policy for patent takedowns only applies in the EU. This means that Epson compatible cartridges in the US are not affected.

Epson are alleging that certain compatible ink cartridges infringe their patents GB2433473 and amendment GB2465293. The alleged infringement concerns the alignment of chip contacts on their cartridges. So far, we know Epson have issued takedowns against compatible cartridges T16 XL; T18 XL; T24 XL; T26 XL; T27 XL; T29 XL; T33 XL plus T0715 XL; T0797 XL; T0807 XL.

We are concerned that eBay is giving protection to only one party in this dispute. Rightsholders can easily claim infringement but resellers appear to be unable to assert the legality of their products and listings.

This is both unusual and unfair. Indeed, eBay have said as much in relation to US patent claims:

“eBay has a policy to quickly remove listings when a NOCI [Notice of Claimed Infringement] provides a court order, but eBay rarely removes listings based on mere allegations of infringement. eBay has two reasons for this policy. First, eBay believes that removing listings based on allegations of infringement would be unfair to buyers and the accused sellers. Such a policy, in eBay’s view, would give too much power to unscrupulous patent holders. The second reason eBay has adopted its policy is because it lacks the expertise to construe the patent infringement claims submitted to it and cannot assess the claims when it never possess[es] the products.”

It is also concerning that Epson opted to act against resellers and did not contact the manufacturers first. If Epson believe that their patents are genuinely being infringed then it would be more efficient and just to take direct legal action in order to prevent import or manufacture of these products at source.

At this stage, we cannot know if there is any merit to the Epson’s claim that these compatible cartridges infringe their patents but using patents in this way would undermine the legal regime that protects production of compatible products, including components, such as ink cartridges. That would be extremely bad for consumers.

We have contacted the IPO for clarification of eBay’s duties and responsibilities relating to patent claims. We would like to speak to both eBay and Epson about this. Meantime, if you have been affected by takedowns relating to Epson compatible ink cartridges and patent claims, please get in touch with us by emailing policy.monitoring@openrightsgroup.org.

 

[Read more]


October 10, 2017 | Matthew Rice

Time to make Data Protection work for consumers

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Data Protection. Two words often followed with discussions about “business compliance”. Plenty of business to business conferences out there are making good money off the spectre of data protection. It doesn’t need to be this way. In fact it shouldn’t. Data protection is about rights - the right for the public to hold private and public bodies that collect and process their data to account. 

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Currently, the Government’s Data Protection Bill will give citizens the power to instruct a select group of not for profit bodies to represent them in complaints to the data protection authority or the judiciary. This is required of the Government as Article 80(1) is a mandatory provision in the General Data Protection Regulation, which as a member state of the European Union is directly applicable in the United Kingdom.

However, when given the option to further strengthen consumer rights the Government decided against it. Article 80(2), an optional power in the GDPR, would give those select not for profit bodies the option to take those same complaints without having an affected member of the public instruct them. 

Open Rights Group and other members of civil society are calling for 80(2) to be adopted.

This would improve consumer rights in two ways. Firstly, it will protect the most vulnerable members of society such as children and the elderly. Secondly, it will move data protection to the same status as other consumer rights frameworks like competition or finance.

It would finally acknowledge something that should have been a long time ago: data protection is a consumer right and with that should be given the same powers that have proven so important in other areas of consumer rights.

PROTECTING THE VULNERABLE

When it comes to standing up for your rights, it is often those that need it most that are least able. In the case of data protection there have been cases of the profoundly negative effect data sharing has had on the elderly, while research has shown how websites targeted at children are either doing a terrible job of explaining data collection and processing, or no job at all.

The story of Olive Cooke is a sad one. A 92 year old poppy seller who took her own life, was said to be “distressed and overwhelmed” by the huge number of requests for donations from charities she was receiving, according to a 2016 report from the Fundraising Standards Board.

The report found that nearly a fifth of the 99 charities sampled had passed on Olive’s details to others, and that most of those had “assumed” permission to share based on the fact that Olive had not proactively opted out of data sharing. The report concluded that there were “inadequate opportunities for the recipient to opt out” of the mailings and data sharing, which collectively created a situation that was “almost uncontrollable”.

The Global Privacy Enforcement Network, a coalition of data protection authorities from around the world released a report in 2015 showing the level of disregard for data protection standards that websites aimed at children were demonstratingThis included half of the sampled sites sharing personal information with third parties, but only 1 in 3 of them giving effective controls in place to limit the collection of personal information. There was a similarly low number of the websites providing an accessible means for permanent deletion of personal information held.

In both of these examples enforcement either came too late, or not all. One reason was because there was no way for a group to take up the cause for either Olive or children. While the Information Commissioner’s Office in the United Kingdom had actually made public comments on the children’s websites research, there has been no evidence of proceedings or follow up taking place. 

If a power similar to Article 80(2) were in place, a not for profit body like Open Rights Group could take up the enforcement against these bad data protection practices. If done appropriately, the exercise of this power would improve the experience of elderly individuals such as Olive Cooke or millions of children online.

DATA PROTECTION AS A CONSUMER RIGHT

The idea that a not for profit body could take up an independent complaint against bad consumer practices is not novel. Traditional consumer rights such as competition and finance have similar powers for a select group of bodies. Some of these powers have lead to significant developments in the consumer landscape. It is time data protection is recognised as another area of consumer rights, one that is growing in importance, and should be given the same enforcement mechanisms as the others.

Consumer rights group Which? is capable of taking a private enforcement action in civil courts against traders for infringements of consumer protection legislation. There is no need to find an individual affected to instruct Which? to take on the enforcement, instead they need merely to show that consumers have suffered a loss

Also, in the financial sector there is the power for Which?, Citizens Advice, the Federation of Small Businesses and the Consumer Council for Northern Ireland to present “super-complaints” to the Financial Conduct Authority. It was this form of complaint, exercised by Citizens Advice, that played an important role in tackling the mis-selling of Payment Protection Insurance

The right for independent bodies to take complaints independently of finding an affected consumer are already in the wider consumer landscape. They have been wielded with discretion and have shown to be a valuable addition to the consumer rights framework. For data protection to be a modern consumer right, it is only logical that the same accountability frameworks are brought in. Implementing Article 80(2) is the way to achieve this.

GOVERNMENT’S VISION

The Government’s vision for the Data Protection Bill is to make the UK the safest place to live and do business onlineIt recognises the increasing volumes of personal data, and notes it as an increasing need to protect it. The vision even recognises that data losses can have distressing repercussions on individuals, and that victims can lose trust. All of this sounds hopeful, as though the Government has identified the issues that data protection can help solve. 

If the Government truly wants to achieve its vision, the small addition to improve the accountability framework for consumer rights in data protection will be a big step to take it there.

 

[Read more]


August 31, 2017 | Pam Cowburn

Nominations for ORG's Advisory Council are open

Are you an expert in digital issues, civil liberties or campaigning?

ORG is recruiting people to our Advisory Council. This is made up of tech, legal, campaigning and political experts who share our values and campaign goals for digital rights and a free and open Internet. As an Advisory Council member, you will be asked to give ORG advice about a range of issues.  Past and present Advisory Council members also help us elect one third of our Board

This year, we are looking for people with the following policy expertise and skills:

• Privacy experts, in data protection, surveillance laws and digital privacy

• Free expression experts, as we prepare for government attempts to control online speech

• People with a legal background

• Experts in the Brexit process

• People with a strong background in copyright reform

• Campaigners, including people with grassroots and local organising experience

• People with experience in FOI, Subject Access Requests, media work

• Journalists and investigative journalists

• People with senior political contacts in the Labour, Lib Dem and Conservative parties


We particularly welcome nominations from or proposing women and people of colour.

This is your chance to help ORG be the most expert and forward thinking digital civil liberties organisation in the UK. Send nominations to nominations@openrightsgroup.org by Friday 29 September 2017.

 

[Read more]


August 09, 2017 | Alec Muffett

The British Public & its Freedom to Tinker with Data

This is a guest blog by Alec Muffett, a security researcher and member of ORG’s Board of Directors.

Britain - after a somewhat shaky start - has recognised its proud tradition of leadership in cryptographic research and development.

jigsaw being put togetherFrom Alan Turing's success at breaking the Enigma cipher at Bletchley Park, and Tommy Flowers' "Colossus" (also there) to break the Lorenz cipher, to early and secret research into what later became known as "Public Key Encryption" by Clifford Cox, to GCHQ's vast deployment of technology to enable mass-surveillance of undersea cable communications — whatever one's opinion of the fruits of the work, Britain is recognised as a world leader in the fields of cryptography.

And one of the great truths of cryptography is: cryptography only improves when people are trying to break it. From academics to crossword-puzzle fans, cryptography does not evolve unless people are permitted to attack its means, methods and mechanisms.

This brings us to the recently announced "Data Protection Bill", in which you will find a well-intentioned paragraph: (our emphasis)

Create a new offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data. Offenders who knowingly handle or process such data will also be guilty of an offence. The maximum penalty would be an unlimited fine.

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/635900/2017-08-07_DP_Bill_-_Statement_of_Intent.pdf (page 10)

This speaks to the matter of "data anonymisation" (and the reverse processes of re-identification or de-anonymisation) where the intention is that some database — for instance a local hospital admissions list — could be stripped of patient names and yet still usefully processed/shared to research the prevalences of disease in a population.

Done improperly, this can go wrong:

…leading to failures where the "anonymity" can be defeated by combining several data sources, or by attacking the data set analytically, in order to return some semblance of the original data.

Ban it?

So it might sound like a good idea to ban re-identification, yes?

Well, no; the techniques of data anonymisation are mostly a form of "code book" cryptography, and (as above) if it's not legal to prod, poke, and try to break the mechanisms of anonymisation, then anonymisation, like cryptography, will not improve.

Therefore: banning re-identification will harm all of our individual security; it should be explicitly legal for anyone — the professionals, the crossword-puzzlers — to "have a go" at re-identification of data. Certainly it should be illegal for anyone to exploit or share the fruits of any successful re-identification — as is currently suggested — but the act of re-identification itself should not be prevented nor chilled in any way.

To swap metaphors: if you drive a car in the UK then it will have been crash-tested by experts in order to determine how safe it is; but that is not sufficient. We do not rely upon experts to crash them once, declare them safe, and then ban members of the public from crashing their cars. Instead, much of our learning and standards in car safety are from analysing actual, real-world incidents.

Similarly: anonymisation is hard to do correctly, and the failures in how people and organisations have deployed it will only be evident if the many eyes of the general public are permitted to dig into the flaws that may have arisen from one example to the next. It will not be sufficient, as this bill announcement continues, for "…the important role of journalists and whistleblowers […to…] be protected by exemptions."

Everyone has a stake in the collective security of our information, and we — the public — are the code-breakers who should be able to research, and hold to account, any instances of diverse and shoddy anonymisation that may be foisted upon us. Therefore this bill proposal must be amended and the freedom of the public to attempt re-identification must not be abridged.

 — Alec Muffett, security researcher & member of the Board of Directors, ORG

Further reading

https://en.wikipedia.org/wiki/Bombe
https://en.wikipedia.org/wiki/Tommy_Flowers
https://en.wikipedia.org/wiki/Clifford_Cocks
https://en.wikipedia.org/wiki/Tempora
https://en.wikipedia.org/wiki/Cryptanalysis
https://en.wikipedia.org/wiki/De-anonymization
https://en.wikipedia.org/wiki/Data_Re-Identification
https://en.wikipedia.org/wiki/Euro_NCAP
http://www.theregister.co.uk/2017/08/07/data_protection_bill_draft/ 

[Read more]


August 01, 2017 | Ed Johnson-Williams

Sorry Amber Rudd, real people do value their security

It’s not for the home secretary to tell the public they don’t need encryption

Amber Rudd has been out doing the media rounds this morning (£) talking about the issues end-to-end encryption poses to law enforcement. One comment in particular caught our eye:

“Real people often prefer ease of use and a multitude of features to perfect, unbreakable security. Who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly user-friendly and cheap way of staying in touch with friends and family?”

This is a little like saying: "Who uses a car because it has airbags and seatbelts, rather than because it’s a convenient way to get around?"

The Home Office strategy here may be to persuade internet companies to take action by telling them that ordinary people don’t care about security. This would be dangerous and misleading.

Clearly, real people (who are Rudd’s not real people?) do value security in their communication, just as they do with safety in their cars. Security is not – or at least does not have to be – the opposite of usability.

For many people, good security makes a service usable and useful. Some people want privacy from corporations, abusive partners or employers. Others may be worried about confidential information, sensitive medical conversations, or be working in countries with a record of human rights abuses.

Whatever the reasons people want secure communications, it is not for the Home Secretary to tell the public that they don’t have any real need for end-to-end encryption.

While Rudd seems to be saying she does not want encryption to be “removed” or bypassed, there are other things she might be looking for. It is possible that she wants the internet companies to assist the police with “computer network exploitation” – that’s hacking people’s devices.

It could mean providing communications data about users which could include data such as: "This user uses this device, often these IP addresses, this version of their operating system with these known vulnerabilities, talks to these people at these times, is online now, is using this IP address, is likely at this address and has visited these websites this many times."

Alternatively, Rudd might mean pushing out compromised app updates with end-to-end encryption disabled.

However, it is likely to be police rather than security services asking for this help. While targeted hacking does provide an investigative option that avoids blanket communications surveillance, it would be risky for the police to have these powers. Training and oversight, after all, are not as thorough or exacting as in the security services.

What is completely lacking is any serious attempt to tell the public what the Home Office wants internet companies to do to make people’s end-to-end communications accessible.

We should be told what risks the public would be exposed to if the companies were to agree to the Home Office’s private requests. Have these risks been properly weighed up and scrutinised? What safeguards and oversight would there be?

One risk is that users may start to distrust tech companies and the apps, operating systems and devices that they make. When security vulnerabilities are identified, firms push out updates to users. Keeping devices and apps up-to-date is one of the most important ways of keeping them secure. But if people are unsure whether they can trust pending updates, will they keep their devices up-to-date?

It would be incredibly damaging to UK security if large numbers of people were dissuaded from doing so. A prime example is the WannaCry ransomware attack that paralysed parts of the NHS in May. It spread through old Windows computers that hadn’t been updated, forcing doctors to cancel thousands of appointments.

The government must spell out its plans in clear, precise legislation and subject that legislation to full parliamentary scrutiny, and it should bring security and usability experts into a public debate about these questions.

Measures that deeply affect everybody’s privacy, freedom of expression, and access to information must not be decided behind closed doors.

[Read more] (2 comments)


June 21, 2017 | Jim Killock

Queen’s speech 2017—threats to privacy and free speech

First analyses of the Queen’s Speech are focussing on what isn’t included, as a weakened Conservative Government appears to have dropped a number of its manifesto commitments but there are several worrying things for digital rights. One welcome development could be data protection legislation, to fill the options in the GDPR.

There are references to a review of Counter-terrorism and a Commision for Countering Extremism which will include Internet-related policies. Although details are lacking, these may contain threats to privacy and free speech. The government has opted for a “Digital Charter”, which isn’t a Bill, but something else.

Here are the key areas that will affect digital rights:

Digital Charter

This isn’t a Bill, but some kind of policy intervention, backed up by “regulation”. This could be the system of fines for social media companies previously mentioned, but this is not explained.

The Digital Charter appears to address both unwanted and illegal content or activity online, and the protection of vulnerable people. The work of CTIRU and the IWF are mentioned as examples of work to remove illegal or extremist content.

At this point, it is hard to know exactly what harms will emerge, but pushing enforcement into the hands of private companies is problematic. It means that decisions never involve courts and are not fully transparent and legally accountable.

Counterterrorism review

There will be a review of counterterrorism powers. The review includes “working with online companies to reduce and restrict the availability of extremist material online”.

This appears to be a watered down version of the Conservative manifesto commitment to give greater responsibility for companies to take down extremist material from their platforms. Already Google and Facebook have issued public statements about how they intend to improve the removal of extremist material from their platforms.

Commission for Countering Extremism

A Commission will look at the topic of countering extremism, likely including on the Internet.

This appears to be a measure to generate ideas and thinking, which could be a positive approach, if it involves considering different approaches, rather than pressing ahead with policies in order to be seen to be doing something. The quality of the Commission will therefore depend on their ability to take a wide range of evidence and assimilate it impartially; it faces a significant challenge in ensuring that fundamental rights are respected within any policy suggestions they suggest.

Data Protection Bill

A new Data Protection Bill, “will fulfil a manifesto commitment to ensure the UK has a data protection regime that is fit for the 21st century”. This will replace the Data Protection Act 1998, which is in any case being removed as the result of the new General Data Protection Regulation passed by the European Parliament last year. Regulations apply directly, so the GDPR does not need to be ‘implemented’ in UK law before Brexit.

We welcome that (at least parts of) the GDPR will be implemented in primary legislation with a full debate in Parliament. It is not clear if the text of the GDPR will be brought into this Bill, or whether it supplements it.

This appears to be a bill to at least implement some of the ‘derogations’ (options) in the GDPR, plus the new rules for law enforcement agencies, that came in with the new law enforcement-related Directive and have to be applied by EU member states.

The bulk of the important rights are in the GDPR, and cannot be tampered with before Brexit. We welcome the chance to debate the choices, and especially to press for the right of privacy groups to bring complaints directly.

Missing: sex and relationships education

There is no mention of the introduction of compulsory sex and relationship education in schools, which was a manifesto commitment for all the main parties, Labour, Lib Dem and Conservative. As there appeared to be a consensus on this issue, it is not clear why this seems to have been dropped.

Encryption is also not mentioned, but that’s because the powers will be brought in through a statutory instrument enabling Technical Capability Notices.

Help us win new rights and fight off censorship

There’s lots to do. Please help us fight proposals for privatised and unaccountable censorship, and to establish rights for privacy groups to complain directly about data protection breaches. Join ORG for £6/month so we can defend your rights.

 

[Read more] (1 comments)


June 13, 2017 | Ed Johnson-Williams

UK and France propose automated censorship of online content

Theresa May and Emmanuel Macron's plans to make Internet companies liable for 'extremist' content on their platforms are fraught with challenges. They entail automated censorship, risking the removal of unobjectionable content and harming everyone's right to free expression.

The Government announced this morning that Theresa May and the French President Emmanuel Macron will talk today about making tech companies legally liable if they “fail to remove unacceptable content”. The UK and France would work with tech companies “to develop tools to identify and remove harmful material automatically”.

No one would deny that extremists use mainstream Internet platforms to share content that incites people to hate others and, in some cases, to commit violent acts. Tech companies may well have a role in helping the authorities challenge such propaganda but attempting to close it down is not as straightforward or consequence-free as politicians would like us to believe.

First things first, how would this work? It almost certainly entails the use of algorithms and machine learning to censor content. With this sort of automated takedown process, the companies instruct the algorithms to behave in certain ways. Given the economic and reputational incentives on the companies to avoid fines, it seems highly likely that the companies will go down the route of using hair-trigger, error-prone algorithms that will end up removing unobjectionable content.

May and Macron’s proposal is to identify and remove new extremist content. It is unclear whose rules they want Internet companies to enforce. The Facebook Files showed Facebook's own policies are to delete a lot of legal but potentially objectionable content, often in a seemingly arbitrary way. Alternatively, if the companies are to enforce UK and French laws on hate speech and so on, that will probably be a lot less censorious than May and Macron are hoping for.

The history of automated content takedown suggests removing extremist content without removing harmless content will be an enormous challenge. The mistakes made by YouTube’s ContentID system that automate takedowns of alleged copyright-infringing content on YouTube are well-documented.

Context is king when it comes to judging content. Will these automated systems really be able to tell the difference between posts that criticise terrorism while using video of terrorists and posts promoting terrorism that use the same video?

There are some that will say this is a small price to pay if it stops the spread of extremist propaganda but it will lead to a framework for censorship that can be used against anything that is perceived as harmful. All of this might result in extremists moving to other platforms to promote their material. But will they actually be less able to communicate?

Questions abound. What incentives will the companies have to get it right? Will there be any safeguards? If so, how transparent will those safeguards be? Will the companies be fined for censoring legal content as well as failing to censor illegal content?

And what about the global picture? Internet companies like Facebook, Twitter and Youtube have a global reach. Will they be expected to create a system that can be used by any national government – even those with poor human rights records? It’s unclear whether May and Macron have thought through whether they are happy for Internet platforms to become an arm of every state that they operate in.

All this of course is in the context of Theresa May entering a new Parliament with a very fragile majority. She will be careful only to bring legislation to Parliament that she is confident of getting through. Opposition in Parliament to these plans is far from guaranteed. In April the Labour MP Yvette Cooper recommended fines for tech companies in a report she headed up on the Home Affairs select committee.

ORG will challenge these proposals both inside and outside Parliament. If you'd like to support our work you can do so by joining ORG. It's £6 a month and we'll send you a copy of our fantastic new book when you join.

[Read more] (3 comments)