Blog


December 11, 2017 | Caitlin Bishop

Battle lines have been drawn over the Data Protection bill

Battle lines have been drawn by the Information Commissioner’s Office and the Joint Committee on Human Rights on the debate over the Government’s Data Protection bill.

Open Rights Group have delivered briefings to Peers on its core campaigning points, including:

  • raising concerns about the hazardous immigration exemptions alongside the3million, the campaigning organisation representing EU citizens living in the United Kingdom.

Both topics are also included in the briefings from the Information Commissioner’s Office and the Deputy Counsel note to the Joint Select Committee on Human Rights. These arrive as Report Stage on the Data Protection Bill begins on Monday 11 December, continues on Wednesday 13 December, and finally finishing in early 2018 on the 10 January.

The Information Commissioner’s Office (ICO) is the independent body that enforces the data protection law of the UK. Their views on the proposed Bill gives us a practical insight into their effects and should considered carefully by the Government.

The Joint Committee on Human Rights are a committee made up of representatives from both the House of Commons and House of Lords. Their role includes scrutinising every Government Bill for its compatibility with human rights. The Deputy Counsel’s note that is referred to here is from the lawyer that provides specialist legal advice to the committee on what human rights implications may be raised by a Government bill.

Immigration Exemptions need to go

The Government have introduced an exemption into the Data Protection Bill that would remove the rights of individuals subject to an immigration procedure to discover what personal data companies and public authorities hold on them.

The exemption if allowed to pass would set aside fundamental rights such as individuals access to personal data about them, the right to erasure, and the right to rectification, among others. With mistakes commonplace in immigration procedures, it is vital the law retains the power for individuals to  hold to account those who collect and process personal data in immigration procedures.

The Information Commissioner’s Office shares Open Rights Group’s concerns about the exemptions which in effect remove accountability:

“The majority of data protection complaints to the Information Commissioner about the Home Office relate to requests for access to personal data to UK Visas and Immigration….If the exemption is applied, individuals will not be able to access their personal data to identify any actual inaccuracies and it will mean that the system lacks transparency and is fundamentally unfair.”

The exemption found in Schedule 2 Part 4 of the Bill is much broader than just data held by the Home Office, covering any organisation processing information that is used in relation to immigration controls. The current immigration regime extends the responsibility to control immigration to schools, GPs, hospitals, landlords, employers and even the DVLA.

The Government maintained that the exemption “emphatically does not set aside the whole of the GDPR”.  

Open Rights Group argues it emphatically does.

The note from the Deputy Counsel suggested to the Committee that they should consider “why this exemption is “necessary in a democratic society””, which is one of the legal tests for whether an interference with a fundamental human right is actually a violation.

The counsel’s note also raises concerns about the discriminatory nature of the exemption, based on the nationality of individuals, such as the 3 million EU citizens currently living in the United Kingdom. The potential scope for discrimination is why Open Rights Group worked with the3million to raise shared concerns with Peers.

Representation of data subjects

Open Rights Group have been campaigning since the Bill arrived in the House of Lords for the power to be given to not for profit bodies to represent data subjects without having an affected member of the public instruct them.

The Government do not want to incorporate the optional power. The Government suggested in the debate during Committee stage that the public were already capable of exercising their powers, citing a recent case brought by 5,000 data subjects. The Government failed to mention that the claim was actually brought by the former Executive Director of consumer rights organisation Which?. This isn’t the spontaneous popular organisation of members of the public the Government’s comments in debate would lead people to believe.

The Information Commissioner’s Office agrees with Open Rights Group on the need for 80(2) to be adopted:

“...there are circumstances where data subjects may not necessarily be aware of what data about them is held by organisations, and more importantly what is being done with it. In such instances data subjects could not be expected to know whether and how they could exercise their rights under data protection law…. This point is of particular importance where young and vulnerable data subjects are involved - these groups are less likely to have the means and capability to exercise their rights on their own behalf.”

This support for 80(2) from the Commissioner is welcome. The note from the Deputy Counsel to the Joint Committee on Human Rights raises representation of data subjects too suggesting “the Government’s omission of 80(2) may diminish the protection of privacy rights”.

With these briefings from the ICO and the Deputy Counsel to the Joint Committee on Human Rights, battle lines already drawn by civil society have now been deepened and fortified. The Government can no longer continue to dismiss these concerns out of hand.

 

[Read more]


November 30, 2017 | Jim Killock

Home Office concedes independent authorisation

This is major victory for ORG, although one with dangers. The government has conceded that independent authorisation is necessary for communications data requests, but refused to budge on retained data and is pushing ahead with the “Request Filter”.

Adding independent authorisation for communications data requests will make the police more effective, as corruption and abuse will be harder. It will improve operational effectiveness, even if less data is used during investigations and trust in the police should improve.

Nevertheless the government has disregarded many key elements of the judgment

  • It isn't going to reduce the amount of data retained

  • It won't notify people whose data is used during investigations

  • It won't keep data within the EU, instead it will continue to transfer it, presumably specifically to the USA

  • The Home Office has opted for a ‘six month sentence’ definition of ‘serious crime’ rather than the Lords’ definition of crimes capable of sentences of at least one year.

These are clear evasions and abrogations of the judgment. The mission of the Home Office is to uphold the rule of law. By failing to do what the courts tell them, the Home Office is undermining the very essence of the rule of law.

If the Home Office won't do what the highest courts tell it to do, why should anybody else? By picking and choosing the laws they are willing to care about, they are playing with fire.

The Home Office thinks it is playing a long game, hoping that courts will adjust their views over time, and that we will all get used to privacy being an increasingly theoretical idea. The truth is that privacy becomes a more necessary principle everyday, in the surveillance economy. We are all in need of greater privacy, so will find ourselves valuing it more.

Nevertheless, the Home Office were always going to find it hardest to concede changes to data retention. We had the right to expect something, even as window dressing, so making no changes at all is pretty audacious. But not in a good way.

ORG, Liberty, BBW, Privacy International and English PEN met Home Office officials today, at precisely the point that the draft changes were released. Which perhaps did not aid matters, as it can only be interpreted as a ploy to keep us away from journalists and kill the story.

The Home Office’s staff that were there did make a very good point about communications data, they said that without communications data, they would have to rely on more intrusive surveillance techniques.

Quite so, and exactly right. All the NGOs present at the meeting were entirely ready to see suspects placed under targeted surveillance measures, rather than having the population at large kept under tabs through retained communications data.

The world has trade offs, and we would suggest that this is a good one.

One final point that the Home Office decided to ignore was the need to notify people whose data has been accessed. The Home Office claimed that this is done in limited circumstances, so no change is needed.

This is another missed opportunity to improve police performance. Notification has the potential to reduce police abuse, and help people spot rotten apples, as the victim will find out when someone is pursuing a campaign of harassment against them or their community. Independent authorisation will help of course, but may not always spot the abuse that an individual will understand to be unfair

Safeguards are suggested for a reason. They are not simply a nicety to satisfy civil liberties campaigners: they are needed to avoid abuse and thereby make the police a better, more trusted, less corruptible, and more effective organisation.

There was one final surprise. The Code of Practice covers the operation of the “Request Filter”. Yet again we are told that this police search engine is a privacy safeguard. We will now run through the code in fine detail to see if any such safeguards are there. On a first glance, there are not.

If the Home Office genuinely believe the Request Filter is a benign tool, they must rewrite this section to make abundantly clear that it is not a mini version of X-Keyscore (the NSA / GCHQ’S tool to trawl their databases of people linked to their email and web visits) and does not operate as a facility to link and search the vast quantities of retained and collected communications data.



[Read more]


October 24, 2017 | Slavka Bielikova

Epson delete competing Ebay ink listings citing patent claims

Compatible ink cartridges are being removed by Epson under eBay's Verified Rights Owner programme.

Epson have contacted a number of resellers on eBay warning them to remove their listings. They have followed up by directly removing the items from eBay,  as members of the “VeRO” programme.

Epson enjoy a special trusted status on eBay under their VeRO programme. VeRO allows rightsholders to remove listings that they “believe may infringe on their intellectual property rights”. eBay don’t appear to require an actual proof of infringement, for example a decision of the court, but accept Epson’s word as a trusted company.

The policy for patent takedowns only applies in the EU. This means that Epson compatible cartridges in the US are not affected.

Epson are alleging that certain compatible ink cartridges infringe their patents GB2433473 and amendment GB2465293. The alleged infringement concerns the alignment of chip contacts on their cartridges. So far, we know Epson have issued takedowns against compatible cartridges T16 XL; T18 XL; T24 XL; T26 XL; T27 XL; T29 XL; T33 XL plus T0715 XL; T0797 XL; T0807 XL.

We are concerned that eBay is giving protection to only one party in this dispute. Rightsholders can easily claim infringement but resellers appear to be unable to assert the legality of their products and listings.

This is both unusual and unfair. Indeed, eBay have said as much in relation to US patent claims:

“eBay has a policy to quickly remove listings when a NOCI [Notice of Claimed Infringement] provides a court order, but eBay rarely removes listings based on mere allegations of infringement. eBay has two reasons for this policy. First, eBay believes that removing listings based on allegations of infringement would be unfair to buyers and the accused sellers. Such a policy, in eBay’s view, would give too much power to unscrupulous patent holders. The second reason eBay has adopted its policy is because it lacks the expertise to construe the patent infringement claims submitted to it and cannot assess the claims when it never possess[es] the products.”

It is also concerning that Epson opted to act against resellers and did not contact the manufacturers first. If Epson believe that their patents are genuinely being infringed then it would be more efficient and just to take direct legal action in order to prevent import or manufacture of these products at source.

At this stage, we cannot know if there is any merit to the Epson’s claim that these compatible cartridges infringe their patents but using patents in this way would undermine the legal regime that protects production of compatible products, including components, such as ink cartridges. That would be extremely bad for consumers.

We have contacted the IPO for clarification of eBay’s duties and responsibilities relating to patent claims. We would like to speak to both eBay and Epson about this. Meantime, if you have been affected by takedowns relating to Epson compatible ink cartridges and patent claims, please get in touch with us by emailing policy.monitoring@openrightsgroup.org.

 

[Read more]


October 10, 2017 | Matthew Rice

Time to make Data Protection work for consumers

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Data Protection. Two words often followed with discussions about “business compliance”. Plenty of business to business conferences out there are making good money off the spectre of data protection. It doesn’t need to be this way. In fact it shouldn’t. Data protection is about rights - the right for the public to hold private and public bodies that collect and process their data to account. 

The test for data protection fulfilling its purpose is whether it is improving consumer rights. Open Rights Group are calling for a specific improvement in consumer rights as the Data Protection Bill reaches its second reading debate in the House of Lords on Tuesday.

Currently, the Government’s Data Protection Bill will give citizens the power to instruct a select group of not for profit bodies to represent them in complaints to the data protection authority or the judiciary. This is required of the Government as Article 80(1) is a mandatory provision in the General Data Protection Regulation, which as a member state of the European Union is directly applicable in the United Kingdom.

However, when given the option to further strengthen consumer rights the Government decided against it. Article 80(2), an optional power in the GDPR, would give those select not for profit bodies the option to take those same complaints without having an affected member of the public instruct them. 

Open Rights Group and other members of civil society are calling for 80(2) to be adopted.

This would improve consumer rights in two ways. Firstly, it will protect the most vulnerable members of society such as children and the elderly. Secondly, it will move data protection to the same status as other consumer rights frameworks like competition or finance.

It would finally acknowledge something that should have been a long time ago: data protection is a consumer right and with that should be given the same powers that have proven so important in other areas of consumer rights.

PROTECTING THE VULNERABLE

When it comes to standing up for your rights, it is often those that need it most that are least able. In the case of data protection there have been cases of the profoundly negative effect data sharing has had on the elderly, while research has shown how websites targeted at children are either doing a terrible job of explaining data collection and processing, or no job at all.

The story of Olive Cooke is a sad one. A 92 year old poppy seller who took her own life, was said to be “distressed and overwhelmed” by the huge number of requests for donations from charities she was receiving, according to a 2016 report from the Fundraising Standards Board.

The report found that nearly a fifth of the 99 charities sampled had passed on Olive’s details to others, and that most of those had “assumed” permission to share based on the fact that Olive had not proactively opted out of data sharing. The report concluded that there were “inadequate opportunities for the recipient to opt out” of the mailings and data sharing, which collectively created a situation that was “almost uncontrollable”.

The Global Privacy Enforcement Network, a coalition of data protection authorities from around the world released a report in 2015 showing the level of disregard for data protection standards that websites aimed at children were demonstratingThis included half of the sampled sites sharing personal information with third parties, but only 1 in 3 of them giving effective controls in place to limit the collection of personal information. There was a similarly low number of the websites providing an accessible means for permanent deletion of personal information held.

In both of these examples enforcement either came too late, or not all. One reason was because there was no way for a group to take up the cause for either Olive or children. While the Information Commissioner’s Office in the United Kingdom had actually made public comments on the children’s websites research, there has been no evidence of proceedings or follow up taking place. 

If a power similar to Article 80(2) were in place, a not for profit body like Open Rights Group could take up the enforcement against these bad data protection practices. If done appropriately, the exercise of this power would improve the experience of elderly individuals such as Olive Cooke or millions of children online.

DATA PROTECTION AS A CONSUMER RIGHT

The idea that a not for profit body could take up an independent complaint against bad consumer practices is not novel. Traditional consumer rights such as competition and finance have similar powers for a select group of bodies. Some of these powers have lead to significant developments in the consumer landscape. It is time data protection is recognised as another area of consumer rights, one that is growing in importance, and should be given the same enforcement mechanisms as the others.

Consumer rights group Which? is capable of taking a private enforcement action in civil courts against traders for infringements of consumer protection legislation. There is no need to find an individual affected to instruct Which? to take on the enforcement, instead they need merely to show that consumers have suffered a loss

Also, in the financial sector there is the power for Which?, Citizens Advice, the Federation of Small Businesses and the Consumer Council for Northern Ireland to present “super-complaints” to the Financial Conduct Authority. It was this form of complaint, exercised by Citizens Advice, that played an important role in tackling the mis-selling of Payment Protection Insurance

The right for independent bodies to take complaints independently of finding an affected consumer are already in the wider consumer landscape. They have been wielded with discretion and have shown to be a valuable addition to the consumer rights framework. For data protection to be a modern consumer right, it is only logical that the same accountability frameworks are brought in. Implementing Article 80(2) is the way to achieve this.

GOVERNMENT’S VISION

The Government’s vision for the Data Protection Bill is to make the UK the safest place to live and do business onlineIt recognises the increasing volumes of personal data, and notes it as an increasing need to protect it. The vision even recognises that data losses can have distressing repercussions on individuals, and that victims can lose trust. All of this sounds hopeful, as though the Government has identified the issues that data protection can help solve. 

If the Government truly wants to achieve its vision, the small addition to improve the accountability framework for consumer rights in data protection will be a big step to take it there.

 

[Read more]


August 31, 2017 | Pam Cowburn

Nominations for ORG's Advisory Council are open

Are you an expert in digital issues, civil liberties or campaigning?

ORG is recruiting people to our Advisory Council. This is made up of tech, legal, campaigning and political experts who share our values and campaign goals for digital rights and a free and open Internet. As an Advisory Council member, you will be asked to give ORG advice about a range of issues.  Past and present Advisory Council members also help us elect one third of our Board

This year, we are looking for people with the following policy expertise and skills:

• Privacy experts, in data protection, surveillance laws and digital privacy

• Free expression experts, as we prepare for government attempts to control online speech

• People with a legal background

• Experts in the Brexit process

• People with a strong background in copyright reform

• Campaigners, including people with grassroots and local organising experience

• People with experience in FOI, Subject Access Requests, media work

• Journalists and investigative journalists

• People with senior political contacts in the Labour, Lib Dem and Conservative parties


We particularly welcome nominations from or proposing women and people of colour.

This is your chance to help ORG be the most expert and forward thinking digital civil liberties organisation in the UK. Send nominations to nominations@openrightsgroup.org by Friday 29 September 2017.

 

[Read more]


August 09, 2017 | Alec Muffett

The British Public & its Freedom to Tinker with Data

This is a guest blog by Alec Muffett, a security researcher and member of ORG’s Board of Directors.

Britain - after a somewhat shaky start - has recognised its proud tradition of leadership in cryptographic research and development.

jigsaw being put togetherFrom Alan Turing's success at breaking the Enigma cipher at Bletchley Park, and Tommy Flowers' "Colossus" (also there) to break the Lorenz cipher, to early and secret research into what later became known as "Public Key Encryption" by Clifford Cox, to GCHQ's vast deployment of technology to enable mass-surveillance of undersea cable communications — whatever one's opinion of the fruits of the work, Britain is recognised as a world leader in the fields of cryptography.

And one of the great truths of cryptography is: cryptography only improves when people are trying to break it. From academics to crossword-puzzle fans, cryptography does not evolve unless people are permitted to attack its means, methods and mechanisms.

This brings us to the recently announced "Data Protection Bill", in which you will find a well-intentioned paragraph: (our emphasis)

Create a new offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data. Offenders who knowingly handle or process such data will also be guilty of an offence. The maximum penalty would be an unlimited fine.

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/635900/2017-08-07_DP_Bill_-_Statement_of_Intent.pdf (page 10)

This speaks to the matter of "data anonymisation" (and the reverse processes of re-identification or de-anonymisation) where the intention is that some database — for instance a local hospital admissions list — could be stripped of patient names and yet still usefully processed/shared to research the prevalences of disease in a population.

Done improperly, this can go wrong:

…leading to failures where the "anonymity" can be defeated by combining several data sources, or by attacking the data set analytically, in order to return some semblance of the original data.

Ban it?

So it might sound like a good idea to ban re-identification, yes?

Well, no; the techniques of data anonymisation are mostly a form of "code book" cryptography, and (as above) if it's not legal to prod, poke, and try to break the mechanisms of anonymisation, then anonymisation, like cryptography, will not improve.

Therefore: banning re-identification will harm all of our individual security; it should be explicitly legal for anyone — the professionals, the crossword-puzzlers — to "have a go" at re-identification of data. Certainly it should be illegal for anyone to exploit or share the fruits of any successful re-identification — as is currently suggested — but the act of re-identification itself should not be prevented nor chilled in any way.

To swap metaphors: if you drive a car in the UK then it will have been crash-tested by experts in order to determine how safe it is; but that is not sufficient. We do not rely upon experts to crash them once, declare them safe, and then ban members of the public from crashing their cars. Instead, much of our learning and standards in car safety are from analysing actual, real-world incidents.

Similarly: anonymisation is hard to do correctly, and the failures in how people and organisations have deployed it will only be evident if the many eyes of the general public are permitted to dig into the flaws that may have arisen from one example to the next. It will not be sufficient, as this bill announcement continues, for "…the important role of journalists and whistleblowers […to…] be protected by exemptions."

Everyone has a stake in the collective security of our information, and we — the public — are the code-breakers who should be able to research, and hold to account, any instances of diverse and shoddy anonymisation that may be foisted upon us. Therefore this bill proposal must be amended and the freedom of the public to attempt re-identification must not be abridged.

 — Alec Muffett, security researcher & member of the Board of Directors, ORG

Further reading

https://en.wikipedia.org/wiki/Bombe
https://en.wikipedia.org/wiki/Tommy_Flowers
https://en.wikipedia.org/wiki/Clifford_Cocks
https://en.wikipedia.org/wiki/Tempora
https://en.wikipedia.org/wiki/Cryptanalysis
https://en.wikipedia.org/wiki/De-anonymization
https://en.wikipedia.org/wiki/Data_Re-Identification
https://en.wikipedia.org/wiki/Euro_NCAP
http://www.theregister.co.uk/2017/08/07/data_protection_bill_draft/ 

[Read more]


August 01, 2017 | Ed Johnson-Williams

Sorry Amber Rudd, real people do value their security

It’s not for the home secretary to tell the public they don’t need encryption

Amber Rudd has been out doing the media rounds this morning (£) talking about the issues end-to-end encryption poses to law enforcement. One comment in particular caught our eye:

“Real people often prefer ease of use and a multitude of features to perfect, unbreakable security. Who uses WhatsApp because it is end-to-end encrypted, rather than because it is an incredibly user-friendly and cheap way of staying in touch with friends and family?”

This is a little like saying: "Who uses a car because it has airbags and seatbelts, rather than because it’s a convenient way to get around?"

The Home Office strategy here may be to persuade internet companies to take action by telling them that ordinary people don’t care about security. This would be dangerous and misleading.

Clearly, real people (who are Rudd’s not real people?) do value security in their communication, just as they do with safety in their cars. Security is not – or at least does not have to be – the opposite of usability.

For many people, good security makes a service usable and useful. Some people want privacy from corporations, abusive partners or employers. Others may be worried about confidential information, sensitive medical conversations, or be working in countries with a record of human rights abuses.

Whatever the reasons people want secure communications, it is not for the Home Secretary to tell the public that they don’t have any real need for end-to-end encryption.

While Rudd seems to be saying she does not want encryption to be “removed” or bypassed, there are other things she might be looking for. It is possible that she wants the internet companies to assist the police with “computer network exploitation” – that’s hacking people’s devices.

It could mean providing communications data about users which could include data such as: "This user uses this device, often these IP addresses, this version of their operating system with these known vulnerabilities, talks to these people at these times, is online now, is using this IP address, is likely at this address and has visited these websites this many times."

Alternatively, Rudd might mean pushing out compromised app updates with end-to-end encryption disabled.

However, it is likely to be police rather than security services asking for this help. While targeted hacking does provide an investigative option that avoids blanket communications surveillance, it would be risky for the police to have these powers. Training and oversight, after all, are not as thorough or exacting as in the security services.

What is completely lacking is any serious attempt to tell the public what the Home Office wants internet companies to do to make people’s end-to-end communications accessible.

We should be told what risks the public would be exposed to if the companies were to agree to the Home Office’s private requests. Have these risks been properly weighed up and scrutinised? What safeguards and oversight would there be?

One risk is that users may start to distrust tech companies and the apps, operating systems and devices that they make. When security vulnerabilities are identified, firms push out updates to users. Keeping devices and apps up-to-date is one of the most important ways of keeping them secure. But if people are unsure whether they can trust pending updates, will they keep their devices up-to-date?

It would be incredibly damaging to UK security if large numbers of people were dissuaded from doing so. A prime example is the WannaCry ransomware attack that paralysed parts of the NHS in May. It spread through old Windows computers that hadn’t been updated, forcing doctors to cancel thousands of appointments.

The government must spell out its plans in clear, precise legislation and subject that legislation to full parliamentary scrutiny, and it should bring security and usability experts into a public debate about these questions.

Measures that deeply affect everybody’s privacy, freedom of expression, and access to information must not be decided behind closed doors.

[Read more] (2 comments)


June 21, 2017 | Jim Killock

Queen’s speech 2017—threats to privacy and free speech

First analyses of the Queen’s Speech are focussing on what isn’t included, as a weakened Conservative Government appears to have dropped a number of its manifesto commitments but there are several worrying things for digital rights. One welcome development could be data protection legislation, to fill the options in the GDPR.

There are references to a review of Counter-terrorism and a Commision for Countering Extremism which will include Internet-related policies. Although details are lacking, these may contain threats to privacy and free speech. The government has opted for a “Digital Charter”, which isn’t a Bill, but something else.

Here are the key areas that will affect digital rights:

Digital Charter

This isn’t a Bill, but some kind of policy intervention, backed up by “regulation”. This could be the system of fines for social media companies previously mentioned, but this is not explained.

The Digital Charter appears to address both unwanted and illegal content or activity online, and the protection of vulnerable people. The work of CTIRU and the IWF are mentioned as examples of work to remove illegal or extremist content.

At this point, it is hard to know exactly what harms will emerge, but pushing enforcement into the hands of private companies is problematic. It means that decisions never involve courts and are not fully transparent and legally accountable.

Counterterrorism review

There will be a review of counterterrorism powers. The review includes “working with online companies to reduce and restrict the availability of extremist material online”.

This appears to be a watered down version of the Conservative manifesto commitment to give greater responsibility for companies to take down extremist material from their platforms. Already Google and Facebook have issued public statements about how they intend to improve the removal of extremist material from their platforms.

Commission for Countering Extremism

A Commission will look at the topic of countering extremism, likely including on the Internet.

This appears to be a measure to generate ideas and thinking, which could be a positive approach, if it involves considering different approaches, rather than pressing ahead with policies in order to be seen to be doing something. The quality of the Commission will therefore depend on their ability to take a wide range of evidence and assimilate it impartially; it faces a significant challenge in ensuring that fundamental rights are respected within any policy suggestions they suggest.

Data Protection Bill

A new Data Protection Bill, “will fulfil a manifesto commitment to ensure the UK has a data protection regime that is fit for the 21st century”. This will replace the Data Protection Act 1998, which is in any case being removed as the result of the new General Data Protection Regulation passed by the European Parliament last year. Regulations apply directly, so the GDPR does not need to be ‘implemented’ in UK law before Brexit.

We welcome that (at least parts of) the GDPR will be implemented in primary legislation with a full debate in Parliament. It is not clear if the text of the GDPR will be brought into this Bill, or whether it supplements it.

This appears to be a bill to at least implement some of the ‘derogations’ (options) in the GDPR, plus the new rules for law enforcement agencies, that came in with the new law enforcement-related Directive and have to be applied by EU member states.

The bulk of the important rights are in the GDPR, and cannot be tampered with before Brexit. We welcome the chance to debate the choices, and especially to press for the right of privacy groups to bring complaints directly.

Missing: sex and relationships education

There is no mention of the introduction of compulsory sex and relationship education in schools, which was a manifesto commitment for all the main parties, Labour, Lib Dem and Conservative. As there appeared to be a consensus on this issue, it is not clear why this seems to have been dropped.

Encryption is also not mentioned, but that’s because the powers will be brought in through a statutory instrument enabling Technical Capability Notices.

Help us win new rights and fight off censorship

There’s lots to do. Please help us fight proposals for privatised and unaccountable censorship, and to establish rights for privacy groups to complain directly about data protection breaches. Join ORG for £6/month so we can defend your rights.

 

[Read more] (1 comments)