Blog


May 22, 2017 | Jim Killock

Facebook censorship complaints could award government immense control

Facebook censorship complaints run both ways—and we should be careful when politicians press for more controls

The leaked Facebook Files, the social media company’s internal policies for content regulation published by the Guardian, show that, like a relationship status on Facebook, content moderation is complicated.

It is complicated because Facebook is a near-monopoly in the social media market, making them both a power player and a target for regulation. It is complicated because there is an uneasy balance to strike between what is law, what is code, and what is community decency.

It is complicated because Facebook finds itself in a media landscape determined to label them as either a publisher or a platform, when neither of which are suitable titles. And ultimately, it is complicated because we are talking about human interaction and regulation of speech at a scale never seen before.

Big player. Big target

Facebook are a monopoly. And that is a big problem. With almost 2 billion users on the site, operating in almost all countries around the world, hoarding the data generated by a community of a size never before seen. The leaks show that even they seem unclear how best to police it.

It could be argued that as a private company, they can create their terms and conditions as they see fit but their global domination means that their decisions have global impact on free speech. This impact creates obligations for them to uphold standards of free expression that are not normally expected of a private company.

Operating in so many countries also means that Facebook are an easy target for criticism from many different governments and media, who will blame them for things that go wrong because of their sheer scale. They can see an easy way to impose control, by by targeting them through the media or regulation. Most recently seen in the Home Affairs Committee report where social media companies were accused of behaving irresponsibly in failing to police their platforms.

World Policing in the Community

Facebook’s business model is premised on users being on their site and sharing as much information as possible so that they can use personal data to sell highly targeted advertising.  Facebook do not want to lose customers who are offended, which means that offence is a much lower threshold than what is legal or not.

Facebook is not unregulated. The company has to comply with court orders when served but as the leaked files show making judgments about content that is probably legal but offensive or graphic is much more difficult.

Being a community police for the world is a deeply complicated position, even moreso if your platform is often seen as the Internet.

Law versus community standards

Facebook will takedown material reported to them that is illegal. However, the material highlighted by the Guardian as inappropriate for publication falls into a category of offensiveness, such as graphic material or sick jokes, rather than illegality.

Where people are objecting to illegal material appearing and not being removed fast enough, we should also bear in mind the actual impacts. For instance, how widely has it actually circulated?  In social media, longevity and contacts is what tends to produce visibility for your content. We suspect a lot of ‘extremist’ postings are not widely seen as the accounts will be swiftly deleted.

In both cases, there is a serious argument that it is society, not Facebook, generating unwanted material. While Facebook can be targeted to remove it, this won’t stop its existence. At best, it might move off its platform, and arrive in a less censored, probably less responsible environment, even one that caters and encourages bad behaviour. 4Chan is a great example of this, in that its uncensored message boards attract abuse, sick jokes and co-ordination of attacks.

Ultimately behaviour such as abuse, bullying and harassment needs to be dealt with by law enforcement. Only law enforcement that can deliver protection, prosecutions and work with individuals to correct their behaviour to reduce actual offending. Failing to take serious action against real offenders encourages bad behaviour.

Publisher v Platform

What happens when your idea of bringing the world together, suddenly puts you in the position of a publisher? When people are no longer just sharing their holiday pictures, but organising protests, running campaigns, even publishing breaking news.

Some areas of the media have long delighted in the awkward positioning of Facebook as a publisher (subject to editorial controls and speech regulation) and not a platform (a service where user’s can express their ideas that are not representative of the service). It might be worth those media remembering that they too rely on “safe harbour” regulations designed to protect platforms for all those comments that their readers post below their articles. To place regulatory burdens creating new legal liabilities for user generated content would be onerous and likely to limit free expression, which no-one should want to see.

Safe harbour arrangements typically allow user content to be published without liability, and place a duty on platforms to take down material when it is shown to be illegal. Such arrangements are only truly fair when courts are involved. Where an individual, or the police, can notify without a court, platforms are forced to become risk averse. Under the DMCA copyright arrangements, for instance, a user can contest their right for material to be re-published after takedown, but has also to make arrangement to be taken to court. All of this places the burden of risk on the defendant rather than the accuser. Only a few who are accused will opt to take legal risk, whereas normally, accusers would the ones to be careful about who they take to court for their content.

Money to burn. People to hire

Facebook have enough money that they should be able to go further in their hiring of humans to do this job better. They appear to be doing that and should be trying to involve more human judgement in speech regulation, not less.   

Considering the other options on the market, more human involvement would seem the most reasonable approach. Facebook have tried and failed miserably  to moderate content by algorithm.  

However, the sheer size of the task in moderating content across so many different cultures and countries, reportedly leaving human moderators only 10 seconds to make a decision whether to take down content, is a massive task that will only grow as Facebook expands.

We need to understand that moderation is rules based, not principle based. Moderators strictly match against Facebook’s “rules” rather than working from principles whether something is reasonable or not. The result is that decisions will often seem arbitrary or just bad. The need for rules rather than principles stems from making judgements at scale, and seems unavoidable.

Algorithms, to be clear, can only make rules-based approaches less likely to be sane, and more likely to miss human, cultural and contextual nuances. Judgement is an exclusively human capability; machine learning only simulates it. When a technologist embodies their or their employer’s view of what’s fair into a technology, any potential for the exercise of discretion is turned from a scale to a step and humanity is quantised. That quantisation of discretion is always in the interest of the person controlling the technology.

One possible solution to the rigidity of rules-based moderation is to create more formal flexibility, such as appeals mechanisms. However, Facebook are most likely to prefer to deal with exceptional cases as they come to light through public attention, rather than impose costs on themselves.

Manifesto pledges

Any push for more regulation such as suggested by the Conservative manifesto is highly likely to encourage automation of judgements to reduce costs—and validates this demand being made by every other government. The Conservative pledges here seem to us to be a route straight to the Computer saying No.

Thus, if you are concerned about what would seem to be arbitrary, opaque rules in place for Facebook’s content moderators set by the company, then you should be doubly concerned by the Conservative’s manifesto pledge to bring further state regulation to the Internet.

Governments have been the home of opaque and arbitrary rules for years, and the Conservatives, if elected, would deliver an internet where the state creates incentives to remove anything potentially objectionable (that could create adverse publicity, perhaps) and what level of security citizens should be able to enjoy from the platforms they use everyday . That is not the future we want to see.

So we have a private monopoly whose immense power in deciding what content people view is concerning, but a concerning proposal for too much state involvement in that decision too. A situation where you want to see better rules in place, but not rules that turn platforms into publishers, and a problem so vast that it seems that just hiring more people would not solve the problem alone.  Like we said, it’s complicated.

What is simple, however is that Facebook present a great opportunity for media stories, and complaints followed by power grabs from government to police the acceptability of speech that they would never dare make illegal. We may regret it if these political openings translate into legislation.

 

 

[Read more] (2 comments)


May 16, 2017 | Mike Morel

The UK Government should protect encryption not threaten it

It is difficult to overstate the importance of encryption. A cornerstone of the modern digital economy, we rely on it when we use our digital devices or make transactions online. Physical infrastructure like power stations and transport systems are dependent on it too.

Department of Culture, Media & Sport

 Encryption also strengthens democracy by underpinning digital press freedom. Whistleblowers can’t safely reveal official corruption to journalists without it.

Laws restricting encrypted communications have generally been associated with more authoritarian governments, but lately proposals to circumvent encryption have been creeping into western democracies. Former Prime Minister David Cameron attacked encryption after the Paris attacks in 2015, and Home Secretary Amber Rudd MP recently said that there should be a way around end-to-end encryption on devices like WhatsApp.

As it happens, Amber Rudd already has legislation that claims to give her the power to tell WhatsApp to remove “electronic protection” (read “encryption”). She can issue a technical capability notice (TCN) which instructs commercial software developers to produce work-arounds in their software without outlawing or limiting encryption itself. Just over a week ago, ORG leaked a secret Home Office consultation on the draft TCN regulation, which gives more detail about how this power can be used.

To be clear, this goes way beyond WhatsApp. The Government wants access to all UK telecommunications encompassing a wide variety of services. Any organisation that facilitates communications among 10,000 or more users could be issued a TCN including email account providers, data storage services, games companies, and (they claim) even overseas operators with enough UK users.

The current ransomware outbreak shows how software vulnerabilities used by security agencies can fall into the wrong hands. There is no reason to think backdoors intentionally created for Government access could not be exploited as well. Why start a digital arms race when we may be releasing new weapons to criminals and hostile governments?

The lack of transparency surrounding TCN’s is another problem. The regulation makes no mention of oversight or risk assessment mechanisms, and the consultation’s secrecy reduces accountability even more. Sometimes the Government has good reason for secrecy, but this is not one of those times. When digital services are compromised, people must know because it affects their privacy and security and everyone has a right to protect themselves.

Business owners should be concerned because their products and customers could be seriously affected, and the process by which they might appeal a TCN is unclear. The only real grounds for complaint appears to be “feasibility” — and many things may be ‘feasible’ but a very bad idea.

From securing the economy to underpinning press freedom, the need for strong encryption is vital. We alter it at our own peril, especially if we do so in secret. Tell the Home Office yourself before the secret consultation ends on 19 May.

See ORG’s detailed breakdown of the TCN regulation here.

[Read more] (1 comments)


May 13, 2017 | Jim Killock

NHS ransom shows GCHQ putting us at risk

The NHS ransom shows the problems with GCHQ’s approach to hacking and vulnerabilities, and this must be made clear to MPs who have given them sweeping powers in the IP Act that could result in the same problems recurring in the future.

GCHQ buildingHere are four points that stand out to us. These issues of oversight relating to hacking capabilities are barely examined in the Investigatory Powers Act, which concentrates oversight and warrantry on the balance to be struck in targeting a particular person or group, rather than the risks surrounding the capabilities being developed.

GCHQ and the NSA knew about the problem years ago

Vulnerabilities, as we know from the Snowden documents, are shared between the NSA and GCHQ, as are the tools built that exploit them. These tools are then used to hack into computer equipment, as a stepping stone to getting to other data. These break ins are at all kinds of companies, sites and groups, who may be entirely innocent, but useful to the security agencies to get closer to their actual targets.

In this case, the exploit, called ETERNALBLUE was leaked after a break in or leak from the NSA’s partners this April. It affects Windows XP. It has now been exploited by criminals to ransom organisations still running this software.

While GCHQ cannot be blamed for the NHS’s reliance on out of date software, the decision that the NSA and GCHQ have made in keeping this vulnerability secret, rather than trying to get it fixed, means they have a significant share of the blame for the current NHS ransom.

GCHQ are in charge of hacking us and protecting us from hackers

GCHQ are normally responsible for ‘offensive’ operations, or hacking and breaking into other networks. They also have a ‘defensive’ role, at the National Cyber Security Centre, which is meant to help organisations like the NHS keep their systems safe from these kinds of breakdown.

GCHQ are therefore forced to trade off their use of secret hacking exploits against the risks these exploits pose to organisations like the NHS.

They have a tremendous conflict of interest, which in ORG’s view, ought to be resolved by moving the UK defensive role out of GCHQ’s hands.

Government also needs to have a robust means of assessing the risks that GCHQ’s use of vulnerabilities might pose to the rest of us. At the moment, ministers can only turn to GCHQ to ask about the risks, and we assume the same is true in practice of oversight bodies and future Surveillance Commissioners. The obvious way to improve this and get more independent advice is to split National Cyber Security Centre from GCHQ.

GCHQ’s National Cyber Security Centre had no back up plan

We also need to condemn the lack of action from NCSC and others once the exploit was known to be “lost” this April. Some remedial action was taken in the US by informing Microsoft who created a patch in March, not however issued freely until today.

Hoarding vulnerabilities is of course inherently dangerous, but then apparently not having an adequate US or any UK wide plan to execute when they are lost is inexcusable.  This is especially true given that this vulnerability is obviously capable of being used by self-spreading malware.

GCHQ are not getting the balance between offence and defence right

The bulk of GCHQ’s resources go into offensive capabilities, including hoarding data, analytics and developing hacking methods. There needs to be serious analysis to see whether this is really producing the right results. This imbalance is likely to remain the case while GCHQ is in charge of both offence and defence, who will always prioritise offence. Offence has also been emphasised by politicians who feel pressure to defend against terrorism, whatever the cost. Defence—such as ensuring critical national infrastructure like the NHS is protected — is the poor relation of offensive capabilities. Perhaps the NHS ransom is the result.

Other interesting responses

[Read more] (8 comments)


May 08, 2017 | Mike Morel

A brief chance for better UK data protection law

The EU’s General Data Protection Regulation (GDPR) comes into force next year, updating a number of digital rights for UK citizens in the age of Big Data. Individuals stand to gain more control over their information and improve their awareness of consent, profiling, and automated decision making.

Department of Culture, Media & SportHowever, the GDPR’s enforcement within member countries has considerable flexibility. Of the many options within the law, one particularly crucial rule hangs in the balance–Article 80(2).

This rule permits privacy groups like ORG to independently represent the public in complaints about data protection law. Without it, privacy watchdogs like ourselves, Liberty or Privacy International would instead have to rely on individuals to bring a complaint.

But individuals do not always have the knowledge, expertise or time to identify and dispute faults in arcane terms and conditions. By ensuring Article 80(2) is enforced, privacy advocates will be free to directly address the Information Commissioner when corporations exploit your data.

The good news is there is something we can do about it. The Department of Culture, Media & Sport (DCMS) is currently holding a public consultation on the GDPR. The poor quality of this hurried consultation suggests this could be easily overlooked and forgotten about. That means we need your help to get Article 80(2) brought into UK law.

Time is short. The brief consultation ends Wednesday May 10. We have until then to make our voices heard. Click here to tell the DCMS to enforce Article 80(2).

 

 

[Read more] (1 comments)


May 04, 2017 | Jim Killock

DCMS consultation on data privacy fails to explain why it matters

New data privacy rights under the General Data protection Regulation depend on a UK consultation which tells readers nothing about its implications

The General Data Protection Regulation (GDPR) sets out many new rights for UK citizens, including better notions of consent, the right to obtain and download your information, and to delete it at a company. You can also find out more about profiling and automated decision-making. There are big fines available when companies don’t comply after it comes into force in mid 2018.

However, many of the new rights will depend on enforcement. One of the better ideas in the regulation is to allow privacy groups to represent citizens in complaints, without having to find specific people who have been directly affected. The GDPR requires member states to choose to allow this, or not, in Article 80(2). We of course very much believe this should be legislated for.

There is a consultation being run by DCMS until Wednesday 10 May on all the different options allowed under the GDPR—and there are quite a few.

However, this consultation is another very disappointing piece of work. Shoddy, even, because it calls for evidence and views, but sets out no background at all for the consultation, so only experts can practically respond. It merely states:

Theme 9 - Rights and Remedies

Rights and Remedies

The derogations related to Rights and Remedies include articles:

Article 17 - Right to erasure ('right to be forgotten')

Article 22 - Automated individual decision-making, including profiling Article 26 - Joint controllers

Article 80 - representation of data subjects

Government would welcome your views on the derogations contained in the articles above. Please ensure that you refer to specific articles/derogations.

There is no way that an average reader could understand the implications of this consultation, which, just like the recent Home office consultation on the IP Act Codes of Practice, means that the consultation appears to breach Cabinet Office guidelines, which state that consultations should:

Give enough information to ensure that those consulted understand the issues and can give informed responses.

This consultation provides exactly no background information whatsoever. You wouldn’t begin to understand that they want to know if you are in favour of privacy organisations being able to make complaints to the ICO under Article 80, or not.

We feel sympathy for the staff at DCMS who have been asked to set out this consultation, and presumably have been prevented from spending time developing background documents due to capacity constraints. This should serve as a warning to us.

Once Brexit kicks in, DCMS staff will need to be able not just to recycle existing policy advice from EU and other organisations on legislation prepared elsewhere, but also to have the expertise to evaluate it and recommend changes. Under the Great Repeal Bill, they may have to advise ministers about things to remove, with little Parliamentary involvement — potentially including aspects of the GDPR of course.

Right now, however, DCMS officials appear to lack the capacity to even produce decent consultation documents for key privacy laws like the GDPR. Ministers should be demanding more resources, or we will start to see serious policy mistakes being made.

 

[Read more] (1 comments)


May 03, 2017 | Mike Morel

ORG delivers anti-Espionage Act petition to the Law Commission

Today marks the end of the Law Commission’s public consultation on their proposals to create a new Espionage Act that would jail whistleblowers and journalists who handle official data. Open Rights Group gave them exactly what they asked for―the voices of 23,385 members of the public, delivered right to their offices at the Ministry of Justice.

The Law Commission is located within the Ministry of JusticeORG’s petition broadly rejects The Law Commission’s proposals and demands they be dropped. The threat of up to 14 years in prison would have a chilling effect on whistleblowers and the reporters they contact, weakening free speech and the integrity of UK democracy.

Thank you to all the ORG supporters that signed the petition or emailed the Commission: they now know that thousands of citizens refuse to live in a country where journalists and government staffers are afraid to expose corruption.

We urge the Law Commission to take your requests seriously. That would be a huge improvement over the sham “consultation” that barely took place while the initial report was developed. Contrary to the Commission’s statements, they worked closely with government officials and lawyers while organisations like ORG, Liberty and the Guardian were given short shrift.

Whether the Commission’s final recommendations will take the public consultation into account remains to be seen. Meanwhile ORG supporters have given them no option to claim public support for a new Espionage Act.

ORG also submitted a comprehensive report along with the petition detailing concerns about the Commission’s proposals. Highlights include:

  • The Law Commission is not being upfront about their aims. Their proposals are obviously in response to the Snowden leaks but they do not mention this or other major cases related to the disclosure of official data. It is blatantly disingenuous to overlook such important cases and not consider how the powers in a new Espionage Act could have been used in these cases.

  • Their proposals go against the very essence of whistleblowing by requiring concerns about corruption or malpractice be reported to an internal ombudsman. Whistleblowers have often tried to raise concerns internally and got nowhere. Whistleblowing is a last resort to expose hidden injustices that are not being dealt with within organisations.

  • Their proposals take away far too many rights from the accused. The Government would only have to show that a defendant was aware of the damage that could be caused by disclosing information - even if no actual damage was caused. So even if journalists expose wrongdoing, like the MPs expenses scandal, they could not use a statutory public interest defence.

  • The proposals threaten free speech. Editors, journalists and whistleblowers would be intimidated by the risk of up to 14 years in prison just for handling data.

  • The UK Government recently enacted the most extreme surveillance law of any democracy, the Investigatory Powers Act. At a time when these powers should be scrutinised, these proposals would criminalise whistleblowers and journalists acting in the public interest.

[Read more] (1 comments)


May 01, 2017 | Jim Killock

Automated censorship is not the answer to extremism

Unbalanced Home Affairs Committee recommendations would threaten free expression

Today’s report by the Home Affairs Select Committee brands social media companies as behaving irresponsibly in failing to remove extremist material.

It takes the view that the job of removing illegal extremist videos and postings is entirely the responsibility of the companies, and does not envisage a role for courts to adjudicate what is in fact legal or not.

This is a complex issue, where the companies have to take responsibility for content on their platforms from many perspectives, including public expectation. There are legitimate concerns.

The approaches the committee advocates is however extremely unbalanced and could provoke a regime of automated censorship, that would impact legal content including material opposing extremism.

We deal below with two of the recommendations in the report to give some indication of how problematic the report is.

Government should consult on stronger law and system of fines for companies that fail to remove illegal content

Platforms receive reports from people about content; the committee assume this content can be regarded as illegal. Sometimes it may be obvious. However, not every video or graphic will be “obviously” illegal. Who then decides that there is a duty to remove material? Is it the complainant, the platform, or the original publisher? Or an independent third party such as a court?

The comparison with copyright is enlightening here. Copyright owners must identify material and assert their rights: even when automatic content matching is used, a human must assert the owner’s rights to take down a Youtube video. Of course, the video’s author can object. Meanwhile, this system is prone to all kinds of errors.

However, there is a clear line of accountability for all its faults. The copyright owner is responsible for asserting a breach of copyright; the author is responsible for defending their right to publish; and both accept that a court must decide in the event of a dispute.

With child abuse material, there is a similar expectation that material is reviewed by the IWF who make a decision about the legality or otherwise. It is not up to the public to report directly to companies.

None of this need for accountability and process is reflected in the HASC report, which merely asserts that reports of terrorist content by non-interested persons should create a liability on the platform.

Ultimately, fines for failure to remove content as suggested by the committee could only be reasonable if the reports had been made through a robust process and it was clear that the material was in fact in breach of the law.  

Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead

There is always a case for general taxation that could be used for the police. However, hypothecated resources in cases like this are liable to generate more and more calls for specific “Internet taxes” to deal with problems that can be blamed on companies, even when they have little to do with the activity in reality.

We should ask: is the posting of terrorist content a problem generated by the platforms, or by other wider social problems? It is not entirely obvious that this problem has in some way been produced by social media companies. It is clear that extremists use these platforms, just as they use transport, mail and phones. It appears to be the visibility of extremists activities that is attracting attention and blame on platforms, rather than an objective link between the aims of Twitter and Facebook and terrorists.

We might also ask: despite the apparent volumes of content that is posted and reposted, how much attention does it really get? This is important to know if we are trying to assess how to deal with the problem

Proactive searching by companies is something HASC ought to be cautious about. This is inevitably error prone. It can only lead one way, which is to over-zealous matching, for fear that content is not removed. In the case of extremist content, it is perfectly reasonable to assume that content opposing extremism while quoting or reusing propagandist content would be identified and removed.

The incentives that HASC propose would lead to censorship of legal material by machines. HASC’s report fails to mention or examine this, assuming instead that technology will provide the answers.

 

[Read more] (1 comments)


April 05, 2017 | Jim Killock

A privacy disaster waiting to happen—the #DEBill on third reading

Today the Lords have their final debate on the Digital Economy Bill. No substantial changes are planned. This means all of the very severe problems with age verification, censorship and copyright sentencing still exist. Only in Part 5, about data sharing, has the government made significant improvements, although problems remain.

Age Verification: a privacy disaster waiting to happen

Age Verification is fraught, and likely to result in a chilling effect, where adults avoid visiting websites because of fears around the age verification technology. It is unclear that it is addressing a pressing social need; and while children do need support and education, a solution addressed at all adults is the wrong way to attempt it.

However, let us turn to the specifics.

Despite assurances that pornographic publishers will be obliged to use age verification tools that are privacy-friendly, the approach is almost certain to go wrong.

The government has chosen to leave the market to specify and provide the actual tools. They expect websites, rather than users, to choose which age checking product is used.

At this point we should remember that one website operator, MindGeek controls the majority of the UK porn market. They are also keen to implement Age Verification, according to the government. The result will be that they will choose and probably own the dominant age verification product.

While we cannot know exactly what MindGeek would do, we should remember that they will be able to shape the AV product how they like. They could allow users to opt into lots of convenient services, such as saving their porn preferences, getting recommendations, and having their credit card details ready for quick and easy payment.

So long as these services and tracking of vast numbers of UK porn users is voluntary … then there is little that could be done to challenge it.

The consequences for privacy are enormous. New risks of tracking people’s sexual preferences will be created, and possibilities of data leaks will abound. It will be the government’s decisions that created this problem, as they failed to impose sufficient safeguards upon  the age verification market.

Censorship: how much blocking would you like?

Any commercial pornographic website that doesn’t offer Age Verification can be blocked under the powers in the Digital Economy Bill. This blocking is meant to be a punishment: but the result will be the censorship of legal material.

The BBFC and government have attempted to assure people privately that the numbers of blocks will be low, and based on market share.

However, the power in the Bill is not limited in this way. How much is blocked is purely a policy and financial choice. The door is open for the government to be lobbied to block vast numbers of entirely legal websites. And there are plenty of people who think this would be a wise and necessary step, including MPs.

Copyright: dangerous criminal penalties for online infringement

For whatever reason, the government resisted our suggestion to limit criminal sanctions to “criminal scale” infringements or serious risks of “criminal scale” infringement.

The result is that any intentional infringement is a criminal matter. This is very different to the offline world, where large scale organised activity is required before criminal charges can be brought.

This cannot be proportionate; and it is not sufficiently foreseeable. While minor infringements may not be brought to court, it makes it impossible to know when something might attract a criminal charge. For individuals, the risk of “copyright trolls” issuing threats, or lawyers giving clients bad advice, can only increase.

Data sharing

The data sharing part of the Bill has undergone significant changes and will leave the House of Lords in an improved state. Following pressure from several civil liberties groups and the Delegated Powers and Regulatory Reform Committee, the Government tabled and passed important amendments on codes of practice and brought forward changes that narrow down definitions of public authorities.

We welcome that the Government specified the list of persons who may disclose and receive information both for public service delivery and for debt and fraud related to the public sector on the face of the Bill. The process of specifying persons entitled to participate in data sharing will be more transparent by not leaving all of these powers up to the Minister.

The Government also amended the Bill to require a specific public authority to only access data for purposes which are in line with its functions. The Bill ties functions of a public authority and its objectives closer together and it will create a more transparent environment where public authorities will be prevented from accessing data for purposes out of scope of their functions.

The Codes of Practice were made statutory by the Lords. Both Houses of Parliament must  approve the Statutory Instrument before it becomes law. We repeatedly advocated for this amendment since most of the safeguards are placed in the Codes of Practice and not on the face of the Bill. Without statutory footing, the codes would have less statutory force and safeguards in the Codes wouldn’t be enforceable.

However the Government has not at all addressed the bulk use of civil registration data and they have not changed their stance on review for all the powers in Part 5 on data sharing.

Chapter 2 provides for the sharing of civil registration for any public body's functions without restrictions. The power is intended for bulk data sharing of the full civil register across government but this power hasn’t been sufficiently justified by the Government.

This Chapter leaves several questions without clear answers. We don’t know how these large databases will be stored and if at all encrypted. The Government said they have no intention to share the information with private companies but they did not provide a guarantee that they won’t do so in the future. We still believe this power should be removed from the Bill.

The Bill includes provisions on amending and repealing the chapters on debt and fraud after a review. The provisions will prevent Ministers from broadening these powers or removing safeguards from the Bill.

ORG would have liked to see reviews in place for all the powers under Part 5 of the Bill to increase transparency of data sharing and prevent unjustified onward disclosure of data to other public authorities.

The Bill doesn’t clearly state that relevant powers in Part 5 should be for benefit of individuals and not for punitive purposes. This could leave a wiggle room for future changes of purposes of data collection.

What is ORG going to do now?

ORG will be considering our options, including Judicial Review. These are very serious matters and nobody else will be stepping up to deal with them.

Join today

If you want to help our work, please join today. By joining you will help us beef up our legal team, led by Myles Jackman. We can only win with support from people like you.



[Read more] (5 comments)