Blog


April 03, 2020 | Jim Killock

The government must explain its approach to mobile contact tracing

Mobile data and contact tracing is a hot topic, as the UK and EU develop projects to provide privacy-protecting means of understanding who is at risk of infection.

The idea is for some 60% of the population to use an app which will look for people with the same app to record proximity. This data is then stored centrally. Health officials then add data of people who have been positively tested for COVID-19. Finally, persons who may be at risk because of their proximity to someone with the virus are alerted to this and asked to self-isolate.

This approach is likely to work best late on, when people are out of the full lock down and meeting people more than they were. It may be a key part of the strategy to move us out of lockdown and for dealing with the disease for some time afterwards. At the current time, during lockdown, it would not be so useful, as people are avoiding risk altogether.

Of course, it will be a huge challenge to persuade perhaps 75% or more of smartphone users (80% of adults have a smartphone) to install such an app, and keep it running for however long it is needed. And there are limitations: for instance a window or a wall may protect you while the app produces a false positive for risky contact. The clinical efficicacy of any approach needs to be throughly evaluated, or any  app will risk making matters worse.

Getting users to install and use an application like this, and share location information, creates huge privacy and personal risks. It is an enormous ask for people to trust such an app – which explains why both the UK and EU are emphasising privacy in the communications we have heard, albeit the EU project is much more explicit. It has a website, which explains:

PEPP-PR was explicitly created to adhere to strong European privacy and data protection laws and principles. The idea is to make the technology available to as many countries, managers of infectious disease responses, and developers as quickly and as easily as possible. The technical mechanisms and standards provided by PEPP-PT fully protect privacy and leverage the possibilities and features of digital technology to maximize speed and real-time capability of any national pandemic response.

There are plenty of other questions that arise from this approach. The European project and the UK project share the same goals; the companies, institutions and governments involved must be talking with each other, but there is no sign of any UK involvement on the European project’s website.

The European project has committed to producing its technology in the open, for the world to share, under a Mozilla licence. This is the only sane approach in this crisis: other countries may need this tool. It also builds trust as people can evaluate how the technology works.

We don’t know if the UK will share technology with this project, or if it will develop its own. On the face of it, sharing technology and resources would appear to make sense. This needs clarifying. In any event, the UK should be working to produce open source, freely reusable technology.

We urgently need to know how the projects will work together. This is perhaps the most important question. People do, after all, move across borders; the European project places a strong emphasis on interoperability between national implementations. In the, UK at the Irish border, it would make no sense for systems lacking interoperability to exist in the North and Eire.

Thus the UK and Europe will need to work together. We need to know how they will do this.

We are in a crisis that demands we share resources and technology, but respect the privacy of millions of people as best as we can. These values could easily flip – allowing unrestricted sharing of personal data but failing to share techologies.

The government has already made a number of communications mis-steps relating to data, including statements that implied data protection laws do not apply in a health crisis; using aggregate mobile data without explaining why and how this is done; and employing the surveillance company Palantir without explaining or stating that it would be kept away from further tasks involving personal data. 

These errors may be understandable, but to promote a mobile contact tool using massive amounts of personal location data, that also relies on voluntary participation, the UK government will have to do much better. PEPP-PT is showing how transparency can be done; while it too is not yet at a point where we understand their full approach, it is at least making a serious effort to establish trust.

We need the UK government to get ahead, as Europe is doing, and explain its approach to this massive, population-wide project, as soon as possible.

[Read more]


March 31, 2020 | Pascal Crowe

Imprints: who’s responsible?

Many proposed electoral reforms are highly contested. One area where there is strong consensus though, is imprints. Essentially, this enhances political transparency by providing information on an election ad [1]: where the ad has really come from and who pays for it. This is already required by law for non-digital ads. 

Platforms, particularly Facebook, have attempted to pre-empt regulation and developed their own standards for the financial transparency of political ads. These efforts however have been criticised for lacking teeth and being easy to subvert by bad actors. ORG, for example, presented evidence of banned white nationalist groups subverting Facebook’s rules simply by re registering the group under a different name. In addition, Facebook’s policy encompassess both ads used in election campaigns and ‘issue ads’ which are used for policy advocacy outside of elections. This differs from what is covered under UK law. Ultimately changing UK law will provide the best guidance for what is needed, and the sharpest teeth for enforcing that change. In addition, changing the rules around imprints are only a small part of the necessary systemic change to stop citizens’ personal data being abused for political ends.  

The government has recently committed to updating the rules for online imprints

What has been less clear, however, is who exactly is responsible for making this change. What is important for civil society is to know which legislature to lobby, and which power would be used to change the law. The following should hopefully clarify the matter and help civil society allies’ lobbying efforts.

Who has the power to introduce digital imprints?

The process of devolution, begun in the 1990’s has made the landscape of election law more complicated. Powers that previously lay solely within the UK government are now split. In the UK, the power to introduce imprints is either reserved (to the UK government) or devolved (to another legislature). 

UK General Elections - Reserved to the UK Government. 

Scottish Parliament – Devolved

Welsh Assembly – Devolved 

Northern Ireland Assembly – Reserved to the UK Government

Each legislature can exercise its power to change an aspect of the law that it has responsibility for.

Where in law is the power to change imprint rules contained? 

Election law is generally only amendable by primary legislation. For imprints, however, there are permissions within primary legislation that allow changes to be made to their requirements via secondary legislation.

  • For UK General Elections and Northern Ireland Assembly Elections, there are provisions in primary legislation which enable the UK Cabinet Office Secretary of State to change the requirements for imprints through secondary legislation. For the law that applies to political parties and non-party campaigners, the relevant provision is in Section 143 of the Political Parties Elections and Referendums Act 2000 (PPERA). For the law that applies to candidates, the relevant provision is in Section 110 of the Representation of the People Act. 
  • For Scottish Parliament Elections, as imprints is a devolved matter, the relevant Scottish Minister has power to change the rules. They can use the provision in Section 143 of the Political Parties Elections and Referendums Act 2000 (PPERA) and also have a separate power to amend the rules for candidates that are set out in the Scottish Parliament Elections Order 2015. 
  • For Welsh Assembly Elections, similarly as imprints are a devolved matter, the relevant Welsh Minister has power to change the rules. They can use the provision in Section 143 of the Political Parties Elections and Referendums Act 2000 (PPERA) and also have a separate power to amend the rules for candidates that are set out in the National Assembly for Wales (Representation of the People) Order 2007. 

Which ministers should we lobby?

This gets complicated quickly. For example, for UK General Elections, the relevant Secretary of State is Michael Gove. However, some responsibility for UK elections has been given to Chloe Smith, the Minister for the Constitution and Devolution. Similarly, the MSP for Constitutional Affairs, Mike Russell is responsible for electoral reform in Scotland, but has shared some of his portfolio with other Scottish Ministers. By contrast, Julie James, the Welsh Minister for Housing and Local Government, has the portfolio for this in Wales. But the ministers’ briefs can chop and change. 

It should be remembered though, that reform around digital imprints should be the beginning, not the end, of UK electoral reform. Some political actors have, up until now, treated imprints like a magic bullet for the UK’s electoral woes. Whilst imprints are useful tools for financial regulators, researchers, and journalists, they do little to address the underlying issues with the political data economy. Whilst ORG welcomes progress on imprints, we hope it will sharpen opinions of the systemic and unlawful use of citizen’s personal data by political parties.

[1] The imprint rules don’t apply to other forms of political material, such as ads to lobby for a change in the law.

 

[Read more]


March 26, 2020 | Jim Killock

Open tech, privacy and Covid-19

From enabling strategies to curb the virus to empowering individuals to connect and work from home, digital technology is playing a vital role in the COVID-19 epidemic.

As the government adopts emergency powers, Open Rights Group (ORG) is on high alert to ensure our digital liberties emerge from this crisis stronger than ever.

Technology and digital rights are critical in the fight against COVID-19

Though some efforts have arguably overreached, the use of personal data has been an indispensable tool in successful measures against COVID-19. The government has the moral duty but also the legal right under data protection law to use personal information to defeat a public health emergency. Importantly, data protection principles such as transparency and lawfulness still apply and the public must be informed about any changes taking place.

We are worried that the government is not being clear about how it wants to use personal data, as we explained last week on Sky News. Transparency will only help the government by heading off conspiracy theories and building the public trust needed to successfully manage a crisis. Our allies at Privacy International are building a public resource to track the global technology response — notably, efforts from the UK are absent.

There are big questions about how the UK government intends to use technology for tracking infection or locating individuals. The focus elsewhere around the world has been on the use of location data from mobile phones. For instance, how will the government work with private companies; will it compel data to be handed from mobile companies? Will it impose new duties on private actors? Is it intending to use existing national security powers and the bulk communications data already in the hands of the intelligence agencies?

Find out more by listening to our live COVID-19 discussion on Friday at 4:30pm.

Working in the open

With collaboration, sharing and freedom of information proving central to stopping COVID-19, the world is learning the importance of open technology like never before. Grassroots efforts like the Open Source COVID19 Medical Supplies Facebook group and Newspeak House's Coronavirus Tech Handbook are turning to Creative Commons open licensing to remove barriers to access and sharing. The rapid development of a free design for a cheap ventilator by an ad-hoc group of British scientists shows that strong intellectual property restrictions are not always necessary to stimulate innovation.

The European Commission's decision to allow free access to copyright protected European medical standards to accelerate the production of masks and other medical equipment illustrates why tools saddled with excessive licensing requirements that prevent innovation or sharing are being left behind.

How you can help

Internet access is now critical to people who are at home, potentially isolated or needing to look after others. We all can take simple steps, such as limiting our use of bandwidth. Even with a high penetration rate in the United Kingdom, 7% of households do not have internet access. You could consider setting up a shared and secure Internet access point for your neighbours.

OpenWireless.org has a guide to setting up a secure open channel to safely share your bandwidth with those in need. Your router should have an easy way to set up a separate, password-free “guest network”: we recommend calling it “openwireless.org” so people know how to find out more about why you’ve provided an open access point.

How ORG is changing

We'll be hosting all our upcoming events online, including a special discussion about the privacy implications within the emerging Coronavirus response this Friday at 4:30pm.

Our staff are of course working at home and trying to stay safe – as we hope you are too.

Don’t forget to join our COVID-19 discussion on Friday at 4:30pm. 
 
Thank you for protecting UK digital rights.
Jim

[Read more]


March 24, 2020 | Pascal Crowe

We need political accountability more than ever- and the ICO can lead the way

We are living in an unprecedented historical moment. Although public health is everyone’s first priority, we are seeing a worrying encroachment on civil liberties by the UK government. The Covid-19 Bill gives extraordinary powers to the police and intelligence services. As a result, ORG’s work of defending human rights online is possibly of more consequence than it ever has been before. 

ORG’s Data Subject Access Request research (forthcoming), has shown the extent of personal data collection by political parties in the UK which predates the current crisis. However the legal basis for them to collect information about individuals, such as their spending habits, address, and social media activity, is considerably more shaky. 

To process your personal data, organisations must rely on one or more lawful bases. These are generally limited by the kind of organisation in question. Most people might assume that political parties rely on the informed consent for citizens to process their data: after all, that is how they get elected! Instead, they tend to rely on three lawful bases: Legitimate interest, public interest, and substantial public interest. It’s the latter two that ORG takes particular issue with. 

The public interest test is invoked when any personal data is processed by a political party. There is a specific provision for what counts as ‘public interest’ for political parties : in this case when processing personal data is  “necessary” for “an activity that supports or promotes democratic engagement”.

However, when processing information such as political opinion or sexuality (known as special category data), there is a higher threshold - the substantial public interest test. For this to be met, data processing must again be “necessary” for“the purposes of the person’s or organisation’s political activities”. In short, this means if processing an individual's political opinion is not necessary for political parties’ political activities, then it is unlawful. 

Both of these tests have over time become known as “exemptions” or “exceptions” for the political parties. However the parties by and large reject this characterisation. For example Labour, in its written evidence to the House of Lord’s Democracy and Digital Technologies Committee stated that “It is not... the case that the current statutory provisions provide an “exception” for political parties.” The Conservatives said much the same.

It is true that the law does not provide carte-blanche “exceptions” for political parties to use all sorts of personal data without consent. But the political parties interpret their lawful bases so broadly, that they have been used as exemptions in practice. The key issue for the political parties is that in both lawful bases, the word “necessary” is doing a lot of heavy lifting. Both DPA guidance, and the ICO itself, have said that “ you do not have a lawful basis for processing if there is another reasonable and less intrusive way to achieve the same result”. The processing must be more than useful, standard practice; it must be targeted and proportionate, for a specific purpose.

Our current legal action against the political parties limits what I can say on this specifically. But most reasonable people will understand this: 

  • Attempting to collect data on every registered voter in the UK is not ‘targeted’. 

  • Attempting to get as much personal information as is possible on each of those voters is not ‘proportionate’. 

  • Doing this year round, limited only by a party’s financial or data assets, implies that a ‘specific purpose’ is lacking. 

In addition, I question whether the public would so readily conflate “democratic engagement” with trading and grading personal information. The bulk of this activity is to work out who it is worth the political parties spending further resources on to encourage them to vote for them. This means cutting people out of their operations. Electioneering and democratic engagement are not the same thing. 

The claim that mass data collection is “necessary” for democratic engagement or political activities is false. There are several routes by which this can be proved. The simplest: the ICO could clarify the position of the law and offer strict- ideally binding by statute- guidance to political parties about what constitutes necessity here. Indeed the ICO stated they think political parties have overstepped the line during their recent evidence to the Fairvote APPG.

They should now go further and offer clear, detailed, firm guidance to political parties about what level of data processing is really necessary for a functioning democracy. 

[Read more]


March 20, 2020 | Jim Killock

In the Coronavirus crisis, privacy will be compromised—but our right to know must not be

At Open Rights Group (ORG), we want the government to succeed in its pursuit of the eradication of Coronvirus COVID-19. We know that means that government will want to use data to understand the impacts of its policies on the progress of the disease, and to anticipate what new measures it needs to take.

Other governments have had great success in doing this; we know a lot about the global response and measures taken elsewhere. Sometimes, however, these efforts have arguably overstepped the mark, for instance monitoring individuals in Israel. There is always a question of proportionality.

The interplay between privacy rights and the use of data here is quite clear: the government has a right and duty to use data in unanticipated ways to defeat the public health emergency. Both UK law and GDPR explain this quite clearly. In essence, data can be used, but data protection duties remain in place, such as data security, minimisation and fair processing.

These laws also anticipate [require] that the public will be informed about the changes that are taking place. This is not, in our view, a purely technical duty. The point of transparency, as everyday privacy expectations are changed, is to ensure that people trust the actions of the government as the reasons for accessing and using data are clear and can be shown to be necessary and proportionate to the threat.

From the other side of the equation, this prevents alarm, misinterpretation and false rumours.

As the government extends its intelligence about the virus through the use of personal data, the government has no real choice but to be transparent. There are only upsides from transparency: not merely reducing the risks of misunderstanding, but also building public confidence, as it explains the means by which it is getting ahead.

Thus it is striking how little we have heard at this point about the government's plans. Over the last two days, stories about use of mobile phone data have emerged, but not from the government. Rather, journalists have been tipped off and then run stories. 

Alarmingly, the government is still refusing to comment on its partnership with mobile companies. This does not show a government in control of its message, or clear in its purpose..

It is in our view extremely important that the government begins to explain how it wants to use personal data. We are sure it has a good story to tell. But it must not risk chipping away at public confidence in its strategy against COVID-19 by failing to do so.

[Read more]


March 02, 2020 | Javier Ruiz

UK publishes trade objectives for deal with the US: What you need to know

The UK government has set out its plans and priorities for the negotiations of a free trade agreement with the US. Here we explain some of the main issues in the digital trade chapters of these deals and outline our positions.

The White House from Washington, DC / Public domain

The UK government has published its negotiation objectives for the free trade agreement with the US The document includes Digital Trade measures that could define British regulation of the internet and our online activities for years to come. Our digital rights are being shaped by the economic discussions in this Free Trade Agreement (FTA) with severe impacts on privacy and free expression and association among other rights.

 The UK government aims to agree provisions “that facilitate the free flow of data, whilst ensuring that the UK’s high standards of personal data protection are maintained, and include provisions to prevent unjustified data localisation requirements.” The document does not detail how this tension will be solved but explains that the UK will “help shape global rules in areas such as data flows, blockchain, driverless cars and quantum technology”, “maximising the UK’s reach in emerging fields like global data flows and Artificial Intelligence (AI)”.

 The government is also committed to promoting “appropriate protections for consumers online and ensure the Government maintains its ability to protect users from emerging online harms.”

The US published its negotiating objectives for a deal with the UK a year ago, with concrete demands and objectives honed through constant negotiation on trade on a variety of fronts. The UK government has admitted that the existing US trade deals will form the starting point for discussions.

Below we outline some of the key elements of the Digital Trade agenda that can be found in those trade agreements signed by the United States, our concerns with these proposals and our premiminary recommendations.

Democratic oversight

Placing the regulation of the digital sphere under the trade banner creates a fundamental problem of democratic oversight because trade treaties generally give disproportionate power to the executive.

The UK government does not have a published trade negotiation policy. The default role for Parliament over trade deals appears to be the Ponsonby Rule, which sets out that “the power to make treaties is a Prerogative power vested in the Crown”, with the texts presented to Parliament for ratification for a mere 21 days. The problem of having such a short period for ratification is compounded by trade negotiations being clouded in utmost secrecy.

Trade primacy and dispute resolution

Technology regulation - including platforms - could be undermined by trade enforcement by providing companies a very strong tool to sue the UK for any policy that they believe damages their interests.  In December 2019, the taxi app Uber started proceedings to sue the Colombian government under the arbitration system in the Colombia-US Trade Promotion Agreement, after a local court found that the company had violated competition rules and ordered it to cease operations.

 Trade is unique in international law in its self-enforcement mechanisms. If a country is found to have created unfair advantages to favour local industries, the other country in the dispute is allowed to bring some form of retaliation to compensate. In most cases this will happen through import tariffs, but not exclusively.

 A recent high profile example is the dispute between the US and the EU over airplane subsidies to Airbus, where the US has been allowed to impose punitive tariffs on $7.5bn of European goods. The twist is that retaliatory tariffs do not have to be applied to the same sector - aviation in this case. The US has slapped a 25% tariff on £1bn imports of Scottish whisky.

Investor State Dispute Settlement (ISDS)

 Outside of WTO tribunals, for example bilateral trade and investment treaties like the UK-US FTA currently under discussion, there are other problems to face up to: Investor-state dispute settlement tribunals (ISDS). These ISDS tribunals allow companies to sue governments, like in the Uber case discussed above.

 ISDS tribunals are widely criticised for being expensive, secretive, and staffed with corporate lawyers, overall favouring companies over governments regulating in the public interest. In March 2019, the Joint Committee on Human Rights of the UK Parliament published a report on international agreements that raised ISDS as a “particular area of concern”. The committee found evidence that the ISDS mechanism discourages government from introducing regulatory measures to protect human rights, labour rights and health.”

 As a general principle, the regulation of the internet and online platforms should not be restricted in such a manner. The UK in particular will be in a difficult position vis a vis the US to enforce any digital trade dispute. ISDS should not be included in any UK trade deal.

Data flows

What is being proposed?

Digital trade agreements contain clauses that ban any restrictions on the “cross-border transfer of information, including personal information, by electronic means” in the conduct of a “covered” business. This means any US online service could object to any UK law or regulation that stops data from their prospective UK customers being sent to the US.

 The agreements can contain an exception regime, for government measures that are “necessary to achieve a legitimate public policy objective”, as long as these exceptions don’t form an “arbitrary or unjustifiable discrimination or a disguised restriction on trade” and are not “greater than are necessary to achieve the objective”.

Who is driving the free flow of data agenda and why?

These measures are mainly driven by the US to support the unrestricted global data flows towards the internet giants of Silicon Valley that currently dominate the global Internet outside China and Russia.

 Governments of developed countries generally support these measures because they perceive that lack of access to cloud computing would disadvantage their country, and generally nobody wants to be left out of the global digital market.

Why is this a problem?

The US lacks meaningful general privacy protections in the commercial sector, particularly for foreigners outside the general protections of their Constitution. Agreeing to unrestricted data flows to the US would put the UK in a double bind that could jeopardise data flows to and from the EU with severe consequences for businesses and citizens. The official UK position is to seek an adequacy agreement with the EU.

 Plenty of legal scholars believe that if the UK and the US enable a completely free flow of data between the two countries in a trade deal, this will likely create problems for the UK to simultaneously keep similar arrangements with the EU in continuity of the GDPR regime.

 The exceptions regime for public policy objectives is ill-defined and hard to implement. Modern digital trade deals may also include a generic section on data protection, which sounds good in principle, but in practice can do more harm than good by presenting lower privacy standards as acceptable choices.

What is the solution?

The UK should only sign agreements on a free flow of data with countries that provide good data protection to a similar standard to the domestic regime. One way to do this would be through the application of adequacy decisions modelled on the GDPR regime. Another approach would be to apply a common international standard such as Convention 108 of the Council of Europe.

Data localisation

What is being proposed?

 Data localisation covers several overlapping but distinct issues. The most obvious aspect of data localisation is forcing companies to store the data locally, or in some cases to set up a local outfit. The provisions in the latest digital trade agreements ban countries from forcing companies to “use or locate computing facilities in that Party’s territory as a condition for conducting business” in the country.

 The main example of localisation would be Russia, where concerns over US interference have led to a path of technological self-sufficiency where the country is now ready to be cut off from the global Internet.  Some data localisation will always be a feature of data protection. Localisation requirements are common worldwide for financial, health or other sensitive data.

Who is driving the anti-localisation agenda and why?

The ban on data localisation has widespread support among the more developed countries. A 2017 US government business survey saw localisation as the main problem for cross border delivery of services online. The ban is opposed by many developing countries and authoritarian governments, although for different reasons.

 The Snowden leaks in 2013 marked a turning point for the realisation that any foreign data stored in the US was fair game for surveillance led many countries, e.g. Brazil, to develop localisation policies. Since then, concerns have continued to grow over the economic dominance of US platforms and the overt use of digital technology as an instrument of US foreign policy. For example, new mobile phones from Huawei soon won’t be able to access Google Play apps, and Venezuela’s creative sector is cut off from leading software tools from Adobe.

 Data localisation in some developing countries is portrayed as akin to policies to control natural resources such as oil. India, for example sees the data of their citizens as “a collective resource, a national asset, that the government holds in trust”. Developed nations are in a similar predicament of trying to achieve “Digital Sovereignty”. For example, the EU strategy for data, outlines plans for the creation of “common European data spaces”.

 Internet companies also make the technical case that in running global operations it is not efficient to have facilities in every single country, although a level of localisation appears to be efficient. Content delivery networks provide such localisation services for many Internet companies.

Why is a ban on localisation a problem?

 This is a thorny issue. The business arguments against localisation have some grounding but we also have to admit that the Internet is too centralised around large companies and richer regions. The nationalist perspective of India and other countries looking for economic justice is unfortunately hard to fully support in its current form from our perspective that privacy is a fundamental right.

 There is a very strong argument in favour of some localisation in the need for regulatory access. The US financial authorities have been blocking localisation requirements for financial data in trade deals for many years because during a crisis in that fast moving world, they cannot wait even a few hours to access data.

 Unfortunately, the line between legitimate regulation and state abuse is sometimes difficult to demarcate. The localisation policies of authoritarian governments have negative consequences for free expression, privacy and democratic values. From this perspective, the ban on the localisation of data and computing facilities could be supported in a tactical manner by digital rights groups. However, the wider arguments about political power and economic justice around digital technologies complicate the picture and make it difficult to lend unqualified support to these measures.

What is the solution?

 The basic building block of any policy on localisation is the protection of human rights and freedoms. From this perspective the pursuit of economic justice and social innovation has to fully comply with the right to privacy among others. Regulators should have access to data in order to protect consumers and society.

 Our conclusion is that data localisation is too complex an issue to be included in trade deals and should be part of wider international discussions on global digital development and democracy.

Disclosure of source code and algorithms

What is being proposed?

Digital trade treaties ban governments from requiring the disclosure of source code as a condition of import, distribution, sale or use of software or of products containing software. The latest agreements extend this prohibition to “algorithms expressed in that source code”.

 The text in the latest US agreements contains a clause allowing a “regulatory body or judicial authority” to demand the source code or algorithm “for a specific investigation, inspection, examination, enforcement action, or judicial proceeding, subject to safeguards against unauthorized disclosure.”

Who is driving this agenda and why?

The main proponent of these restrictions is the US with the argument that the disclosure of source code and underlying know-how is a form of forced technology transfer.  Other developed countries such as Japan and the EU are also fully behind.

 The main source of concern about forced technology transfer seems to be China. The current US–China trade war was officially triggered by US allegations of China’s abusive trade practices in forced technology transfer and intellectual property.

Why is a ban on technology transfer and the disclosure of source code and algorithms a problem?

There is consensus that China does indeed obtain source code and algorithms, but these demands are likely already unlawful and just require better enforcement. A generalised ban on the disclosure of source code and algorithms will hamper accountability, limit the space for public policy and harm least developed countries.

 There are growing concerns about potential unfairness and bias in life-changing decisions made or supported by the use of algorithms, from credit to court sentencing or migration status. Preventing the disclosure of source code or algorithms would hamper efforts to develop new forms of technological transparency and accountability. The UK GDPR includes a right for individuals in certain circumstances to be informed of the logic of the systems making decisions that significantly affect them.

 There are many legitimate public policy reasons for a government to require a foreign company to disclose their technology. Regulators may need to examine technical systems, for example in electronic voting, gambling machines, or car emissions. There are particular concerns for policies supporting open source software.

 For developing countries, technology transfer is a central aspect of economic development and global socio-economic justice. The Norwegian oil industry and the Taiwanese textile sector were built through forced technology transfers.

 While this issue may not seem critical within a US-UK Free Trade Agreement, given that both countries enjoy comparable levels of technological development, its inclusion in the deal would facilitate the propagation of this idea into other deals.

What’s  the solution?

Technology transfer is a necessary element of global development and fighting climate change. The UK government should work to develop fair transfer mechanisms that enable accountability and prevent abuse, instead of promoting a blanket ban on algorithmic disclosure.

Summary

  • The negotiation of UK FTAs should be conducted with the highest levels of democratic participation, including a greater role for parliament than currently afforded in the law.
  • UK FTAS should not include ISDS mechanisms that would allow foreign companies to sue to stop legitimate regulations to protect the public.
  • The UK should only sign agreements on a free flow of data with countries that provide good data protection to a similar standard to the domestic regime.
  • Data localisation is too complex an issue to be included in trade deals and should be part of wider international discussions on global digital development and democracy.
  • The UK government should work to develop criteria for fair technology transfer mechanisms that enable accountability and prevent abuse, instead of promoting a blanket ban on algorithmic disclosure.

 

[Read more]


February 19, 2020 | Francis Davey

Data Protection and Brexit

At the moment, the General Data Protection Regulation (or GDPR) is an important piece of legislation protecting personal data, but it is European, not UK law. There is therefore a certain amount of concern about what might happen to data protection in the UK in the future, but there also seems to be some confusion about what is happening now.

The government's original plan was that, at the point of leaving the EU, all existing EU legislation which took effect in the UK directly (such as the General Data Protection Regulation) would become by default UK legislation. That is what section 3 of the European Union (Withdrawal) Act 2018 would have done. As soon as this happened, the pithily named Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 would immediately step in and perform an extensive series of edits to what would then be renamed the "UK GDPR".

Almost all of these edits are essentially a global replacement of terms like "European Union" with "United Kingdom". The GDPR is complicated enough that a simple search and replace would not work and so the edits have more heavy lifting to do. But what you have to imagine is something that looks exactly like the GDPR if the EU contained only the UK and nothing else.

One confusion that seems to be spreading around is that these changes have already happened, because they were all timed to occur on "exit day", but that isn't correct. Paragraph 1 of Schedule 5 of the European Union (Withdrawal Agreement) Act 2020 postpones all existing "exit day" dates to "implementation day" which is probably the end of the year, though there has been sufficient chaos in recent years that predicting anything is hard. Instead the agreement signed between the UK and EU treats the UK as being (for data protection purposes at least) as being a part of the EU. 

That means that right now, data protection law is almost but not quite unchanged in the UK. But since the UK is not in fact in the EU, there may be some small differences

Then what happens? If nothing else is changed by the government, at the end of the year the UK will transition to the new UK GDPR. Internally everything should remain approximately as it is now, but since the UK will not be in the EU, exporting data from the EU to the UK might become a little more difficult.

At the moment the EU has a short list of "countries" to which it is OK, at least in some circumstances, to export personal data from the EU. It takes a while to be put on that list - there has to be what is known as an "adequacy decision" and so there is some concern that free export of personal data from the EU to the UK will become more difficult. Not impossible - there are other ways of exporting data lawfully, they just require more work. By the way, the UK's equivalent "short list" of countries in the UK GDPR will include all the EU adequate countries plus the EU, so export from the UK to the EU should not be a problem.

Of course the UK and EU are currently supposed to be preparing an agreement for 2021 and beyond. There is no reason at all this could not include an agreement on personal data, which would allow free export. There would not necessarily have to ever be an "adequacy decision". If the government sticks to its plan, the UK's data protection law will be essentially identical to that in the EU, at least for the time being, so this should not be hard to negotiate. It is of course impossible to tell.

Note that many people seem to assume - and have been saying to me - that the only way forward is for the UK to obtain an  adequacy decision from the Commission. The argument is then about how fast that could be done; whether the commission would want to cooperate in fast-tracking and what to do in the interim.

This may be correct politically but it is just not true legally. There is nothing stopping the UK and the EU agreeing in a treaty between the two, that the UK would not be treated as a third country by the EU. Two examples of that already exist: first the existing transitional agreement, which does just this. Second of course in the relationship between the EU and the EEA. No EEA country is "adequate", but nor are they treated as "third countries". That is of course implicit in the framework of EU legislation, but there is no reason why that sort of relationship is impossible. Whether the EU and the UK could or would agree to do so is a completely different question.

What's more, as things stand, it would be very odd if free movement of data did not exist to the UK, because most of those adequate countries have agreed to allow free export of data to the UK if the government's statement on the point is to be believed. I have only checked the Isle of Man legislative change to be sure, but it seems clearly to allow export of data from the Isle of Man to the UK. What this appears to me is that data may be exported from the EU to one of these other places and then into the UK without any hindrance, so a sufficiently determined data exporter can "get around" the rules anyway.

[Read more]


January 31, 2020 | Pascal Crowe

APPG on Electoral Campaigning Transparency adopt ORG reforms to electoral landscape

Last summer, Open Rights Group gave oral evidence to the All Party Parliamentary Group (APPG) on Electoral Campaigning Transparency. It was convened by Fair Vote, the Electoral Reform Society and Stephen Kinnock MP. This APPG has been leading the charge to make our electoral laws fit for the digital challenges of the 21st Century. Its final report, ‘Defending Democracy in the Digital Age’, was published earlier this January

Many APPGs are mere vanity projects for ambitious MPs that can lose momentum and fail to deliver their promised outputs. This is not one of those. The Secretariat and members of the APPG should be congratulated for such a speedy delivery. This speaks to the urgency of the issues at hand. 

The laws that regulate how much an election campaign can spend, and the ways that data are used in elections, are increasingly intertwined. Not least, this is because the value of datasets used by a campaign are rarely captured by regulators, although this is nominally already required by regulation. This is partially because all spending reportage happens after the fact, so there is no way to track spending in real time, or assess if the campaigns themselves are being truthful about the size of their campaign assets. 

Open Rights Group’s (ORG) Data and Democracy Project made several key recommendations to address this issue, that were adopted in the final report. Here are a selection:

  • Data sets should be assigned a market based monetary value, which can then be included in spending regulation and sums. Although this is nominally required by existing regulation, a new calculus needs to be applied to work out (and perhaps limit) their changing financial value in specific campaigning contexts. 

  • To facilitate this, the Electoral Commission and the ICO should form a joint task force to conduct ‘data audits’ of a campaign’s data assets, such as data sets, algorithms, and social networks. This should also scope for illegal and unethical behaviour. 

  • The Electoral Commission and the ICO should reserve the right to carry out ‘drug tests’ during elections, to ensure that campaigns are complying with electoral and data protection law. 

Although changes to the laws that regulate this activity have never been more needed, frankly that change has never seemed further away. The replacement of the DCMS Select Committee chair, Damian Collins, has come as a shock to many. Collins’ tenure was defined by its fierce scrutiny of the digital campaigning landscape and the actors within it. Many eyes are watching his successor, Julian Knight, who ultimately gets to decide the committee’s direction of travel. There is concern that digital campaigning will now play second fiddle to scrutiny of the BBC. 

ORG hopes that the new chair continues his predecessor’s tenacity in regards to electoral reform. In particular, he should start by safeguarding the work of the sub committee on misinformation. We look forward to working with him on these vital issues.

[Read more]