October 19, 2016 | Jim Killock

Fig leafs for privacy in Age Verification

The Digital Economy Bill mandates that pornographic websites must verify the age of their customers. Are there any powers to protect user privacy?

Yesterday we published a blog detailing the lack of privacy safeguards for Age Verification systems mandated in the Digital Economy Bill. Since then, we have been offered two explanations as to why the regulator designate, the BBFC, may think that privacy can be regulated.

The first and most important claim is that Clause 15 may allow the regulation of AV services, in an open-ended and non-specific way:

15 Internet pornography: requirement to prevent access by persons under the age of 18

  1. A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18

  2. [snip]

  3. The age-verification regulator (see section 17) must publish guidance about—

    (a) types of arrangements for making pornographic material available that the regulator will treat as complying with subsection (1);

However, this clause seems to regulate publishers who “make pornography material available on the internet” and what is regulated in 15 (3) (a) is the “arrangements for making pornography available”. They do not mention age verification systems, which is not really an “arrangement for making pornography available” except inasmuch as it is used by the publisher to verify age correctly.

AV systems are not “making pornography available”.

The argument however runs that the BBFC could under 15 (3) (a) tell websites what kind of AV systems with which privacy standards they can use.

If the BBFC sought to regulate providers of age verification systems via this means, we could expect them to be subject to legal challenge for exceeding their powers. It may seem unfair to a court for the BBFC to start imposing new privacy and security requirements on AV providers or website publishers that are not spelled out and when they are subject to separate legal regimes such as data protection and e-privacy. 

This clause does not provide the BBFC with enough power to guarantee a high standard of privacy for end users, as any potential requirements are undefined. The bill should spell out what the standards are, in order to meet an ‘accordance with the law’ test for intrusions on the fundamental right to privacy.

The second fig leaf towards privacy is the draft standard for age verification technologies drafted by the Digital Policy Alliance. This is being edited by the British Standards Institution, as PAS 1296. It has been touted as the means by which commercial outlets will produce a workable system.

The government may believe that PAS 1296 could, via Clause 15 (3) (a), be stipulated as a standard that Age Verifcation providers abide by in order to supply publishers, thereby giving a higher standard of protection than data protection law alone. 

PAS 1296 provides general guidance and has no means of strong enforcement towards companies that adopt it.  It is a soft design guide that provides broad principles to adopt when producing these systems.

Contrast this, for instance, with the hard and fast contractual arrangements the government’s Verify system has in place with its providers, alongside firmly specified protocols. Or card payment processors, who must abide by strict terms and conditions set by the card companies, where bad actors rapidly get switched off.

The result is that PAS 1296 says little about security requirements, data protection standards, or anything else we are concerned about. It stipulates that the age verification systems cannot be sued for losing your data. Rather, you must sue the website owner, i.e. the porn site which contracted with the age verifier.

There are also several terminological gaffes such as referring to PII (personally identifying information) which is a US legal concept, rather than EU and UK’s ‘personal data’; this suggests that PAS 1296 is very much a draft, in fact appears to have been hastily cobbled-together

However you look at it, the proposed PAS 1296 standard is very generic, lacks meaningful enforcement and is designed to tackle situations where the user has some control and choice, and can provide meaningful consent. This is not the case with this duty for pornographic publishers. Users have no choice but to use age verification to access the content, and the publishers are forced to provide such tools.

Pornography companies meanwhile have every reason to do age verification as cheaply as possible, and possibly to harvest as much user data as they can, to track and profile users, especially where that data may in future, at the slip of a switch, be used for other purposes such as advertising-tracking. This combination of poor incentives has plenty of potential for disastrous consequences.

What is needed is clear, spelt out, legally binding duties for the regulator to provide security, privacy and anonymity protections for end users. To be clear, the AV Regulator, or BBFC, does not need to be the organisation that enforces these standards. There are powers in the Bill for it to delegate the regulator’s responsbilties. But we have a very dangerous situation if these duties do not exist.

[Read more]

October 18, 2016 | Jim Killock

A database of the UK's porn habits. What could possibly go wrong?

The Government wants people who view pornography to show that they are over 18, via Age Verification systems. This is aimed at reducing the likelihood of children accessing inappropriate content.

To this end the Digital Economy Bill creates a regulator that will seek to  ensure that adult content websites will verify the age of users, or face monetary penalties, or in the case of overseas sites, ask payment providers such as VISA to refuse to process UK payments for non-compliant providers.

There are obvious problems with this, which we detail elsewhere.

However, the worst risks are worth going into in some detail, not least from the perspective of the Bill Committee who want the Age Verification system to succeed.

As David Austen, from the BBFC, who will likely become the Age Verification Regulator said:

Privacy is one of the most important things to get right in relation to this regime. As a regulator, we are not interested in identity at all. The only thing that we are interested in is age, and the only thing that a porn website should be interested in is age. The simple question that should be returned to the pornographic website or app is, “Is this person 18 or over?” The answer should be either yes or no. No other personal details are necessary.

However, the Age Verification Regulator has no duties in relation to the Age Verification systems. They will make sites verify age, or issue penalties, but they are given no duty to protect people’s privacy, security or defend against cyber security risks that may emerge from the Age Verification systems themselves.

David Austen’s expectations are unfortunately entirely out of his hands.

Instead, the government appears to assume that Data Protection law will be adequate to deal with the privacy and security risks. Meanwhile, the market will provide the tools.

The market has a plethora of possible means to solve this problem. Some involve vast data trawls through Facebook and social media. Others plan to link people’s identity across web services and will provide way to profile people’s porn viewing habits. Still others attempt to piggyback upon payment providers and risk confusing their defences against fraud. Many appear to encourage people to submit sensitive information to services that the users, and the regulator, will have little or no understanding of.

And yet with all the risks that these solutions pose, all of these solutions may be entirely data protection compliant. This is because data protection allows people to share pretty much whatever they agree to share, on the basis that they are free to make agreements with whoever they wish, by providing ‘consent’.

In other words: Data protection law is simply not designed to govern situations where the user is forced to agree to the use of highly intrusive tools against themselves.

What makes this proposal more dangerous is that the incentives for the industry are poor and lead in the wrong direction. They have no desire for large costs, but would benefit vastly from acquiring user data.

If the government wants to have Age Verification in place, it must mandate a system that increases the privacy and safety of end users, since the users will be compelled to use Age Verification tools. Also, any and all Age Verification solutions must not make Britain’s cybersecurity worse overall, e.g. by building databases of the nation’s porn-surfing habits which might later appear on Wikileaks.

The Digital Economy Bill’s impact on privacy of users should, in human rights law, be properly spelled out (“in accordance with the law”) and be designed to minimise the impacts on people (necessary and proportionate). Thus failure to provide protections places the entire system under threat of potential legal challenges.

User data in these systems will be especially sensitive, being linked to private sexual preferences and potentially impacting particularly badly on sexual minorities if it goes wrong, through data breaches or simple chilling effects. This data is regarded as particularly sensitive in law.

Government, in fact has at its hands a system called Verify which could provide age-verification  in a privacy friendly manner. The Government ought to be explaining why the high standards of its own Verify system are not being applied to Age Verification, or indeed, why the government is not prepared to use its own systems to minimise the impacts.

As with web filtering, there is no evidence that Age Verification will prevent an even slightly determined teenager from accessing pornography, nor reduce demand for it among young people. The Government appears to be looking for an easy fix to a complex social problem. The Internet has given young people unprecedented access to adult content but it’s education rather than tech solutions that are most likely to address problems arising from this. Serious questions about the efficacy and therefore proportionality of this measure remain.

However, legislating for the Age Verification problem to be “solved” without any specific regulation for any private sector operator who wants to “help” is simply to throw the privacy of the UK’s adult population to the mercy of the porn industry. With this mind, we have drafted an amendment to introduce the duties necessary to minimise the privacy impacts which could also reduce if not remove the free expression harms to adults.


[Read more] (14 comments)

October 17, 2016 | Pam Cowburn

In 'vest'ing in crime fighting technology – accountability versus privacy rights?

The Met Police have announced that body-worn cameras will be rolled out across the force. ORG's Javier Ruiz and Pam Cowburn spoke to Alex Heshmaty about the initiative when it was first announced in 2014.

What impact is wearable technology likely to have on police safety and effective crime fighting? Conversely, what's the impact on police accountability and reliability of evidence?

This initiative would further increase the scope of surveillance in the UK. Already, we have one of the highest rates of CCTV cameras by population in the world. A 2013 survey estimated that there could be up to 5.9 million surveillance cameras in the UK, one for every 11 people.

Wearable technology may be even more intrusive than CCTV, capturing up-close visuals and audio recordings which, in the case of the police, could be of victims and perpetrators involved in violent and graphic crimes.

While it's important to make policing more transparent and accountable, we need to make sure that we don't over-rely on technology to achieve this. Change must also come through wider policies and attempts to change cultural working practices.

Similarly, the effectiveness of surveillance as a crime prevention measure should not be over-stated and may not always be justified by the cost. Other more low-tech measures – such as better street lighting – may be more effective in preventing crime.

Although video recordings may provide useful evidence that can help to secure convictions, as with other kinds of evidence, they can also be misleading if presented without relevant context. If cameras are on all the time, the police are effectively filming the public on a continual basis regardless of whether they are involved in a crime. In terms of making sure the police are accountable, it is less likely that police abuses would happen in public places. But, it might be preferable to have cameras in police vehicles – where there have been accusations of abuse and where it is less likely that bystanders will be filmed – in the same way that there are cameras in police stations.

Issues may arise on how audio-visual materials are used and how long they are kept for, particularly when the police are filming members of the public not involved in criminal activity.

Arguably there may be benefits to the police wearing cameras at demonstrations. Protesters may feel that this might deter heavy handed dispersal tactics by the police or provide evidence of them if they occur. Conversely, police officers may feel that they have evidence to counter any claims of police brutality or provide evidence of provocation. But cameras would also give the police a visual record of everyone who attended a particular demonstration. How might that footage be used afterwards? Could facial recognition software be used to identify people to keep a note for future demonstrations or investigations?

Won't it just be possible to turn the camera off (in the same way as a recording can be stopped)?

If it is possible to turn a camera off, there would need to be mechanisms within the camera to keep a proper audit of when it has been switched on and off, and why.

Continual recording would mean that all of a police officer's daily activities would be recorded and they would be fully accountable for their actions. But it would also mean that many members of the public, not involved in crimes, would be captured on film and this would be an unnecessary intrusion on their privacy. In addition, there are times when police officers have to use their discretion. If they were wearing cameras, they might feel obliged to pursue minor infractions, which they might deal with differently otherwise.

Conversely, selective recording could lead to accusations that video footage is misleading, has been taken out of context, or deliberately manipulated to secure a conviction.

Does the use of such technology present any challenges to current criminal law and police practice?

The use of CCTV by public authorities is regulated under the Protection of Freedoms Act 2012 (PFA 2012). The Surveillance Camera Code of Practice pursuant to PFA 2012 provides guidance to public authorities.This guidance acknowledges that, "there may be additional standards applicable where the system has specific advanced capability... for example the use of body-worn video recorders". However, it does not give much detail about what these standards are.

The Information Commissioner's Office (ICO) has published more detailed guidance, which spell out further what these mean – In the picture: A data protection code of practice for surveillance cameras and personal information'. This recognises the threats to privacy:

"BWV [body-worn video] systems are likely to be more intrusive than the more "normal" CCTV style surveillance systems because of its mobility. Before you decide to procure and deploy such a system, it is important that you justify its use and consider whether or not it is proportionate, necessary and addresses a pressing social need."

It also outlines the data protection issues and offers guidance that data should be stored, "in a way that remains under your sole control, retains the quality of the original recording and is adequate for the purpose for which it was originally collected".

What are the potential human rights or privacy implications for individuals?

The police spend a lot of time talking to victims, witnesses and other members of the public, not just apprehending criminals. By wearing a camera they could essentially be continuously filming in public places and this has privacy implications for everyone in those places. The government's guidance says, "people in public places should normally be made aware whenever they are being monitored by a surveillance camera system" but it is difficult to see how this could work in practice if the camera is being worn by an officer.

Given the appetite for footage of real criminals being arrested, there are also risks of videos being leaked, hacked or shared inappropriately and this is likely to breach rights of privacy.

What measures would police need to take to ensure that their use of such technology complies with data protection laws?

The police have broad powers to hold and process data, and there are a number of data protection opt-outs available to them. If they are to record and keep video footage, they must have systems in place that store audio-visual material securely. There also need to be strict controls over who can access it. The guidance from the ICO outlines these requirements clearly. However, it is not only data protection law but also the Human Rights Act 1998 that the police must comply with.

This article was first published by Lexis®PSL IP & IT on 17 November 2014.

[Read more]

September 16, 2016 | Paul Sanders

A fair way to close the value gap

The music industry says that artists, labels, and songwriters are getting a raw deal from services that allow users to upload content. The beef is that user-uploaded songs, which may generate advertising revenue for the service and the uploader, compete directly with those same songs uploaded by the copyright owner. The difference in revenue between a user upload and a professionally supplied version is what the music industry means by the ‘value gap’.

And they don’t like it. As explained by record company trade body IFPI’s Frances Moore:

"The value gap is about the gross mismatch between music being enjoyed by consumers and the revenues being returned to the music community."

Copyright terms and conditions always make the uploader responsible for any copyright permission or licences, but sometimes uploaders don’t have all the rights they need. If services remove the content promptly when asked, they benefit from what is known as a 'safe harbour', and the copyright holder has no claim against them for infringement or loss of revenue.

So what does the music industry want? Frances Moore again: "The 'safe harbour' regime designed for the early days of the internet should no longer be used to exempt user upload services that distribute music online from the normal conditions of music licensing. Labels should be able to operate in a fair functioning market place, not with one hand tied behind their back when they are negotiating licences for music."

Unusually for the music industry, the IFPI position has managed to generate broad support among artists and indie labels, as well as songwriters and publishers. Over 1,300 artists have now signed a letter to the EC President Juncker, which you can read online here

The music industry is not calling for safe harbour to be abolished, rather that the qualification to benefit from it is drawn far more narrowly now that many platforms are less file hosting services and more media and advertising businesses. And the European campaign is mirrored by similar efforts in the US asking for changes to the DMCA safe harbour provisions.

The pushback has been quick and predictable, and based on the same set of positions that have been rehearsed over the last 20 years of Internet history. Some feel that copyright should be abolished, and artists who can play live should make money only from from ticket and tee shirt sales. Some suspect the campaign is just the biggest artists and labels wanting to add even more millions to their already vast riches. Many think that the music industry should share out more equally what it already has before seeking to get stronger rights. Some think that over-zealous labels and artists are harming other creators by issuing unfair take-down notices.

Legal sophisticates will recognise some important principles wrapped up in this debate. Citizens and consumers are clearly right to demand that an industry with unfair practices is not rewarded. And in a commercial environment in which only a tiny proportion of new work achieves a return on investment, there is a balance to be found between the value of distribution and promotion to the creator, and the value of the content to the service. There are too a whole class of either inadvertent, incidental, and innocent infringements where the uploader has no intent to profit to the detriment of the musicians.

But does the music industry have a point at all? The disparity in the money the music industry gets for the same consumer experience is real enough. IFPI calculated wholesale subscription revenue per user at just under $30 per year for 2015, while advertising brought in about $0.72 per user per year, albeit from a much larger user base. The advertising rates on UGC are generally much lower than on professionally supplied content; so where YouTube and other services are being used as a music jukebox, the hit to music industry revenue from this competition is very significant.

But of course there is no way to know whether there’s more money to be found. The services currently benefiting from safe harbour have every incentive to increase their own revenue. Other old copyright businesses that are moving to internet economics seem to be suffering similarly, so it might just be inherent in the way Internet media economics works. You can watch Professor Scott Galloway for an entertaining rant about this.

Internet advertising has its own set of issues quite apart from any music industry griping. We are learning that the cost of relying on advertising to support media generally is paid partly in greater intrusion into our private lives as trackers try to squeeze more value out of our daily traces, mostly with nothing like informed consent. High quality journalism is expensive. It might be weakened as more people have greater access to publishing platforms, with subsequent harm to political process and public life.

For me one of the ironies of the long standing conflict between copyright and Internet businesses is that if copyright was a tech startup innovation it would be lauded as a thing of progressive genius. It would not be organised in national silos, nor looked after by people who refuse to cooperate with each other. But there’s no more natural way, in a world of infinite replication and trackability, to incentivise and reward creators. And that is what ad-supported UGC services do, through unique digital object ids, channels, and the massively complex world of consumer tracking and advertising markets.

There is a great deal both sides can do to show they are fit for purpose. The music industry should finally deliver on the promise of digital technology and make it easy for everyone to identify and pay the creators and owners of the music, just like YouTube does with its own creators. Services need to show they deserve a safe harbour by demonstrating respect for the rights and privileges of everyone, from fair dealing student to striving artist to privacy deserving citizen. I would like to see the value gap closed by giving songwriters and musicians more say in the deals that affect their livelihoods, and by demanding more transparency from services on what they do with all of our data.

Paul Sanders is a member of ORG's Advisory Council and the cofounder of several music and technology companies. This blog is his personal view and we hope it will start a debate on the 'value gap'.

[Read more]

September 14, 2016 | Jim Killock

GCHQ should not push ISPs to interfere with DNS results

GCHQ have a dual and rather contradictory mandate: they are asked to get around security measures, break into systems and snoop on citizens. They are also asked to protect the UK from cyber attacks by improving security protections.

While these two goals are not automatically in conflict, they are certainly in tension, which will also raise questions of trust. Is GCHQ’s strategy intended to secure our systems, or in fact to keep them vulnerable?

Today’s announcement that GCHQ’s National Cyber Security Centre wish ISPs to manipulate DNS results to prevent access to phishing sites smacks of exactly this conflict. (The Domain Name System (DNS) is what resolves an ordinary web address like to a unique number (IP address) that gets your web browser to the correct web server.)

Their Director General, Ciaran Martin, explained in a speech that:

The great majority of cyber attacks are not terribly sophisticated. They can be defended against. And if they get through their impact can be contained. But far too many of these basic attacks are getting through. And they are doing a lot of damage

we're exploring a flagship project on scaling up DNS filtering: what better way of providing automated defences at scale than by the major private providers effectively blocking their customers from coming into contact with known malware and bad addresses?

Now it's crucial that all of these economy-wide initiatives are private sector led. The Government does not own or operate the Internet. Consumers use have a choice. Any DNS filtering would have to be opt out based. So addressing privacy concerns and citizen choice is hardwired into our programme.

There are a number of problems with this approach. Privacy and logging are one; but so is the collateral damage that comes from DNS blocking. Phishing tends to abuse URLs rather than whole sites, so the impact of blocking entire sites can sometimes be huge. And there are alternatives targeting specific known problems, such as Chrome’s “safer browsing” product.

Having ISPs able to serve up “spoof” DNS results for whole websites is, perhaps coincidentally, tremendously useful when implementing censorship.

The DNS blocking approach, even if “voluntary” and a matter of choice, would potentially run up against industry initiatives to improve security of customers through preventing the manipulation of DNS results, such as DNSSEC (among others). The aim of these projects is to prevent “spoof” DNS results, which allow intermediaries to interfere with web pages, replace adverts, or serve fake pages based on users mis-spelling domains. It would have made it impossible for the Phorm model of interception of user web traffic to work, for instance. 

Even if we trust ISPs and governments not to abuse their extending powers of censorship, we ought to be worried that GCHQ are proposing at least one security measure which undermines international efforts to improve the integrity of the Internet, and thereby also, its security. Perhaps this reveals some of the weaknesses of a state-led approach to Internet security. It would also likely be redundant if clients switched to encrypted resolvers run by other parties.

For instance, GCHQ seems to be more keen on working with a handful of big players, who can make ‘major’ interventions to ‘protect’ the public. Rather than expecting the market, the endpoints, and helping users themselves to do better, GCHQ no doubt find it easier to work with people who can deliver change ‘at scale’.

Or to look at it another way, GCHQ’s proposed solution may not be mandatory, but could impose a certain kind of stasis on technical innovation in the UK, by retarding the adoption of better DNS security. Does GCHQ really know better than the technical bodies, such as the Internet Engineering Taskforce (IETF) and their commercial participants, who are promoting changes to DNS?

There is no doubt that GCHQ have information which would be useful for people’s security. However, precisely what their motivations are, and what their role should be, are much more open to question. For this reason, we have called for their cyber security role to be divorced from their surveillance capabilities and placed under independent management.

That aside, GCHQ’s idea to promote the tampering of DNS results may be superficially attractive in the short term, but would be a medium term mistake.

[Read more] (2 comments)

August 31, 2016 | Javier Ruiz

Bad news in leaked EU Copyright Directive

Leaked EU Copyright Directive ignores ordinary internet users and presents limited reforms to support creators, researchers, teachers and librarians, while providing a sledgehammer of protectionist measures for the incumbent news, music and film industries.

Several documents have been leaked from the European Commission providing a clear picture of the proposed reforms to copyright that will be presented later in the year. The picture is quite negative as the proposals range from the timid to the openly regressive, such as the introduction of a new ancillary right for news publishers. Several key initiatives have been dropped, including changes to the current exceptions for freedom of panorama that allow taking the pictures of public art and buildings.

The documents leaked include the Impact Assessment on the Modernisation of EU Copyright Rules, prepared by EU officials; an official communication titled ‘Promoting a fair and efficient European copyright-based economy in the digital single market’ due to be published later this month; and finally the full text of the proposed new Directive on Copyright in the Single Market.

The new Directive will complement and not replace current legislation such as the Infosoc Directive, although there are some minor technical modifications. Existing directives will not be reopened for discussion, thus limiting the possibilities for reform in key areas such as Digital Rights Management. The directive applies to the European Economic Area (EEA) and will probably be relevant to the UK whatever shape Brexit eventually takes.

These leaks make the past two years of pre legislative discussions about comprehensive copyright reform feel like a waste of everyone’s time, except of course for a few industry lobbyists. The EU is about to throw away the first chance in over a decade to adapt copyright to the digital world, instead choosing classic protectionism for incumbent industries. These measures will not promote the creation of a vibrant digital industry in Europe capable of standing up to Silicon Valley - as EU policymakers want.

Below is a summary of the main contents of the leaked Directive. There are other initiatives in the wider package, including: a Regulation for online broadcasting, implementation of the Marrakesh treaty on accessibility for the visually impaired, a broad package on copyright enforcement and more provisions to promote European works.

The contents of the Directive are a mix of initiatives that include some mandatory limited exceptions for culture and education and measures to help improve remunerations for creators, but in the main are openly about supporting right holders and European industry.

Protectionist measures

Publishers ancillary right

This is the most controversial reform is the creation of a completely new intellectual property right for news publishers which lasts 20 years and will add a new layer of complexity to Internet regulation. The new right has the same scope as rights of reproduction and making available and it’s covered by the same exceptions, including criticism or review. The EU is open in aiming to support the financial sustainability of news publishers, and the right does not cover scientific or academic publishers. We are not completely clear on the situation of blogs, but the right is meant to cover only publications by a “service provider”.

This right is meant to stop internet news aggregators to simply copy a portion of the news article and stop revenues flowing to the original site. Similar initiatives in Germany and Spain had a disastrous effect on media access, but we will need more time to fully understand how bad this one is. The first analyses are extremely negative as the new right seems even less constrained than previous initiatives.

User uploaded content: YouTube Law

Another major concession to industry aiming to address the “value gap” created by the disparity between the number of people watching content in platforms - basically YouTube - and the revenues received. The Directive forces relevant online platforms to seek licenses from rightsholders. While there may a case for Google to share some of its profits with rightsholders it is unclear that copyright law is the best way to do it. This law will extend beyond YouTube with unpredictable effects on Internet activities, as lawyers cotton on to the new powers given to industry.

This new power goes to the heart of Internet regulation: the (lack of) liability of intermediaries that enables content to be hosted and linked around, expressed in the E-commerce Directive. In principle the new power covers services that go beyond providing “mere physical facilities” and perform and “act of communication to the public” by taking an active role in curating or promoting content, but this is not always clear cut.

The Directive does not include an obligation to monitor preemptively - which would contravene other laws - but it forces the implementation of technological protection measures to protect works, such as Google’s Content-ID - with transparency obligations towards rightsholders.

Fair remuneration for authors and performers

There are some positive measures to protect creators that include transparency over online media sales, powers to renegotiate contracts and alternative dispute resolution mechanisms. Overall they seem positive albeit a bit weak, when compared with the sledgehammers given to news publishers and the music industry.

New mandatory exceptions

These exceptions are positive but in all cases limited when compared to the initial demands of libraries, educators and cultural institutions. They do not include many of the more far fetching reforms proposed by civil society and even the European Parliament.

The call for a mandatory exception for “freedom of panorama” campaigned for by many civil society groups including ORG fell on deaf ears. The Commission has simply stated in their documents that the status quo works fine, while politely asking all countries to implement the exception.

Text and Data Mining exception

This exception allows the making of copies to perform analysis for scientific research by non-profit or public interest organisations. There is no compensation for rights holders and an explicit ban on contractual clauses overriding the exception. Technical measures to restrict access or copying are allowed but should not affect the exception.

This is a positive move, although many research organisations and libraries had been asking for a broader scope as they feared that a lot of important research may be excluded.

Online Teaching exception The rationale for this exception is the lack of clarity on whether existing exceptions in the Infosoc and Database Directive apply to online education, particularly cross border access . The exception covers only “educational establishments”, which must control access to the resources, and will likely exclude many online educational initiatives. The exception allows for licensing schemes to take precedence over the exception and this could be used to weaken the provisions.

Digital Preservation

Libraries, archives and similar cultural heritage institutions will be allowed to make necessary copies of works for preservation, but only of works in their permanent collections. The exception is only for internal copies and not for online libraries.

Supporting the digital market

A couple of fairly minor initiatives that are positive but of limited impact in the context of the once-in-twenty-years reform of copyright.

Out-of-commerce works

Libraries have been lobbying for a long time to be allowed to engage in collective licensing deals to digitise and distribute out-of-commerce works. They see this as both an extension of their mission and an opportunity to generate funds, although in principle this is framed as non-commercial cost recovery of the costs of mass digitisation. The exception only applies to works first published in the EU. This is not a full free copying exception, but the option to enter extended collective licensing deals without the need to get approval from every author. There is a six month compulsory notice in case authors are around and object.

Video on demand 

The directive forces member states to create a voluntary “negotiation mechanism” with the support of and impartial body to help parties license work for Video on demand (VoD) services.

In summary, a disappointing culmination of a two year discussion that started with high hopes of seeing Europe take bold moves to really modernise copyright. The legislative process starts now, however, and while the UK is in the EU, ORG will continue to try to influence the shape of these laws as they go through the European Parliament. We must also remember that this is all based on leaked documents and the European Commission may still make some changes.

[Read more] (1 comments)

August 23, 2016 | Javier Ruiz

Review of bulk surveillance powers gives one side of the argument

The review of bulk surveillance powers by David Anderson supports the operational case for the current practices of the agencies. The review did not look at the wider impacts of mass surveillance and any further discussions need to take both aspects into account.

David Anderson’s Review of Bulk Powers gives broad endorsement to the contents of the Investigatory Powers Bill. This should come as no surprise, given his job was to determine their utility to the government and not any wider social impacts, including on human rights. The review also had a limited remit to look at potential alternatives to bulk, and found that these may exist for some of the cases examined, but are generally too cumbersome, slower or less effective. The review makes only one recommendation for reform: the creation of a new Technical Advisory Panel of independent security cleared experts to support the Investigatory Powers Commissioner.

We are concerned that the government will try to use Anderson’s review to end the public debate on mass surveillance. This would be a mistake, as the underlying issues will not go away. Countless human rights experts and bodies have condemned mass surveillance and a large proportion of the population remains concerned about their online privacy.

The IPB will be scrutinised again by the Lords in the Autumn, and the other side of the debate needs to be included in the discussions. Focusing on the utility of bulk - the operational case - is important in establishing the necessity for such powers. This does not tell us however whether wholesale collection and analysis of data is proportionate in a democratic society. As Anderson himself repeatedly makes clear in his review, these are matters for Parliament to decide.

Anderson arrived at his conclusions after looking at the classified activities of the Security and Intelligence Agencies - MI5, MI6 and the GCHQ - including several visits by him and his team. The Review also summarised most influential reports in this area, including those by the Intelligence and security Committee, Anderson’s previous report, and also some US reports that cover similar issues. Anderson found that all available evidence supports the operational case for bulk interception, bulk datasets and acquisition of communications data.

The only area where in Anderson’s view the case is not yet fully proven is bulk “equipment interference”, aka bulk hacking, as this is a new power. He quotes extensively security personnel claiming this power is required to cope with the growing threats that encryption and anonymity present to other bulk collection methods. There is little discussion about what this power may look like in practice. Documents leaked by Snowden mention GCHQ’s interference activities causing disturbance to the internet traffic of whole countries, such as Pakistan. Unfortunately, Anderson does not engage with the leaked documents and glosses over “the potential of CNE to create security vulnerabilities or leave users vulnerable to damage” (p. 108).

The review was completed in a very short period of time, rushed to avoid delaying the legislative process. Concerns have been expressed from various quarters that a lack of time might have impacted the results, but it seems unlikely that more time would have changed these. The Review was asked whether there is an operational case for bulk, and given the scope and sources of evidence the answer would almost certainly have been the same. Anderson neutralises the dissonant evidence, such as the last report from the Intelligence and Security Committee that criticised bulk hacking, or the US Privacy and Civil Liberties Board report that found there was no value in giving all phone records to the government for analysis.

The main new piece of information is the extended evidence from the security and intelligence agencies. Throughout the paper Anderson deals with criticisms of bulk by referencing new secret evidence that he has been able to see and was not available - and never will be - to critics. The argument is collapsed into a question of trust, in this case of trusting Anderson to vouch for the agencies.

Government will brandish this review’s headlines to support the Investigatory Powers Bill, but a more detailed follow up is needed. For example, the discussions on alternatives to bulk seem to point at some options, but these are quickly dismissed. It would be important for Parliament to probe these potential options in much more detail.

The recommendation for a Technical Advisory Panel can hardly be rejected, but its success will depend on how it is handled in practice. There are numerous advisory boards across government with very mixed results. Technically proficient staff should be part of the oversight and authorisation bodies, not just in advisory roles.

The review joins a veritable body of publications from security related bodies and individuals that cross-referentially support the practices of the agencies while ostensibly asking for some reforms. Since the Snowden leaks three years ago there has been a huge improvement in the level of public debate in the UK about surveillance . It would be foolish to deny that we have a much better understanding of how these systems work, and not all from the Snowden leaks. At the same time we have to ask ourselves what has actually changed in practice.

Unfortunately, the most serious concerns about the agencies do not relate to whether surveillance powers are useful and not misused, but how secrecy and pervasive monitoring affect democracy and individuals’ personal development in the medium to long term. There is little that Anderson’s review can do to allay those fundamental concerns.

Anderson puts his finger briefly on these most basic aspects, quoting his own previous report to support the need for strong surveillance powers:

A perception that the authorities are powerless to act against external threats to the nation, or unable effectively to prosecute certain categories of crime (including low-level crime), can result in hopelessness, a sense of injustice and a feeling that the state has failed to perform its part of the bargain on which consensual government depends.(p. 120)

A very different conclusion could be reached. The pervasive sense of hopelessness and insecurity created by technological change, economic crises and broader geopolitical upheaval might be perceived as the failure of the state to successfully plan, regulate and intervene diplomatically. The development of an all-powerful mass surveillance state seems the wrong overcompensation mechanism, which will do little to address the underlying causes of insecurity, injustice and disenfranchisement.

[Read more]

July 22, 2016 | Slavka Bielikova

IPBill Committee stage latest sitting

The House of Lords debated the IPBill in the Committee this Wednesday for the last time before the summer recess. The topics covered the Internet Connection Records (ICRs), the Request Filter and equipment interference. Here is a brief breakdown of what was said. We will be back in September with more.

Throughout the debate, the Lib Dem Lords mounted strong opposition to the Government in all three areas being discussed.

Internet Connection Records

Lord Paddick and Baroness Hamwee presented their amendments requesting removal of the ICRs, arguing they fail to meet the basic test of necessity. Lord Paddick made excellent points to justify their position.

Internet connection records do not do what the Government claim they do. … At best, internet connection records provide only details of which communications platforms have been used, most of which are based in the United States.

Lord Paddick referred to earlier statements made by MI5, MI6 and GCHQ claiming they do not have explicit necessity for ICRs because they have other ways of securing the data they need.

Lord Strasburger seconded their reasoning and pointed to high costs and impracticalities related to keeping the records. He made a case for the ICRs weakening the safety of British citizens. Strasburger said

It is a matter of when, not if, these sensitive data get into the wrong hands.

The unique position of the UK stubbornly requiring ICRs was questioned by Lord Oates. He pointed out that none of the Five Eyes countries, or any western democracy, collects ICRs. According to Lord Oates, it will be a horrific experience for the public when they discover that government insists on the retention of the details of every single person in the country's access to every single website.

The Government maintained their position that ICRs are necessary in combatting crime but did not offer any compelling evidence, instead repeatedly stressing their desire for ICRs to future-proof the Bill.

Under the agreed amendments on ICRs, they can only be obtained by UK authorities if they are to be used to help prevent or detect crime. Lord Keen of Elie said that ICRs would only "be able to be acquired only for offences that are sufficiently serious that an offender can be sentenced to at least six months’ imprisonment". The amendments were still criticised because of their vague phrasing.

Request filter

The request filter was debated under similar narrative. Lord Strasburger argued the filter would be too intrusive. Ha called it “a bulk power masquerading as an innocuous safeguard to reduce collateral intrusion.” 

The Government's response to him was that the filter facilitates public authority cross-stakeholder communications; Lord Keen even called the request filter a safeguard because the authorities will only see the data they need to see (they did not provide any information on how this is technically possible).

Strasburger's concerns of misuse were also shared by the Conservative Lord Lucas.

The potential for casual misuse or misuse suborned by journalists will be considerable. On top of that is potential misuse by government.

Equipment interference

The Lords also discussed several probing amendments on thematic warrants and hacking, including training and testing warrants. The training and testing warrants became the centre of attention after it became obvious that an innocent citizen could be the subject of the above warrants.

Baroness Jones submitted her proposal to create the Investigatory Powers Commission instead of IP Commissioner. Her reasoning is that

this approach confuses and conflates the roles of authorisation and oversight. It is constitutionally inappropriate for those involved in decision-making to have responsibility for the oversight of those same decisions. Such conflation gives rise to a potential conflict of interest.

Mentions of the CJEU Advocate General's opinion

Regarding the opinion of the Advocate General published the morning of the debate, Lord Strasburger used it to challenge the ICRs collection. However, Lord Keen refused to comment on potential implementation of the opinion until the European Court of Justice rules in the DRIPA challenge.

The Government maintain that the existing regime for the acquisition of communications data and the proposals in the Investigatory Powers Bill are compatible with EU law, and clearly it would not be appropriate to comment further while legal proceedings are ongoing.

[Read more]

: E-voting's Unsolvable Problem-->
  • ORG Glasgow: A discussion of the General Data Protection Regulation (GDPR)
  • ORG Aberdeen: March Cryptonoise event
  • ORG North East: Take control of your online life
  • ORG Cambridge: Monthly March Meetup