Blog


March 27, 2017 | Ed Johnson-Williams

Encryption must not be a dirty word. Here're 5 ways we all rely on it

Encryption keeps us safe. Politicians must not threaten to weaken it.

padlock over dataBritish politicians are again putting pressure on Internet companies to make sure the Government can access end-to-end encrypted messages. We thought we'd remind them why encryption keeps us safe and secure.

1. Our national infrastructure depends on encryption

Our power stations, transport systems, hospitals and military all rely on encryption to communicate securely. They need encryption so they can reliably send and receive accurate, trustworthy information. Without that, our national infrastructure would be immeasurably more susceptible to attacks from other countries, non-state hackers, and criminals.

2. Our economy depends on encryption

Our banks, stock exchanges, payment systems, and shops also need to be able to send and receive reliable information without criminals or foreign powers' intelligence agencies intercepting or tampering with the transaction. We need to be confident that when we pay for something our data is secure. Our economy relies on that confidence and that confidence is made possible by encryption.

3. Our free press depends on encryption

When sources contact journalists with sensitive information about MPs' expenses, the Panama papers or the Snowden files, they rely on encryption to make sure they can blow the whistle. Sources can use encrypted communications to pass evidence of corruption and abuse to journalists in a secure way. That helps keep us informed and our press free.

4. Our online security depends on encryption

Nearly every major website's web address starts with HTTPS – keeping the connection between your computer and that website encrypted. Encryption stops someone snooping on your web use and intercepting your usernames and passwords when you're in a coffee shop.

5. Our devices’ security depends on encryption

Once you’ve encrypted your laptop, tablet, or phone, if someone gets their hands on your locked or powered-off device, they would need your password to decrypt and access the data. This stops thieves from stealing your phone and then accessing your emails, contacts, and texts.

Join ORG

Help us to challenge politicians’ dangerous and misleading comments about encryption. Join ORG today!

[Read more] (1 comments)


March 27, 2017 | Jim Killock

Amber Rudd already has sweeping powers to attack encryption

Amber Rudd has engaged in another attack on people’s security by suggesting that companies must be able to ‘remove’ encryption.

Amber Rudd MPThe striking thing is that if she was genuinely serious about her suggestion, she would not be making public demands; she would be signing legal orders to force companies to change their products. She would not be telling us about this.

Last year, the UK Government passed the Investigatory Powers Act, which gives British law enforcement and intelligence agencies vast surveillance powers.

These powers already purport to grant the minister the ability to issue a “Technical Capability Notice” with which Amber Rudd could instruct WhatsApp to re-engineer their product to be surveillance-friendly.

The TCN could, for instance, instruct WhatsApp to enable an invisible “third recipient” in the case of targeted individuals. Thus, even without asking providers to remove or weaken encryption, the UK believes it has found a way to legally compel companies to provide information from supposedly secure products.

There are enormous problems with TCNs. They can be “appealed” to a technical committee but it is unclear how well the process will ever deal with wider security concerns, or risks to the companies or their users. The process seems focused on ‘feasibility’ rather than whether introducing weaknesses is a good idea.

Fundamentally, anything which enables GCHQ to listen in could be available to someone else, whether another government, or perhaps a criminal who learns how to abuse the weakness.

These notices are not subject to any public guidance about their use. Unlike interception of communications, equipment interference (hacking), bulk communications data acquisition (mass surveillance), bulk personal datasets (everything government knows about you) and National Security Notices (orders to act), which have public codes of practice and the Home Office claims to be “consulting on” there is no obligation for a Code of Practice on TCNs which might give some insight into how these issues might be balanced.

Those codes that have been published for consultation contain 415 pages of dense detail, a mere 15 paragraphs of explanatory information, while the public, lawyers and business have been given a mere six weeks to work out what they mean.

As you can imagine, the powers outlined the codes for interception of communications, equipment interference and bulk communications data acquisition will grant Ms Rudd many avenues to surveil the likes of Adrian Elms.

We should use Amber Rudd’s cheap rhetoric as a launch pad to ask ourselves why she has such sweeping powers, and what the constraints really amount to.

Join ORG

Help us to challenge politicians’ dangerous and misleading comments about encryption. Join ORG today!

[Read more] (2 comments)


March 22, 2017 | Javier Ruiz

MEPs start push back on online copyright censorship

A report for a committee of the European Parliament has pushed back against proposed legislation that would force the filtering of user-uploaded online content and restrict the use of press content online.

The EU is introducing some major changes to copyright legislation under a programme to improve the European Digital Single Market. This reform has reached an important milestone with the publication of a long-awaited report on a proposed new Directive. These changes will affect the UK after it leaves the EU, although the exact impact will depend on the level of access to the single market achieved after the negotiations between the UK government and the EU.

The proposed Directive on Copyright in the Digital Single Market contains measures that could have a negative impact on citizens. They include restricting the reuse of press content online and forcing platforms to censor user uploaded content. These proposed measures seem designed to protect incumbent European media conglomerates from Silicon Valley but will do little to promote a vibrant European digital sector.

The final report on the Directive for the Legal Affairs (JURI) Committee by Rapporteur MEP Therese Comodini Cachia’s (EPP, Malta) will provide the main point of scrutiny by the European Parliament, setting the lines for the final plenary vote. The draft report will now be open for amendments by MEPs from the committee until 30 March.

MEP Comodini Cachia should be commended for taking input from a variety of organisations - Including ORG through C4C - and not only the industry lobbies that hold such a disproportionate hold on policymaking around intellectual property.

Possibly due to these diverse perspectives, her report marks a shift from the latest stages in the debate towards a more balanced approach that considers both the interests of copyright owners but also the rights of users. We still think that Ms Comodini could have gone further in certain areas.

Removal of the proposed press publishers’ right

The report proposes to delete the Commission’s proposal for a brand new copyright-like right for press publishers introduced by the Commission to widespread dismay from copyrights experts, who have made clear that the introduction of a new right is unnecessary and would create a whole array of new problems.

The report rejects the argument that the Internet is intrinsically damaging to the press, with news portals and aggregators actually allowing readers to find more sources of news. This is evidenced by the experience of Spain, where a compulsory licence on news aggregators led Google to pull out its news service for that country, as they do not make any profit from it. This led to a drop in traffic to small publications.

There is broad agreement though that news publishers face serious challenges from online platforms. Some publishers have demanded a new right because they claim that they simply cannot enforce the existing copyright in their publications when news articles are copied by third parties, due to complex contractual relations with the authors of the pieces.

Ms Comodini proposes an alternative to a new right by giving publishers more powers to enforce existing copyright, which should provide enough protection. Her Amendment 52 gives press publishers legal standing to represent the authors of the works contained in their publications, being able to sue in their own name to defend the rights of such authors for the digital use of their press publications.

This is a very sensible proposal that should be supported by MEPs.

Improvements to filtering of user content by online platforms

The proposals would amend Article 13 of the draft Directive, which sets out obligations for online platforms to monitor and filter user content. These obligations would now be restricted to ensuring that agreements are concluded with rightsholders for the use of their works, removing references to “prevent the availability on their services of works or other subject-matter identified by rightsholders”.

The draft Directive makes platforms primarily responsible for preemptively checking that user uploaded content does not breach copyright. The report changes this, with rightsholders being responsible for identifying any misuse of their works, rather than prescribing “content recognition technologies” to be applied by online platforms. These changes go a long way to soften some of the worst aspects of the Draft DSM Directive by removing automated censorship.

The report also makes clear that copyright exceptions must be respected in any measures implemented and users shall have access to a court to “assert their right of use” (Am 60). This is very important when it comes to parody or criticism.

There are, however, some potential problems. We share concerns raised by the Copyright 4 Creativity (C4C) group that certain amendments could drag a broader array of online providers under these measures.

The original proposals were meant to cover services that both stored user uploaded files and made these available online, squarely aimed at YouTube. Removing the references to “storage” when defining the type of online services that would be in scope for these measures could widen the scope to many more online services.

Despite the improvements introduced by Ms Comodini, we recommend that MEPs vote for amendments that simply delete these provisions rather than trying to minimise them, as we cannot predict how they will be used, or misused, in the future.

A handful of US digital companies - Google, Facebook, etc. - have cornered the market and are at war with traditional media from news to music. Internet users are caught in the middle of this war when they upload and share the things they like.

Proposing that intellectual property agreements must be in place for all platforms actively making available user-uploaded content will create massive barriers to entry for new projects and damage the interactive nature that has characterised the internet in recent times. The explosion of free expression by ordinary citizens is something that should be celebrated and protected by governments, not traded in backroom deals between giant industry groups.

[Read more] (3 comments)


March 17, 2017 | Slavka Bielikova

Government half-turn on data sharing

Last month, the Government responded to the report by the Delegated Powers and Regulatory Reform Committee on the Digital Economy Bill. They showed they heard some of the Committee’s criticism on data sharing provisions and they intend to amend the Bill to reflect their recommendations. Unfortunately, the Government’s intentions will not fix everything.

ORG previously called to limit powers given to Ministers and to put constraints on unlimited bulk sharing of civil registration data. We advocated for codes of practice to be made statutory and legally enforceable and to boost privacy safeguards and put them on the face of the Bill.

The Government’s response touched upon several of our concerns.

The good:

Narrow definitions of specified persons, and who gets access to data on fuel poverty

We particularly welcome the Government’s commitment to specify the list of persons who may disclose and receive information for public service delivery and debt and fraud related to public sector on the face of the Bill. This will take away the power from Ministers and the process of specifying persons entitled to participate in data sharing will be more transparent.

The Government further said they will amend the Bill to narrow down the list of fuel poverty schemes and persons that have access to the data. They plan to introduce a similar amendment to address water poverty schemes.

Close ties between functions and objectives

The response also addressed specifying of data sharing objectives for public service delivery. The Government pledged to introduce an amendment that will require a specific public authority to only share data for purposes which are correspondingly in line with its functions.

Statutory codes of practice

The Committee’s report put a lot of pressure on making codes of practice statutory and it appears that the Government found their criticism justified. They agreed to subject the codes of practice to an affirmative procedure. This means both Houses must approve the Statutory Instrument before it becomes law. It is particularly necessary to give these statutory force as the safeguards are contained  in the codes and not on the face of the Bill.

Henry VIII powers removed

After the Committee criticised the use of the so-called Henry VIII powers, the Government decided to drop them completely from the Bill. These powers would allow the Government to amend or repeal the Bill after it has become an Act of Parliament. This means that Ministers could make changes to Chapter 1 on Public Service Delivery by subordinate legislation with or without further parliamentary scrutiny.

The poor wording of the Henry VIII clause in the Bill could have easily resulted in expanding the data sharing powers without any accountability after the Bill passes in both Houses of Parliament.

All of these changes are welcomed and needed but they don’t plug all the holes in Part 5 on data sharing.

The bad:

Review only for fraud and debt powers

The Bill includes provisions on amending and repealing the chapters on debt and fraud after a review. Following the report, the Government decided to narrow the power to amend. It will prevent Ministers from broadening the debt and fraud powers or removing any safeguards from the Bill.

This is a sensible requirement that should be applied to all the powers in Part 5, not just debt and fraud powers. Applying reviews to all powers and limiting powers to amend and repeal will increase transparency of data sharing and prevent unjustified onward disclosure of data to other public authorities.

No exclusion of punitive objectives

The new Government amendment tying together objectives for data sharing with functions of public authorities still raises issues. The Bill only states that the objective of data sharing should be for the benefit of an individual or a household regarding public service delivery. Other powers in Part 5 (disclosure for debt and civil registration reasons and sharing with electricity and gas suppliers) don’t have the requirement to be of benefit.

The functions of a public authority could include enforcement - if they do, objectives for data sharing could be of punitive nature. The Government should clearly state that the relevant powers in Part 5 are only to benefit individuals, not to punish them.

Dehybridisation clauses remain

The Bill includes provisions stating that provisions on sharing data for the purposes of public service delivery should not be regarded as hybrid instruments. A Hybrid Instrument is a piece of legislation that disproportionately affects a particular group of people within a class.

Part 5 of the Bill could have disproportionate effects, for example, on people in fuel poverty. Hybrid instrument procedure would give them an opportunity to present their arguments against the Bill to the House of Lords Hybrid Instruments Committee and then, possibly, to a select committee deciding whether or not the legislation should be approved by both Houses.

The Committee’s report highlighted this issue and suggested to consider further safeguards to be put on the face of the Bill if this provision is to remain in the Bill. The Government’s response does not address the clauses on hybrid instruments.

Bulk sharing of civil registration data untouched

Neither the Committee, nor the Government made any comments on unconstrained sharing of bulk civil registration data. Chapter 2 provides for the sharing of civil registration for any public body's functions without restrictions. The power is intended for bulk data sharing of the full civil register across government but this power hasn’t been sufficiently justified by the Government.

Ministers have presented this chapter as a way of improving electronic government transactions by avoiding the need for paper certificates to be circulated, but it appears to be more about convenience for administrators instead of a clear social purpose.

The Bill doesn’t require consent from the data subject to share their civil registration data. It is also unclear how these large databases will be stored and if at all encrypted. The Government said they have no intention to share the information with private companies but they did not provide a guarantee that they won’t do so in the future.

The power shouldn’t be part of the Bill without appropriate guarantees and safeguards.

 

[Read more]


March 08, 2017 | Ed Johnson-Williams

CIA and GCHQ hacking – they must clear up their own mess

US intelligence agencies are working with the UK to stockpile vulnerabilities that they can use to hack Windows and Mac computers, iOS and Android smartphones, and smart TVs. The UK Government has serious questions to answer.

The agencies will use these vulnerabilities for targeted surveillance. However vulnerabilities can also be discovered and exploited by criminals and other countries’ intelligence agencies. GCHQ's decision to keep their exploits secret could have devastating effects for society at large.

It is likely that the CIA and GCHQ are not the only organisations with knowledge of these vulnerabilities with the capability to exploit them. The agencies have, possibly through their own mistakes, increased the risks vastly by failing to ensure that the vulnerabilities are either reported or kept to themselves.

Many of the vulnerabilities disclosed in the CIA's files came from UK intelligence agencies including GCHQ. The UK Government has some serious questions to answer. These include:

  1. How does the Government ensure that GCHQ’s process for deciding whether to exploit or report a vulnerability is adequate? Are they creating unnecessary risks for organisations and individuals?

  2. How do oversight bodies check that GCHQ’s policies for assessing the risk of keeping an active vulnerability secret are sufficiently robust?

  3. Did any hacking operations reduce the security and privacy of an individual/organisation with respect to other actors?

  4. Is the authorisation process sufficient to avoid future problems?

  5. How will the UK government and agencies work to clean up the mess created by their decision not to report these vulnerabilities to the vendors?

While targeted surveillance is a legitimate aim, we need to know that government regulation of this area is sufficient. Governments should be regulating the way their intelligence agencies hoard and use vulnerabilities that affect devices owned by millions of ordinary people. From what we learnt during the passage of the Investigatory Powers Act, it appears that the ‘creation’ of techniques is not really regulated at all.

Whatever benefits there may have been to GCHQ and the US agencies in stockpiling these vulnerabilities to use for "good", the race is now on to repair them as fast as possible. NSA and GCHQ must disclose what they know about repairing these vulnerabilities and how they might be exploited to assist in this effort. The agencies must now work with the manufacturers of internet-connected devices like phones, laptops, TVs and routers, but potentially also fridges, toasters and home automation systems to repair the vulnerabilities.

Even if intelligence agencies report the vulnerabilities to device manufacturers, this does not mean that the devices will immediately be secure. The devices need to be updated to fix vulnerabilities. Manufacturers of Internet-connected devices have an ongoing responsibility to prioritise security, to actively test the security of the devices they sell, and to push out security updates to fix known vulnerabilities, which does not always happen.

At the moment, we have a secretive and unaccountable system of device hacking, badly in need of accountability and oversight. We should remember that our worry is only partly the agencies. It is the results of their actions, especially through enabling criminality, that we most need to worry about.

[Read more] (1 comments)


March 07, 2017 | Ed Johnson-Williams

Yes, the CIA can hack phones but Signal and WhatsApp are still safe for nearly everyone

The CIA can hack phones but Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. The worst thing to do would be to throw our hands up in the air and give up on our digital security.

Wikileaks have published documents claiming that the CIA can use some vulnerabilities in the iOS and Android operating systems to hack mobile phones and then monitor anything that happens on those phones. The vulnerabilities are expensive to buy or discover. In order to keep their existence secret for as long as possible they are likely to have been used on a targeted basis.

Some journalists – working for newspapers including the New York Times, the Independent and the Telegraph – have reported this story as showing that the CIA can bypass the encryption on messaging apps like Signal and WhatsApp. This is emphatically not accurate. The apps themselves are secure. They are probably uncritically repeating a Wikileaks tweet to that effect.

If the CIA (or the NSA, MI5 or GCHQ for that matter) hacks your phone then they will be able to read messages on any messaging app, regardless of how good the app's encryption is.

There is a big difference between phone operating systems being hacked and message encryption being broken. If a messaging app’s encryption has been broken, that would affect every user of the app. The encryption in Signal and WhatsApp has not been broken.

If the CIA is so interested in you personally that they would hack your phone, then yes you are vulnerable to attack. This is not new. Most of us, however, are not national security journalists reporting on sensitive state secrets so the CIA hacking our phone is very unlikely. We can and should still use encrypted messaging apps to help keep our messages private and secure from people who a) aren’t as powerful and well-resourced as the CIA and b) far more likely to try to read our messages.

Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. 1 The worst thing to do would be to throw our hands up in the air and give up on our digital security.

If someone tunnels into a bank vault, the unbreakable lock on the vault's door doesn't help very much. But we know it's still a good idea to put a really good lock on your bank vault to deal with other break-in attempts.

There are issues with the way the CIA and other intelligence agencies hoard and use device vulnerabilities without reporting them to the manufacturers of the devices. If vulnerabilities in the devices remain unfixed it means that people’s devices are also open to attack from criminals and from other countries’ intelligence agencies.

From a personal security point-of-view it’s important to keep all of this in perspective. Most people are at far greater risk of their devices being infected from clicking a link in a phishing email than they are of being hacked by the CIA using a vulnerability in their device.

1. Issues about WhatsApp's data collection and contribution to Facebook's business model are covered here.

Update 8/3/2017, 10:30am - Added links to further incorrect media coverage that questioned the security of Signal and WhatsApp.

[Read more] (1 comments)


March 07, 2017 | Jim Killock

Scrap the DEBill Age Verification and censorship—before it is too late

The Digital Economy Bill (DEBill) is at the report stage in the House of Lords, a long way down the parliamentary process, and the concerns around AV and censorship of legal pornographic websites are still there. Today, further amendments have been published but unfortunately, they are too little and too late to limit the Bill’s harm to free expression and privacy of UK citizens.

The Bill is an example of how not to legislate for the Internet and complex social issues. Among other things, the DEBill, attempts to address the issue of under 18s seeing pornography, by forcing porn sites to implement Age Verification (AV). This ‘simple’ solution has been fraught with problems from the start.

Age verification - an unseemly scramble to patch it up

Age Verification is in a terrible mess. There are no privacy safeguards on the face of the Bill, which means that UK citizens could be at risk of having private information about their porn habits and sexuality leaked, hacked or exploited. ORG has repeatedly called for the privacy concerns to be addressed.

The Government now says that it “agrees that parliamentary oversight is necessary”.[1]  Their changes are published as amendments. One Statutory Instrument will spell out what is and is not to be censored (page one, altering Clause 15). A second sees the guidance from the AV regulator given statutory force (page four, adding in a new clause). A third regulates the regulator, adding in duties that have until now been forgotten (page five, adding a new clause).

If this sounds like a dog’s breakfast, then that is because it is. The Government is admitting that it has done its work so badly it will need to patch up the DEBill with three pieces of secondary legislation at a later date.

Apart from the democratic deficit of leaving the big problems to future, as yet undrafted secondary legislation, we need to ask whether the problems with age verification can be fixed this way. We are not confident they can.

The Bill allows the regulator, the BBFC, to regulate pornographic websites and ensure that they verify the age of their customers.The new clause says that a Statutory Instrument (SI) will provide guidance for “types of arrangements for making pornographic material available that the regulator will treat as complying”.

The Government claims this statutory “guidance” means that the BBFC can regulate the age checking tools and their privacy requirements.

However, this is rather dubious. AV providers are not mentioned in the Bill, only porn providers; the Bill does not spell out any power to regulate AV providers or what the duties might be.

Can the contents of this SI be applied to AV providers and impose privacy duties on them, if neither is mentioned in the Bill? How would the SI do this fairly and without exceeding the remit laid out for them in the Bill?

Website operators and Age Verification providers may wish to check customers’ age in all kinds of novel and potentially intrusive ways. They will have strong commercial reasons for doing so. It may seem to them be quite unreasonable for statutory instruments to go beyond the question of whether age is properly checked, and impose privacy duties not even referenced in the bill that run against their commercial interests.

The Bill therefore needs to be clear that it can regulate the AV providers, and not just the websites.

Consent and free choice

The SI would also need to ensure that users can choose their own tools. This point is essential. If the websites choose the age checking tools, then these will be chosen for the benefit of the websites, as either cheapest, or the best for tracking people, or simply the tool that everyone signed up to, whether or not it is private or safe.

It is not clear that the Government has understood the need for free choice and an open market. Previously, ministers have fallen back on claiming that data protection laws are sufficient, which they are not, as they rely on consent and free choice when you choose to relinquish some of your privacy. Consent is absent here, as everyone must use the tools in order to access the content.

To legislate this kind of structure into existence through SIs without any mention in the Bill is pretty unusual. It would be needed, but it is difficult to justify doing this through secondary legislation, which is meant to be about detail, not principle.

We cannot be confident that these problems will be fixed in the SI and by then it will be too late for Parliament to do anything about it. The absence of privacy duties is very worrying, not least because it suggests that the Government has not accepted that anything actually needs to be done.

Would safe Age Verification be enough?

It isn’t clear that even ‘safe’ age verification tools would be enough to prevent free expression harms. Some people won’t want to use these tools, and may not trust them. Ensuring they are safe would reduce the “chilling effect” but it isn’t clear that it will be enough to stop them creating a barrier to people accessing legal content, nor is it clear that publishers really should be given this duty.

Nevertheless, the Government should take its duties seriously, and ensure that privacy and anonymity are duties for the regulator, spelt out in the Bill.

The same applies to the censorship regime, which was added at the last minute.

Inadequate censorship safeguards

The government has not changed its mind about giving wide censorship powers to an administrative body, the BBFC. The BBFC will be allowed to censor any pornographic website that does not provide an Age Verification tool.

Most websites of course won’t implement Age Verification. The BBFC will therefore have a power that could be applied to millions of non-compliant websites. We are told that we should trust that the BBFC will only apply this power to block a small number, perhaps 100 major websites. This begs the question, what is the point? Blocking just 100 porn sites would not prevent under 18s from accessing pornography as there will be plenty of other sites for them to access.

This power would be a hostage to future demands to apply it more widely, to perhaps thousands more websites, in order to “make the Internet safe” for children, which would imply attempting extremely expansive blocking.

Once politicians realise that the vast majority of pornography is neither subject to Age Verification, nor blocked, calls to ramp up the number of blocked websites will be made, and the power will be available to block them. All that will be needed is more money and a change of policy at the BBFC.

On appeals, the response concedes that an independent body should review any appeals:

“The government accepts, however, that the appeal must be considered by someone independent from the original decision maker and will table appropriate amendments to require the Secretary of State to be satisfied that the appeal will be independent before designating the regulator. [...] Additionally, the Secretary of State may issue guidance to the regulator on this matter” [Digital Economy Bill: Government Response  Clause 17(4) - appeal arrangements]

However, it does not envisage these involving the courts. This would mean that appeals might be dealt with narrowly, and fail to deal with free expression and association impacts adequately.

The Bill is clearly in a mess

The only conclusion we can make is that the Bill is so far from ready, so absent of safeguards, that these sections need to be dropped.

At this point, the Bill is meant to be in its final shape, and the Government is proposing its last amendments. There is of course a slim chance that the Opposition could force a vote to insert privacy safeguards. This would be unlikely to be enough, given the stage we are at at.

The only sensible course of action is to remove the entire section from the Bill.


[1] “The manner in which age verification technology operates will be fluid given the pace of technological change … the regulator should remain responsible for the production of guidance about the types of arrangements for making pornographic material available that it treats as complying with being “not normally accessible” to persons under the age of 18, but. The government will table amendments so that that the guidance must be laid before Parliament subject to the affirmative procedure for first exercise of the power and the negative procedure thereafter.”

“We are planning to underpin these changes with the introduction of a clause which will give the Secretary of State Power to publish guidance, which the regulator must have regard to, as to how the its exercises its functions, including the guidance it produces.”   [Digital Economy Bill: Government Response Clauses 15(3), 21(9) and 22(7) - the regulator’s guidance]

[Read more] (3 comments)


March 06, 2017 | Jim Killock

Why the IPO needs to change the criminal offence for online copyright infringement

The IPO says no change is needed to their proposed criminal offence for online copright infringement punishable by ten year sentences

The IPO has responded to your letters to the minister Jo Johnson MP about the new 10 year sentences for online copyright infringement.

In the bill, ten year sentences are available where online publication of a copyright work means that a “loss” has occurred (including not paying licence fees) or a “risk of loss” is created.

We do not think the IPO have adequately explained why they cannot or should not introduce a threshold for criminality.

The IPO says:

It is important to note that the criminal offences apply to making material available to others, not to those just downloading material to their computers. Anyone seeking to enforce their rights for the downloading of material would be unlikely to refer to this legislation.

Ten year sentences would only be applied in the most serious of criminal circumstances. It is highly unlikely that small, unintentional infringement would be caught by this offence. [our emphasis]

As we have said, publication without a licence is often an intentional act, where people either know or ought to know that they are infringing copyright.  The question is whether these usually minor offences are worthy of criminal sanctions?

Examples could include:

  • Using copyright pictures from other websites, such as images of politicians or famous buildings, on a personal blog or social media

  • Using images of musicians or actors found on news websites, for instance from award events, on a blog or social media

  • Sharing files (which includes uploading as well as downloading) via Bittorrent at small scale

In each case, a licence has not been paid, the user should understand they are infringing copyright, and they are causing further risks that other people might reuse or re-share the images or files.

In the case of file sharing, it is only ever detected when files are “uploaded” (and shared back to the copyright owner or their agent).

The acts appear to be criminal under the proposed offence. We understand that they are unlikely to be sentenced, or even prosecuted, but the question remains as to why these minor acts should be criminalised, rather than being subject to civil charges.

The risk of an increase of ‘trolling’ is considered to be low but the government will periodically review and respond to any concerns.

We may never hear about many threats sent privately. Gathering evidence of harm will be extremely difficult except in the most egregious examples of letters sent in their thousands.

The proposed offence creates new opportunities for trolls, while there is a simple way to remove this risk, which is to introduce thresholds. The statement says that:

It would not be practical for the government to set a specific level of loss or gain at which infringement becomes a criminal offence. This is because the circumstances of each infringement needs to be taken into account.

Our suggestions are not for "specific" levels of loss or gain, such as “acts causing under £200 of damages”.

Our proposal is to set a threshold of "commercial scale loss", and revising "risk of loss" to "serious risk of commercial scale loss". These are flexible rather than “specific”, so the government’s objection does not make sense to us.

If the losses are small, and the risks are minor, why should “circumstances” mean that an act should be criminal?

Our changes would give the public, lawyers and courts a clear indication that minor acts of file sharing or unlicensed online publication would be unlikely to meet the thresholds of "serious risk" or "commercial scale" losses.

This would protect people who received threatening letters - whether in bulk or privately, under the radar.

It is true that some minor acts of copyright infringement can be regarded as criminal today. The current offence criminalises “prejudicial effect”, which we agree is insufficiently narrow. The IPO argues that it has tried to narrow this by focusing on the intention of the infringer.

However, the proposed changes do not solve the original problem of criminalising ordinary internet users. Introducing the fuzzier “risk of loss” actually makes it more likely that grannies and teenagers will end up facing threats of criminal charges, perhaps agreeing to admit guilt, or simply paying up when faced with threats.

This change is small, and sensible, and we ask the government to look at this again. 

You can email the minister Jo Johnson MP to tell him to change it before it is too late..

 

[Read more]