Blog


May 01, 2017 | Jim Killock

Automated censorship is not the answer to extremism

Unbalanced Home Affairs Committee recommendations would threaten free expression

Today’s report by the Home Affairs Select Committee brands social media companies as behaving irresponsibly in failing to remove extremist material.

It takes the view that the job of removing illegal extremist videos and postings is entirely the responsibility of the companies, and does not envisage a role for courts to adjudicate what is in fact legal or not.

This is a complex issue, where the companies have to take responsibility for content on their platforms from many perspectives, including public expectation. There are legitimate concerns.

The approaches the committee advocates is however extremely unbalanced and could provoke a regime of automated censorship, that would impact legal content including material opposing extremism.

We deal below with two of the recommendations in the report to give some indication of how problematic the report is.

Government should consult on stronger law and system of fines for companies that fail to remove illegal content

Platforms receive reports from people about content; the committee assume this content can be regarded as illegal. Sometimes it may be obvious. However, not every video or graphic will be “obviously” illegal. Who then decides that there is a duty to remove material? Is it the complainant, the platform, or the original publisher? Or an independent third party such as a court?

The comparison with copyright is enlightening here. Copyright owners must identify material and assert their rights: even when automatic content matching is used, a human must assert the owner’s rights to take down a Youtube video. Of course, the video’s author can object. Meanwhile, this system is prone to all kinds of errors.

However, there is a clear line of accountability for all its faults. The copyright owner is responsible for asserting a breach of copyright; the author is responsible for defending their right to publish; and both accept that a court must decide in the event of a dispute.

With child abuse material, there is a similar expectation that material is reviewed by the IWF who make a decision about the legality or otherwise. It is not up to the public to report directly to companies.

None of this need for accountability and process is reflected in the HASC report, which merely asserts that reports of terrorist content by non-interested persons should create a liability on the platform.

Ultimately, fines for failure to remove content as suggested by the committee could only be reasonable if the reports had been made through a robust process and it was clear that the material was in fact in breach of the law.  

Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead

There is always a case for general taxation that could be used for the police. However, hypothecated resources in cases like this are liable to generate more and more calls for specific “Internet taxes” to deal with problems that can be blamed on companies, even when they have little to do with the activity in reality.

We should ask: is the posting of terrorist content a problem generated by the platforms, or by other wider social problems? It is not entirely obvious that this problem has in some way been produced by social media companies. It is clear that extremists use these platforms, just as they use transport, mail and phones. It appears to be the visibility of extremists activities that is attracting attention and blame on platforms, rather than an objective link between the aims of Twitter and Facebook and terrorists.

We might also ask: despite the apparent volumes of content that is posted and reposted, how much attention does it really get? This is important to know if we are trying to assess how to deal with the problem

Proactive searching by companies is something HASC ought to be cautious about. This is inevitably error prone. It can only lead one way, which is to over-zealous matching, for fear that content is not removed. In the case of extremist content, it is perfectly reasonable to assume that content opposing extremism while quoting or reusing propagandist content would be identified and removed.

The incentives that HASC propose would lead to censorship of legal material by machines. HASC’s report fails to mention or examine this, assuming instead that technology will provide the answers.

 

[Read more] (1 comments)


April 05, 2017 | Jim Killock

A privacy disaster waiting to happen—the #DEBill on third reading

Today the Lords have their final debate on the Digital Economy Bill. No substantial changes are planned. This means all of the very severe problems with age verification, censorship and copyright sentencing still exist. Only in Part 5, about data sharing, has the government made significant improvements, although problems remain.

Age Verification: a privacy disaster waiting to happen

Age Verification is fraught, and likely to result in a chilling effect, where adults avoid visiting websites because of fears around the age verification technology. It is unclear that it is addressing a pressing social need; and while children do need support and education, a solution addressed at all adults is the wrong way to attempt it.

However, let us turn to the specifics.

Despite assurances that pornographic publishers will be obliged to use age verification tools that are privacy-friendly, the approach is almost certain to go wrong.

The government has chosen to leave the market to specify and provide the actual tools. They expect websites, rather than users, to choose which age checking product is used.

At this point we should remember that one website operator, MindGeek controls the majority of the UK porn market. They are also keen to implement Age Verification, according to the government. The result will be that they will choose and probably own the dominant age verification product.

While we cannot know exactly what MindGeek would do, we should remember that they will be able to shape the AV product how they like. They could allow users to opt into lots of convenient services, such as saving their porn preferences, getting recommendations, and having their credit card details ready for quick and easy payment.

So long as these services and tracking of vast numbers of UK porn users is voluntary … then there is little that could be done to challenge it.

The consequences for privacy are enormous. New risks of tracking people’s sexual preferences will be created, and possibilities of data leaks will abound. It will be the government’s decisions that created this problem, as they failed to impose sufficient safeguards upon  the age verification market.

Censorship: how much blocking would you like?

Any commercial pornographic website that doesn’t offer Age Verification can be blocked under the powers in the Digital Economy Bill. This blocking is meant to be a punishment: but the result will be the censorship of legal material.

The BBFC and government have attempted to assure people privately that the numbers of blocks will be low, and based on market share.

However, the power in the Bill is not limited in this way. How much is blocked is purely a policy and financial choice. The door is open for the government to be lobbied to block vast numbers of entirely legal websites. And there are plenty of people who think this would be a wise and necessary step, including MPs.

Copyright: dangerous criminal penalties for online infringement

For whatever reason, the government resisted our suggestion to limit criminal sanctions to “criminal scale” infringements or serious risks of “criminal scale” infringement.

The result is that any intentional infringement is a criminal matter. This is very different to the offline world, where large scale organised activity is required before criminal charges can be brought.

This cannot be proportionate; and it is not sufficiently foreseeable. While minor infringements may not be brought to court, it makes it impossible to know when something might attract a criminal charge. For individuals, the risk of “copyright trolls” issuing threats, or lawyers giving clients bad advice, can only increase.

Data sharing

The data sharing part of the Bill has undergone significant changes and will leave the House of Lords in an improved state. Following pressure from several civil liberties groups and the Delegated Powers and Regulatory Reform Committee, the Government tabled and passed important amendments on codes of practice and brought forward changes that narrow down definitions of public authorities.

We welcome that the Government specified the list of persons who may disclose and receive information both for public service delivery and for debt and fraud related to the public sector on the face of the Bill. The process of specifying persons entitled to participate in data sharing will be more transparent by not leaving all of these powers up to the Minister.

The Government also amended the Bill to require a specific public authority to only access data for purposes which are in line with its functions. The Bill ties functions of a public authority and its objectives closer together and it will create a more transparent environment where public authorities will be prevented from accessing data for purposes out of scope of their functions.

The Codes of Practice were made statutory by the Lords. Both Houses of Parliament must  approve the Statutory Instrument before it becomes law. We repeatedly advocated for this amendment since most of the safeguards are placed in the Codes of Practice and not on the face of the Bill. Without statutory footing, the codes would have less statutory force and safeguards in the Codes wouldn’t be enforceable.

However the Government has not at all addressed the bulk use of civil registration data and they have not changed their stance on review for all the powers in Part 5 on data sharing.

Chapter 2 provides for the sharing of civil registration for any public body's functions without restrictions. The power is intended for bulk data sharing of the full civil register across government but this power hasn’t been sufficiently justified by the Government.

This Chapter leaves several questions without clear answers. We don’t know how these large databases will be stored and if at all encrypted. The Government said they have no intention to share the information with private companies but they did not provide a guarantee that they won’t do so in the future. We still believe this power should be removed from the Bill.

The Bill includes provisions on amending and repealing the chapters on debt and fraud after a review. The provisions will prevent Ministers from broadening these powers or removing safeguards from the Bill.

ORG would have liked to see reviews in place for all the powers under Part 5 of the Bill to increase transparency of data sharing and prevent unjustified onward disclosure of data to other public authorities.

The Bill doesn’t clearly state that relevant powers in Part 5 should be for benefit of individuals and not for punitive purposes. This could leave a wiggle room for future changes of purposes of data collection.

What is ORG going to do now?

ORG will be considering our options, including Judicial Review. These are very serious matters and nobody else will be stepping up to deal with them.

Join today

If you want to help our work, please join today. By joining you will help us beef up our legal team, led by Myles Jackman. We can only win with support from people like you.



[Read more] (5 comments)


March 27, 2017 | Ed Johnson-Williams

Encryption must not be a dirty word. Here're 5 ways we all rely on it

Encryption keeps us safe. Politicians must not threaten to weaken it.

padlock over dataBritish politicians are again putting pressure on Internet companies to make sure the Government can access end-to-end encrypted messages. We thought we'd remind them why encryption keeps us safe and secure.

1. Our national infrastructure depends on encryption

Our power stations, transport systems, hospitals and military all rely on encryption to communicate securely. They need encryption so they can reliably send and receive accurate, trustworthy information. Without that, our national infrastructure would be immeasurably more susceptible to attacks from other countries, non-state hackers, and criminals.

2. Our economy depends on encryption

Our banks, stock exchanges, payment systems, and shops also need to be able to send and receive reliable information without criminals or foreign powers' intelligence agencies intercepting or tampering with the transaction. We need to be confident that when we pay for something our data is secure. Our economy relies on that confidence and that confidence is made possible by encryption.

3. Our free press depends on encryption

When sources contact journalists with sensitive information about MPs' expenses, the Panama papers or the Snowden files, they rely on encryption to make sure they can blow the whistle. Sources can use encrypted communications to pass evidence of corruption and abuse to journalists in a secure way. That helps keep us informed and our press free.

4. Our online security depends on encryption

Nearly every major website's web address starts with HTTPS – keeping the connection between your computer and that website encrypted. Encryption stops someone snooping on your web use and intercepting your usernames and passwords when you're in a coffee shop.

5. Our devices’ security depends on encryption

Once you’ve encrypted your laptop, tablet, or phone, if someone gets their hands on your locked or powered-off device, they would need your password to decrypt and access the data. This stops thieves from stealing your phone and then accessing your emails, contacts, and texts.

Join ORG

Help us to challenge politicians’ dangerous and misleading comments about encryption. Join ORG today!

[Read more] (1 comments)


March 27, 2017 | Jim Killock

Amber Rudd already has sweeping powers to attack encryption

Amber Rudd has engaged in another attack on people’s security by suggesting that companies must be able to ‘remove’ encryption.

Amber Rudd MPThe striking thing is that if she was genuinely serious about her suggestion, she would not be making public demands; she would be signing legal orders to force companies to change their products. She would not be telling us about this.

Last year, the UK Government passed the Investigatory Powers Act, which gives British law enforcement and intelligence agencies vast surveillance powers.

These powers already purport to grant the minister the ability to issue a “Technical Capability Notice” with which Amber Rudd could instruct WhatsApp to re-engineer their product to be surveillance-friendly.

The TCN could, for instance, instruct WhatsApp to enable an invisible “third recipient” in the case of targeted individuals. Thus, even without asking providers to remove or weaken encryption, the UK believes it has found a way to legally compel companies to provide information from supposedly secure products.

There are enormous problems with TCNs. They can be “appealed” to a technical committee but it is unclear how well the process will ever deal with wider security concerns, or risks to the companies or their users. The process seems focused on ‘feasibility’ rather than whether introducing weaknesses is a good idea.

Fundamentally, anything which enables GCHQ to listen in could be available to someone else, whether another government, or perhaps a criminal who learns how to abuse the weakness.

These notices are not subject to any public guidance about their use. Unlike interception of communications, equipment interference (hacking), bulk communications data acquisition (mass surveillance), bulk personal datasets (everything government knows about you) and National Security Notices (orders to act), which have public codes of practice and the Home Office claims to be “consulting on” there is no obligation for a Code of Practice on TCNs which might give some insight into how these issues might be balanced.

Those codes that have been published for consultation contain 415 pages of dense detail, a mere 15 paragraphs of explanatory information, while the public, lawyers and business have been given a mere six weeks to work out what they mean.

As you can imagine, the powers outlined the codes for interception of communications, equipment interference and bulk communications data acquisition will grant Ms Rudd many avenues to surveil the likes of Adrian Elms.

We should use Amber Rudd’s cheap rhetoric as a launch pad to ask ourselves why she has such sweeping powers, and what the constraints really amount to.

Join ORG

Help us to challenge politicians’ dangerous and misleading comments about encryption. Join ORG today!

[Read more] (2 comments)


March 22, 2017 | Javier Ruiz

MEPs start push back on online copyright censorship

A report for a committee of the European Parliament has pushed back against proposed legislation that would force the filtering of user-uploaded online content and restrict the use of press content online.

The EU is introducing some major changes to copyright legislation under a programme to improve the European Digital Single Market. This reform has reached an important milestone with the publication of a long-awaited report on a proposed new Directive. These changes will affect the UK after it leaves the EU, although the exact impact will depend on the level of access to the single market achieved after the negotiations between the UK government and the EU.

The proposed Directive on Copyright in the Digital Single Market contains measures that could have a negative impact on citizens. They include restricting the reuse of press content online and forcing platforms to censor user uploaded content. These proposed measures seem designed to protect incumbent European media conglomerates from Silicon Valley but will do little to promote a vibrant European digital sector.

The final report on the Directive for the Legal Affairs (JURI) Committee by Rapporteur MEP Therese Comodini Cachia’s (EPP, Malta) will provide the main point of scrutiny by the European Parliament, setting the lines for the final plenary vote. The draft report will now be open for amendments by MEPs from the committee until 30 March.

MEP Comodini Cachia should be commended for taking input from a variety of organisations - Including ORG through C4C - and not only the industry lobbies that hold such a disproportionate hold on policymaking around intellectual property.

Possibly due to these diverse perspectives, her report marks a shift from the latest stages in the debate towards a more balanced approach that considers both the interests of copyright owners but also the rights of users. We still think that Ms Comodini could have gone further in certain areas.

Removal of the proposed press publishers’ right

The report proposes to delete the Commission’s proposal for a brand new copyright-like right for press publishers introduced by the Commission to widespread dismay from copyrights experts, who have made clear that the introduction of a new right is unnecessary and would create a whole array of new problems.

The report rejects the argument that the Internet is intrinsically damaging to the press, with news portals and aggregators actually allowing readers to find more sources of news. This is evidenced by the experience of Spain, where a compulsory licence on news aggregators led Google to pull out its news service for that country, as they do not make any profit from it. This led to a drop in traffic to small publications.

There is broad agreement though that news publishers face serious challenges from online platforms. Some publishers have demanded a new right because they claim that they simply cannot enforce the existing copyright in their publications when news articles are copied by third parties, due to complex contractual relations with the authors of the pieces.

Ms Comodini proposes an alternative to a new right by giving publishers more powers to enforce existing copyright, which should provide enough protection. Her Amendment 52 gives press publishers legal standing to represent the authors of the works contained in their publications, being able to sue in their own name to defend the rights of such authors for the digital use of their press publications.

This is a very sensible proposal that should be supported by MEPs.

Improvements to filtering of user content by online platforms

The proposals would amend Article 13 of the draft Directive, which sets out obligations for online platforms to monitor and filter user content. These obligations would now be restricted to ensuring that agreements are concluded with rightsholders for the use of their works, removing references to “prevent the availability on their services of works or other subject-matter identified by rightsholders”.

The draft Directive makes platforms primarily responsible for preemptively checking that user uploaded content does not breach copyright. The report changes this, with rightsholders being responsible for identifying any misuse of their works, rather than prescribing “content recognition technologies” to be applied by online platforms. These changes go a long way to soften some of the worst aspects of the Draft DSM Directive by removing automated censorship.

The report also makes clear that copyright exceptions must be respected in any measures implemented and users shall have access to a court to “assert their right of use” (Am 60). This is very important when it comes to parody or criticism.

There are, however, some potential problems. We share concerns raised by the Copyright 4 Creativity (C4C) group that certain amendments could drag a broader array of online providers under these measures.

The original proposals were meant to cover services that both stored user uploaded files and made these available online, squarely aimed at YouTube. Removing the references to “storage” when defining the type of online services that would be in scope for these measures could widen the scope to many more online services.

Despite the improvements introduced by Ms Comodini, we recommend that MEPs vote for amendments that simply delete these provisions rather than trying to minimise them, as we cannot predict how they will be used, or misused, in the future.

A handful of US digital companies - Google, Facebook, etc. - have cornered the market and are at war with traditional media from news to music. Internet users are caught in the middle of this war when they upload and share the things they like.

Proposing that intellectual property agreements must be in place for all platforms actively making available user-uploaded content will create massive barriers to entry for new projects and damage the interactive nature that has characterised the internet in recent times. The explosion of free expression by ordinary citizens is something that should be celebrated and protected by governments, not traded in backroom deals between giant industry groups.

[Read more] (3 comments)


March 17, 2017 | Slavka Bielikova

Government half-turn on data sharing

Last month, the Government responded to the report by the Delegated Powers and Regulatory Reform Committee on the Digital Economy Bill. They showed they heard some of the Committee’s criticism on data sharing provisions and they intend to amend the Bill to reflect their recommendations. Unfortunately, the Government’s intentions will not fix everything.

ORG previously called to limit powers given to Ministers and to put constraints on unlimited bulk sharing of civil registration data. We advocated for codes of practice to be made statutory and legally enforceable and to boost privacy safeguards and put them on the face of the Bill.

The Government’s response touched upon several of our concerns.

The good:

Narrow definitions of specified persons, and who gets access to data on fuel poverty

We particularly welcome the Government’s commitment to specify the list of persons who may disclose and receive information for public service delivery and debt and fraud related to public sector on the face of the Bill. This will take away the power from Ministers and the process of specifying persons entitled to participate in data sharing will be more transparent.

The Government further said they will amend the Bill to narrow down the list of fuel poverty schemes and persons that have access to the data. They plan to introduce a similar amendment to address water poverty schemes.

Close ties between functions and objectives

The response also addressed specifying of data sharing objectives for public service delivery. The Government pledged to introduce an amendment that will require a specific public authority to only share data for purposes which are correspondingly in line with its functions.

Statutory codes of practice

The Committee’s report put a lot of pressure on making codes of practice statutory and it appears that the Government found their criticism justified. They agreed to subject the codes of practice to an affirmative procedure. This means both Houses must approve the Statutory Instrument before it becomes law. It is particularly necessary to give these statutory force as the safeguards are contained  in the codes and not on the face of the Bill.

Henry VIII powers removed

After the Committee criticised the use of the so-called Henry VIII powers, the Government decided to drop them completely from the Bill. These powers would allow the Government to amend or repeal the Bill after it has become an Act of Parliament. This means that Ministers could make changes to Chapter 1 on Public Service Delivery by subordinate legislation with or without further parliamentary scrutiny.

The poor wording of the Henry VIII clause in the Bill could have easily resulted in expanding the data sharing powers without any accountability after the Bill passes in both Houses of Parliament.

All of these changes are welcomed and needed but they don’t plug all the holes in Part 5 on data sharing.

The bad:

Review only for fraud and debt powers

The Bill includes provisions on amending and repealing the chapters on debt and fraud after a review. Following the report, the Government decided to narrow the power to amend. It will prevent Ministers from broadening the debt and fraud powers or removing any safeguards from the Bill.

This is a sensible requirement that should be applied to all the powers in Part 5, not just debt and fraud powers. Applying reviews to all powers and limiting powers to amend and repeal will increase transparency of data sharing and prevent unjustified onward disclosure of data to other public authorities.

No exclusion of punitive objectives

The new Government amendment tying together objectives for data sharing with functions of public authorities still raises issues. The Bill only states that the objective of data sharing should be for the benefit of an individual or a household regarding public service delivery. Other powers in Part 5 (disclosure for debt and civil registration reasons and sharing with electricity and gas suppliers) don’t have the requirement to be of benefit.

The functions of a public authority could include enforcement - if they do, objectives for data sharing could be of punitive nature. The Government should clearly state that the relevant powers in Part 5 are only to benefit individuals, not to punish them.

Dehybridisation clauses remain

The Bill includes provisions stating that provisions on sharing data for the purposes of public service delivery should not be regarded as hybrid instruments. A Hybrid Instrument is a piece of legislation that disproportionately affects a particular group of people within a class.

Part 5 of the Bill could have disproportionate effects, for example, on people in fuel poverty. Hybrid instrument procedure would give them an opportunity to present their arguments against the Bill to the House of Lords Hybrid Instruments Committee and then, possibly, to a select committee deciding whether or not the legislation should be approved by both Houses.

The Committee’s report highlighted this issue and suggested to consider further safeguards to be put on the face of the Bill if this provision is to remain in the Bill. The Government’s response does not address the clauses on hybrid instruments.

Bulk sharing of civil registration data untouched

Neither the Committee, nor the Government made any comments on unconstrained sharing of bulk civil registration data. Chapter 2 provides for the sharing of civil registration for any public body's functions without restrictions. The power is intended for bulk data sharing of the full civil register across government but this power hasn’t been sufficiently justified by the Government.

Ministers have presented this chapter as a way of improving electronic government transactions by avoiding the need for paper certificates to be circulated, but it appears to be more about convenience for administrators instead of a clear social purpose.

The Bill doesn’t require consent from the data subject to share their civil registration data. It is also unclear how these large databases will be stored and if at all encrypted. The Government said they have no intention to share the information with private companies but they did not provide a guarantee that they won’t do so in the future.

The power shouldn’t be part of the Bill without appropriate guarantees and safeguards.

 

[Read more]


March 08, 2017 | Ed Johnson-Williams

CIA and GCHQ hacking – they must clear up their own mess

US intelligence agencies are working with the UK to stockpile vulnerabilities that they can use to hack Windows and Mac computers, iOS and Android smartphones, and smart TVs. The UK Government has serious questions to answer.

The agencies will use these vulnerabilities for targeted surveillance. However vulnerabilities can also be discovered and exploited by criminals and other countries’ intelligence agencies. GCHQ's decision to keep their exploits secret could have devastating effects for society at large.

It is likely that the CIA and GCHQ are not the only organisations with knowledge of these vulnerabilities with the capability to exploit them. The agencies have, possibly through their own mistakes, increased the risks vastly by failing to ensure that the vulnerabilities are either reported or kept to themselves.

Many of the vulnerabilities disclosed in the CIA's files came from UK intelligence agencies including GCHQ. The UK Government has some serious questions to answer. These include:

  1. How does the Government ensure that GCHQ’s process for deciding whether to exploit or report a vulnerability is adequate? Are they creating unnecessary risks for organisations and individuals?

  2. How do oversight bodies check that GCHQ’s policies for assessing the risk of keeping an active vulnerability secret are sufficiently robust?

  3. Did any hacking operations reduce the security and privacy of an individual/organisation with respect to other actors?

  4. Is the authorisation process sufficient to avoid future problems?

  5. How will the UK government and agencies work to clean up the mess created by their decision not to report these vulnerabilities to the vendors?

While targeted surveillance is a legitimate aim, we need to know that government regulation of this area is sufficient. Governments should be regulating the way their intelligence agencies hoard and use vulnerabilities that affect devices owned by millions of ordinary people. From what we learnt during the passage of the Investigatory Powers Act, it appears that the ‘creation’ of techniques is not really regulated at all.

Whatever benefits there may have been to GCHQ and the US agencies in stockpiling these vulnerabilities to use for "good", the race is now on to repair them as fast as possible. NSA and GCHQ must disclose what they know about repairing these vulnerabilities and how they might be exploited to assist in this effort. The agencies must now work with the manufacturers of internet-connected devices like phones, laptops, TVs and routers, but potentially also fridges, toasters and home automation systems to repair the vulnerabilities.

Even if intelligence agencies report the vulnerabilities to device manufacturers, this does not mean that the devices will immediately be secure. The devices need to be updated to fix vulnerabilities. Manufacturers of Internet-connected devices have an ongoing responsibility to prioritise security, to actively test the security of the devices they sell, and to push out security updates to fix known vulnerabilities, which does not always happen.

At the moment, we have a secretive and unaccountable system of device hacking, badly in need of accountability and oversight. We should remember that our worry is only partly the agencies. It is the results of their actions, especially through enabling criminality, that we most need to worry about.

[Read more] (1 comments)


March 07, 2017 | Ed Johnson-Williams

Yes, the CIA can hack phones but Signal and WhatsApp are still safe for nearly everyone

The CIA can hack phones but Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. The worst thing to do would be to throw our hands up in the air and give up on our digital security.

Wikileaks have published documents claiming that the CIA can use some vulnerabilities in the iOS and Android operating systems to hack mobile phones and then monitor anything that happens on those phones. The vulnerabilities are expensive to buy or discover. In order to keep their existence secret for as long as possible they are likely to have been used on a targeted basis.

Some journalists – working for newspapers including the New York Times, the Independent and the Telegraph – have reported this story as showing that the CIA can bypass the encryption on messaging apps like Signal and WhatsApp. This is emphatically not accurate. The apps themselves are secure. They are probably uncritically repeating a Wikileaks tweet to that effect.

If the CIA (or the NSA, MI5 or GCHQ for that matter) hacks your phone then they will be able to read messages on any messaging app, regardless of how good the app's encryption is.

There is a big difference between phone operating systems being hacked and message encryption being broken. If a messaging app’s encryption has been broken, that would affect every user of the app. The encryption in Signal and WhatsApp has not been broken.

If the CIA is so interested in you personally that they would hack your phone, then yes you are vulnerable to attack. This is not new. Most of us, however, are not national security journalists reporting on sensitive state secrets so the CIA hacking our phone is very unlikely. We can and should still use encrypted messaging apps to help keep our messages private and secure from people who a) aren’t as powerful and well-resourced as the CIA and b) far more likely to try to read our messages.

Signal and WhatsApp remain very good ways to communicate when using a mobile phone for nearly everyone. 1 The worst thing to do would be to throw our hands up in the air and give up on our digital security.

If someone tunnels into a bank vault, the unbreakable lock on the vault's door doesn't help very much. But we know it's still a good idea to put a really good lock on your bank vault to deal with other break-in attempts.

There are issues with the way the CIA and other intelligence agencies hoard and use device vulnerabilities without reporting them to the manufacturers of the devices. If vulnerabilities in the devices remain unfixed it means that people’s devices are also open to attack from criminals and from other countries’ intelligence agencies.

From a personal security point-of-view it’s important to keep all of this in perspective. Most people are at far greater risk of their devices being infected from clicking a link in a phishing email than they are of being hacked by the CIA using a vulnerability in their device.

1. Issues about WhatsApp's data collection and contribution to Facebook's business model are covered here.

Update 8/3/2017, 10:30am - Added links to further incorrect media coverage that questioned the security of Signal and WhatsApp.

[Read more] (1 comments)