Response to the Joint Committee on Human Rights inquiry into Freedom of Expression

This inquiry focuses on freedom of expression as an essential foundation of democratic society, guaranteed by the common law and by Article 10 of the European Convention on Human Rights. See also the oral testimony provided by Jim Killock here.

Written evidence from Open Rights Group (FOE0214)

1 Open Rights Group are a digital rights campaigning organisation. We seek to help build a society where rights to privacy and freedom of speech online are respected, protected and fulfilled. 

2 Is greater clarity required to ensure the law is understood and fair?

2.1 We feel there are three areas where the law is unclear.

2.2 Online speech regulation

2.2.1 The Government is advancing legislative proposals on Online Harms, and intends to place new regulations on online service providers to restrict potentially unlawful or harmful content. Government wants platforms to make their Terms and Conditions, establishing speech boundaries, more clearly enforceable. The framework attempts to safeguard free expression by placing a duty to protect controversial opinions. It places the supervision of these duties (“duty of care”) to a state regulator, Ofcom. 

2.2.2 The Government says that Ofcom will not be responsible for reviewing individual takedowns, but this is not sustainable. For instance, the removal of Donald Trump’s account can be viewed either as evidence of inaction until the harm is done, or evidence that controversial views are not being protected. Policies and boundaries set up by companies will be supervised by a state body, and the speech of millions of UK citizens will be policed by guidelines negotiated between private companies and a state regulator.

2.2.3 This would not be accepted if it was the press being regulated. They have rightfully insisted on being exempted. However, when newspaper content appears on regulated services it will be hard for it not to be regulated. Over time, the difference between platforms and press  diminishes. 

2.2.4 In contrast, the European proposals in the Digital Services Act avoid direct state regulation, opting for systems to ensure that risks are audited, and on practical protections for content takedown systems, reviews, and penalties for actors who abuse takedown systems.

2.3 Government takedowns and removals

2.3.1 Government agencies take down a surprising amount of content through voluntary arrangements. Police and other agencies rely on their own judgment that content is unlawful, and ask material to be removed or domains suspended. 

2.3.2 The Counter-Terrorism Internet Referrals Unit of the Metropolitan Police removed 310,000 items by 2019.[1] There is little supervision or safeguards, and no system of independent appeals. Platforms provide limited remedies for content removals, and there is no system to ensure that independent appeals are possible. Platforms are not required to inform users that takedown requests came from the police.

2.3.3 We have little evidence of the quality of these recommendations, but a few have entered the public domain, as WordPress.com’s’ parent company Automattic was submitting unredacted takedown notices to the online transparency database known as Lumen.[2] In one case, Automattic was asked by CTIRU to remove a website which trolled UKIP leaders as racist and homophobic. The site is in very poor taste, but it is not in breach of terror legislation.[3]

2.3.4 Recently Google were issued a notice asking them to remove part of their Transparency report detailing a takedown request from CTIRU. It is unclear why CTIRU thought Google’s transparency report was in breach of terror legislation.

2.3.5 Independent oversight of these requests is necessary, and independent appeals processes need to be in place. Without these, agencies are granted wide powers to remove content informally, and both users and platforms may be disinclined to question these ‘requests’.

2.3.6 A number of police and law enforcement agencies issue “Domain Suspension Requests” to the .UK registry Nominet. A domain suspension makes it impossible to reach a particular domain, such as fakegoods.co.ukbut of course a different domain can be used in its place, such as fake-goods.co.uk. Up to 36,000 domain suspensions are made a year, mostly by the City of London Police relating to trademark infringements, but this also include a small number suspensions for speech offences. In total, 16,000 suspensions were made last year. Nominet have stated that domain owners making an appeal are referred to the agency concerned for internal review, but they do not have an independent review procedure. 

2.3.7 Nominet agreed with our 2019 recommendation[4] to introduce splash pages for suspended domains, after a public consultation.[5] These provide transparency, but do not explain how mistakes can be challenged.

2.3.8 In both these cases, Nominet domain suspensions and CTIRU takedowns, Government agencies have mechanisms which avoid the application of law through the courts, instead relying on informal processes that lack legal process, independent oversight, and independent review mechanisms. The motivations, such as the scale of the problem and the lack of legal mechanisms, are understandable, but the Committee should be concerned to see that independent oversight, prior scrutiny and appeals are applied to safeguard online free expression.

2.4 Online speech offences

2.4.1 We are particularly concerned about broad laws that are unspecific in the harm they are targeting, or create inconsistency online and offline. The Communications Act section 127 criminalises speech which is “grossly offensive” when sent on a public communications network.[6] We do not believe that mere “offensiveness” can be a test for criminalisation of speech. There are clearly truths and opinions that some can hold as (grossly) offensive. Criminalisation must be related instead to a specific and criminal level of harm.

2.4.2 As the Communications Act applies to speech on public networks only, speech which is merely offensive, and is legal when spoken in public or printed and distributed, can be unlawful online. 

2.4.3 These laws are being reviewed by the Law Commission,[7] who propose they are replaced by a test for causing “mental distress”. The test of “mental distress” could capture speech which is bound to offend, but is legitimate; this is compounded by basing it on a  “risk” or “likelihood to harm” targeted at a “likely audience” making the proposal very broad and open to wide interpretations. They propose a defence for “reasonable excuse” but this reverses the burden of speech restrictions; speech should regarded as legitimate unless there is a very clear reason why it must be restricted.

2.4.4 We recommend instead that the offences in the Public Order Act are made consistent online and offline. This would make it clear that speech online acts to cause alarm, distress, threatening behaviour and harassment were criminal. This has the added benefit that public order offences are already well-understood and established.[8]

3 Does everyone have equal protection of their right to freedom of expression?

3.1 No. Speech online is controlled by platforms who restrict speech within the boundaries of what is legal, and also tend to make mistakes both in where their boundaries are set, and what material does or does not cross their lines.

3.2 Social media platforms lack diversity and choice. There are a handful of doinant platforms; investment of time and effort building presence cannot be transferred; alternatives for content delivery cannot be accessed. “Social media diversity” should be as big a question as “media diversity”.

3.3 The lack of social media diversity contributes to the “attention” market over-riding user concerns. Business do not need to worry about users going elsewhere; they instead need to compel users to engage with them. This is a poor dynamic for speech, as “interesting” content is also usually shallow, and tends to bear a loose relationship with the truth. It is easier to exaggerate or lie if interest is your goal. These dynamics are well-known from the tabloid newspaper market.

3.4 Other mass communications platforms are open to competition and switching. With mobile phone, or email providers, multiple providers allow users to change who delivers their service, and users are not prevented from interacting with other phone users or email recipients. They are “interoperable”.

3.5 Competition policy is beginning to ask how this may be delivered.[9] The challenge is to embed interoperability requirements, so that e.g. Facebook and Twitter users and messages are shareable as they are with email or phones. If in place,  platforms could compete on user experience, methods to prioritise content, moderation policies, to benefit free expression.

3.6 Current UK legislation lacks notice and takedown, except regarding defamation. (In copyright, the US DCMA system is applied as a de facto standard.) Notice and takedown systems typically allow someone to complain about a problem, and for content to be removed unless the original author states that they are willing to defend their publication in court. This allows for copyright or defamation issues to be dealt with fairly. Such systems need safeguards against abusive and spurious notifications being made.

3.7 Companies need to provide independent review procedures of content they believe breaches their terms and conditions. In this regard, the Facebook Oversight Board is a good step, although we are unsure if it will have sufficient reach. It also currently has no power to review account suspensions without a request from Facebook itself, as with Trump.[10]

3.8 Such issues have barely been scratched in the UK Online Harms process, but are central to safeguarding free expression online.

31/01/2021


[1] https://wiki.openrightsgroup.org/wiki/Counter-Terrorism_Internet_Referral_Unit

[2] https://wiki.openrightsgroup.org/wiki/Counter-Terrorism_Internet_Referral_Unit/Lumen_reports

[3] https://transparencyreport.google.com/government-removals/overview?authority_search=country:GB&lu=request_country&request_country=period:;authority:GB;p:2

[4] See https://www.openrightsgroup.org/blog/informal-censorship-nominet-domain-suspensions/ and https://www.openrightsgroup.org/publications/uk-internet-regulation/

[5] https://www.nominet.uk/nominet-and-pipcu-tackle-cyber-crime-with-launch-of-new-landing-pages-for-suspended-criminal-domains/

[6]https://wiki.openrightsgroup.org/wiki/Section_127  https://www.legislation.gov.uk/ukpga/2003/21/section/127

[7] https://www.lawcom.gov.uk/project/reform-of-the-communications-offences/

[8] https://www.openrightsgroup.org/publications/open-rights-group-response-to-the-law-commission-reform-of-the-communications-offences/

[9] https://openforumeurope.org/publications/ofa-research-paper-the-technical-components-of-interoperability-as-a-tool-for-competition-regulation/

[10] https://www.lawfareblog.com/facebook-oversight-board-should-review-trumps-suspension See also https://oversightboard.com/news/236821561313092-oversight-board-accepts-case-on-former-us-president-trump-s-indefinite-suspension-from-facebook-and-instagram/