call +44 20 7096 1079

Censorship

There are three types of censorship of the Internet in the UK. The first is the blocking of child abuse images, which has been managed by the Internet Watch Foundation since 1996 and whose recommendations are implemented by BT in the form of Cleanfeed. In 2013, the leading search engines agreed to filter out search results leading to such material. The second is court-ordered blocking of sites accused of facilitating copyright infringement, such as Newzbin2 and The Pirate Bay. The third is the wholesale of blocking of "objectionable material" on the basis that children need to be protected from accidentally stumbling upon it. Mobile operators have long operated such blocks by default, and in December 2013, under government pressure, BT turned on a system of "family-friendly" filters bought in from the US company Nominum that subscribers must actively opt out of. Besides sex-related material (including sex education and gay and lesbian sites), the categories include violent material, "extremist related content", "anorexia and eating disorder websites" and "suicide related websites", "alcohol", and "smoking", as well as "web forums", "esoteric material", and "Web blocking circumvention tools". Parents are not given tools to test what will be blocked in practice when signing up. The main public wifi providers have also committed to applying "family-friendly" filters wherever children are likely to be present. The upshot is that the UK has one of the most comprehensive Internet censorship systems in the world.

Background

There are many technical reasons (PDF) why blocking is ineffective. While it prevents accidental discovery by the majority and allows authorities to claim they have "done something", blocking gives a false sense of security. It is trivially easy to bypass for the portion of the population – both site owners and users – who are determined, motivated, and technically adept. In addition, blocking is a crude instrument, carrying with it the significant risks of over-blocking, of insufficient redress, of damage to innovation, and of driving the widespread adoption of avoidance measures such as encryption and anonymising technologies. Moving a step forward to ban or break these would have serious repercussions for the privacy and security of consumers online. In areas such as hate speech and adult material, blocking glamorises the target, giving it the excitement of "forbidden fruit" and its purveyors the aura of martyrs. In still others, some options may be actively harmful: O2's mobile blocking, for example, enables an abusive adult to block access to Childline and the Samaritans.

Politicians and mass media tend to conflate these three types of censorship. The Open Rights Group, however, draws a strong distinction between blocking material that is illegal and that which is not. Child abuse images are illegal in almost every country in the world, and there is general agreement on the appropriateness of blocking and removing them from the Internet. The other categories listed above, however, are much less easy to agree on or define. Search engines and ISPs are not copyright experts, nor are they qualified to pass judgment on what constitutes hate speech or pornography; the upshot will be over-blocking of legal material out of an excess of caution and a desire to save on costs. The better answer to online copyright infringement, for example, is fast, reliable, high-quality, reasonably priced legal services, which the entertainment industry has been slow to develop.

 

ORG's view

The Open Rights Group believes it is right to give parents the tools to manage Internet access at home according to their own values. But we believe these tools should provide transparency and choice about what is being blocked, and that social problems should be solved by appropriately drafted and publicly debated social policy. Specifically, ORG advocates: transparency to end users about what is blocked (for example, by using Error 451); transparency to site owners about whether their sites are blocked, by whom, and how to pursue redress; and defaults that make filtering opt-in rather than opt-out.

If used at all, blocking should be necessary, proportionate, and the best way of achieving the stated goals, based on independent evidence assessed by a court. It should respect fundamental human rights such as freedom of expression. It should be transparent. And it should be implemented through a fair and clear legal process. Censorship should not be implemented through "streamlined" quasi-judicial arrangements with ISPs. So far, none of the government's initiatives passes these tests.

What you can do

 

all mobile network operators currently do)