ORG’s work on free speech has become critical in the last year. In the last year, Internet censorship has become a fashionable and simplistic response from lazy policy makers. In some cases, it seems politicians are prone to thinking that they need an “Internet dimension” to their field, and the easiest response they can think of is to advocate censorship. Such thinking is not merely lazy: it is dangerous and counter-productive.
There are very few cases where censorship is likely to produce the desired results. The reasons are often very simple. Only in the most morally clear and outrageous circumstances is censorship likely to be acceptable to the majority of people.
The first, basic characteristic of censorship is that while it may work for the majority of the population, with no particular desire to access the material, anyone determined to get at whatever is banned, can do so. Thus, censorship may punish the innocent majority, but will not restrict the determined, sympathetic or criminal. This is particularly true of network censorship, as the Internet is inherently designed to allow content to travel as freely as possible, and to create many, many ways to communicate.
The second characteristic is that the target of censorship will appear to be both victimised and perhaps more important than they are. Censorship is the tool of the state, which wields a lot of power. Anyone censored is easily seen as a David in a struggle against a Goliath, but also is by implication actually creating a threat worth restricting. Both give impetus to the person or group suffering censorship.
Censorship often also allows governments to pretend that the social problem has been dealt with. Sometimes, the people using or generating the censored content continue unabated, while the pressure to deal with serious crime is reduced, as it “appears” to have been made inaccessible.
While censorship is generally poor public policy, when it is proposed, there are duties that governments must follow if they are to abide by international human rights norms. Censorship must be ordered by a court, it must balance the interests of the parties involved, and limit the impact on free expression. It must not take place as the result of automated procedures, algorithmic selection, or through private selection of material, under the guise of “self regulation” by industry.
Nominet have for around two years been responding to police notification that .uk domains are being used for criminal activity, mostly in relation to fraud, false ticket sales and fake branded goods. Several thousand domains have been suspended, and on a few occasions complaints by their owners have been referred to and adjudicated by the police units requesting the suspensions.
The London Met’s e-crime unit have been making the notifications, although they have no specific power to compel Nominet to suspend a domain. The police and other agencies have made the argument that Nominet can or should act, through enforcement of their terms and conditions with their customers. Additionally they have stated that Nominet may be liable under the Proceeds of Crime Act.
Nominet created an “Issue Group”, composed of police, industry and civil society representatives including ORG to look at creating a mechanism for UK police and law enforcement to notify Nominet on domains used for criminal activity. Nominet wished to know how it might protect the interests of its customers and the general public, including by helping to make criminal material harder to find, where Nominet was the only reasonable means to take action.
Although the group did try to find a workable way to balance the interest of everyone involved, it became apparent that the positions of civil society and the Internet industry were at odds with Nominet’s practice and law enforcement’s desires.
LINX, ISPA, Privacy International and ORG were of the opinion that court orders are the only proper protection of people’s fundamental right to access to justice. This right is not something that can be given up. Where the police ask for domains to be suspended, they need to be fully accountable to open legal processes, and any internal procedure created by Nominet would not deliver that. If new police powers are needed, that is something that Parliament should debate.
At this point, Nominet are deciding what to do with the advice they have been given. ORG has scored a success in being represented on the inside, and being able to help a coalition of civil society groups to influence the development of policy.
This summer, politicians from David Lammy to the Prime Minister were keen to put Blackberry Messenger and other tools in the spotlight after rioting took place across the UK. These tools had been used to organize the rioting, it was alleged, and they should therefore be shut down in times of disquiet.
Such an idea is obviously dangerous, from the point of view of unbalanced powers, the example it sets to undemocratic regimes elsewhere, and even the physical risks it would place on individuals trying to escape from physical threats, such as burning and looting.
ORG responded by mobilizing free speech groups firstly to write to the government about their approach, and then in the Autumn, to contrast William Hague’s public commitments to a free and open Internet, and respect for human rights, with the UK governments’ stance on a number of human rights issues, including default censorship of adult content, and the apparent desire to shut down communications networks during the UK riots.
ORG’s suspicion was that the government’s intentions would shift from switching networks off, to finding new ways to increase surveillance. New ideas may find their way into the Communications Capabilities development Plan: but so far nothing has been publicly released.
UK censorship proposals
Mobile and adult content censorship
Last summer, Claire Perry started a campaign for adult content to blocked on Internet networks by default. She argued that children and others could access adult material by accident, and that this could be harmful to them. She created a broad coalition of MPs to campaign for default blocking that adults could ‘opt out’ of, much as happens on the mobile Internet today.
The censoring of websites through automated and opt out systems is likely to serve nobody well. It poses privacy risks and creates a moral impetus to live with censorship: who wants to say, yes, I want pornography? Most of all, it creates a huge problem, even today, for websites to find out when they are accidentally blocked and get those blocks removed.
We also need to be sensitive to the harms that automated child safety causes to children. Automated filters that are set at a level.
Right to parody
Parodies can fall foul of copyright law. If your parody “copies” anything from a film or book you are parodying, then you can be accused of copyright infringement in the UK. Yet many countries, including France, Australia and the USA, permit people to use copyright material in this way.
For ORG, this is a simple free speech issue. You can’t be expected to get someone’s permission to make fun of them. And you can’t avoid using material like logos and designs when making a parody. Particularly for campaigners who want to parody corporations to show up their double standards, or for artists who want to engage with the everyday experience of commercial culture, these rights are vital.
We therefore started, in 2011, a right to parody campaign. Our aim was to find campaigners and artists who knew what it meant to be on the wrong end of the stick. The campaign was picked up by the B3TA Community, and has been a very significant success. There are some great parodies generated by their parody image challenge. And since then, we’ve collected evidence from artists such as Swede mason and Cassetteboy, who have suffered as a result of these copyright restrictions, and whose creative careers have arguably been stifled.
Frank la Rue, the UN and international human rights evaluations
The UK and France were singled out for criticism by UN Human Rights Rapporteur Frank La Rue last year.
This would not have happened but for the campaign ORG and others ran against the Bill. We drew special attention to the impacts of website blocking and of disconnecting families for alleged copyright violations; Frank la Rue’s report for the UN confirmed our view that civil offences should not result in punishments directly impacting people’s right to communicate.
The UN makes assessments of its members’ human rights records, in a process called ‘Periodic Review’. This year, it is the UK’s turn. Last year, we engaged with this process and submitted evidence to the UN on the range of ORG’s work. We were able to cite the report from Frank La Rue. His views are hard to dismiss, but it is ORG’s campaigning work that has helped lead us to be able to challenge the UK directly on the Act’s human rights impacts.