call +44 20 7096 1079
June 13, 2013 | Jim Killock

Website filtering problems are a “load of cock”

On Tuesday, I spoke at an event organised by the Sunday Times and Policy Exchange about online pornography and child protection. This was in the run-up to the opposition debate that took place in Parliament on Wednesday on these topics.


The motion laid down by Labour says:

That this House deplores the growth in child abuse images online; deeply regrets that up to one and a half million people have seen such images; notes with alarm the lack of resources available to the police to tackle this problem; further notes the correlation between viewing such images and further child abuse; notes with concern the Government's failure to implement the recommendations of the Bailey Review and the Independent Parliamentary Inquiry into Online Child Protection on ensuring children's safe access to the internet; and calls on the Government to set a timetable for the introduction of safe search as a default, effective age verification and splash page warnings and to bring forward legislative proposals to ensure these changes are speedily implemented.

The "1.5m" statistic has been debunked elsewhere, but the alarming point here is the deliberate conflation of child abuse images and legal material, potentially accessed by children. The motion slips from talking about child abuse images, to 'safe searches' to protect children from seeing adult material. Just as worrying is the adoption of a position in favour of default blocking by Labour. You can read a transcript of the debate on Hansard.

claire perry at policy exchange, Policy Exchange CC-BY

This is a symptom of a wider problem with this debate - a failure to properly distinguish between different categories of content, and the different methods of dealing with them.  That requires at least some understanding of the technology - the details matter.

A further problem is an unwillingness from some MPs to appreciate or even acknowledge the problems with technical solutions. In the debate on Tuesday, I tried to outline the problems with filtering, including the over and under-blocking of content.

Claire Perry helpfully described such problems as a "load of cock". Helpfully, because such a comment would be very likely to be caught by a filter and cause it to be blocked, while not, of course being pornographic. 

Claire also got applause for suggesting that blocked websites were simply collateral damage necessary to protect children. This is the kind of woolly thinking that thankfully got rejected by her government, which recognised that economic harm stems from blocking legitimate websites, for instance. After all, if you can protect children, and avoid blocking for adults, why not? Can some balance not be struck?

Unfortunately, in the eyes of many MPs, arguing for balance is betraying children. If any children can access more porn than we can technically prevent, then we have failed. Of course, filters don't always work and can be easily got round, but if our solution helps a bit, surely that is better than nothing?

These kinds of position, once you examine them, are pretty incoherent. Filters that don't work well will probably get switched off. Defaults that block too much may encourage people to remove the filters. Parents may assume their children are safe when filters are switched on. Software design is iterative not legislative; yet legislation is often favoured over industry engagement.

The child protection debate over the last two years has won Claire Perry many friends, who believe she has raised the profile of an issue and got results. Certainly, the fact that ISPs are building network level filters points to this, but I was intrigued by a question at the debate on Tuesday. Apparently children are installing Chrome, because it was suggested that helps them access porn sites and gets round filters.

We did try to tell Claire this kind of thing would happen, before she persuaded ISPs to spend millions of pounds on network filters. Even with filters, if parents leave children with admin privileges, they will be able to use their computers to trivially defeat any blocks. Some MPs in the debate in Parliament suggested only 'very clever' folk will be able to get round filtering. This isn't true – most children will find this easy.

Which leaves us with the harms on all sides, to websites, adults and children, without the supposed benefits.

Labour have essentially made the same mistake as Culture Secretary Maria Miller's letter to online companies, in which she invited Internet companies to a proposed 'summit':

Recent horrific events have again highlighted the widespread public concern over the proliferation of, and easy access to, harmful content on the internet. Whether these concerns focus on access to illegal pornographic content, the proliferation of extremist material which might incite racial or religious hatred, or the ongoing battle against online copyright theft, a common question emerges: what more can be done to prevent offensive online content potentially causing harm?

It is clear that dangerous, highly offensive, unlawful and illegal material is available through basic search functions and I believe that many popular search engines, websites and ISPs could do more to prevent the dissemination of such material.

The debate and letter confuse legal, illegal and potentially harmful content, all of which require very different tactics to deal with. Without a greater commitment to evidence and rational debate, poor policy outcomes will be the likely result. There's a pattern, much the same as the Digital Economy Act, or the Snooper's Charter.

Start with moral panic; dismiss evidence; legislate; and finally, watch the policy unravel, either delivering unintended harms, even to children in this case, or simply failing altogether.

ORG, Index on Censorship, English PEN and Big Brother Watch have written to the Culture Secretary Maria Miller demanding that civil society be present at her 'summit', to make sure these issues are addressed. We have yet to receive a reply.

google plusdeliciousdiggfacebookgooglelinkedinstumbleupontwitteremail


Comments (4)

  1. Eamonn:
    Jun 13, 2013 at 12:46 PM

    I saw an interview (think it was Newsnight) where an eminent politician said google should do more to block illegal content BEFORE it reaches the internet...what was he suggesting that google should implement an intent filter..."oh look you are thinking about uploading "porn" you can't do that..."

    do these politicians have the slightest clue how the internet actually works?

  2. Pete:
    Jun 13, 2013 at 01:39 PM

    It is my responsibility (and no one else) as a parent to supervise my children's use of lawful telecommunications - phone, email, post, internet, television, newsprint.

    No filter will ever discriminate between art, science, and porn. No filter will compensate for the relative maturity of the user. No amount of filtering will ever project children from insidious threats like grooming or bullying.

    It is just a complete fallacy.

    We should be educating our children about the value of free speech, freedom of association, freedom of expression, democracy, and their right to private communication free from unwarranted fascist state surveillance.

  3. Earl:
    Jun 17, 2013 at 12:11 PM

    There's a deliberate juxtaposition going on between child abuse imagery and the ability of children to access pornographic content. As pointed out by the article; they are two entirely separate issues, but are being used together as a rather incoherent excuse to propose a censorship framework.

    It makes me wonder whenever I read about these debates - does anyone ever suggest education? A well educated parent can very effectively prevent their child from accessing inappropriate content. Why do we always come to the conclusion that censorship and suppression are the answer to our problems? More regulation, more law, more control over the media that we choose to consume.

    Child abuse is a separate issue and must be respected in its own right if we are serious about tackling it. I'm afraid those with their own anti-pornography agenda are confusing the issue and muddying the water for a lot of people who are sincere about tackling child abuse.

  4. Tim:
    Jun 22, 2013 at 09:53 AM

    I heard Jim on Bringing Up Britain earlier this year. Although the panelists unfortunately failed to engage with his points, one said that filtering of the kind ORG says won't work is already working in Germany. Jim had finished by then, so nobody challenged it.

    Does anybody know about the German experience of porn filtering?

    PS
    I tried to search for "germany porn filtering", but my search engine returned results for "germany filtering". Still, it lead to this: http://www.zeropaid.com/news/9960/german_minister_announces_plans_for_mandatory_web_filtering/ . However, I haven't yet found anything about the consequences of the German policy.



This thread has been closed from taking new comments.