A new report from Open Rights Group and LSE Media Policy Project reveals widespread over-blocking on mobile networks, helping to demonstrate why we shouldn't accept default-on adult Internet filtering
Today we're launching a new report called "Mobile internet censorship: what's happening and what we can do about it", which is a joint publication with LSE Media Policy Project. You can download a pdf of the report and LSE Media Policy Project will be posting responses on their blog, which we'll also be cross-posting these on ORGZine today.
The report is about how mobile operators' child protection filters work. It shows how systems designed to help parents manage their childrens' access to the Internet can actually affect many more users than intended and block many more sites than they should. It reveals widespread overblocking, problems with transparency and difficulties correcting mistakes.
We argue that mobile operators need to offer an 'active choice', be far more transparent and open, and provide easier ways to address errors.
More broadly, the report helps emphasise that the 'neo Mary Whitehouse' campaign for default blocks, led by Claire Perry MP is calling for the wrong solution in looking to 'default on' filtering. The lessons from mobile filtering suggest fixed-line Internet filtering should concentrate on users and devices rather than networks, be properly described as 'parental controls' (because the content blocked is far broader than adult sexual material) and above all involve an 'active choice', not be set by default.
Without that guarded approach, seemingly simple, laudable goals such as protecting children through technical intervention may have significant harmful and unintended consequences for everybody’s access to information.
The report is based on reports of inappropriate blocks provided to our website Blocked.org.uk through January to March. These were cases where sites or services were blocked that should not have been. Working with a small group of volunteers, we received over 60 reports, including personal and political blogs, sites for restaurants, and community sites. Here are some examples:
- Biased-BBC (www.biased-bbc.blogspot.co.uk) is a site that challenges the BBC’s impartiality. We established it was blocked on O2 and T-Mobile on 5th March.
- St Margarets Community Website (www.stmgrts.org.uk), is a community information site ‘created by a group of local residents of St Margarets, Middlesex.’ Their ‘mission is simple - help foster a stronger community identity.’ We established it was blocked on Orange and T-Mobile on 8th March.
- The Vault Bar (www.thevaultbar.co.uk) in London. We established that the home page of this bar was blocked on Vodafone, Orange, and T-Mobile on 6th February.
- Shelfappeal.com was reported blocked on 15th February 2012 on Orange. This is a blog that features items that can be placed on a shelf.
- ‘Tor’ (www.torproject.org). We established that the primary website of this privacy tool (meaning the HTTP version of the Tor Project website, rather than connections to the Tor network) was blocked on at least Vodafone, O2 and Three in January.
- La Quadrature du Net (www.laquadrature.net/en). The website of this French ‘digital rights’ advocacy group was reported blocked on Orange’s ‘Safeguard’ system on 2nd February. La Quadrature du Net has become one of the focal points for European civil society’s political engagement with an important international treaty called the Anti-Counterfeiting Trade Agreement. The block was removed shortly after we publicised the blocking.
Last week - too late to be included in this report - we wrote about how the site of peace advocates Conciliation Resources was blocked on Orange, O2 and Vodafone. O2's URL checker classified the site as pornography (the mistake now seems to have been corrected).
Our mystery shopper exercise also helped show that the mobile operators find it difficult to respond to reports of mistakes, especially when a site is stuck behind a filter for no good reason.
There are serious consequences to badly implemented, default child protection blocking systems. They include restrictions on markets, censorship, a failure to address young people's diverse needs and a false sense of security for parents.
Some simple changes to how mobile operators run their filtering services would help address many of the problems with mobile filtering - including better ways to choose to activate filtering on an account, more transparency about how the filtering works and simpler more effective ways of addressing mistakes.
In the longer term there should be an effort to move away from filtering at the ‘ISP level’ towards device-based filtering.
You can read the full report for more. We hope it helps contribute to a sensible child protection strategy, rather than one based on the overly-simplistic, albeit emotionally appealing, proposition that children need to be protected from seeing things parents don't want them to see. We need tools for parents that help them manage, through responsible and engaged parenting, their childrens' access to the Internet - that does not have to mean unresponsive and broad network-level filtering.
Update: responses to the report
As part of encouraging debate around the report we've been inviting responses from a variety of organisations. The first of these are now live.
- Hamish MacLeod, Chair of the Mobile Broadband Group, on ORGZine.
- Jo Glanville, editor of Index on Censorship, has a response up on the LSE Media Polocy Project blog.
- Jamie Bartlett, Head of the Violence and Extremism Programme at Demos, has responded with the post 'Filters are not the answer'.