call +44 20 7096 1079


August 01, 2013 | Jim Killock

Diane Abbott responds on web forum blocking

The word about the breadth of nudge censorship or default filtering is spreading. Categories such as "web forums" may well be pre-selected when adults enable filters.

diane_abbott_cc-by-nc-sa-birmingham-museum-and-art-galleryOn a cycling forum, members who are rightly worried that their forum may be blocked by default filters, Skydancer posted a response he was given by Diane Abbott:

I do not believe that the arrangements to protect children from hard core porn online will affect a forum to discuss cycling! I think that men, who think that viewing hard core porn without let or hindrance is some kind of human right, are deliberately exaggerating the effect of the suggested arrangements

I asked Diane Abbott about this, and to her credit she replied. The conversation was private, so I won't quote her replies, but I think it is common knowledge that Diane believes that default filters are more effective. I think it is also common knowledge that she is prioritising child safety and wants everything done possible to block children's access to pornography as something very seriously harmful. However, in the statement above we are clear that she has made a factual mistake. Forums and social media are very likely to be the target of default parental filters.

The question for Diane Abbott should be, given categories like web forums are likely to be blocked by some of the filters and any box is very likely to be pre-selected, does Diane believe these sites should be blocked by default? Or does she not care?

But it isn't just her policy. Claire Perry and David Cameron have taken credit for it, and the ISPs have agreed to it.

Let's define the intent as:

To limit access of children to pornography by maximising the number of households where pornography is blocked for children

and examine the “nudge censorship” policy against that.

Problem one: breadth for adults and children

The "pre-selected" categories may include very broad types of content, such as "alcohol, drugs and tobacco," "social media" and "web forums". The impact is broader than just "pornography".

Let's measure this purely against the objective above. Firstly, it is irrelevant to it. Secondly, if web filters appear to be blocking too much content, there is surely a danger that some parents will simply switch them back off, not least because their children constantly ask for sites to be unblocked.

Problem two: collateral damage

Collateral damage in this case means yes, climbing and cycling clubs, pubs, bars, being blocked. We can add mistakes to this as well.

Collateral damage is something that any sensible policy should minimise, from a public policy point of view. It is no good to say "your climbing club is worth less than my child", a decently designed policy should aim to meet the needs of both.

Problem three: commercial sites want access to their market, adults want access to porn

This is a problem where the outcome is difficult to predict. If 30% of your potential market is suddenly lost to you, as adults in households with children find porn blocked, then these publishers may go to efforts to get it back. This might make filtering less effective, we don't know. One can imagine apps being a means to distribute, for instance. Perhaps pornographic spam will become more popular, reaching children indiscriminately.

Equally, adults who find pornography blocked are placed in a difficult situation, where the outcome for children is again unpredictable. Perhaps irresponsible parents will simply switch the filters off, leaving vulnerable children even more vulnerable. Perhaps, if filters were targeted at children, and aimed to keep adults out of them, this would be less of a worry.

What we are dealing with is software design, not social engineering

What I am getting at is that government insisting that certain ways of setting up a software system, like specifying that categories are "pre-selected", that buttons must say "next", that filtering must be network based, is simply to make a category error.

Government has an objective, but is negotiating the form of the user experience as if it had omniscient knowledge of user behaviour. It doesn't – it has a few gut instincts. It needs to separate the objective of limiting children's access to pornography from the means to deliver it, especially as we get into the details.

In government policy terms, it is extraordinary for Claire Perry and David Cameron to be staking their reputations on whether or not boxes are "pre-selected" and whether forced choices (active choice) is better than a kind of default option.

Lawrence Lessig famously said that "Code is Law", meaning that the operations of machines increasingly determine social outcomes, such as automatic content identification and DMCA takedowns.

Without really knowing why, politicians are stumbling on this concept, and becoming amateur software UX (user experience) designers. Unpredictable consequences will ensue. Please sign the petition!

Footnote: interestingly, Richard Thaler did his best yesterday to distance himself and the Number 10 unit from these proposals. I am not sure if he is correct (maybe this is 'bastard-nudge') but he clearly doesn't want to be associated with "nudge censorship".

[Read more] (16 comments)

July 31, 2013 | Jim Killock

Government wants default blocking to hit small ISPs

"Preselected" parental filters are now official policy, and should extend to small ISPs, according the the Department of Culture, Media and Sport's (DCMS) new strategy paper.

david_cameron_cc-by_gpaumier.jpg Announced without fanfare, this is the result of several years work on a Communications Bill, now parked, it seems.

The strategy says "we need good filters that are preselected to be on ... the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on"

They state that "We expect the smaller ISPs to follow the lead being set by the larger providers".

Finally, DCMS demand ISPs give them magic beans (“We want industry to continue to refine and improve their filters to ensure they do not – even unintentionally – filter out legitimate content”) and threaten them with regulation if they do not answer to future demands, or “maintain momentum”.

Take action and sign our petition against default Internet filtering.


Currently 91% of children live in households with internet access and a greater proportion of children aged 12-15 own smartphones than adults. While consenting adults should be free to watch the legal content they choose, children and young people are important consumers of digital content and their ability to access harmful and age inappropriate content should be limited as far as possible.

The Government has been working through the UK Council for Child Internet Safety (UKCCIS), which brings together more than 200 organisations across the information and communication industries, law enforcement, regulators, academia, and charities – to pursue a voluntary approach to child internet safety and has called on industry to make the right tools available to allow parents to protect children online.

We are seeing good progress in this area:

• Where children could be accessing the internet, we need good filters that are preselected to be on, and we need parents aware and engaged in the setting of those filters. By the end of this year, when someone sets up a new broadband account, the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on.
• By the end of next year ISPs will have prompted all existing customers to make an unavoidable decision about whether to apply family friendly filters.
• Only adult account holders will be able to change these filters once applied.
• All mobile phone operators will apply adult filters to their phones.
• 90% of public Wi-Fi will have family friendly filters applied to wherever children are likely to be present.
• Ofcom will regularly review the efficacy of these filters.

But we are clear that industry must go further:

• We expect the smaller ISPs to follow the lead being set by the larger providers.
• We want industry to continue to refine and improve their filters to ensure they do not – even unintentionally – filter out legitimate content.
• We want to see mobile network operators develop their child safety services further; for example, filtering by handset rather than by contract would provide greater flexibility for parents as they work to keep their children safe online.

And while Government looks to the industry to deliver, through the self-regulatory mechanisms already established under UKCCIS, we are clear that if momentum is not maintained, we will consider whether alternative regulatory powers can deliver a culture of universally-available, family-friendly internet access that is easy to use.

[Read more] (6 comments)

July 31, 2013 | Jim Killock

Twitter abuse debate moves on

The Twitter abuse debate has moved on significantly, onto the question of what the police are doing, and what difference that can make.

The police reacted swiftly, to try to relinquish responsibility, for instance:

Andy Trotter, who leads on social media for Britain's police forces, told the Guardian he feared that "a whole new tranche" of web-based hate crimes could "cause great difficulty for a hard-pressed police service" trying to deal with what could amount to thousands of allegations.

"We want social media companies to take steps to stop this happening. It's on their platforms this is occurring. They must accept responsibility for what's happening on their platforms ...
"They can't just set it up and walk away. We don't want to be in this arena. They are ingenious people, it can't be beyond their wit to stop these crimes, particularly those particularly serious allegations we have heard of over the weekend."

As Lilian Edwards asks:

What exactly do we have police for, then, if not to investigate specific, repeated and documented crimes? Giving up on policing Twitter is no more defensible than abandoning  a town like, say, Walthamstow to the criminal elements.

For a senior policeman, Mr Trotter also seems sadly ignorant of the law. Even leaving aside the issue of threat of rape as a common law crime, which might involve some difficult issues of sufficiently proving intention (though not many), the Protection Against Harassment Act 1997, especially s 4(1) makes it very clear that two attempts to "cause another to fear that violence will be used against him [sic] " form a course of conduct which is a crime. In the Perez and Creasy cases there are apparently hundreds of such threatening tweets, many retweeted or screencapped.

 It is impossible to understand how police who went ahead with investigating cases which involved poorly framed jokes on Twitter can now say they do not have the money to take on genuine, vicious  and entirely humorless threats of rape. It seems much more likely that they fear  they do not have the technical ability to understand how to police the Net , or the resources, and are terrified, and also worried that having destroyed their credibility on the Net once (see below), things can only get worse. But in that case the remedy is to acquire expertise, not to retreat to a pre 1996 position of declaring the social Internet terra incognita where elephantine trolls roam.

As Lilian implies, asking the police to investigate these crimes may not always inspire confidence among people who are regular users of Twitter. You may be reminded of the heavy handed tactics they employed against Paul Chambers after his joke, or the prosecutions of other people for suggesting that British soldiers deserve to die for their crimes in Afghanistan.

These prosecutions took place using Section 127 of the Communications Act, which criminalises "grossly offensive" speech using a public communications network. It covers a very wide range of potential speech, and it is unclear why the offence exists. In any case, its abuse has led to the Crown Prosecution Service advising that a "high threshold" should be employed in relation to use of S127, because of the free expression impacts. In their advice they note that human rights courts do not hold that mere offensiveness should result in criminalisation of speech. Offensiveness may be necessary to challenge ideas, after all.

The CPS says that communications that are either "credible threats" or "which specifically target an individual or individuals and which may constitute harassment or stalking within the meaning of the Protection from Harassment Act 1997" "should be prosecuted robustly".

Thus the CPS seems to believe that the police should be playing their part by investigating behaviour which would clearly be illegal. This is different from non-credible threats, and activity which does not constitute harassment.

To balance their calls for strong action on illegal activity, Stella Creasy and Yvette Cooper should callfor Parliament to repeal Section 127 so that matters of offensive but non-threatening, non-harassing speech remain clearly out of scope.

[Read more] (10 comments)

July 30, 2013 | Ed Paton-Williams

A quick guide to Cameron's default Internet filters

Last week, David Cameron announced plans to introduce default adult Internet filters for everyone. Here's our quick guide to the issues around default Internet filtering.

David Cameron wants all British Internet users to make an "unavoidable choice" on whether to switch on default filtering.

Crucially, he thinks people should have to actively opt out if they don't want Internet filters. The boxes that accept the filters would be pre-ticked.

Households that leave the filters turned on would, in theory, be unable to access websites with material categorised as inappropriate for under-18s.

Why is David Cameron proposing default Internet filtering?
David Cameron sees default adult Internet filters as the easy way to protect children online. "One click to protect your whole home and keep your children safe" is how he described his plan last week.

Will the filters only block pornography?
No. The filters will block all sorts of websites. Open Rights Group has spoken to the Internet Service Providers who would be responsible for implementing Cameron's plan.

If you are in a household with filters turned on, you would be unable to access websites that fall into categories such as web forums, violent material, alcohol, smoking, web blocking circumvention tools as well as suicide related websites, anorexia and eating disorder websites and pornography.

Are filters a reliable way to regulate access to the Internet?
Based on the evidence so far, no. UK mobile operators like O2 and Vodafone already block websites that are thought to be unsuitable for under-18s. There are problems with mobile blocking that will probably be replicated with broadband blocking.

ORG's found that mobile operators regularly block websites that shouldn't be blocked. Sites that have been blocked by mistake include church websites because they mention wine, shops selling tobacco pipes, political blogs miscategorised as hate speech, lingerie shops for no clear reason and many more.

There are also concerns that people will find it harder to access crucial advice on sexual health, sexuality and relationships as these sites may be mistakenly blocked.

If a site is blocked by mistake, how hard can it be to just unblock it?
When people find mobile operators blocking websites by mistake, they have found it very difficult to get them to remove the block. It may be hard for website owners to know if their site has been blocked.

Broadband providers would need to train their customer service staff to quickly handle complaints about incorrectly blocked websites.

Adults will be able to choose to switch off filters. Won't they just disable them immediately?
People tend to accept defaults. The 'nudge theory' that Cameron uses to try to influence our decisions and behaviour takes that as a given.

Encouraging everyone to accept adult Internet filters means millions of adults will lose access to all sorts of material rightly or wrongly categorised inappropriate for under-18s.

How would you turn the filters off after you'd turned them on?
David Cameron says "filters can only be changed by the account holder." His intention is to stop children turning the filters off without their parents' knowledge.

This approach will also cause some problems though. We know that sites will be blocked by mistake. So, for example, people in an abusive relationship who want to access a site about domestic violence may be unable to do so. That site might be blocked by mistake and they wouldn't be able to turn the filter off to access it without the abuser knowing.

What are the alternatives to default Internet filtering?
Parents should be able to manage their children's Internet access and some people do want household-wide filtering. Cameron's message of 'Set it and forget it' is unhelpful though as it risks giving people a false sense of security.

People should be asked to make an active and informed choice about what sorts of websites devices in their household can visit. This means that the boxes to choose which filters to turn on should not be pre-ticked and there should be real transparency about which sites the filters would block.

The Governent should ensure parents are aware that turning filters on does not immediately make the Internet safe. Government should also encourage parents to talk to their children about what they do online and offline.

The Government hasn't done enough to encourage and promote easy-to-use device-based filters.

Some sexual health groups have called for the Government to ensure children have high quality sex and relationship education (SRE) but the Government recently voted down mandatory SRE.

What can I do about default Internet filtering?
Open Rights Group has launched a petition calling on David Cameron to drop his plans for default Internet filtering. Click here to sign the petition.

[Read more] (5 comments)

July 29, 2013 | Ed Paton-Williams

Take action: Call out Cameron on online censorship

David Cameron is asking Britain to sleepwalk into censorship. Everyone agrees that we should try to protect children from harmful content. But unprecedented filtering of legal content for everyone is not the answer.

David Cameron wants your ISP to push you into switching on web filters. These proposals are incredibly dangerous. Cameron should drop them immediately.

David Cameron Stop Default Internet Filtering

ORG's been spending a lot of time speaking to the media to make sure Cameron and the pro-blocking lobby don't go unchallenged.

But we need another way of putting some pressure on and letting him know how many people think his plans are mistaken.

Tell David Cameron to drop his plans to turn on adult filters by default. Sign our petition to him now!

The Government should help people make informed choices about what sorts of websites their household can visit.


Introducing adult filters as default is not an effective way to help make children safe and will cause huge issues that Cameron can't just dismiss as teething problems.

"Set it and forget it" is the wrong message to send to parents. It risks giving people a false sense of security. Just because ISPs block adult content, children are not automatically safe. Content that you might want to block will still get through. And filters won't stop sexting, bullying or stalking on the Internet.

Instead of trying to make decisions for parents, the Government should give them better support. Help parents to actively decide which sites to make more difficult for their children to visit. Concentrate on encouraging them to talk to their children about their online and offline activity.

Let David Cameron know his plans aren't the answer to keeping children safe. Sign the petition now.

We've seen mobile operators apply parental controls with terrible results. When ISPs use adult filters, they block sites by mistake. Lots of people need to be able to access educational material on sexual health and sexuality. But adult filters can miscategorise this sort of advice as pornography and stop people of all ages finding crucial advice.

David Cameron's wrong to put people's access to crucial information at risk. Please sign the petition to tell him you're against his plans.

David Cameron mentions adult filtering and stopping child abuse in the same breath. But accessing adult content is legal. Accessing illegal content is clearly a crime and should be treated completely differently.

So creating a register of people who want to access adult content or just want unfiltered Internet access brings serious and unjustified privacy issues. How will people on this register be treated? Who will have access to these databases? Will it be secure? The Government needs to address these questions.

Adult filtering affects everyone, not just children or families. So after you've signed the petition, can you share the link with your friends, family and followers?

Facebook Share button
  Twitter Share Button  Google Plus Share button  Email share button

[Read more] (4 comments)

July 29, 2013 | Jim Killock

Twitter abuse: let’s debate what the police are doing

Calls for abuse buttons on Twitter may help a bit, but we should look to the police for law enforcement.

Police, Brierley Hill, cc-by-sa West Midlands PoliceRape threats are vile. They are also illegal. Harassment is also an offence. The recent spate of such threats against Caroline Criado-Perez resulted in a call for a Twitter 'abuse' button.

Now that somebody has been arrested for threatening Caroline Criado-Perez, the debate should shift to where it should have started. How should the police react to complaints of online harassment and threats of violence?

From a campaigning standpoint, focusing on Twitter seems to make sense. Twitter have a customer base and reputation they need to protect. Rape threats are unacceptable, and Twitter will be under immense pressure to take action. Inaction looks like protecting the bottom line. People will understand that campaigning can have an effect in raising the issue of online threats and abuse. Labour have joined in with Yvette Cooper accusing Twitter's response of being 'inadequate'.

Several campaigners speaking in favour of a new 'abuse button' said they had focused on Twitter because the police had failed them.

Kate Smurthwaite said that she had on some occasions approached the police, who advised her to ignore the behaviour, or come back if she had any information about who it was. Caroline Criado-Perez also reported that she had complained to the police, but did not think they would act. Hence the focus on Twitter.

It is a common complaint I have heard even from MPs who have suffered online abuse.

Their complaints to the police were about criminal acts, beyond the point where free speech is at issue. The actions that Twitter can take are limited, and truly, 'inadequate'. Twitter could at best delete an account of an abuser, leaving them unpunished and able to set up a new account and carry on abusing. They might pass information to the police: but this could easily lack context and would be prone to errors of judgement.

Depending on the volume of complaints, abuse buttons would use automated sifting techniques. Thus they could be prone to abuse, or at the very least, mistakes. This may not mean that they are irrelevant, but they may not be as useful as is hoped.

We should worry if we make companies the main source of 'redress' for lawbreaking. From copyright through to libel, they are often badly placed to make fair judgements. Legal risk carries greatest weight rather than justice for someone complaining or complained about when a company takes an action. Companies making judgements can be a kind of privatisation of law enforcement that removes pressure on the police to deal with online offences.

After all, if a company deals with the problem statements, why shouldn't the police assume the problem has been solved and deal with something "more serious"?

There is a parallel between the proposals Cameron pushed last week, trying to blame search engines for inaction on paedophilia, and ISPs for failing parents. Even if companies can take some steps (and as difficult and complicated though that may be) the most important initiatives are in the hands of the state. For child abuse, policing is preferable to giving up and asking companies to take steps to obscure access to content, whether by search or 'web blocking'. In the case of children avoiding pornography, effective education and discussion are much more critical than whether ISPs sign up to "nudge censorship". 

In the case of victims of harassment, the police need to investigate, arrest and prosecute offenders. No doubt, after a few cases, people would start to avoid crossing a line. Just as importantly, it could create confidence among the victims that their complaint might be dealt with sympathetically by the police.

So far, incidents do seem to be resolved when the police have them drawn to their attention though mass media and online campaigning. This is not acceptable, however, if equally threatening situations are not taken seriously, purely because of a lack of public attention.

Once Twitter have provided better reporting tools, campaigners should focus back on the police, rather than starting from a point of view where they regard them as a lost cause.

[Read more] (30 comments)

July 27, 2013 | Jim Killock

Who exactly is responsible for 'nudge censorship'?

We have no legislation, a contradictory official government policy, and ISPs promising that they will deliver a 'pre-selected' censorship approach.

claire-perry-cc-by-policyexchangeIn essence, DCMS's Maria Miller, Claire Perry and David Cameron's staff have hijacked agreed cabinet policy, pushed for something very different and persuaded ISPs that they should implement significantly worse policies than originally envisaged.

This is what was agreed in December 2012: some kind of compulsory prompt for parents to enable filters, that “does not impose a solution on adult users or non-parents”. Network filtering was never specified, although easy 'whole home' solutions were preferred. These could be in the hands of parents, at the router, rather than being placed in dangerously easily reconfigurable centralised ISP equipment.

For whatever reason, DCMS and Perry have been pushing both network filtering and 'nudge censorship' onto ISPs. ISPs have agreed; now those of us who think government has got it wrong have nobody clear to pressurise.

ISPs appear to have caved into the overwhelming PR issue that child protection can be, especially when conflated with the separate issue of child abuse images. But by refusing to insist that the government legislate, if it wants such specific provisions, they have opened themselves up to a number of problems:

  1. Are ISPs responsible for incorrect blocks?
  2. Are ISPs financially liable for incorrect blocks?
  3. What happens when government suggests that 'terrorist content' be blocked with not 'opt out'?
  4. Are ISPs responsible for adopting the nonsense 'preselected censorship' policy – as it is not official government policy, but apparently the personal position of Claire Perry and DCMS heads such as Maria Miller?
  5. Will Claire Perry continue to have a personal veto on the nature of broadband set up screens?

Finally, we would like to know if the Lib Dems are happy for Claire Perry and Cameron to go off piste in this way.

Here's the official government policy from December, so you can read what ISPs are actually meant to be doing, and how different that appears to be from what they have agreed to with Perry and Miller.

23. Although there was only minority support among parents for the three options consulted on, the Government does not believe parents are uninterested in their children's safety online: the very high percentages of parents who think they have the responsibility for their children's safety suggests otherwise. However, the offer to parents should be reformulated in a way that ensures that children can be given the levels of protection their parents think is appropriate for them, reduces the risk of uninterested parents avoiding online safety issues, and does not impose a solution on adult users or non-parents.

24. Our approach to child internet safety should therefore evolve in ways so that it: 

  • actively helps parents to make sure they have appropriate safety features in place when their children access the internet and also encourages them to think about issues such as grooming, bullying and sexting as well as potentially harmful or inappropriate content
  • covers existing ISP customers as well as new ones prompts or steers parents towards those safety features
  • makes it easier for parents to take charge of setting up the internet access their children will have, and less likely that they will abdicate this responsibility to their children

25. The Government is now asking all internet service providers to actively encourage people to switch on parental controls if children are in the household and will be using the internet. This approach should help parents make use of the available safety features without affecting internet users aged 18 and over who can choose not to set up controls.

26. Internet service providers have made great progress to date in implementing "active choice" controls where all new customers are asked if they want to switch on parental controls. The Government is urging providers to go one step further and configure their systems to actively encourage parents, whether they are new or existing customers, to switch on parental controls. The Government believes providers should automatically prompt parents to tailor filters to suit their child's needs e.g. by preventing access to harmful and inappropriate content. We also expect ISPs to put in place appropriate measures to check that the person setting up the parental controls is over the age of 18. This builds on the child internet safety approach already established by the four main ISPs by steering parents towards the safety features and taking responsibility for setting up those that are most appropriate for their own children. It will also help parents think about the knowledge and skills children need to prevent harm from the behaviour of other people on the internet: we are clear from the consultation that parents are conscious of these risks as well as those posed by age-inappropriate content.

27. This is only one part of the approach which the Government is pressing for. All of the information and communication industries, including retailers and device manufacturers, should work to develop universally-available family-friendly internet access which is easy to use. The Government wants to see all internet- enabled devices supplied with the tools to keep children safe as a standard feature.


We've launched a petition calling for David Cameron to drop his plans for default Internet filtering. Sign the petition here:

[Read more] (16 comments)

July 25, 2013 | Jim Killock

Sleepwalking into censorship

After brief conversations with some of the Internet Service Providers that will be implementing the UK's "pornwall" we've established a little bit about what it will be doing. To be fair, the BBC were pretty close.

The essential detail is that they will assume you want filters enabled across a wide range of content, and unless you un-tick the option, network filters will be enabled. As we’ve said repeatedly, it’s not just about hardcore pornography.

You'll encounter something like this:

EDIT NOTE: the category examples are based on current mobile configurations and broad indications from ISPs

(1) Screen one

"Parental controls"
Do you want to install / enable parental controls
☑ yes
☐ no


(2) Screen two [if you have left the box ticked]

“Parental controls”

Do you want to block

☑ pornography
☑ violent material
☑ extremist and terrorist related content
☑ anorexia and eating disorder websites
☑ suicide related websites
☑ alcohol
☑ smoking
☑ web forums
☑ esoteric material
☑ web blocking circumvention tools

You can opt back in at any time


The precise pre-ticked options may vary from service to service.

What's clear here is that David Cameron wants people to sleepwalk into censorship. We know that people stick with defaults: this is part of the idea behind 'nudge theory' and 'choice architecture' that is popular with Cameron.

The implication is that filtering is good, or at least harmless, for anyone, whether adult or child. Of course, this is not true; there's not just the question of false positives for web users, but the affect on a network economy of excluding a proportion of a legitimate website's audience.

There comes a point that it is simply better to place your sales through Amazon and ebay, and circulate your news and promotions exclusively through Facebook and Twitter, as you know none of these will ever be filtered.

Meanwhile ISPs face the unenviable customer relations threat of increased complaints as customers who hadn't paid much attention find websites unexpectedly blocked.

Just as bad, filters installed with no thought cannot be expected to set appropriately for children of different ages.

Of course, all of this could be easily avoided by simply having an 'active choice' as the ISPs originally suggested: with no preset defaults, forcing customers to specify whether they wanted filters, or not.

It's really very surprising that Cameron's campaign has spent six months insisting on a system designed to fail consumers, threatening ISPs with legislation if they didn't use the inaccurate, error prone method that Number 10 seem to believe in.

If it all seems to work badly, at what point is it ok for ISPs to start running their own businesses, and change the setup screens?


We've launched a petition calling for David Cameron to drop his plans for default Internet filtering. Sign the petition here:

[Read more] (82 comments)

google plusdeliciousdiggfacebookgooglelinkedinstumbleupontwitteremail