call +44 20 7096 1079


August 14, 2013 | Lee Maguire

Virgin and Sky blindly blocking innocent sites

The blind over-blocking of innocent sites by UK ISPs apparently continues.

As reported by PC Pro, the systems implemented by both Virgin and Sky to stop access to websites blocked by the courts appear to be blocking innocent third-party sites with apparently little or no human oversight.  For example the website was reported to have been blocked.

In order to understand why this specific issue happened, you need to be familar with a quirk in how DNS is commonly used in third-party load-balanced site deployments.

Many third-party load balanced systems, for example those using Amazon's AWS infrastructure, are enabled by pointing CNAME records at names controlled by those third-party systems. For example may be pointed at  However, "" usually cannot be directly given a CNAME record (CNAME records cannot be mixed with the other record types needed such as those pointing to nameservers and mailservers). A common approach is to point "" to a server that merely redirects all requests to "".

From forum posts we can see that it's this redirection system, in this specific case an A record used for "", that has been blocked by the ISPs - probably a court-order-blocked site is also using the service - making numerous sites unavailable for any request made without the "www" prefix.

These incidents strongly suggest that the opaque approach to website blocking by ISPs, and the apparent lack of oversight, has the potential to be hugely damaging to the internet. Open Rights Group calls for greater transparency in this area, beginning with making the court orders available for public inspection.

[Read more] (6 comments)

August 09, 2013 | Lee Maguire

Website blocking measures lead to inadvertent censorship

A technical decision made by Sky in implementing website blocking has lead to the blocking of news site TorrentFreak

TorrentFreak reports today that Sky is currently blocking access to their site. Not as a deliberate act of censorship, but as an entirely predictable by-product of its system for complying with court-ordered website blocks.

When the owner of EZTV (a site ordered blocked on the 25th of July) automatically pointed UK visitors to, Sky's blocking system (which from court documents we believe to be codenamed "Hawkeye") apparently automatically added TorrentFreak's IP address to its blacklist.

Inadvertent denial-of-service by pointing DNS records at innocent third-parties is an entirely predictable possibility for anyone attempting to implement blocking systems. If this explanation for blocking proves to be the case, we'd be extremely surprised if the possibility had not occurred to the engineers responsible.

Open Rights Group continues our call for more transparency in the ways these blocks are performed, including access to the orders that would presumably limit the legal scope of blocking. If merely blocking the handful of sites that have received blocking orders in the past 12 months results in collateral damage (such as the blocking of we hold little confidence in the ISPs being able to implement David Cameron's default network filtering plans without causing significant disruption.

[Read more] (4 comments)

August 09, 2013 | Javier Ruiz

Tackling “thorny issues” of open government at the OGP London summit

A look at some of the tricky issues and tensions in open government being discussed at the upcoming OGP summit.

The Open Government Partnership summit in London is gaining momentum, as evidenced by the growing engagement from civil society organisations. The OGP is reaching an important milestone, with the closure of its first cycle of country commitments and independent assessments.

The summit will be an inclusive space where governments can announce inspiring projects and collaborate with civil society. But this does not have to mean shying away from tackling difficult questions around open government.

Last week, UK civil society organisations held a meeting to discuss the summit. One proposal was making these areas of potential conflict explicit by creating a specific track for “thorny issues”. This would show the OGP is a confident process that takes these matters seriously.

The following areas would be suitable for inclusion. Some have already been proposed as a concrete session, while others are just an idea looking for more partners:

1. Transparency and private public services

Private companies have an important role to play in many of the areas covered by the OGP, such as the extractive industries and fiscal transparency. But this session will focus on the increasing provision of public services by private companies.

These companies tend to be excluded from “Right to Information” laws. Where there is information available, this is normally limited to narrow terms of contract delivery, making it difficult to assess overall performance and value for money.

2. Openness and privacy

Open data and transparency programmes can have privacy impacts, which could also lower acceptance and engagement from citizens. From a different perspective, we may also find that privacy can be used as an excuse to hinder transparency.

In some cases these tensions will involve personal data that is published in the public interest, such as subsidies, taxes, registers, judicial documents, etc. Another potential conflict is the publication of data from public services - schools, hospitals, welfare, etc. This kind of data is normally “anonymised”, but there are growing concerns about the risks of re-identification of individuals by combining different data sources.

An international workshop on this topic will have to analyse how to balance diverse regulatory approaches with upholding fundamental principles on privacy and the protection of personal data.

Privacy International and Open Rights Group are coordinating this session.

3. Surveillance and national security

The recent confirmation of the existence of mass internet surveillance programmes by several industrialised nations is a game-changer that brings into question some of the assumptions that have underpinned the relations between open government, surveillance and national security.

Few will question that there is a role for secrecy and special powers. But the blanket exemptions for national security from most transparency programmes and right to information laws may have gone too far. In some countries there is no basic information on the legal basis of surveillance programmes, or the size of their overall budget. Many civil society organisations are demanding more targeted surveillance and better accountability.

More fundamentally, we may need to revisit the unspoken presumption in open government circles that there is no need to justify collecting increasing amounts of data on citizens because eventually something good will come out of it.

Open Society FoundationsOpen Rights Group and Tactical Technology Collective are coordinating this session.

4. Protection for whistleblowers

There are growing concerns that despite an increase in commitment to openness, many OGP countries are actually ratcheting up the persecution of whistleblowers. Besides several high profile cases withinternational resonance there are many less known cases throughout the world.

Several organisations, including OSF, have expressed interest in organising sessions on this important topic. Please get in touch.

5. Citizens’ rights, practical tools and government commitments

Groups involved int he OGP have alternative approaches to openness. This has been characterised in simple terms as involving on one side Right to Information veterans, who have focused for a long time on getting government to implement legislation. One the other side would be Open Data activists that, instead of driving policy, develop practical technology solutions to provide access to public information. Of course the reality is a more complex. Nowadays most people in the field will agree that transparency and accountability require both laws and tools, plus citizen engagement and infomediaries.

There are concerns, however, that the OGP may be skewing this balance with its focus on voluntary commitments by the executive branches of government that lack legally enforceable mechanisms. The problems arise when the same governments that propose national plans with excellent aspects are simultaneously weakening Right to Information legislation or the role of civil society.

The Campaign for Freedom of Information are coordinating this proposal.

The proposals above are all in a shared online document that attempts to collate all the sessions proposed by civil society groups. Please add the details of any proposals you are developing to that spreadsheet, and get in contact with anyone who is developing an idea you would be interested in supporting.

It is important to get international collaborations to shape the sessions. Particularly, let us know if you know of any government representatives from your country who are coming to the summit and may be interested in participating in these panels.

There is a growing consensus that the summit should reflect the diversity and multistakeholder nature of the OGP. A criteria for acceptance into the programme should be that panels are gender balanced and include representation from the majority world.

The deadline for presenting complete proposals to the OGP summit team is the 1st of September.

This blog was also posted to the Open Government site.  

[Read more]

August 08, 2013 | Peter Bradwell

Nominet trying again with .uk proposals

Nominet are again consulting on their idea to introduce .uk domain registration. But the proposals are little better than before.

Nominet's new .uk proposals, described in more detail on their website, include:

  1. The ability to register .uk sites. The proposals would mean, for example, that if you run that you could also register
  2. An effort to verify registrants' details for second level registration through checks against a third party database, and a requirement to supply a UK address for service.
  3. Reserving names for a period so that those who first registered a domain name string have priority over the .uk registration. So, for example, if I registered in 2003, and nobody registered or and so on before me, then I would have the first opportunity to register
  4. Charging a wholesale price of £4.50 a year for multiple year registrations or £5.50 for single year registrations.

Nominet are effectively arguing that they will make a lot more money through these proposals, and this is good because they will then be able to do more of their work improving the trust and security of the .uk namespace. I'm paraphrasing Nominet's argument. (See Leslie Cowley's blog on the Nominet website for more on the thinking behind the changes).

However, they make little or no case for this. There are no details about how much they expect the proposals to raise, or how they plan to use the extra money to improve trust and security in the .uk namespace.

Haven't we been here before?

This is the second consultation Nominet have run on this idea. The first was at the end of last year. They received lots of negative feedback last time. We responded and were critical of the proposals, and recommended they be dropped. We argued that the plans would lead to:

1. the creation of a 'walled garden' that would undermine confidence in the rest of the UK domain space including
2. the imposition of additional cost burdens on website operators, which are likely to be particularly significant for SMEs
3. the positioning of Nominet in an inappropriate role, by setting them up as arbiters of trust online and giving them additional and somewhat unchecked powers. This would effectively create for Nominet a monopoly over 'trust' and security in the UK domain space.

What has changed from the last consultation?

Not a whole lot. Two main things:

  1. dropped the some of the security services that would have offered exclusively to those with .uk sites.
  2. changed the charging structure, changing the fee from £20 in the original consultation to a wholesale price of £4.50 a year for multiple year registrations or £5.50 for single year registrations

Why do Nominet think this is a good idea?

The shortest answer: because they will make an awful lot of money from it. Nominet say this proposal will 'keep the namespace competitive', and that the namespace needs to 'develop and innovate to remain competitive and relevant.' Further on in the consultation document Nominet specify four benefits:

  1. Maintain the relevance of the .uk name space in a rapidly developing market;
  2. Provide additional choice for registrants in the .uk space and meet market demand;
  3. Fulfil Nominet's public purpose by increasing security and trust in the .uk name space; and
  4. Progress Nominet's commercial development.

We have serious doubts about whether the proposals for greater verification of registrants' details will have any effect on consumer confidence. For example, it seems like it will still be fairly easy for somebody to simply register using a 'real' name and address that is not theirs. Nominet certainly provide no evidence of the likely effects of the new process.

The key argument Nominet make seems to be this: the commercial development of Nominet is a good thing because it will enable them to do more to make the .uk domain space more trusted and secure. I asked Nominet about this on Twitter:

@peterbradwell: @Nominet thanks! is the idea that nominet's commercial growth via new .uk sales will improve nominet's ability to meet its public purpose?

@Nominet: @peterbradwell yes or at least help us to maintain the ability in the face of the changing domain name landscape.

The lack of a justification for the .uk proposals

Figures, estimates or otherwise, of the costs and benefits of the proposals are absent from the consultation document or background paper. There is no estimate of the extra income this will generate for Nominet or the registrars, and no estimation of the costs to businesses. There are no proposals for exactly what Nominet will do with the extra money to further their public purpose work. Nominet say there is no reason to provide a business case. 

All of this makes it hard if not impossible to consider whether this is the best way to improve the trust and security of the .uk namespace. The relationship between Nominet's continued commercial growth and improvements in trust and security of .uk namespace seems to be taken as given. 

The blog estimates Nominet could make upwards of £25m from the proposals - doubling their revenue - and lists a number of important questions that have not been addressed.

Nominet's .uk plans still represent an effort to exploit their position to create new online 'real estate'. We're currently putting together our formal response to Nominet, which we'll post on the website as soon as possible.

More detail on the consultation and information on how you can respond are on the Nominet site.


[Read more]

August 01, 2013 | Jim Killock

Diane Abbott responds on web forum blocking

The word about the breadth of nudge censorship or default filtering is spreading. Categories such as "web forums" may well be pre-selected when adults enable filters.

diane_abbott_cc-by-nc-sa-birmingham-museum-and-art-galleryOn a cycling forum, members who are rightly worried that their forum may be blocked by default filters, Skydancer posted a response he was given by Diane Abbott:

I do not believe that the arrangements to protect children from hard core porn online will affect a forum to discuss cycling! I think that men, who think that viewing hard core porn without let or hindrance is some kind of human right, are deliberately exaggerating the effect of the suggested arrangements

I asked Diane Abbott about this, and to her credit she replied. The conversation was private, so I won't quote her replies, but I think it is common knowledge that Diane believes that default filters are more effective. I think it is also common knowledge that she is prioritising child safety and wants everything done possible to block children's access to pornography as something very seriously harmful. However, in the statement above we are clear that she has made a factual mistake. Forums and social media are very likely to be the target of default parental filters.

The question for Diane Abbott should be, given categories like web forums are likely to be blocked by some of the filters and any box is very likely to be pre-selected, does Diane believe these sites should be blocked by default? Or does she not care?

But it isn't just her policy. Claire Perry and David Cameron have taken credit for it, and the ISPs have agreed to it.

Let's define the intent as:

To limit access of children to pornography by maximising the number of households where pornography is blocked for children

and examine the “nudge censorship” policy against that.

Problem one: breadth for adults and children

The "pre-selected" categories may include very broad types of content, such as "alcohol, drugs and tobacco," "social media" and "web forums". The impact is broader than just "pornography".

Let's measure this purely against the objective above. Firstly, it is irrelevant to it. Secondly, if web filters appear to be blocking too much content, there is surely a danger that some parents will simply switch them back off, not least because their children constantly ask for sites to be unblocked.

Problem two: collateral damage

Collateral damage in this case means yes, climbing and cycling clubs, pubs, bars, being blocked. We can add mistakes to this as well.

Collateral damage is something that any sensible policy should minimise, from a public policy point of view. It is no good to say "your climbing club is worth less than my child", a decently designed policy should aim to meet the needs of both.

Problem three: commercial sites want access to their market, adults want access to porn

This is a problem where the outcome is difficult to predict. If 30% of your potential market is suddenly lost to you, as adults in households with children find porn blocked, then these publishers may go to efforts to get it back. This might make filtering less effective, we don't know. One can imagine apps being a means to distribute, for instance. Perhaps pornographic spam will become more popular, reaching children indiscriminately.

Equally, adults who find pornography blocked are placed in a difficult situation, where the outcome for children is again unpredictable. Perhaps irresponsible parents will simply switch the filters off, leaving vulnerable children even more vulnerable. Perhaps, if filters were targeted at children, and aimed to keep adults out of them, this would be less of a worry.

What we are dealing with is software design, not social engineering

What I am getting at is that government insisting that certain ways of setting up a software system, like specifying that categories are "pre-selected", that buttons must say "next", that filtering must be network based, is simply to make a category error.

Government has an objective, but is negotiating the form of the user experience as if it had omniscient knowledge of user behaviour. It doesn't – it has a few gut instincts. It needs to separate the objective of limiting children's access to pornography from the means to deliver it, especially as we get into the details.

In government policy terms, it is extraordinary for Claire Perry and David Cameron to be staking their reputations on whether or not boxes are "pre-selected" and whether forced choices (active choice) is better than a kind of default option.

Lawrence Lessig famously said that "Code is Law", meaning that the operations of machines increasingly determine social outcomes, such as automatic content identification and DMCA takedowns.

Without really knowing why, politicians are stumbling on this concept, and becoming amateur software UX (user experience) designers. Unpredictable consequences will ensue. Please sign the petition!

Footnote: interestingly, Richard Thaler did his best yesterday to distance himself and the Number 10 unit from these proposals. I am not sure if he is correct (maybe this is 'bastard-nudge') but he clearly doesn't want to be associated with "nudge censorship".

[Read more] (16 comments)

July 31, 2013 | Jim Killock

Government wants default blocking to hit small ISPs

"Preselected" parental filters are now official policy, and should extend to small ISPs, according the the Department of Culture, Media and Sport's (DCMS) new strategy paper.

david_cameron_cc-by_gpaumier.jpg Announced without fanfare, this is the result of several years work on a Communications Bill, now parked, it seems.

The strategy says "we need good filters that are preselected to be on ... the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on"

They state that "We expect the smaller ISPs to follow the lead being set by the larger providers".

Finally, DCMS demand ISPs give them magic beans (“We want industry to continue to refine and improve their filters to ensure they do not – even unintentionally – filter out legitimate content”) and threaten them with regulation if they do not answer to future demands, or “maintain momentum”.

Take action and sign our petition against default Internet filtering.


Currently 91% of children live in households with internet access and a greater proportion of children aged 12-15 own smartphones than adults. While consenting adults should be free to watch the legal content they choose, children and young people are important consumers of digital content and their ability to access harmful and age inappropriate content should be limited as far as possible.

The Government has been working through the UK Council for Child Internet Safety (UKCCIS), which brings together more than 200 organisations across the information and communication industries, law enforcement, regulators, academia, and charities – to pursue a voluntary approach to child internet safety and has called on industry to make the right tools available to allow parents to protect children online.

We are seeing good progress in this area:

• Where children could be accessing the internet, we need good filters that are preselected to be on, and we need parents aware and engaged in the setting of those filters. By the end of this year, when someone sets up a new broadband account, the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on.
• By the end of next year ISPs will have prompted all existing customers to make an unavoidable decision about whether to apply family friendly filters.
• Only adult account holders will be able to change these filters once applied.
• All mobile phone operators will apply adult filters to their phones.
• 90% of public Wi-Fi will have family friendly filters applied to wherever children are likely to be present.
• Ofcom will regularly review the efficacy of these filters.

But we are clear that industry must go further:

• We expect the smaller ISPs to follow the lead being set by the larger providers.
• We want industry to continue to refine and improve their filters to ensure they do not – even unintentionally – filter out legitimate content.
• We want to see mobile network operators develop their child safety services further; for example, filtering by handset rather than by contract would provide greater flexibility for parents as they work to keep their children safe online.

And while Government looks to the industry to deliver, through the self-regulatory mechanisms already established under UKCCIS, we are clear that if momentum is not maintained, we will consider whether alternative regulatory powers can deliver a culture of universally-available, family-friendly internet access that is easy to use.

[Read more] (6 comments)

July 31, 2013 | Jim Killock

Twitter abuse debate moves on

The Twitter abuse debate has moved on significantly, onto the question of what the police are doing, and what difference that can make.

The police reacted swiftly, to try to relinquish responsibility, for instance:

Andy Trotter, who leads on social media for Britain's police forces, told the Guardian he feared that "a whole new tranche" of web-based hate crimes could "cause great difficulty for a hard-pressed police service" trying to deal with what could amount to thousands of allegations.

"We want social media companies to take steps to stop this happening. It's on their platforms this is occurring. They must accept responsibility for what's happening on their platforms ...
"They can't just set it up and walk away. We don't want to be in this arena. They are ingenious people, it can't be beyond their wit to stop these crimes, particularly those particularly serious allegations we have heard of over the weekend."

As Lilian Edwards asks:

What exactly do we have police for, then, if not to investigate specific, repeated and documented crimes? Giving up on policing Twitter is no more defensible than abandoning  a town like, say, Walthamstow to the criminal elements.

For a senior policeman, Mr Trotter also seems sadly ignorant of the law. Even leaving aside the issue of threat of rape as a common law crime, which might involve some difficult issues of sufficiently proving intention (though not many), the Protection Against Harassment Act 1997, especially s 4(1) makes it very clear that two attempts to "cause another to fear that violence will be used against him [sic] " form a course of conduct which is a crime. In the Perez and Creasy cases there are apparently hundreds of such threatening tweets, many retweeted or screencapped.

 It is impossible to understand how police who went ahead with investigating cases which involved poorly framed jokes on Twitter can now say they do not have the money to take on genuine, vicious  and entirely humorless threats of rape. It seems much more likely that they fear  they do not have the technical ability to understand how to police the Net , or the resources, and are terrified, and also worried that having destroyed their credibility on the Net once (see below), things can only get worse. But in that case the remedy is to acquire expertise, not to retreat to a pre 1996 position of declaring the social Internet terra incognita where elephantine trolls roam.

As Lilian implies, asking the police to investigate these crimes may not always inspire confidence among people who are regular users of Twitter. You may be reminded of the heavy handed tactics they employed against Paul Chambers after his joke, or the prosecutions of other people for suggesting that British soldiers deserve to die for their crimes in Afghanistan.

These prosecutions took place using Section 127 of the Communications Act, which criminalises "grossly offensive" speech using a public communications network. It covers a very wide range of potential speech, and it is unclear why the offence exists. In any case, its abuse has led to the Crown Prosecution Service advising that a "high threshold" should be employed in relation to use of S127, because of the free expression impacts. In their advice they note that human rights courts do not hold that mere offensiveness should result in criminalisation of speech. Offensiveness may be necessary to challenge ideas, after all.

The CPS says that communications that are either "credible threats" or "which specifically target an individual or individuals and which may constitute harassment or stalking within the meaning of the Protection from Harassment Act 1997" "should be prosecuted robustly".

Thus the CPS seems to believe that the police should be playing their part by investigating behaviour which would clearly be illegal. This is different from non-credible threats, and activity which does not constitute harassment.

To balance their calls for strong action on illegal activity, Stella Creasy and Yvette Cooper should callfor Parliament to repeal Section 127 so that matters of offensive but non-threatening, non-harassing speech remain clearly out of scope.

[Read more] (10 comments)

July 30, 2013 | Ed Paton-Williams

A quick guide to Cameron's default Internet filters

Last week, David Cameron announced plans to introduce default adult Internet filters for everyone. Here's our quick guide to the issues around default Internet filtering.

David Cameron wants all British Internet users to make an "unavoidable choice" on whether to switch on default filtering.

Crucially, he thinks people should have to actively opt out if they don't want Internet filters. The boxes that accept the filters would be pre-ticked.

Households that leave the filters turned on would, in theory, be unable to access websites with material categorised as inappropriate for under-18s.

Why is David Cameron proposing default Internet filtering?
David Cameron sees default adult Internet filters as the easy way to protect children online. "One click to protect your whole home and keep your children safe" is how he described his plan last week.

Will the filters only block pornography?
No. The filters will block all sorts of websites. Open Rights Group has spoken to the Internet Service Providers who would be responsible for implementing Cameron's plan.

If you are in a household with filters turned on, you would be unable to access websites that fall into categories such as web forums, violent material, alcohol, smoking, web blocking circumvention tools as well as suicide related websites, anorexia and eating disorder websites and pornography.

Are filters a reliable way to regulate access to the Internet?
Based on the evidence so far, no. UK mobile operators like O2 and Vodafone already block websites that are thought to be unsuitable for under-18s. There are problems with mobile blocking that will probably be replicated with broadband blocking.

ORG's found that mobile operators regularly block websites that shouldn't be blocked. Sites that have been blocked by mistake include church websites because they mention wine, shops selling tobacco pipes, political blogs miscategorised as hate speech, lingerie shops for no clear reason and many more.

There are also concerns that people will find it harder to access crucial advice on sexual health, sexuality and relationships as these sites may be mistakenly blocked.

If a site is blocked by mistake, how hard can it be to just unblock it?
When people find mobile operators blocking websites by mistake, they have found it very difficult to get them to remove the block. It may be hard for website owners to know if their site has been blocked.

Broadband providers would need to train their customer service staff to quickly handle complaints about incorrectly blocked websites.

Adults will be able to choose to switch off filters. Won't they just disable them immediately?
People tend to accept defaults. The 'nudge theory' that Cameron uses to try to influence our decisions and behaviour takes that as a given.

Encouraging everyone to accept adult Internet filters means millions of adults will lose access to all sorts of material rightly or wrongly categorised inappropriate for under-18s.

How would you turn the filters off after you'd turned them on?
David Cameron says "filters can only be changed by the account holder." His intention is to stop children turning the filters off without their parents' knowledge.

This approach will also cause some problems though. We know that sites will be blocked by mistake. So, for example, people in an abusive relationship who want to access a site about domestic violence may be unable to do so. That site might be blocked by mistake and they wouldn't be able to turn the filter off to access it without the abuser knowing.

What are the alternatives to default Internet filtering?
Parents should be able to manage their children's Internet access and some people do want household-wide filtering. Cameron's message of 'Set it and forget it' is unhelpful though as it risks giving people a false sense of security.

People should be asked to make an active and informed choice about what sorts of websites devices in their household can visit. This means that the boxes to choose which filters to turn on should not be pre-ticked and there should be real transparency about which sites the filters would block.

The Governent should ensure parents are aware that turning filters on does not immediately make the Internet safe. Government should also encourage parents to talk to their children about what they do online and offline.

The Government hasn't done enough to encourage and promote easy-to-use device-based filters.

Some sexual health groups have called for the Government to ensure children have high quality sex and relationship education (SRE) but the Government recently voted down mandatory SRE.

What can I do about default Internet filtering?
Open Rights Group has launched a petition calling on David Cameron to drop his plans for default Internet filtering. Click here to sign the petition.

[Read more] (5 comments)

google plusdeliciousdiggfacebookgooglelinkedinstumbleupontwitteremail