October 17, 2019 | Jim Killock

Age Verification is dead – for now

Compulsory Age Verification for adult content was cancelled yesterday by Nicky Morgan, the Culture Secretary. However, the plans may well come back.


Porn viewing histories for 20m people what could possibly go wrong?Open Rights Group welcomes this change for two reasons. Firstly, the privacy protections within the scheme were merely optional for AV providers, leaving users at risk of having their porn habits profiled, recorded and leaked. Secondly, the plans included the prospect of widespread Internet blocking of non-compliant sites, raising the prospect of the UK becoming the most prolific Internet censor in the democratic world.

Attempting to regulate all Internet content to ensure it is safe for children is, unfortunately, not an achievable aim. Any steps taken will in truth be partial and come at costs. It is very unclear that Age Verification, especially when combined with Internet censorship of legal content, would reach a reasonable balance.

A more reasonable approach to child protection would firstly help parents to use content controls, such as filters, on devices, which can be tailored to suit the child’s development; and secondly it would ensure that children are empowered through education, so that they know how to manage their own risks and make sensible choices.

Whether we like it or not, the simple existence of easily distributed pornographic content means that it will always be within the easy reach of under 18s, whatever controls are put in place. Much like drugs, smoking or drinking alcohol, teenagers will ignore the rules if they wish to. Files are extremely easily swapped and shared. While this does not mean we abandon attempting controls, it does mean we should assume they are not a replacement for open discussion with under 18s.

Government policy we are told will now focus on an forthcoming Online Harms Bill. This may well bring age verification back into the policy mix, now or at a later date. The proposed approach of a “duty of care” to users is vague, and we believe likely to lead to risk-averse platforms over-censoring material, for instance by machine identification, merely to reduce risk. Age verification could easily become a general requirement for platforms, to manage risk more precisely. However, at this stage we are merely speculating about future policy.

For now we hope that Parliamentarians look back at the last two Digital Economy Acts, of 2010 and 2016. Both proposed widespread web censorship, of copyright infringing sites in 2010, and adult content in 2016. Both proposed measures that crashed and burned: the 2010 provisions for disconnecting alleged file sharing households in 2010 died without trace. In 2016, the complex measures to police some, but not all, adult content, now looks like it is gone.

What both acts had in common was that the favoured solutions of particular groups, the copyright holders in 2010, and some child protection and anti-porn campaigners in 2016, were taken by government as the best way to regulate. There was insufficient attention paid to people who pointed out the obvious flaws.

There is an easy way to avoid these kinds of policy failures, which is to discuss the means of reach an objective fully with all parties, including industry and the whole of civil society, before choosing a policy to pursue.