Digital Privacy

Regulating Age Verification

As Online Safety Act age verification or assurance duties kick in, on 25 July 2025, we recap what the act failed to do: regulate and protect privacy of children and adults engaging with these tools.

The need for a mandatory privacy scheme

When the UK last proposed age verification for adult content online, we analysed the deficiencies with the proposed code of practice for AV, which, as with the current Ofcom codes, left providers to make the decisions on what security is themselves. In our view, at a minimum, a mandatory scheme is needed to ensure that AV systems initiated by the Online Safety Act 2023 universally reach a high standard of data security and privacy. While this would not mean that it has no privacy impact, nor would it ensure that AV achieves its goals, it would at least ensure the current free-for-all and potential privacy debacles are avoided. The UK stands alone on this among major European countries; other jurisdictions are at least attempting to address privacy concerns, either through legislation or regulators.

Likewise, other European jurisdictions appear to be clearer about when Age Verification is required. In the UK, companies are interpreting their OSA duties much more widely.

Ofcom cannot mandate privacy standards within their existing OSA powers, but Ofcom and the ICO can work to ensure a voluntary standard exists which would ensure providers complied with their privacy duties in the Online Safety Act [Section 22 (3)].

Industries can currently propose voluntary schemes within existing data protection legislation, which are overseen by the ICO. Likewise, the ICO could take a similar role to the CNIL, and provide clearer guidance on what is acceptable or best practice.

The OSA creates incentives for poor security and privacy

By providing no minimum standard for Age Assurance, and instead relying solely on data protection law, the OSA allows the main drivers for age assurance to be primarily commercial: Convenience, Cost and Compliance. The commercial incentives for choosing a provider and AV system will include:

  • Convenience to the platform; usually meaning US-based and local to the platform;
  • Cost; meaning that some platforms will choose low cost AV providers who have lower security investments or collect user data in order to monetise it as part of their business model. This can also include Monetisation of data by the platform: meaning the platform may choose to increase the benefits it receives from AV by collecting more than needed
  • Compliance, meaning platforms use AV to meet their broader legal duties, such as to identifying which of their users are children, and which are adult, for a wide variety of content or service decisions

US based services are inherently problematic, as providers are used to permissive legal conditions for personal data, and are subject to US surveillance laws. It is harder for the ICO to regulate a market with numerous small US, UK and EU providers, devising their own privacy arrangements, than a market governed by an agreed standards body.

Furthermore, breaches of data protection law are rarely if ever enforced against by the UK’s Information Commissioner, with the exception of spam and cold calling. This means that companies are not running a financial risk even if they seriously break data protection rules. While reputational risks do not go away, these are insufficient to stop bad decisions from being made, especially when faced with other business pressures.

Additionally, the range and potential ubiquity of unregulated AV systems risks users engaging with actual scams as users have little or no control over how they verify their age.

This risky and dangerous situation regarding Age Assurance is the opposite of what legislation should be seeking. As we have detailed elsewhere, the risks to users can be extremely high, should data ever leak. We already seeing a proliferation of tools with poor data practices that could pose risks to users. In this document, we outline the key steps needed to make Age Assurance required under the OSA sufficiently safe for users to engage with it.

Recommendations from our report

In our report on How to Fix the Online Safety Act, we made a number of recommendations:

  • Recommendation 28
    Platforms should provide users with detailed documents regarding the use of their data so that they can understand the risks to their privacy and data.
  • Recommendation 29
    Ofcom and the ICO should work with industry to create a high standard for privacy in age verification.
  • Recommendation 30
    Ofcom should recommend that age verification solutions include the use of high, independently managed data protection standards, and meet interoperability and accessibility needs.
  • Recommendation 31
    Future legislation should incorporate privacy, accessibility, and interoperability requirements for age verification and assurance.
  • Recommendation 32
    Section 81 duties should be redrafted in future legislation to ensure no impact to privacy and as minimal impact as possible on free expression.

Existing AV standards and work at the EU

A number of existing standards have been developed, such as the Age Check Certification Scheme (ACCS) and PAS 1296, but they are general in nature. While worthwhile, they are in our view insufficiently focused on the problems that must be addressed for the OSA’s purposes.

All AV standards and legislative attempts have been criticised for potential privacy problems and for their likely ineffectiveness, but all European Union efforts contrast with the UK in attempting to address the privacy questions seriously and directly

European Union

The European Union expects that Age Verification will be needed for adult sites, although this is likely to be narrower than the UK’s uses. They are undergoing a debate about how to establish high security and privacy techniques across member states, and have as a first step published an open source demonstration app. While the EU’s approach has difficulties, and also remains voluntary at the EU level, the UK could draw on this work as it progresses, especially as it has the aim of producing a “user-friendly and privacy-preserving age verification method”, backed by legal requirements to protect fundamental rights.

Some EU member states have also shown significant regulatory concern regarding age verification. For example, France’s laws targeting access to pornography (Article 227-24 of the French Criminal Code), does not discuss privacy duties, however, their data protection authority has been active in setting the need for privacy, developing recommendations since 2021, and in 2022 recommending and demonstrating Zero-Knowledge systems for age verification. It has also taken steps to prohibit pornographic websites from collecting identification documents. Likewise, the Spanish data protection authority has established a Decalogue of Principles regarding online age verification.

How to regulate Age Assurance under the OSA

Define the goals of Age Assurance

Age Assurance or verification is being employed to satisfy OSA compliance goals, but this is not the same as a social purpose. As a result, it is being used in new and unintended ways.

For example it is being used to check whether someone is adult or child, merely to access certain features or content, with a high and unintended cost to free expression. As noted above, it is also being used by platforms and providers for new business purposes such as gathering more personal data.

In order for regulators to be able to define and minimise the risks, the OSA needs to define closely when and where third party age assurance tools are actually needed, and ensure that uses are narrow, rather than expansive.

Define minimum legal requirements for Age Assurance

The Online Safety Act must be amended to require that Age Assurance tools:

  • meet high standards of data security
  • ensure data minimisation for the user
  • reduce data retention to the minimum necessary for the AV provider
  • provide as little information as possible to the platform / publisher
  • prohibited from using data for secondary uses, such as profiling and
  • behavioural tracking
  • are interoperable, so users can choose their own trusted Age Assurance tool
  • meet the needs of vulnerable adult users
  • are certified as meeting these standards

and that the certification body must be obliged to

  • make their standards public and freely available
  • make their register of approved providers public
  • make regular public reports including details of any security or privacy issues in the period

Not all age verification systems need to comply with an OSA standard, but those employed for OSA purposes in our view must, given their widespread and compulsory nature, and their use regarding adult content and services.

Make Certification Mandatory

The government must establish mandatory certification requirements for all age assurance providers operating under the Online Safety Act. While voluntary standards or schemes are insufficient to ensure security, privacy, and public confidence, they currently can be recommended by Ofcom (s.82 OSA) in their guidance, but it is not expected that Ofcom would have any role in shaping such a standard. Likewise, Ofcom’s guidance can make reference to the ‘principle’ of interoperability, ie users choosing their own AV tool, but Ofcom cannot mandate it.

Making a scheme mandatory would require a change to the Online Safety Act. A single high standard should be enforced and adopted, given the problems we note above.

Recommended Actions:

  • Amend the Online Safety Act to require that AV tools used under the act are certified
  • Amend the OSA to ensure a single standard is adopted and a certification body is appointed
  • Amend the OSA to give the power to the ICO or Ofcom to appoint the certification body to create and maintain a detailed standard and for the state regulator to approve changes to the standard

Enforce the standard for Online Safety Age Assurance

Either Ofcom or the ICO must ensure that the certification standards body meets the standards required by the Act. Depending on the mechanism, one or other would approve the standards body. Age Assurance providers would then sign up to the certification body, which would make checks to ensure compliance.

In the event of non-compliance, Ofcom or the ICO should be able to:

  • Issue financial penalties
  • Receive grievances, complaints and deal with potential non-compliance
  • Require secure data deletion where standards are breached or certification lapses.
  • Compensate individuals if they are directly impacted by a breach.

Developing a detailed standard

Much of the work of developing detailed security standards has already been done. Without going into the full detail of what one could look like, we would expect a standard to cover:

  • data and metadata minimisation and other requirements made by the amended act
  • use of zero-knowledge techniques where possible
  • cryptographic requirements for data in transit or at rest
  • limits to logging
  • industry standards for cybersecurity that will be employed
  • vulnerability testing to be employed, especially threat-led testing
  • bounties for vulnerability detection and systems for responsible vulnerability reporting
  • use of standardised risk assessments and response plans and their review by the standards body
  • disclosure of breaches and reporting of vulnerabilities
  • processes for swift fixing of vulnerabilities that are detected

The detailed standard can reference other existing Age Verification standards and schemes, but in our view needs to exist in its own right to meet the requirements created by making Age Assurance ubiquitous and mandatory for certain sites and users.

REGULATE AGE ASSURANCE PROVIDERS!

Tell thr Department of Science, Innovation and Technology that Age Assurance done under the Online Safety Act must be safe, private and trusted.

Sign our open letter

Age verification facts

ORG’s website explains how age verification and age estimation checks (age assurance) under the Online Safety Act 2023 works.

Find out more