Policy briefing – Criminal liability for content harmful to children

A Policy briefing

Criminal liability for content harmful to children

Amendment NC2 Sir William Cash and Miriam Cates

Drafted by Dr Monica Horten – 16 Jan 2023

The Amendment is not about child protection. It is about content harmful to children, and restricting their access to it.

The Amendment inserts a criminal liability into Section 11 of the Bill. It is confusing because Section 11 is headed “ Safety Duties Protecting Children”. But in fact, duties that are related to child protection, such as blocking child sexual abuse material, are addressed elsewhere in the Bill. Other duties that could fall into this category, such as suicide or self-harm, are either already addressed elsewhere or will be tackled by new government amendments to be tabled in the Lords.

There is no clear and precise definition on the face of the Bill of what that content harmful to children is. The situation is similar to “legal but harmful content for adults”, that was removed by the government in December. Content harmful to children will be determined by government Ministers after the Bill has been passed into law.

Without any definition, the tech companies do not know what it is that they are being held liable for. A criminal liability would be set for crimes yet to be defined, based on the actions of others.

They won’t know what is supposed to be taken down or where they should restrict access to content. Fear of a jail sentence, will lead to over-moderation where content that is lawful is removed. It portends the use of upload filters – where the system sweeps in and removes content before it has been posted. It may mean that content is sanitised to the level of a child.

The strengthening of the age assurance requirement in Section 11 (by amendment last December at Re-committal Stage) will have the effect of making age-gating compulsory. This will have the effect of restricting children’s access to content, and also adults. People will be subject to even more algorithmic decision-making than they are at present. Internationally, this is a poor template for the UK to set. The UK has traditionally had a high reputation on the international stage for protecting human rights and freedom of expression.

Any move to criminalise those who facilitate and disseminate speech online, risks damaging consequences, where hostile and repressive states could duplicate it, not for child protection, but for political repression.

Published by Open Rights Group, 12 Dukes Road, London, WC1H 9AD. A UK-based organisation that works to preserve digital rights and freedoms by campaigning on digital rights issues. Open Rights is a non-profit company limited by Guarantee, registered in England and Wales no. 05581537.
CC BY-SA 3.0 free to reuse except where stated.