Views Around the Table on the Online Safety Bill

On 27 September 2022, Open Rights Group (ORG) held a roundtable with civil society – from women’s rights advocacy groups to organisations representing over-surveilled communities – to discuss some of the most concerning aspects of the Online Safety Bill, currently at the report stage in the House of Commons.

Two particular issues were raised during the event: content moderation and encryption. Dr. Edina Harbinger from Aston University and network security expert Alec Muffet gave participants an overview of the Bill’s impact on these two issues. 

ORG’s freedom of expression policy manager Dr. Monica Horten chaired the discussion, expressing that the main aim was to understand from participants, when it comes to the tension between privacy and protection online, whether the balance within the Bill is right.

Implications of the Bill

The summary report of the discussion can be found online. Some of the issues touched on included how “illegal content,” listed in Schedule 7 of the Bill, would be handled, which Harbinger said would need technology to monitor content, so anything illegal is removed efficiently.

It also covered what was meant by “legal but harmful” content and whether any content moderation algorithm had the required nuances to detect such content, not to mention the implications of monitoring such content on free speech. There have been some updates on this aspect of the Bill, as discussed below. 

Technology capability notices (TCN) were explained with the power given to Ofcom as the regulator to facilitate an interception warrant, potentially allowing someone to monitor and break the encryption on private messaging platforms.

The capability of breaking end-to-end encryption also makes platforms more vulnerable to hacking and risks any surveilled messages potentially being taken out of context, participants heard.

Article 8

Privacy, as a right in itself regardless of the content, was proposed as worth protecting because endless surveillance couldn’t be allowed, “just in case” something untoward is detected.

The tricky part of the discussion, however, was a common question on how to get the balance right as abuse and crimes occur online and can stir up emotional debate, particularly where those subject to abuse are women and children.

Some put forward solutions. For instance, there could be transparency around the technology used to monitor services; a cost-benefit analysis of resources used to find abuse victims was also suggested. In addition, hashing technology has been touted in some rights circles as a possibility, exploring how that can be applied to image-based abuse. 

When it comes to those impacted, this disproportionately falls on Black and minoritised communities.

Sophia Akram, Policy Manager (Sector Support), Open Rights Group

The sharp end

Widening the lens, however, there were also reminders that online abuse comes from a broader context and a holistic response and resources are needed to put into prevention, social services and healing justice, for example.

At the end of the day, the harms being committed warrant a public health and criminal justice response, particularly regarding violence against women and girls. And when it comes to those impacted, this disproportionately falls on Black and minoritised communities.

A cost-benefit analysis may not reveal those at the sharp end on either side, i.e., those who experience harm and those who are subject to TCNs, over-surveillance and securitisation of their online space.

Policy shift

The caveat of the discussion, however, was that the recent change in the country’s leadership meant the proposals discussed at the time were still up for debate and finalisation.

Since the roundtable event, the Bill has returned to Parliament with a range of amendments. The government claims to create a ‘triple shield’ of removing illegal content, a so-called user empowerment provision to filter out content and requiring platforms to enforce their terms and conditions. The implications of this policy shift are set out in our recent blog post. Read it here.

Listening between the soundbites

While the government says its proposals will not hinder free speech and privacy, such assertions do not stack up in the broader consideration of how such content moderation and chat surveillance will occur. 

And if “legal but harmful” remains in the Bill for children, age verification may need some form of biometric surveillance. 

Read the Online Safety Bill – Sector Support Roundtable: Summary Report here.

Links for further information

Open Rights Group Online Safety Bill Campaign Hub

Open Rights Group Primer on the Online Safety Bill

Alec Muffet’s Primer on End-to-End Encryption

Privacy International’s Encryption Report

Liberty’s Report on the Online Safety Bill

Child Rights International Report on Encryption

Open Rights Group Blog on Immigration Impact of Online Safety Bill

End Violence Against Women Coalition Code of Practice

Racial Justice Network Campaign – Stop the Scan

Racial Justice Network Report on Biodata Collection