Palestine Action ruling: Human rights organisations call for Ofcom to issue guidance on content takedowns

Human rights organisations, academics and writers have written to ask Ofcom to provide immediate guidance to tech platforms following the High Court ruling that the UK Government’s proscription of Palestine Action as a terrorist organisation was unlawful. The Government’s decision to appeal the ruling has left uncertainty over how online content relating to Palestine Action should be moderated.

The letter, organised by Open Rights Group, has asked Ofcom to clarify whether platforms are still expected to remove content, how they will implement new duties to remove terrorist content and whether content will be restored if the government loses its appeal.

Open letter to ofcom

Read now

Shortly after the High Court ruling, the Met said that officers will no longer arrest people at protests who express support for Palestine Action, for example by holding up banners, although they will continue to gather evidence. The Chief Magistrate, Paul Goldspring said that ongoing court cases will also be adjourned as “there is no merit in hearing the cases until we know what is going on with the appeal.”

However, there has been no clarification from Ofcom about what the ruling will mean for online platforms who have duties to remove ‘terrorist’ content under the Online Safety Act. This will become an even more urgent issue if new requirements to proactively scan for illegal content come into effect later this year as expected.

Sara Chitseko, Pre-Crime Programme Manager at Open Rights Group said:

“The UK’s vague definition of terrorism and legal duties under the Online Safety Act already risk content being wrongly defined as illegal and removed. Now there is additional confusion over whether tech companies are targeting and removing online content relating to Palestine Action.

“In light of the court’s judgment and commentary on freedom of expression, Ofcom need to provide immediate guidance to ensure that important public debates about Palestine are not being censored.”

As the government is appealing the High Court decision, Palestine Action is still proscribed. The legal position is that content supportive of Palestine Action must be removed, when a platform finds it or it is reported to them. This is clearly excessive, especially in light of the court’s judgment that the proscription meant “very significant interference” with the right to freedom of speech and freedom of assembly.

In August 2025, ORG and other human rights organisations wrote to Ofcom and tech platforms to raise concerns that legal duties under the Online Safety Act combined with the UK’s vague definition of terrorism would lead to content being wrongly identified as illegal and removed.

They warned that “the proscription of Palestine Action may result in an escalation of platforms removing content, using algorithms to hide Palestine solidarity posts and leave individuals and those reporting on events vulnerable to surveillance or even criminalisation for simply sharing or liking content that references non-violent direct action.”

Open Rights Group met with Ofcom to discuss our concerns, but little light was shed on how tech platforms would ensure that legal content was not censored.

People in the UK who believe they have been wrongly deplatformed or had content removed have no independent mechanism for redress — only internal platform appeals processes. Ofcom has also been found to have encouraged platforms to adopt over-cautious ‘bypass strategies,’ censoring content beyond what the law strictly requires to avoid regulatory scrutiny.

It is clear that UK duties to remove terrorist content are not only open to abuse, but also make it impossible for platforms to balance free expression rights, as they require blanket action.

The threat to Palestine-related political content could be made worse by the expected introduction of new Online Safety powers, which may come into effect before the government’s appeal goes to court.

These powers were outlined in yet another lengthy and complex Ofcom consultation last summer and included plans to restrict livestreaming, scan for illegal content and suppress algorithms. The following duties are of particular concern for political content related to Palestine.

Pre-publication scanning: Tech platforms will have to scan posts and files being shared for ‘illegal’ content and remove it before it’s even published. Automated filters can’t understand context or nuance —  legitimate political content about Palestine (and more) could be flagged and removed before anyone sees it. 

Algorithmic suppression: Recommender systems will have to de-prioritise content that might be illegal until it is reviewed. This is supposed to stop the spread of extremist content but it could mean lawful activism and protest footage is hidden from feeds — even if it breaks no rules. Open Rights Group has already faced de-prioritisation of content on Tik Tok about topics on surveillance which have been wrongly flagged by platforms.

It is unclear how unpublished algorithms can reach minimum standards of transparency and accountability. This is however an area where Ofcom could ask for progress, so we can understand how they impact on speech.

Police emergency takedown powers: During a designated ‘crisis,’ police will have direct lines to platforms to demand immediate content removal. Without independent oversight, live protest footage and dissenting political voices could be silenced in real time.

As Open Rights Group has warned, if introduced, these new duties pose a clear risk that political speech and activism on Palestine and other issues could be censored.

Fix the Online Safety Act