End Pre-Crime
Data and content is being weaponised to criminalise people without cause fuelled by facial recognition technology, AI and surveillance.

What is Pre-Crime?
The police and criminal justice authorities are increasingly using tech, data and AI to identify people who they believe are at ‘risk’ of committing crimes. This results in people having action taken against them even though they haven’t committed a crime through joint enterprise.
Flawed technologies like facial recognition, police gang matrices, the data mining of social media and the Prevent programme undermine our presumption of innocence.
Biases embedded in these systems and ‘gang narratives’ that are used to build joint enterprise cases exacerbate discrimination that’s inherent in the criminal justice system already.
We urge transparency, scrutiny and a moratorium on the use of tech in policing that amplify systemic oppression.
ban ‘crime-predicting’ police tech
Predictive polcing is built on existing, flawed police data. Over-policed communities are more likely to be identified as at ‘risk’ of criminality, leading to more racist policing.
Sign the petitionAI and Predictive Policing
Online content is being mined and used as digital evidence to create gang narratives. Alongside extended criminal liability, the weaponisation of content and data is increasingly being used to imprison young Black people and people of colour for offences they have not commited.
WRITE TO YOUR COUNCIL: BAN PREDICTIVE POLICING
Tell your councillor to ban predictive policing in your community.
Take ActionWHy predictive policing must be banned
We have the right to be presumed innocent, not predicted guilty
Find out moreThe Prevent Duty
Prevent operates in the pre-crime space in which no offence has taken place. It is one where people are surveilled and viewed as suspicious. Prevent operates by extracting data and policing information that further securitises the spaces of marginalised and vulnerable communities.
Prevent and the pre-crime state
ORG’s report on how unaccountable data sharing is harming a generation
Find out moreMass surveillance
Live facial recognition and surveillance technology generates sensitive biometric data. Its previously disproportionate misidentifications among younger Black men in particular highlights the discriminatory nature of the technology.
FAcial recognition: no oversight, no consent
Wales cross-party group on surveillance and facial recognition technology in the UK
Find out moreHome Office CCTV: free mass surveillance?
Government programme to supply security equipment to faith institutions
Find out moreThe case against police body-worn video cameras
How police surveillance technologies may enable police abuse of power
Find out moreKEEP UP TO DATE WITH OUR CAMPAIGNS
Subscribe to our newsletter to receive updates on the latest developments affecting your digital rights.
Sign up now