Mass Surveillance

Why ‘Predictive’ Policing Must be Banned

The UK Government is trying to use algorithms to predict which people are most likely to become killers using sensetive personal data of hundreds of thousands of people. The secretive project, originally called ‘The Homicide Prediction Project’ was discovered by Statewatch. They described how “data from people not convicted of any criminal offence will be used as part of the project, including personal information about self-harm and details relating to domestic abuse.”

It may sound like something from a sci-fi film or dystopian novel, but the “Homicide Prediction Project” is just the tip of the iceberg. Police forces across the UK are increasingly using so-called “predictive policing” technology to try to predict crime. Police claim these tools “help cut crime, allowing officers and resources to be deployed where they are most needed.” In reality, the tech is built on existing, flawed, police data.

As a result, communities who have historically been most targeted by police are more likely to be identified as “at risk” of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. These technologies infringe human rights and are weaponised against the most marginalised in our society. It is time that we ban them for good.

That is why we are calling for a ban on predictive policing technologies, which needs to be added to any future AI Act, or the current Crime and Policing Bill. We are urgently asking MPs to demand this ban from the government, before these racist systems become any further embedded into policing.

Lack of transparency and accountability

So-called “predictive policing” systems are not only harmful in that they reinforce racism and discrimination; there is also a lack of transparency and accountability over their use. In practice, this means people often do not know when or how they, or their community have been subject to “predictive policing,” but they can still be impacted in various areas of their life.

This includes being unjustly stop-and-searched, handcuffed and harassed by police. However, because data from these systems is often shared between public services, people can experience harms in multitude areas of their life, including in their dealings with schools and colleges, local authorities and the Department for Work and Pensions. This can affect people’s access to education, benefits, housing and other essential public services.

Even when individuals seek to access information on whether they have been profiled by a tool, they are often met with blanket refusals or contradictory statements. The lack of transparency means people often cannot challenge how or why they were targeted, or all the different places that their data may have been shared.

In an age where “predictive policing” technologies are being presented as a silver bullet to crime, police forces should be legally required to disclose all the “predictive policing” systems that they are using, including what they do, how they are used, what data operationalises them and the decisions they influence.

It should also be legally required that individuals are notified when they have been profiled by “predictive policing” systems, with clearly defined routes to challenge all places that their data is being held. Without full transparency and enforceable accountability mechanisms, these systems risk eroding the very foundations of a democratic society.

Beyond ‘predictive’ policing, towards community safety


The failures of “predictive” policing have been well documented – from reinforcing racist policing to undermining human rights. But rejecting these technologies does not mean giving up on public safety. On the contrary, it means shifting resources and attention to solutions that are proven to work, that respect human rights and that are based on trust, not fear. This means investing in secure housing, mental health services, youth centers and community based support services for people experiencing hardship or distress. If safety is the goal – prevention not prediction should be the priority.

Ban Crime predicting police tech

Crime predicting’ AI doesn’t prevent crime – it creates fear and undermines our fundamental right to be presumed innocent.

Sign the petition
End Pre-Crime