South Wales Police state their case for using facial recognition

On 28 June, the parliamentary session’s final Senedd cross-party group (CPG) on digital rights and democracy, of which Open Rights Group (ORG) acts as secretariat, took place online with over 20 attendees tuning in to hear a follow-up to a previous group session on facial recognition technology. 

During the last meeting in February, we discussed the landmark ruling that found the use of facial recognition by South Wales Police breached the right to privacy under the European Convention of Human Rights. South Wales Police has since resumed use of the technology and Chief Inspector Scott Lloyd from the force addressed the CPG to explain why. He was joined by Lee Jones – Chief Executive, Police & Crime Commissioner for South Wales – and Madeleine Stone – Legal and Policy Officer, Big Brother Watch. Sarah Murphy MS chaired the meeting.

‘Convinced’ by facial recognition

Lloyd, who is attached to the National Police Chief’s biometric function – a small team of people trying to help navigate biometric technology and its adoption in law enforcement – presented the case for live facial recognition as a legitimate capability and extremely effective, particularly as the criminal landscape is evolving and becoming more complex. He also stressed that the members of the public they have engaged with are keen to see its use more broadly.

Stressing the distinction between retrospective facial recognition, in use since 2017, and operator-initiated facial recognition – a new form that will be on police mobile devices to help identify people they interact with daily – Lloyd acknowledged that live facial recognition is a more contentious iteration.

Retrospective facial recognition would only involve using the technology after an incident and then trying to match a suspect’s image with custody images. Live facial recognition cameras, supported by an intelligence case, involve software that receives a live feed, comparing it against a watchlist, after which the image and the biometric template used for live facial recognition are immediately deleted.

Overall, Lloyd explained it was a game changer for police as criminals are apprehended and suspects are moved far more quickly through the criminal justice system. South Wales Police are one of only a few forces currently deploying live facial recognition and it has been deployed on 70 occasions with 75 arrests resulting; there were no false arrests or complaints specifically concerning the technology save for two court cases, he explained.

He pointed out that facial recognition was widely available on people’s mobile phones but that “it is right for law enforcement to be subject to the highest levels of scrutiny.”

The courts have asked South Wales Police to look at who they put on the watchlist and where they deploy the technology and said there needs to be a code of practice and amendments to local policies to remedy the issues found in the court.

As a result, the surveillance camera code was amended in August 2021 to adopt elements of the judgment, as were Southwest police policy documents. The College of Policing, as a professional body for police, has also since published authorised professional practice for the overt use of live facial recognition. 

To understand if there was any potential bias in the technology, South Wales Police also approached the National Institute of Science and Technology for help, sought clarity from some suppliers, and commissioned the National Physical Laboratory to conduct a large study of live facial recognition. As the tech evolves, however, they will need to revisit the legal landscape.

‘Scrutinising live facial recognition

Jones came in to explain the Commissioner’s role in providing oversight of the use of the technology without getting involved in operational policing. So, while the force rationalises what resources can be used, the Commissioner can provide oversight that the rationale is justified.

Formal and informal channels exist for that oversight and thorough governance arrangements, including formal scrutiny committees. Their approach is guided by standard policy but they can consider the public’s concerns and academic expertise.

“[The Commissioner] has an independence perhaps people aren’t aware of,” said Jones. Mechanisms include:

  • A Police and Crime panel consists of members from each of the seven local authorities in the South Wales area and independent members. Panel members will receive a demonstration of the tech to see its progress.
  • A police accountability and legitimacy group – a group of independent members from among the community, race equality camps and young people’s organisations.- brings different perspectives around stop and search and facial recognition, particularly. 
  • A scrutiny board with Jones and his team members proactively develops feedback using public engagement, experts, reports from academia, etc.
  • Technological aspects and other insights are fed back in a report to a Commissioner strategic board, the most senior accountability board chaired by the Commissioner, which also addresses escalations, including court cases or what goes to the Police and Crime panel. 
  • An independent ethics committee and experts have been involved from the beginning of the development of the tech. Independent members are appointed with various backgrounds from within the community. 
  • An annual survey is sent to the public where they can give feedback on policing styles.

Changes have been introduced, explained Jones, but it is an iterative process. So far, they’ve identified issues around accuracy when identifying people of colour and younger people, but as the technology is used more, further problems will arise.

Explaining the rationalisation process, he said each deployment is decided on individually based on the intelligence picture. For example, the Harry Styles concert warranted live facial recognition because it was a younger demographic; it could draw in sex offenders and could be a terrorist target. It was also a concert where everyone wanted to be, so the potential to locate missing persons was high. Therefore, there is no carte blanche to using the technology, he said. 

Starting with surveillance rather than suspicion

Stone started with its perspective on live facial recognition, returning to the distinction between it and the retrospective version. She stressed that finding individuals responsible for criminal wrongdoing differs from surveilling football fans at matches or protesters.

Live facial recognition poses the most significant threat to human rights, privacy and civil liberties in the UK because it is untargeted and “mass surveillance,” scanning everyone that walks past it, she said:

“We believe that is a privacy-obliterating technology. There are serious issues with accuracy and discrimination. And lastly, this technology is enormously undemocratic.”

Essentially, using live facial recognition turns policing on its head as, typically, surveillance is preceded by suspicions – an individual would be suspected with evidence that warrants surveillance or there’s a risk of them committing a crime or other reasons for investigation. However, live facial recognition starts with the mass surveillance of everybody who walks past a camera and then compares them against a watch list.

The trajectory is that the more we normalise this technology, the more widely it will be used and we could expect it within CCTV networks eventually. We should expect a certain amount of privacy when we walk down the street; this type of technology forms the backbone of a police state. 

Stone also made the point that the safeguards were minimal, guidelines were unenforceable and the technology is still imperfect, despite the millions invested into it.

She also pointed out that stats indicating a 1 in 60,000 false positive rate were misleading as this did not consider that there were people on the watchlist who should not be due to the broad-ranging guidelines authored by the College of Policing. Thus the true rate is that 90% of all matches have been false positives. Also, as watchlists grow, the accuracy and efficacy of the technology will weaken, she said.

The study commissioned by South Wales Police acknowledged bias and that young Black females are the most susceptible to mismatch. It is important to understand that deployments are taking place in largely Black communities and the fact the first deployment took place at Notting Hill Carnival, where London’s BlackBritish and African Caribbean communities come together, speaks volumes, Stone added.

Use of facial recognition against protestors not ruled out

Fielding questions from attendees, Lloyd also confirmed that there are additional checks and balances concerning some protected characteristics, particularly focusing on children, because the technology’s accuracy is affected with younger subjects. He also clarified that there are settings within the technology where no bias is detected. Still, these may be changed if there is an intelligence case to do so, e.g. a lower level of accuracy would be accepted to find a particular target. 

There is wide concern that facial recognition will be used on protests as with the King’s Coronation and although South Wales Police have not used it for such reasons, there was no ruling out of that possible use.

The issue of community engagement was an important one to attendees. And while there was engagement, it was acknowledged that much work had to be done to target the right community members and address specific concerns, particularly racial bias. There is still much trust that needs to be built with communities.

As Stone explained the broad nature of watchlists, which encompass more than criminal wrongdoings, e.g. people with mental health issues and those on particular NHS lists can find themselves on watchlists, ORG also raised questions about people referred to Prevent who may end up on certain police databases and watchlists, and whether they could come on to these watchlists. It iterated that the Prevent policy already falls foul of a high false positive rate and there are racialised dimensions to who is referred to Prevent, including erroneously.

Murphy finished by asking if there could be data shared between departments, noting that migrant women experiencing domestic abuse found themselves contacted by the Home Office after they reported it, while the details of people with visible disabilities at a protest in Manchester were reported to the Department for Work and Pensions to threaten people’s benefits because “if they were well enough to go to a demonstration they could also work.”

Murphy questioned if facial recognition technology would facilitate that. Lloyd said that would be concerning and was not aware that was happening. Jones said he would pick up these points and raise them within the Commissioner’s office.

The CPG will resume later in the year.

Resist Pre-Crime

Data and content is being weaponised to criminalise people without cause.
Find Out More
Resist Pre-Crime

KEEP UP with the latest developments

Subscribe to our newsletter to receive updates on the latest developments affecting your digital rights.

Sign up now