UK Facial Recognition – No Oversight, No Consent

On 3 February 2023, the Wales cross-party group on digital rights and democracy – for which Open Rights Group serves as the secretariat – held its fourth session on surveillance and facial recognition technology in the UK. Chaired by Sarah Murphy MS, the session heard powerful accounts from campaigner Ed Bridges, who launched the groundbreaking challenge Ed Bridges vs South Wales Police, Madeleine Stone from Big Brother Watch and researcher Stephanie Hare.

While justice and policing are not devolved in Wales, Murphy stressed how important it was to have the discussion when the decisions on rights are being made elsewhere – Westminster and the Home Office. And as the speakers iterated, technology is having a resounding effect on the right to privacy, discrimination, bias and children’s rights.

“Imagine if South Wales police or other force stopped people while they walked down their local high street and took their fingerprints on the off chance that they were a wanted criminal; people would object to that.”

Ed Bridges

Indiscriminate and disproportionate incursion of privacy

Bridges kicked off after Murphy summarised the legal challenge he launched in 201. Bridges acted as he took exception to the fact that he would walk down the street and a facial recognition camera could ID him even when he was doing nothing wrong.

In a court of appeal judgment in August 2020, the Court of Appeal agreed with submissions from Liberty, with whom Bridges appealed, and found South Wales police use of facial recognition technology breaches privacy rights, data protection laws and equality laws under Bridges’ right to privacy under Article 8 of the European Convention on Human Rights and that the force failed to properly investigate whether the software exhibited gender bias.

Bridges said there were three main reasons why facial recognition was problematic:

1. Lack of consent – The tech makes a biometric map of your face and checks it against a database. That biometric map is unique to individuals, more akin to a fingerprint than an image.

“Imagine if South Wales police or other force stopped people while they walked down their local high street and took their fingerprints on the off chance that they were a wanted criminal; people would object to that,” explained Bridges.

2. The chilling effect – The second time Bridges was scanned, he was at an arms fair protest in Cardiff. He reflected that the technology made people feel like they were being criminalised, which could deter people from attending peaceful demonstrations in the future. He mused if the same thing happened in China or Russia, it would be labelled an abuse of power.

3. The system’s inherent bias – Facial recognition can only ever be as good as the algorithm which supports it. There’s plenty of data, which shows the facial recognition software is less effective at recognising female faces and non-white faces, he said.

Incompatible with democracy

Stone specialises in surveillance technology and its impact on human rights and spoke of the organisation’s experience tracking the growth of facial recognition in the UK from its first use case. Describing what it is like for people on the ground when they monitor deployments of live facial recognition, she provided several examples of how the tech, as well as being a violation of privacy, also misidentifies people, predominantly young Black men.

In one deployment in Romford, London, a 14-year-old young black teenager in school uniform was misidentified by facial recognition on his way home from school. As it was his first experience, he didn’t know what was happening when he found himself surrounded by plainclothes officers and they called him into an alleyway, asking for his fingerprints and ID. He didn’t know they were police officers before uniformed officers appeared and didn’t have ID because he was only 14. He tried to use his school tie to show which school he attended to verify his identity. The experience shook him.

“That is a really negative experience to have with police at such a young age. And it leaves him not only with the idea that the police might be racist but also that technology might be racist. And that’s something that is a lot more difficult to fight back against if you feel like the various systems that police are using have biases against you,” she said.

In addition, there is very little oversight: no legislation, parliamentary debate or public consultation, allowing police forces to write their own rules and make the decisions on facial recognition tech’s usage. Police forces have been quite reluctant to give full transparency and it’s taken several years of organisations like BBW and Liberty pushing and slicing under legal challenge to get full transparency policies used. The Met Police even commissioned a review into how live facial recognition complies with the law and human rights risks, which said it posed a serious threat to human rights and the review was buried, said Stone.

Children can’t consent

Hare presented her chapter on facial recognition from her book, Technology is Not Neutral: A short guide to technology ethics.

Particularly troubling to Hare was that facial recognition technology was being used on children. Britain made worldwide headlines last year because of facial verification technology, a form of facial recognition used in schools in North Yorkshire.

When Hare asked the Scottish biometrics commissioner about this, they said it was not their remit as it didn’t involve crime but they thought that children in school in Scotland should be able to sit in class or take school meals without being watched and recorded. In other words, it was not a proportional or necessary use of the tech.

The ICO had also just declared that the technology was deployed in a manner that was likely to have infringed data protection law and that it was unlikely compliant with GDPR.

Hare said it was important to ask whether children can consent in any situation regarding facial recognition. Could they really read all the research and the legislation to make an informed choice? They will likely comply rather than object to get on with their day, so the power imbalance is apparent. Thus, in Hare’s view, this technology should not be used on kids and a special protection should be in place.

Data protection

It was noted that data protection legislation provides some rights to individuals in the case of surveillance technologies, such as CCTV. For instance, if you are in footage taken by the police at a football match, data protection legislation allows you to write to the relevant police force and ask for the footage. Yet, there is no equivalent around the use of facial recognition. 

“It is the Wild West in legal terms,” said Bridges.

To find out more about the developing landscape around surveillance, digital rights and inequity, Stone suggested following groups, such as BBW, Liberty, Open Rights Group, Algorithmic Justice League, the Runnymede Trust, the Racial Equality Network and the Joint Council for the Welfare of Immigrants.

Open Rights Group is also doing important work on raising awareness of weakening data protections being posed by the government. 

And to keep the issue relevant in the eyes of lawmakers, it is important to write to your elected representative and keep the problems associated with surveillance and facial recognition tech firmly in their consciousness.

Links to further resources:

    End Pre-Crime

    Data and content is being weaponised to criminalise people without cause.
    Find Out More
    End Pre-Crime