ORG has signed this joint letter to the Metropolitan Police calling on them to scrap plans to use automated facial recognition at Notting Hill Carnival
This letter is also available as a PDF on Liberty's website.
Cressida Dick, CBE, QPM
New Scotland Yard
Wednesday 16th August 2017
To the Commissioner of the Metropolitan Police,
We are writing to express our concern regarding the Metropolitan Police’s plans to use mobile facial recognition at this year’s Notting Hill Carnival.
We are calling on you to scrap plans to use automated facial recognition at Notting Hill Carnival and to urgently start a dialogue with civil society and Parliament about the use of this technology.
There is no clear legal basis for the use of automated facial recognition in public spaces. Whilst the Protection of Freedoms Act 2012 introduced the regulation of overt public space surveillance cameras in England and Wales, the Act does not contain a single reference to facial recognition. As such, Parliament has never considered or scrutinised automated facial recognition technology. We Note that Government has not published plans for the use of facial recognition technology and has missed its own deadline for a biometrics strategy, which would address the use of facial recognition technologies, by four years. The Surveillance Camera Commissioner raised the issue again in his March 2017 report asking, “As new technology becomes available or widely used how do we ensure that it is used within the legislative framework – for example under what legislative footing is automatic facial recognition (…)[?]”.1 The absence of a clear statutory footing and the lack of any parliamentary scrutiny raise a serious and urgent question as to the lawfulness of automatic facial recognition in public spaces. The use of automatic facial recognition in public spaces presents such a significant interference with the right to private life in particular that its use is likely to constitute a breach of the Human Rights Act.
There is no independent statutory oversight of the Metropolitan Police’s use of automated facial recognition technology. The oversight of facial recognition is not within the remit of the Surveillance Camera Commissioner. He noted: “Clarity regarding regulatory responsibility is an emerging issue, for example in automatic facial recognition use by police – which regulator has responsibility – the Biometric Commissioner, the Information Commissioner or Surveillance Camera Commissioner [?]”2 In 2015, the Science and Technology Committee recommended that such oversight fall within the Biometrics Commissioner’s remit, as did the (then) Commissioner. However, the suggestion has not been acted on and facial recognition remains without oversight.
There is an unacceptable lack of transparency around the Metropolitan Police’s use of automated facial recognition. We do not know, for example, in which public places automated facial recognition is in use; where and how the images captured are stored; how long the images are stored for; how and when the images are deleted; what databases the images are being matched against; whether the software is applied to recorded CCTV footage after the fact; whether images are linked up to other databases; whether any or all of the footage, images, or other data is shared with a third party, whether other public authorities or the software provider; whom the software provider is and how much is paid for the service; how accurate the software is; and whether the software has been tested for accuracy biases.
Research has shown that facial recognition software can carry racial accuracy biases,3 with some software found to more frequently misidentify black people and women.4 Notting Hill Carnival is an event that specifically celebrates the British African Caribbean community. The complete lack of transparency from the Metropolitan Police about trials of this technology; the lack of information regarding what software the Force intends to use, whether it has been independently tested for demographic biases, or how accurate it is; and the lack of a clear legal basis and independent oversight presents a real concern that automated facial recognition could inadvertently lead to discriminatory policing at Notting Hill Carnival. This risk is unacceptable. The choice to use Notting Hill Carnival to trial, yet again, this invasive technology unfairly targets the community that Carnival Exists to celebrate. There has been a pattern of black deaths following police contact that have caused considerable concern about discriminatory policing. This will only exacerbate concerns about abuses of state power. We reject the notion that year-on-year deployment constitutes a ‘trial’ – for which, by the Force’s own admission, there are “no timescales”;5 and we reject the proposition that it is appropriate to test biometric surveillance on this community.
If the Met were to repeat use of automated facial recognition at Notting Hill Carnival, this would demonstrate a disregard for democratic scrutiny, a disavowal of the Met’s human rights obligations, and indifference to the serious risk of discrimination posed by this technology.
We urge you to abandon plans to use automated facial recognition at Notting Hill Carnival this year. This is not policing by consent. Automated facial recognition is a technology that presents unique threats to human rights and civil liberties. The privacy concerns that this technology gives rise to are on a par with those associated with other forms of biometric surveillance such as DNA databases and ID cards. It is imperative that a serious and meaningful conversation is had involving parliament, civil society and law enforcement regarding plans for this technology before it is ‘trialled or deployed and risks unduly interfering with individuals’ human rights.
Martha Spurrier, Director of Liberty
Deborah Coles, Director of INQUEST
Ratna Dutt OBE, Chief Executive of the Race Equality Foundation
Liz Fekete, Director of Institute of Race Relations
Dr Gus Hosein, Executive Director of Privacy International
Dr Omar Khan, Director of Runnymede
Jim Killock, Executive Director, Open Rights Group
Renate Samson, Chief Executive of Big Brother Watch
Black Lives Matter, UK
The Monitoring Group
Police Action Lawyers Group
1. A National Surveillance Camera Strategy for England and Wales – Surveillance Camera Commissioner, March 2017, para. 35, p.12
2. Review of the impact and operation of the Surveillance Camera Code of Practice –Surveillance Camera Commissioner, Feb 2016, p.15
3. Facial Recognition Software Might Have a Racial Bias Problem – Clare Garvie and Jonathan Frankle, TheAtlantic, 7 April 2016: https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-offacial-recognition-systems/476991 (accessed 7 August 2017)
4. See evidence provided to the Full House Committee on Oversight and Government Reform (US), 22 March2017: https://oversight.house.gov/hearing/law-enforcements-use-facial-recognition-technology (accessed 7 August 2017)
5. In email correspondence between Detective Inspector Richard Lines (SO15) and Silkie Carlo (Liberty), 2 August 2017.