Police hail improved accuracy of facial recognition tech as campaigners urge ban
Research has found minimal discrepancies for sex and race at certain settings but human rights groups have criticised police use of the technology.
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Police use of facial recognition technology is a form of mass surveillance that “turns us into walking ID cards” and should be banned, campaigners have said.
The Metropolitan Police, Britain’s largest force, welcomed a research report published on Wednesday that found there were minimal discrepancies for race and sex when the technology is used at certain settings.
It was also found to correctly identify between identical twins.
South Wales Police had paused its use of the technology amid concerns over discrimination but will resume in the wake of the report.
Human rights groups Liberty, Big Brother Watch and Amnesty have said the technology is oppressive and has no place in a democracy.
The research, carried out by the National Physical Laboratory, was commissioned by the Met and South Wales Police in late 2021 following fierce public debate about police use of the technology.
Director of intelligence for the Met Lindsey Chiswick said: “This is a significant report for policing as it is the first time we have had independent scientific evidence to advise us on the accuracy and any demographic differences of our facial recognition technology.
“We know that at the setting we have been using it, the performance is the same across race and gender and the chance of a false match is just one in 6,000 people who pass the camera.
“All matches are manually reviewed by an officer. If the officer thinks it is a match, a conversation will follow to check.
“We understand the concerns raised by some groups and individuals about emerging technology and the potential for bias. We have listened to these voices.
“This research means we better understand the performance of our algorithm. We understand how we can operate to ensure the performance across race and gender is equal.”
South Wales Chief Constable Jeremy Vaughan said the system is “a force for good”.
He added: “I believe the public will continue to support our use of all the available methods and technology to keep them safe and thanks to the work of the National Physical Laboratory and the results of its independent evaluation I believe we are now in a stronger position than ever before to be able to demonstrate that the use of facial recognition technology is fair, legitimate, ethical and proportionate.”
But human rights groups hit out at use of the technology, which has been found to be less accurate for black people at different settings, and less accurate for young people under the age of 20.
Katy Watts, lawyer at Liberty, said: “We should all be able to live our lives without the threat of being watched, tracked and monitored by the police.
“Facial recognition technology is a discriminatory and oppressive surveillance tool that completely undermines this basic right.
“This report tells us nothing new – we know that this technology violates our rights and threatens our liberties, and we are deeply concerned to see the Met Police ramp up its use of live facial recognition.
“The expansion of mass surveillance tools has no place on the streets of a rights-respecting democracy.”
She said the technology “sows division” and will be disproportionately used on communities of colour.
A recent review by Baroness Louise Casey found that the Metropolitan Police is institutionally racist, misogynist and homophobic.
Ms Watts added: “It’s impossible to regulate for the dangers created by a technology that is oppressive by design. The safest, and only, thing to do with facial recognition is ban it.”
Liberty was among 14 campaign groups including Big Brother Watch and Black Lives Matter UK who wrote to Met Commissioner Sir Mark Rowley when he began the job in September demanding an end to police use of the system.
Madeleine Stone, legal and policy officer from Big Brother Watch, said: “Live facial recognition is suspicionless mass surveillance that turns us into walking ID cards, subjecting innocent people to biometric police identity checks.
“This Orwellian technology may be used in China and Russia but has no place in British policing.
“This report confirms that live facial recognition does have significant race and sex biases, but says that police can use settings to mitigate them.
“Given repeated findings of institutional racism and sexism within the police, forces should not be using such discriminatory technology at all.”
False identifications during Met use of the technology include a 14-year-old black schoolboy in uniform, and a French exchange student who had only been in the country for a few days.
Ms Stone added: “One in 6,000 people being wrongly flagged by facial recognition is nothing to boast about, particularly at deployments in large cities where tens of thousands of people are scanned per day.
“If rolled out across the UK, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence.
“Live facial recognition is not referenced in a single UK law, has never been debated in Parliament, and is one of the most privacy-intrusive tools ever used in British policing.
“Parliament should urgently stop police from using this dangerously authoritarian surveillance tech.”
Oliver Feeley-Sprague, Amnesty International UK’s military, security and police director, said: “Against the appalling backdrop of the Casey report and evidence of racist policing with stop and search, the strip-searching of children and the use of heavily biased databases like the gangs matrix, it’s virtually impossible to imagine that faulty facial recognition technology won’t amplify existing racial prejudices within policing.
“The Met’s reliance on police-commissioned research to back up blanket surveillance tactics is just the latest example of a police force in denial over its past failings and unwilling to properly listen to its critics.
“Facial recognition systems are Orwellian and involve a massive intrusion into all our lives – they should have no place in any well-run, properly accountable police service.”