Police used facial recognition technology lawfully, High Court rules in landmark challenge
Ed Bridges vows to appeal High Court ruling after judges find human rights and data protection not violated
An activist has lost the world’s first legal challenge over police use of facial recognition, after the High Court found the technology was being used lawfully in Wales.
Ed Bridges said his human rights had been violated by the “intrusive surveillance tool”, after he was scanned at a protest and while Christmas shopping in Cardiff.
But High Court judges dismissed his case against South Wales Police after finding that human rights and data protection laws had not been violated.
Speaking after the ruling, Mr Bridges vowed to appeal the decision after crowdfunding almost £7,000 for the legal battle.
He said South Wales Police had been using the controversial technology indiscriminately against thousands of people without their consent.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance,” Mr Bridges added.
He was supported by the Liberty human rights group, which vowed to campaign for an outright ban on facial recognition in Britain, amid concerns over its use by private companies.
“This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms,” said Liberty lawyer Megan Goulding.
“It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”
The judgment said 500,000 faces may have been scanned by South Wales Police so far, and that the legal framework governing facial recognition must be subject to periodic review.
In their ruling, Lord Justice Haddon-Cave and Mr Justice Swift said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Wales Police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.”
Lord Justice Haddon-Cave said the case was the “first time that any court in the world had considered” automatic facial recognition (AFR).
“The algorithms of the law must keep pace with new and emerging technologies,” he added.
During a three-day hearing in May, Mr Bridges’ lawyers told the High Court that the technology allowed the police to “monitor people’s activity in public in a way they have never been able to do before”, without having to gain consent or use force.
Dan Squires QC said: “The reason AFR represents such a steep change is you are able to capture almost instantaneously the biometric data of thousands of people.
“It has profound consequences for privacy and data protection rights, and the legal framework which currently applies to the use of AFR by the police does not ensure those rights are sufficiently protected.”
Mr Squires said Mr Bridges had a “reasonable expectation” his face would not be scanned in a public space and processed without his consent while he was not suspected of wrongdoing.
South Wales Police chief constable Matt Jukes said the force had conducted “careful legal” and ethical work.
“I recognise that the use of AI and face-matching technologies around the world is of great interest and at times, concern,” he added.
“So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme. With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.
“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
Facial recognition technology maps faces in a crowd by measuring the distance between features, then compares results with a watch-list of images – which can include wanted criminals, missing people and persons of interest.
As well as a high rate of “false positive” alerts, where innocent people are wrongly flagged to police, campaigners have raised concern over evidence of racial bias in the software.
South Wales Police has been using the technology since 2017 and is considered the national lead force on its use.
London’s Metropolitan Police is facing a separate legal challenge over its own facial recognition trial, which has seen members of the public misidentified as criminals in 96 per cent of scans.
A spokesperson for the Information Commissioners’ Office, which intervened in the case, said it would be reviewing the judgment carefully.
“We welcome the court’s finding that the police use of live facial recognition (LFR) systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018,” the watchdog added.
“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”
The information commissioner has conducted its own investigation into trials conducted by South Wales Police and Scotland Yard, and said it would consider the High Court’s findings before publishing national recommendations.
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.