Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Information commissioner threatens legal action against police using 'dangerous and inaccurate' facial recognition technology

Report brands software ‘dangerous and inaccurate’ after research by The Independent finds 98 per cent of returns by Met Police system were not criminals

Lizzie Dearden
Home Affairs Correspondent
Tuesday 15 May 2018 13:22 EDT
The technology automatically compares people’s faces to police databases in real time, flagging up potential matches
The technology automatically compares people’s faces to police databases in real time, flagging up potential matches (PA)

The information commissioner has threatened to bring legal action against police forces using controversial facial recognition technology, to protect the public’s privacy and human rights.

The watchdog issued its warning following research by The Independent that showed 98 per cent of returns by software used by the Metropolitan Police were “false positives”.

A report by Big Brother Watch branded the technique “dangerous and inaccurate” in a report presented in parliament, calling for all public authorities to stop using the surveillance camera technology.

Speaking at the event, Tottenham MP David Lammy said such powers “must have scrutiny to ensure that they are not abused”.

“That is clearly not the case here,” he added. “This is not effective policing, this is not efficient policing, and innocent members of the public are getting harassed by the police in being asked to prove their identity and innocence.”

Mr Lammy voiced concern about the potential for “conscious and unconscious bias” and profiling of black communities after his own review found inequalities at all stages in the criminal justice system.

Police leaders have defended the software, saying that they do not arrest suspects based on a match alone, have checks and balances in place and delete images that do not generate an alert.

Scotland Yard has deployed its system at the Notting Hill Carnival and Remembrance Sunday commemorations, while South Wales Police used it at last year’s Champions League final in Cardiff.

A police officer watches revellers at Notting Hill Carnival
A police officer watches revellers at Notting Hill Carnival (PA)

Elizabeth Denham, the information commissioner, said the reach of facial recognition technology (FRT) was increasing because of its ability to link to mobile and fixed cameras in real time.

“There may be significant public safety benefits to enable the police to apprehend offenders and prevent crimes from occurring,” she added.

“But how facial recognition technology is used in public spaces can be particularly intrusive. It’s a real step change in the way law-abiding people are monitored as they go about their daily lives.

“There is a lack of transparency about its use and a real risk that the public-safety benefits derived from the use of FRT will not be gained if public trust is not addressed.”

She listed many “unanswered questions” around the technology, including over its accuracy, bias, effectiveness and a lack of national coordination.

The commissioner, who has raised concerns with the Home Office and National Police Chiefs’ Council, warned that any police force using it must prove it solves a specific problem in a way that less intrusive methods cannot.

“Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public,” she added.

Big Brother Watch’s report, which was supported by more than a dozen other groups, said the technology had been rolled out over the past two years in city centres, at political demonstrations, sporting events and festivals.

Its research claimed that use of the software has been “lawless” and could breach the right to privacy under the Human Rights Act.

South Wales Police has defended its use of FRT after it was revealed that 2,000 people at the 2017 Champions League final were wrongly identified as potential criminals
South Wales Police has defended its use of FRT after it was revealed that 2,000 people at the 2017 Champions League final were wrongly identified as potential criminals (PA)

Scotland Yard wrongly identified 95 people as potential criminals at the 2017 Notting Hill Carnival alone but plans seven more deployments this year, it said, while a separate system used by South Wales Police was found to be 91 per cent inaccurate.

Researchers said the force stores photos of innocent people that do not flag up on police databases for a year, without their knowledge.

The software used in Britain has not yet been tested for demographic accuracy, but in the US concerns have been raised that facial recognition is less reliable for women and black people.

Silkie Carlo, director of Big Brother Watch, said: “Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for and that poses a major risk to our freedoms.

“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

Norman Lamb, chair of the Science and Technology Committee, called on the government to review its approach to facial recognition technology.

The Liberal Democrat MP said misidentification was being caused because of “technical inadequacy” and the failure to remove acquitted people’s images from a police database that holds more than 19 million photos.

“It is wrong that the police are using technology which has the potential to wrongly incriminate individuals on UK streets,” he added. “We have already discovered that the systems currently in use are not up to the task, and that law enforcement is unable to effectively manage the images they possess.”

The Metropolitan Police said it was trialling facial recognition technology to help officers “identifying known offenders in large events in order to protect the wider public”.

A spokesperson said deployments will be “overt”, rather than hidden, and fully evaluated when the experiment ends later this year.

“There have been no arrests resulting from the use of facial recognition technology,” he added.

“Regarding ‘false’ positive matches – we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts.

“All alerts against the watchlist are deleted after 30 days. Faces in the video stream that do not generate an alert are deleted immediately.”

South Wales Police has also defended its capability, which it said helped fight crime and protect the public – generating more than 2,000 positive matches in nine months, helping secure successful convictions and finding vulnerable people in mental health crises.

“Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops,” the force added.

“No one has been arrested where a ‘false positive alert’ has occurred and no members of the public have complained. This is due to importance we place on human judgement.

“In all cases, an operator will consider an initial alert and will either disregard it, which happens in the majority of cases, or dispatch an intervention team where a match is considered to have been made.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in