Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facial recognition trial in London results in zero arrests, Metropolitan Police confirm

Opponents have criticised the technology over the rate of 'false positives' and privacy and human rights concerns

Lizzie Dearden
Home Affairs Correspondent
Tuesday 03 July 2018 16:15 EDT
Police are trailling controversial facial recognition technology in Stratford

Police have admitted that no one was arrested during a trial of controversial facial recognition technology, which sparked privacy and human rights concerns.

Scotland Yard had hailed the pilot in Stratford, which The Independent revealed to be one of several planned across London this year, as a method of identifying wanted violent criminals and cracking down on attacks.

But observers saw only one person stopped after being identified as a potential match between live camera images and photos of a known suspect.

Scotland Yard did not say how many people were questioned in total but confirmed there were no arrests resulting from facial recognition equipment that was positioned on a pedestrian bridge.

Detective Superintendent Bernie Galopin, the force’s lead for the technology, said: “This deployment formed an important part of ongoing trials and a full review of its use will take place once they have been completed.

“It is important to note all the faces on the watchlist used during the deployment were of people wanted by the Met and the courts for violence-related offences.

“If the technology generated an alert to signal a match, police officers on the ground reviewed the alert and carried out further checks to confirm the identity of the individual.

“All alerts against the watchlist will be deleted after 30 days and faces in the database that did not generate an alert were deleted immediately.”

Police officers stand among shoppers in Stratford, East London.
Police officers stand among shoppers in Stratford, East London. (The Independent)

Hannah Couchman, an advocacy and policy officer at Liberty who monitored Thursday’s trial and had access to the Met’s operations room, said she saw only one “match” in around two hours – a young black man.

“The alert came up on the computer screen and it seemed quite clear to me and my colleague that there wasn’t a great deal of similarity between the live image and the ‘probe’ image,” she told The Independent.

“We had been assured by the Met that where an alert comes up there has to be that human interaction where an officer makes a verification as a human being and says ‘that looks sufficiently similar for us to intervene’.

“We didn’t see that happen – we saw an immediate radio out with a description and by the time I walked out on the bridge there were already two officers who had taken this man to one side.”

Ms Couchman said the man took his rucksack off and had his pockets turned out by police, handing over an Oyster transport card used by under-18s and students that displays the user’s name and photo.

Officers talked to the man and communicated with colleagues to confirm he was not the suspect who had flagged as a potential match to his face before letting him go.

Afterwards, Ms Couchman said the man “said he didn’t completely understand what had just happened and he found it quite frustrating”.

“He was given a leaflet by an officer at the end when it was all over and that was the only one I saw the whole day,” she added.

Scotland Yard said the Stratford operation would be “overt” and that members of the public passing the cameras would be handed leaflets and talked to by police, but The Independent did not observe any information being proactively given out.

The vast majority of people passing through a line of police officers straddling a bridge between two shopping centres appeared not to see posters saying facial recognition technology was being used through the throngs of shoppers.

Liberty is among the campaign groups raising privacy and human rights concerns over the technology, and is backing potential legal action against South Wales Police by a Cardiff resident who believes his face was scanned at a peaceful anti-arms protest and while doing his Christmas shopping.

Opponents argue that the software currently being used by British police forces is “staggeringly inaccurate” and has a chilling effect on society, while supporters see it as a powerful public protection tool with the ability to help track terrorists, wanted criminals and vulnerable people.

Det Supt Galopin said automatic facial recognition technology was being tested in a variety of environments including protests, sporting events and crowded public spaces.

“The [Stratford] deployment formed part of the Met’s ongoing trial of the technology, and was used to further assess how it can support standard policing activity and assist in tackling violent crime,” he added.

“We will now consider the results and learning from this use of the technology.”

Two suspects were arrested during the Stratford trial after a knife arch operated separately by the Met’s Violent Crime Task Force and British Transport Police detected two weapons being carried by people near the Westfield shopping centre.

Police have committed to at least 10 separate trials, following previous deployments at Notting Hill Carnival, Remembrance Day services and at the Port of Hull, ahead of a full evaluation at the end of this year.

The Independent previously revealed that Scotland Yard’s software was returning “false positives” – images of people who were not on a police database – in 98 per cent of alerts.

Commissioner Cressida Dick will be questioned on the use of facial recognition at a hearing of the London Assembly’s Police and Crime Committee on Wednesday.

Last week the government announced the creation of a new oversight and advisory board for facial recognition in law enforcement, which could be expanded to ports, airports, custody suites and on police mobile devices.

The Home Office’s strategy on biometrics, which also include fingerprints and DNA, said the board would make recommendations on policy changes and oversight arrangements for the technology, which is currently being purchased ad hoc by police forces.

The use of facial recognition is more prevalent in the US, where it was used to track down an alleged mass shooter following a massacre at a newspaper's office last week.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in