As Live Facial Recognition (LFR) cameras are rolled out across the capital, data analysis by City News suggests the software may be less accurate than previously claimed. According to the most recent data, nearly one percent of people alerted to police are being misidentified.

The Met Police stressed the chance of someone being misidentified as a suspected criminal is vanishingly small, or about one in every three hundred thousand people who walk past a camera. But if so-called false alerts are measured against total alerts – when the technology flags an individual for being on the criminal watchlist – the rate is much higher.

A spokesperson for the Met Police told City News it followed scientific and independent guidance on calculating the false alert rate.

Across 203 deployments between September 2024 to 2025, the Met recorded ten false alerts out of 3,147,436 faces scanned by live recognition cameras. Using a different measure – out of 2,077 total alerts – the rate is significantly higher, equating to one misidentification in 167 alerts.

In the first month of 2026, the rate was higher still, with one false alert in 125.

The Met Police told City News it had stringent safeguards in place to protect people’s privacy and has never arrested an individual who was subject to a false alert.

Live Facial Recognition cameras are becoming part of life in London, and British Transport Police has launched a six-month pilot at London Bridge station.

Sainsbury’s has announced five new London supermarkets will be trialling LFR, despite one of its customers being misidentified and instructed to leave a store in Elephant & Castle earlier this year.

Warren Rajah, 42, was escorted out of his local Sainsbury’s after staff incorrectly used Live Facial Recognition technology

This apparent disparity between the Met’s headline figure and the operational reality could add to controversy around the technology.

The government’s consultation on legal framework for LFR by law enforcement closes on Thursday, with ministers signalling a rapid expansion of the technology across policing and retail.

The Home Secretary, Shabana Mahmood, has already announced 40 new LFR vans to be rolled out across the UK.

In a statement the Met Police told City News:

“The Met follows guidance by the International Organisation for Standardisation (ISO) and National Physical Laboratory on calculating the false alert rate.”

They pointed to data showing Live Facial Recognition (LFR) has taken more than 1,700 dangerous offenders ‘off the streets’ since the start of 2024.

On impact, they commented:

“We have never arrested an individual who was subject to a false alert. From our annual report, there were no arrests on the back of the 10 false alerts and four of the individuals were not even spoken to. The other six were spoken to for just five minutes or less.”

Being stopped, questioned or singled out—even briefly—can have real consequences. Shaun Thompson, a knife-crime activist, was wrongly stopped by police in February 2024 after LFR misidentified him told the BBC in an interview that the stop outside London Bridge station was “intimidating” and “aggressive.”

“They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest.”

Left to right: Shaun Thompson, Big Brother UK Silkie Carlo, MP David Davis

Shaun Thompson brought a legal challenge against the Met Police in the High Court last week, and the verdict is pending. He is supported by civil liberties group, Big Brother, Director Silkie Carlo and Conservative MP David Davis.

We approached Facewatch, the company that provides the cameras to more than 100 retail companies:

 “Every alert generated by the system is reviewed by a specialist Facial Analyst before action is taken. Decisions are made in conjunction with human oversight, not by the technology alone. Importantly, the system is designed so that technology alone never triggers enforcement.”