As Live Facial Recognition cameras are rolled out across the capital, analysis by City News suggests the system looks very different depending on how you measure “false alerts”.

Across 203 deployments between September 2024 and September 2025, the Met recorded 10 false alerts. Set against 3,147,436 faces scanned, the force says the chance of a member of the public being misidentified is vanishingly small, around one in every 300,000.

But set those same false alerts against the 2,077 total alerts, meaning instances where the technology flagged someone as a possible watchlist match, the rate is higher. On that measure, false alerts account for around 0.48% of alerts, roughly one misidentification in 208 alerts.

In the first month of 2026, the Met’s published figures show the rate was higher still, at one false alert in 125 alerts. The Met Police told City News it has safeguards in place and said it has never arrested anyone as a result of a false alert.

Live Facial Recognition is increasingly visible in London. British Transport Police says it has launched a six-month pilot at London Bridge station. Sainsbury’s has also announced five new London supermarkets will trial LFR, after a customer said he was misidentified and told to leave a store in Elephant and Castle earlier this year.

Warren Rajah, 42, was escorted out of his local Sainsbury’s after staff incorrectly used Live Facial Recognition technology

This apparent disparity between the Met’s headline figure and the operational reality could add to controversy around the technology.

The government’s consultation on legal framework for LFR by law enforcement closes on Thursday, with ministers signalling a rapid expansion of the technology across policing and retail.

The Home Secretary, Shabana Mahmood, has already announced 40 new LFR vans to be rolled out across the UK.

In a statement the Met Police told City News: “The Met follows guidance by the International Organisation for Standardisation (ISO) and National Physical Laboratory on calculating the false alert rate.”

We have approached both organisations for comment, and are awaiting a response.

The Met pointed to data showing Live Facial Recognition (LFR) has taken more than 1,700 dangerous offenders ‘off the streets’ since the start of 2024.

On impact, they added:

“We have never arrested an individual who was subject to a false alert. From our annual report, there were no arrests on the back of the 10 false alerts and four of the individuals were not even spoken to. The other six were spoken to for just five minutes or less.”

Being stopped, questioned or singled out—even briefly—can have real consequences. Shaun Thompson, a knife-crime activist, was wrongly stopped by police in February 2024 after LFR misidentified him told the BBC in an interview that the stop outside London Bridge station was “intimidating” and “aggressive.”

“They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest.”

Left to right: Shaun Thompson, Big Brother UK Silkie Carlo, MP David Davis

Shaun Thompson brought a legal challenge against the Met Police in the High Court last week, and the verdict is pending. He is supported by civil liberties group, Big Brother, Director Silkie Carlo and Conservative MP David Davis.

We approached Facewatch, the company that provides the cameras to more than 100 retail companies:

 “Every alert generated by the system is reviewed by a specialist Facial Analyst before action is taken. Decisions are made in conjunction with human oversight, not by the technology alone. Importantly, the system is designed so that technology alone never triggers enforcement.”