Most people who have been wrongly flagged by the Metropolitan Police’s live facial recognition (LFR) technology over the past two years were people of colour.

Though the overall number of false alerts has fallen since the technology was piloted by the Met a decade ago, its disproportionate impact on people of colour remains high in a city where almost half the population are from a minority background.

A request under the Freedom of Information Act for data between 2021 and 2026 was returned for 2024 and 2025 only. In 2024, 17 Black people out of a total of 27 were misidentified. Demographic data of false alerts prior to 2024 was not held.

In 2025, eight out of ten people falsely identified were Black.

Live Facial Recognition Van

The Met police declined to comment on these specific findings. However, a spokesperson said they continue to monitor and review the use of the tool and the false alert rate to ensure ‘fairness, transparency and accountability’ in their use of LFR.

The Met Police twice raised the system’s similarity threshold in 2024 – 2025, the point at which an alert is trigged when a live face matches a watchlist photo – which appears to have improved its accuracy. They said they have an exceptionally low false alert rate of 0.0003% – a previous investigation by City News raised concerns over this number.

The National Physics Laboratory, which independently tests the cameras said the current threshold of 0.64, created ‘balance across ethnicity and gender’.

The Met Police acknowledged the disparity in false alerts in its most recent annual report, but said cameras were commonly deployed in ‘crime hotspots’ where Black males are statistically overrepresented in both crime and victimisation data.’

Areas the cameras have been deployed recently include Walthamstow Central, Dalston Kingsland and Oxford Street.

Shopping at Oxford street, London, Christmas day

The force said the technology had helped to remove more than ‘1,700 dangerous offenders from London’s streets since the start of 2024.’ It added none of the false alerts in 2025 resulted in arrests; four individuals were not approached, while the remaining six were spoken to for five minutes or less.

Concerns over racial bias with positive identifications, highlighted in a University of Cambridge study, prompted Essex Police to temporarily suspend their use of LFR cameras last week.

The study called for further investigation because cameras more accurately identified Black people.

The Information Commissioner’s Office (ICO), who audited Essex Police, said ‘all forces should also be conducting routine testing for bias and discriminatory outcomes.’ This includes when it arises from technology design, training data or watchlist composition.

Big Brother UK, has commented:

“It’s disturbing that 80% of the innocent people wrongly flagged by facial recognition were black.”