Metropolitan Police
Big Brother Watch says "UK’s has a reckless approach to face surveillance"

“Live facial recognition turns the streets of Britain into AI-powered police line-ups,” according to Madeleine Stone, Senior Advocacy Officer at Big Brother Watch, a civil liberties and privacy campaign organisation.

“Innocent members of the public are being subjected to biometric identity checks as they go about their business,” she said. This comes as the Metropolitan Police expanded their rollout of LFR (Live Facial Recognition) in South London and Croydon.

What is LFR?

The force said LFR is a “vital tool” in combatting crime. It works by matching a person’s unique facial features with faces on criminal watchlists.

“LFR can be used in a number of ways to prevent and detect crime, find wanted criminals, safeguard vulnerable people, and to protect people from harm – all to keep the people we serve safe,” according to the force’s website.

“LFR is not a ubiquitous tool that uses lots of CCTV cameras from across London to track every person’s movements. It is a carefully deployed overt policing tactic to help locate a limited number of people the police need to find in order to keep London safe,” it said.

It has been used at large public events such as the King’s Coronation, but is now being rolled out en masse to combat offences such as petty crime and shoplifting.

The force have so far reported 45 recent arrests in the borough of Croydon with the assistance of eight deployments of specialised LFR vans.

Privacy Concerns

Big Brother Watch welcome the crackdown on crime, “everyone wants dangerous criminals off the street,” but are concerned about how LFR data will be used and stored.

“Police have written their own rules on how this technology can be used, allowing innocent people, such as victims and witnesses of crimes to be placed on watchlists, as well as peaceful protesters and people with mental health conditions”, said Madeleine.

“The UK’s reckless approach to face surveillance makes us a total outlier in the democratic world, and more in step with the likes of China and Russia,” she said.

There is no legislation around LFR technology and a supposed lack of explicit consent. “They might not know it’s happening. They can’t consent to it properly,” she said.

Alleged Discrimination

Big Brother Watch also raised concerns that the LFR algorithm might be discriminatory. As early as 2019, an independent report, published by Essex University, found it is “highly possible” that the use of LFR by the Metropolitan Police “would be held unlawful if challenged before the courts.”

According to police deployment data, the sole purpose of LFR is to “locate those suspected suspected of having committed, or where there are reasonable grounds to suspect that the individual depicted is about to commit an offence or where there are reasonable grounds to suspect an individual depicted to be committing an offence.”