Facial recognition tech sparks debate in Croydon

Facial recognition tech sparks debate in Croydon
Credit: (Jordan Pettitt/PA) (PA Wire)

Croydon (Parliament Politics Magazine) – Croydon police face public backlash and support as they introduce permanent facial recognition technology, raising civil liberties and privacy concerns.

As reported by East London Lines, although some people applauded the Metropolitan Police’s decision to permanently use face recognition technology in Croydon, it also raised worries about the police force’s over-policing of minority areas.

One of the first sites for the new trial, which will have permanent cameras placed on the streets, will be Croydon.

“Prevent and detect crime, find wanted criminals, safeguard vulnerable people, and protect people from harm” is how the Met marketed the technology.

Installed on street furniture, the cameras will only turn on when local officers are utilizing the technology to scan people in real time and find them on the Met’s “watchlist.”

Additionally, a red notice warning of the existence of police facial recognition technology will be posted in the vicinity.

According to a Financial Times investigation from last year, live face recognition had been deployed 180 times in 2024, particularly in areas with a higher concentration of Black people, like Thornton Heath in Croydon, Northumberland Park in Haringey, and Deptford High Street in Lewisham.

The Met has stated that “no statistically significant differences in performance based on gender or ethnicity at the settings the police use” in the Police use of live facial recognition technology report.

StopWatch, an advocacy group that monitors police practices, issued a statement in response to the measure:

“In a context where it has been widely reported that marginalised and racialised communities are overpoliced and disproportionately likely to be the target of police powers, these demographics in turn are disproportionately represented in police data, skewing and biasing the data which then goes on to form the basis upon which the algorithm is trained.”

The Met said in an impact assessment:

“The accuracy of the LFR system algorithm was tested by the National Physical Laboratory. At a threshold of 0.6 the False Positive Identification Rate has been found to be equitable between gender, ethnicity and age, in order for there to be no statistically significant imbalance between demographics with a False Positive Identification rate of 0.017% (1 in 6,000).”

A black activist called Shaun Thompson was detained by police in Croydon for about 30 minutes in June 2024 after he was mistakenly identified as a suspect in the Met’s face recognition database. A court review against the Metropolitan Police was later initiated by him.

According to MyLondon, following the event, Thompson said he felt as though he was being regarded as “guilty until proven innocent” and called the technology “flawed.”

Some residents remain optimistic about the new scheme. Omi, shopkeeper in Croydon, told EastLondonLines:

“I think the police are just doing their job. If they do nothing, they will receive a complaint. It’s fair.”

He Added.

“If you haven’t done anything wrong, then why are you bothering?”

He said that it’s not a bad thing, because there will be less criminal activity. It makes him feel safer as well. In my opinion, they should put cameras everywhere, not just Croydon.

Yunns, resident and shop owner in Croydon, also told ELL he supported the scheme.

There’s a lot of drug dealers and troublemakers here, not just at night but during the day as well. We need security, outside there are problems like drugs and fighting.”

He went on to say that he doesn’t mind or feel like he is being watched by the government anyway.

Additionally, police jobs are being cut as a result of the new technologies. A £450 million funding shortage might force the Metropolitan Police to lay off 2,300 officers and 400 workers, according to Sir Mark Rowley, Commissioner of the Metropolitan Police. 

That would be a 7% officer and 3.5% civilian staff decrease. He called the possible effect on the service “seriously detrimental.”

How will the community react to the permanent installation of facial recognition cameras in Croydon?

Many local residents and councillors feel the decision was made without proper community consultation or consent, describing the technology as a “gross invasion of privacy” and expressing frustration at the lack of meaningful public discussion or opportunity to object. This has led to feelings of distrust and alienation among parts of the community.

Civil rights groups and campaigners highlight that facial recognition technology disproportionately misidentifies and targets Black and minority ethnic people, increasing risks of over-policing and discrimination. 

Croydon’s diverse population, including a large Black community, raises particular concerns about exacerbating racial disparities in policing. This fuels anxiety about surveillance normalizing racial profiling and community divisions.

Privacy advocates and groups like Big Brother Watch warn that permanent facial recognition normalizes mass surveillance, threatens freedoms, and lacks sufficient legal safeguards or oversight. 

Beth Malcolm

Beth Malcolm is Scottish based Journalist at Heriot-Watt University studying French and British Sign Language. She is originally from the north west of England but is living in Edinburgh to complete her studies.