“You know what they say: if you ain’t a criminal, you ain’t got nothing to worry about,” one man said as he passed Romford train station, where the Metropolitan Police this week held a trial of facial recognition software.
Live facial recognition technology cameras, which scan people walking by for a possible match to wanted individuals, have been trialled by the Met across London since last year. Police say the measure could become “invaluable” in day-to-day operations.
The system streams digital images directly to a database containing a watch list of people taken into police custody. This week’s test in east London’s Romford town centre saw eight people, half of them teenagers, arrested. A live trial of the technology in Westminster led to two arrests in December.
But its trials have been protested by various anti-surveillance groups, including Big Brother Watch, and as people passed the van in Romford, one campaigner said: “Well, we’ve all got something to worry about when it comes to this”.
The deployments have run for eight hours on each day of the tests, and police say that if the tech generates an alert of a match, officers on the ground will stop and carry out further checks to confirm the person’s identity, explaining to the individual why they have been stopped.
It has previously been used at Notting Hill Carnival in 2016 and 2017, Remembrance Day events in 2017, the Port of Hull docks and London’s Stratford station last year.
The scheme is part of efforts to crack down on rising levels of violent crime in the capital. London saw the highest number of homicides in a decade last year, with 132 people killed.
When HuffPost UK visited Romford this week, it was eighth trial in the series of 10. It was a bitterly cold, sunny winter morning in the town centre, where plainclothes and uniformed officers were patrolling the area surrounding a marked police vehicle, kitted out with two visible CCTV-style cameras.
About five metres away from the van stood a Met-branded information board explaining that live facial recognition technology was in place.
But if the many people walking by failed to notice it, their attention was likely raised by a group of protestors from civil liberties groups Liberty and Big Brother Watch, holding leaflets and placards campaigning against the use of the technology.
Gavin, an onlooker, stopped to read the police sign, and told us he supports the measures as an effective way to tackle knife crime.
“It’s a better way to stop and search that way than it is to stop and search a group of people or someone that they’ve got no inclination of. Sometimes they stop someone on race alone, and nothing else. At least with facial recognition, it can pinpoint someone specifically if they’ve got a record of [them],” he said.
“They can only use that information or that picture if they have evidence that that person was there and that particular incident took place and they’ve got to put a face to the name.
“Honestly, I’m more than fine with it.”
“It’s a better way to stop and search...at least with facial recognition, it can pinpoint someone specifically if police have a record of them”
Ray, a bystander who stopped to take pictures of the information sign, said he felt intimidated by the idea.
“I wouldn’t like a police officer studying my face and then remembering me for something that maybe I’ve never done, you know? I find it offensive, really.
“I do feel targeted – I just walked past these officers here and they looked at me. Whether they studied me, I don’t know. I just felt intimidated by it, really.”
Meanwhile, campaigners continued to inform onlookers. “Facial recognition, it’s against your rights”. Some people stopped to listen and nod in agreement. Others stepped forward to argue and to ask “why?”.
It’s a question the demonstrators were happy to answer. “Liberty are here today to raise awareness about the use of this technology on our streets and let people know why it’s a breach of their human rights for their sensitive, biometric data to be snatched from them without their consent, sometimes even without their knowledge,” Hannah Couchman, from the group, told HuffPost UK.
A trial by South Wales Police last year resulted in 91% of the matches identifying the wrong people, it was revealed following a Freedom of Information request by Big Brother Watch.
Couchman said she believed that the technology is a threat to individual rights and privacy, and disproportionately targets women and black people, as well as “chilling” freedom of expression, as it affects the decisions people make about where they go, and who they travel with if they are being watched.
She added: “The police have said that if you don’t choose to be scanned in these facial recognition deployments they may treat that as suspicious, so that means that if this is something you don’t want to participate in, you may face further action from the police, which we feel is completely inappropriate.”
The Met denies this, saying that those who decline a scan “will not necessarily be viewed as suspicious”, and that officers will use their judgement to identify dubious behaviour.
While the force told HuffPost UK it would not disclose the cost of the operation, Big Brother Watch claims more than £200,000 has been spent on the trial.
Commander Ivan Balhatchet, strategic lead for LFR, said: “The technology used in Romford forms part of the Met’s ongoing efforts to reduce crime in the area, with a specific focus on tackling violence. Use of the equipment at Romford Town Centre resulted in several arrests for violent offences.
The Metropolitan Police said it will now evaluate the technology following the trials.
The force said in a statement: “Tackling violent crime is a key priority for the Met and we are determined to use all emerging technology available to support standard policing activity and help protect our communities.”
This article previously stated that Romford was located in Essex, not in east London. This has now been corrected.