You may have spotted a video showing a London man being stopped by police after he covered his face while passing a facial recognition van.
“How would you like it if you walk down the street and someone grabbed your shoulder? You wouldn’t like it, would you?,” the man is heard telling officers.
The man was stopped, his photograph taken, and he was fined £90 for masking his face.
The incident, captured by BBC cameras, highlights the reason that campaigners are concerned about the technology which has been trialled by the Metropolitan Police throughout different locations around London in recent years.
Police say the scheme is part of efforts to crack down on rising levels of violent crime in the capital, with trials taking part at Notting Hill Carnival in 2016 and 2017, Remembrance Day events in 2017, the Port of Hull docks and Romford.
But campaigners, who would often protest near the deployment sites, say they witnessed “shocking” police incidents during the trials.
Among them was a 14-year-old black child in school uniform being stopped by plainclothes officers and fingerprinted by officers after being misidentified.
Civil liberties organisation Big Brother Watch previously branded the technology “China-style mass surveillance”.
Ahead of the first legal challenge over the technology next week by human rights group Liberty, here is what you need to know about how it might affect you.
Do I Need To Be Worried?
The Met say an independent evaluation is being carried out on the technology, and that they are working to ensure it is as accurate as possible.
According to the force, passersby have a right to object to the scanning, which is designed to match up faces with images of wanted criminals on a database.
Police have previously said the trials in “real-life conditions” were intended to “get accurate data and learn as much as possible from it”. The force made eight arrests across the 10 trials.
It says its use of the technology is justified by the Human Rights Act 1998, the Freedom of Information Act 2000, the Protection of Freedoms Act 2012, and the Data Protection Act 2018.
But a legal challenge is set to take place next week after father of two Ed Bridges issued a judicial review of South Wales Police’s use of the technology. His face was scanned at both an anti-arms protest and while doing his Christmas shopping.
“It’s evident by academic study that this kind of technology will misidentify BAME people and women far more than it will white men”
The crowdfunded case, brought by human rights organisation Liberty, will be the first of its kind.
Liberty’s policy and campaigns officer Hannah Couchman told HuffPost UK that the case will argue that the technology doesn’t have a basis in law.
“The case will focus on the breach, or the potential breach, of the equality duty that the police are under as a public body, which feeds into a wider concern as an organisation that this technology is discriminatory.
“It’s evident by academic study that this kind of technology will misidentify BAME people and women far more than it will white men.
“That means that there are sections of the community more likely to be subject of a false stop, the police stop them but actually, they’re not the person being searched for.”
What Can I Do If I Am Concerned About This Technology And Use Of My Data?
Couchman says you can get in touch with the Liberty advice team, the details of which are found on their website.
She added: “If people are concerned about this technology, we’re running a campaign called Neighbourhood Watched, with a company called Privacy International.
“We’ll shortly be releasing a campaign to allow people to contact their police and crime commissioner. They are people who you can express your concerns to and are intended to hold the local police force to account on behalf of the local community.”
Who Does The Technology Target?
The Metropolitan Police claims that the technology uses “NeoFace” technology to analyse the details of faces on its watch list – i.e. people who have been taken into police custody.
The force says that when the camera detects a face, it creates a digital version of it and “searches it against the watch list”.
Amid concerns that the technology disproportionately targets women and people of colour, the force recently told parliament’s science and technology committee that it is “working to further mitigate potential impact” of “the system response with respect to different demographics”.
How Successful Is It?
Not very.
Last month, Big Brother Watch revealed that the trials had a shocking 4% success rate.
The technology, which was used 10 times between 2016 and 2018, misidentified innocent members of the public 96% of the time over eight of the trials, a Freedom of Information request by the campaign group showed.
Two separate deployments at Westfield shopping centre in Stratford scored a 100% misidentification rate.
It follows an earlier FoI request by Big Brother Watch to South Wales Police, which showed that 91% of matches had identified the wrong people.
The campaigners criticised the technology as “disastrous for policing”.