Automatic facial recognition challenged in UK court

Collecting biometric data such as facial scans in mass surveillance schemes undermines human rights, say activists.

You’re a law-abiding citizen. How would you feel if your face was scanned by police cameras and your biometric details collected in a mass surveillance programme?

Would it bother you? It bothered Ed Bridges, who was scanned by a vehicle-mounted Automatic Facial Recognition (AFR) camera while Christmas shopping in South Wales in 2017.

But it really bothered him when, the following year, he was again scanned at a protest against the arms trade.

Advertisement

“On that occasion, the van was parked opposite the crowd,” Bridges told Al Jazeera. “I felt it was there to sort of intimidate people and dissuade them from using their peaceful right to protest.”

This week in a landmark case, Bridges mounted a legal challenge against South Wales Police after crowdfunding a campaign to establish a judicial review.

AFR has given the police “extraordinary power”, his lawyer Dan Squires told the hearing at the court in Cardiff on Thursday, the third and final day of hearings.

Advertisement

Police say a person’s biometric details are not stored – except for those who match someone on their watch list – but this is not a legal requirement in the UK.

“The way South Wales Police have operated to date has been responsible and limited,” Squires said. “But none of that comes from the law. That comes from self-restraint.”

Bridges is calling for a code of conduct to be established. 

It belongs in a police state and not in a democracy like ours

by Megan Goulding, Liberty

Civil liberties

Lawyer Gerry Facenna, speaking for Britain’s information commissioner, suggested a legal framework needed to be drawn up for AFR use, saying there was a lack of clarity on how police forces compiled these “watch lists” and then use facial recognition to search for suspects.

Advertisement

Civil liberties groups are also concerned.

“As they walk past the camera, it takes their really sensitive biometric data from them without their knowledge or consent,” said Megan Goulding, a lawyer working with human rights group Liberty.

“So, it’s really akin to them walking down the High Street and the police forcibly taking their fingerprint or their DNA. And so we say it belongs in a police state and not in a democracy like ours.”

Advertisement

The technology has been banned in the US city of San Francisco. A trial operation of Automatic Facial Recognition in London has now ended, a Metropolitan Police spokesperson told Al Jazeera, but not before a man was fined for disorderly conduct after covering his face while passing a police van in the British capital, the court heard on Thursday. 

Al Jazeera’s Paul Brennan said “deep learning” artificial intelligence technology had greatly improved the accuracy of AFR.

“That still leaves wider questions,” he said, reporting from Cardiff, “of governance and purpose, and when exactly the protection of the public becomes intrusion and oppression.”

Facenna, speaking on Thursday at the judicial review, said there was “serious doubt” whether the legal framework currently in place was good enough. “It’s all a bit ad hoc,” he said. “There’s nothing sufficiently precise and specific.”

The government’s Home Secretary Sajid Javid was represented in court by Richard O’Brien, who “welcomed” the claim against the police and agreed that guidance should be provided to ensure the public’s rights were not abused.

A judgement in Bridges’ legal case against South Wales police is expected to be handed down in the autumn. 

Source: Al Jazeera, News Agencies

Advertisement