Logo

Divisional Court dismisses legal challenge over police use of automated facial recognition

A Divisional Court has dismissed a challenge to police use of Automated Facial Recognition (AFR) in what is thought to be the first time any court in the world has considered the technology.

Edward Bridges claimed his civil liberty was infringed when he was in Cardiff on two occasions when AFR was used, once in the vicinity of a football match and again near a defence equipment exhibition.

Mr Bridges, who was supported by the civil rights group Liberty, argued that the use of AFR was a breach of his rights under the Data Protection Act 1998 and article 8 of the European Convention on Human Rights.

He further claimed that using AFR was contrary to the police’s public sector equality duty.

The case of Bridges, R (On Application of) v The Chief Constable of South Wales Police [2019] EWHC 2341 (Admin) was heard by Haddon-Cave LJ and Swift J.

They said: “This case raises novel and important issues about the use of AFR by police forces. The central issue is whether the current legal regime in the United Kingdom is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilised society.

“At the heart of this case lies a dispute about the privacy and data protection implications of AFR.”

The judges concluded that there had been many technological innovations that have become available to the police and they have not needed specific powers to use them.

“We consider the police's common law powers to be ‘amply sufficient’ in relation to the use of AFR,” they said. “The police do not need new express statutory powers for this purpose.”

The judges said there was a clear and sufficient legal framework governing AFR’s use and “the fact that a technology is new does not mean that it is outside the scope of existing regulation, or that it is always necessary to create a bespoke legal framework for it”.

They said they were satisfied both that the current legal regime is adequate and that South Wales Police's use of AFR had been consistent with the Human Rights Act, and data protection legislation.

Commenting on the case Matrix Chambers, whose Dan Squires QC and Aidan Wills acted for Mr Bridges, instructed by Liberty, said live AFR captures the facial biometrics of people passing within range of video cameras and compares this data to the facial biometrics of people on police watchlists.

Matrix said: “The court held that live AFR engages the Article 8 rights of anyone whose face is scanned (or is at risk of being scanned) and constitutes the (sensitive) processing of their personal data.

“The judges nevertheless concluded that South Wales Police’s use of AFR did/does not breach the claimant’s privacy or data protection rights.

“The judgment recognises that a public authority data controller’s compliance with the duty to undertake a data protection impact assessment is amenable to judicial review but rejected the challenge to the South Wales Police’s assessment. A challenge to South Wales Police’s discharge of the public sector equality duty was also dismissed.”

Monckton Chambers, from which Gerry Facenna QC and Eric Metcalfe acted for the Information Commissioner, who was granted permission to intervene in the proceedings, said police use of AFR  was “lawful on the basis that the interference struck a fair balance and was not disproportionate”.

Megan Goulding, a lawyer with Liberty, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms.

“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all. It is time that the Government recognised the danger this dystopian technology presents to our democratic values and banned its use. Facial recognition has no place on our streets.”

Mark Smulian

(c) HB Editorial Services Ltd 2009-2018