High Court ‘recognises responsibility’ shown by South Wales Police use of facial recognition technology
South Wales Police use of automatic facial recognition (AFR) technology to search for people in crowds is lawful, the High Court in Cardiff has ruled.
The force has been trialling AFR since April 2017 but faced a legal challenge from campaigning group Liberty on behalf of former councillor Ed Bridges, who claimed his human rights were breached when he was filmed by the system while Christmas shopping.
Lord Justice Haddon-Cave, sitting with Mr Justice Swift, concluded that South Wales Police’s use of live facial recognition “met the requirements of the Human Rights Act”.
The judges also ruled that existing data protection law offered sufficient safeguards for members of the public whose faces were scanned by facial recognition cameras, and that South Wales Police had considered the implications.
Three UK forces have used facial recognition in public spaces since June 2015: the Metropolitan Police Service (MPS), Leicestershire Police and South Wales Police.
Lawyers for South Wales Police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured. The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras.
The chief constable of South Wales Police, Matt Jukes, said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern. So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.
“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
South Wales police and crime commissioner Alun Michael said his priority had been to ensure the police “make best use of technology to keep the public safe while also working within the law and protecting civil liberties”.
The Home Office welcomed the judgment confirming there was a “clear and sufficient legal framework” for the use of AFR.
The MPS said the ruling’s implications would be carefully considered before a decision was taken on any future use of live facial recognition technology.
Leicestershire Police said it uses facial recognition technology in criminal investigations, within locally agreed guidelines and legislation to identify possible suspects. “It was last used at a public event in 2015, as a pilot scheme and it has not been used in that way since,” a force spokesperson said.
The Information Commissioner’s Office (ICO), which has previously voiced concerns over the use of the technology, released a statement in response to the ruling. It warned that facial recognition has the potential to “undermine rather than enhance confidence in the police”.
“We will now consider the court’s findings in finalising our recommendations and guidance to police forces,” said an ICO spokesperson.
Mr Bridges has announced his intention to appeal the ruling. “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms,” said Mr Bridges’ lawyer Megan Goulding.
The legal decision came on the same day Mayor of London Sadiq Khan acknowledged that the MPS had participated in the deployment of facial recognition software at the King’s Cross development in central London between 2016 and 2018, sharing some images with the property company running the scheme.
That contradicted previous assurances about the relationship with King’s Cross given by the mayor, who asked the MPS “as a matter of urgency” to explain what images of people had been shared with the developer and other companies.
The ruling also coincided with the release of a public opinion survey on facial recognition technology. Of the 4,000 adults polled by the Ada Lovelace Institute, 55 per cent said they wanted the Government to impose restrictions on police use of facial recognition technology but that nearly half (49 per cent) supported use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place.