MPS called up on false claim of LFR support
The UK’s Biometrics Commissioner has criticised the Metropolitan Police Service (MPS) for falsely claiming he supported the use of live facial recognition (LFR) technology.
In a post on his official website, Professor Paul Wiles – a former chief scientific adviser to the Home Office – wrote: “I am aware that the Metropolitan Police Service have produced an equality impact assessment in relation to their deployment of LFR. In that document they claim that I ‘supported the concept of LFR’.
“In fact, I have continually said that we need proper governance of new biometric technologies such as LFR through legislation. In my view it is for parliament to decide whether LFR ought to be used by the police and, if so, for what purposes.”
The equality impact assessment referred to by the Commissioner had been released in 2018 and contained the reference to the support of Professor Wiles within a consultation log.
A spokesperson for the MPS said the error had since been corrected: “The MPS welcomes the Biometric Commissioner’s interest in developing guidance to cover use of biometric systems and information.
“We have been keeping the Biometrics Commissioner informed about the MPS’s deployment of LFR and look forward to any opportunities to work with him about the use of new biometrics in law enforcement. We have updated the equality impact assessment to accurately reflect his position.”
However, the complaint made by Professor Wiles revealed there had been other gaps in the consultations made by the force, which had published documents and gone ahead with operational deployment of LFR despite not having received responses from groups including the black and Sikh police associations and its trans network association.
The MPS reported that its first operational use of LFR was met with an “overwhelmingly positive” response from members of the public.
Acting Chief Inspector Chris Nixon of the north east basic command unit said: “My officers worked closely with the technology team to use the technology effectively and would be keen to deploy it again. No positive alerts were generated by the system on this occasion and there were no false alerts or incorrect identifications.”
However, campaign group Big Brother Watch, which opposes the use of the technology, said members of the public it spoke to felt “watched and targeted”.