Facial recognition technology ‘needs to be consistent’
“Considerable investment and changes to police operating procedures” are required according to the first independent academic evaluation into Automated Facial Recognition (AFR) systems being used across UK police forces.
Cardiff University examined the use of the technology across a number of sporting and entertainment events in Cardiff for over a year, including the UEFA Champion’s League Final and the Autumn Rugby Internationals.
The technology works in two ways: Locate, which is a live, real-time application that scans faces within CCTV feeds, searches for possible matches against a pre-selected database of facial images of individuals deemed to be persons of interest by the police; and Identify, which takes still images of unidentified persons (usually captured via CCTV or mobile phone camera) and compares these against the police custody database to generate investigative leads.
The report found that in 68 per cent of submissions made by police officers in the Identify mode, the image had too low a quality for the system to work.
The Locate mode showed improvements over the year and could correctly identify a person of interest 76 per cent of the time.
Given the efforts needed for AFR to be consistent the researchers recommended that the technology would be more appropriately named “assisted facial recognition”, as opposed to “automated facial recognition”.
They added that ‘automated’ implies that the identification process is completed by an algorithm alone. Rather, the system serves as a decision-support tool to assist human operators in making identifications.
The technology also ends up being deployed in uncontrolled environments, and so is affected by external factors such as lighting, weather and crowd flows.
Additionally, South Wales Police came to an “unanticipated consequence” of the technology when mid-way through the project it came to light that the quality of images taken in custody suites had to be improved.
However, it noted that given the amount of complexity surrounding the use of AFR, some of the results that were obtained from the technology were “impressive”.
This was particularly the case when South Wales Police updated the technology with NeoFace, which was provided by NEC.
The company originally provided the police with their ‘S17’ algorithm but later provided their updated ‘M20’ version.
The research report said it is noteworthy that the particulars of these algorithms are ‘black boxed’, meaning that that it has not been revealed by NEC to the police or the evaluation team precisely how matches are being calculated by the system.
Because of this, “it [is] very difficult for police to explain and account for the results that they are having returned to them”, it added.
Another consequence of the technology is that a small number of people have a face type which has an increased chance of triggering a false-positive match.
The researchers suggested that police and the justice sector consider the ethical implications of the technology; for example, whether it would be more appropriate to use AFR for “preventing and solving only more serious kinds of crime, rather than all offences”.
The recommended that police powers be clarified through new legislation and said consideration should be given as to whether the use of custody suite images are appropriate or if images used should be limited to individuals arrested, charged and convicted of an offence.
Professor Martin Innes who led the evaluation said: “There is increasing public and political awareness of the pressures that the police are under to try and prevent and solve crime. Technologies such as Automated Facial Recognition are being proposed as having an important role to play in these efforts. What we have tried to do with this research is provide an evidence-based and balanced account of the benefits, costs and challenges associated with integrating AFR into day-to-day policing.”
Deputy Chief Constable Richard Lewis from South Wales Police said: “It was fitting that we participated in the independent evaluation of our use of facial recognition technology in policing. We have learnt much about the technology during the evaluation period, and its ability to help prevent and detect often serious crimes, along with how it can assist our officers in supporting the vulnerable.
“The report provides a balanced perspective of our use of the technology and hopefully it will help to demystify some of the misunderstandings and misinformation that have proliferated across the press.
“South Wales Police remains committed to the continuous use of the technology in a proportionate and lawful way to protect the public, whilst also remaining open and transparent about how and when we use it.”