Facing the facts
Jan 20, 2020 @ 15:05

Steve Ainsworth, Director of Operational Safety at Northgate Public Services, explores if facial recognition technology is the ‘Big Brother’ we need to fear.

Steve Ainsworth

Automated facial recognition (AFR) presents both a risk and an opportunity for society and, as a consequence, evokes strong views. While it is important to acknowledge and respond to concerns it could lead to a ‘Big Brother’ society, there needs to be a balanced debate. In my view, those who call for a total ban on the use of real-time AFR in policing are misguided.

Addressing public concern

As is often the case, advances in technology can rapidly outpace legislation and there is now a pressing need for a defined regulatory approach.

Balancing the needs of individual citizens’ right to privacy with the need of police forces to protect public safety has always been contentious. Think CCTV, ANPR and stop and search, all effective policing tools and all the subjects of intense debate and scrutiny. And rightly so, as no one wants a pervasive Orwellian surveillance State, least of all the police.

It is clear appropriate safeguards need to be put into place to help mitigate public concern. So, what can be done to counter the popular narrative that police use of AFR will morph into a dystopian surveillance system? The current regulatory void surrounding AFR has exacerbated concerns about the privacy of innocent members of the public, and also what constitutes the proper threshold for inclusion on a computerised watch list. For example, should it contain images of those arrested but not charged? Or those with minor or spent convictions? Will the image of a law-abiding citizen going about their everyday activities be recorded and stored on a police database?

The truth is, we are already constantly monitored and tracked. Our images are recorded on CCTV in stores, on transport systems and in city centres and kept far longer than the millisecond a non-matched image taken during AFR. The difference is there are protocols and agreed frameworks in place, with CCTV now perceived by the public as an important investigative tool.

To ensure AFR technology is similarly viewed as being used to make society safer, rather than a case of ‘Big Brother’ is watching, there needs to be transparency around the process and sufficient privacy tools built in to the system, such as blurring the faces around the ‘matched face’ of a person of interest and ensuring ‘captured’ faces that do not match those on a ‘watch list’ are not stored on a police database but immediately deleted.

Eliminating bias

Bias in AFR is a key public concern. Whilst much has been done to improve the quality of both provided and captured images within systems, little has been done to encourage a standardised approach to compiling, testing and curating the data sets that train the algorithm.

That is why there needs to be an ethical approach by tech companies when compiling the control data sets to ensure they are fully representative of both gender and ethnicity. A recognised uniform approach to the curation and testing of data sets will minimise the opportunity for bias.

The need to improve the accuracy of the technology is often covered in the press and quoted as reason to curtail the use of it. However, what is less understood, is that it is more a question of ‘fine tuning’ the system to set the threshold appropriate to the operational objective. A counter-terror unit with intel of a known individual targeting an attack at a specific area will have a very different threshold to a team using it to look for a missing person. Technology will never replace the bobby on the beat or outsmart a detective; its role is to assist not replace.

The benefits to officers of AFR are enormous. It can help in pre-planned operations, support safeguarding strategies or provide an investigative lead in a time-critical situation. Only recently, it enabled New York Police Department officers to identify and arrest a suspect, Larry Griffin II, within hours after he had caused major panic and a subway shutdown after entering the rail network with a suspect device.

Arguments the technology should be renamed ‘assisted’ rather than ‘automated’, as mooted in a report by the Crime and Security Research Institute, is a cogent one, as the action taken following a match is down to the human operator. For example, how accurate is the match? Should officers be dispatched to stop and search the individual? The misnomer the technology makes the decision needs to be addressed.

Transparent and auditable

The technological capabilities of AFR are advancing at an accelerated pace. The increasing levels of accuracy and the potential to apply it to body-worn video will deliver tangible benefits to society, as it will help officers identify those who wish to harm it, as well as those who are at risk of harm.

Is it possible to balance civil liberties with AFR? Yes, I believe so. But for it to gain traction in the public arena there needs to be ‘self-policing’ of the system in the absence of regulation and oversight. Transparency, proportionality and an agreed framework around the process, as in ANPR, for example, is fundamental to securing public confidence, eliminating bias and mitigating against the fears ‘Big Brother’ is watching.

Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2024 Police Professional