Login: Email AddressPassword

Robocops: are the police robots coming?
30 Sep 2016

<b><i>Rick Muir: no area of policing<br>immune from technological innovation</b></i>
Rick Muir: no area of policing
immune from technological innovation
Rick Muir discusses the role of automation in policing and asks whether the public have been consulted on the future of digital policing.

The pace of technological change is such that much of what was once science fiction is now becoming reality. In Paul Verhoeven’s dystopian 1987 film Robocop, a Detroit police officer is gunned down and transformed by an evil corporation into a prototype law enforcement cyborg. Fast forward to 2016 and we find that police robots are actually being deployed on the streets in places as far afield as China and the Democratic Republic of Congo. Less visibly, automated software is playing a major role in areas like digital forensics and economic crime prevention.

As ever the science is speeding ahead of public consciousness and debate about these developments. This article addresses this by asking which components of policing could technically be automated before then asking whether or not they should be.

The rise of the robots
In their pioneering 2013 study of the US labour market, Carl Benedict Frey and Michael Osborne predicted that as many as 47 per cent of jobs in America are at risk from computerisation over the next ten to 20 years. A contrasting study has subsequently estimated that the figure for OECD countries is more like nine per cent. What everyone agrees is that very many of the tasks currently performed by human beings will in the coming years be performed by robots and algorithms.

Driving this trend is the exponential increase in the capabilities of computerised machines. In the 20th century computer technology was mainly a threat to routine manual workers, whose roles could easily be broken down into discrete tasks and coded. Now the increased power of computerised devices means that non-routine work can be broken down into tasks, enabling them to be coded and acted upon autonomously.

Non-routine cognitive labour is under threat because of the rise of big data, which means that patterns can be detected by algorithms across huge data sets, enabling machines to think in ways that surpass human capabilities. These algorithms can think on a larger scale than we can and they lack human biases. Sensor technologies are bringing in more data and advanced user interfaces mean that computers are ever more responsive to human requests. Algorithms can now make the kind of subtle judgments that previously only human beings could or at the very least can be powerful aids to human decision-making.

Non routine manual labour is at risk because of advances in robotics. Enhanced sensors and manipulators mean that robots can now manage tasks such as driving autonomously in busy traffic that were previously considered the preserve of human workers.

In their book of The Future of the Professions, Daniel Susskind and Richard Susskind predict that the traditional professions such as the law, medicine and architecture are now at risk. These professions have for decades acted as gate keepers, maintaining, interpreting and applying practical expertise to many of the most complex problems we face. Susskind and Susskind argue that in a technology based internet society there are new, cheaper and more transparent solutions to the problem of limited understanding than paying to consult a human expert.

So, for example, if we can secure access to legal advice via free online portals with algorithms able to sort and answer our questions, why pay for an expensive lawyer? eBay has an automated online dispute resolution facility and there is no reason why such systems could not deal with some questions family or civil law, for instance.

The impact of automation on policing
The irony for the police service is that just as it has decided to pursue an agenda of professionalisation the whole notion of the expert is under assault from ever more powerful technology.

So, to what degree is policing likely to be subject to automation? Note that here we are interested in whether or not components of the police role could technically be automated, rather than the moral question of whether they should be.

Table 1 (below) distinguishes between those parts of policing that could be supplemented by automated technology and those that might be fully replaced by capable machines. Unsurprisingly there is no area of policing that is likely to be immune from technological innovation. Indeed there are some areas of criminal investigation, traffic enforcement, surveillance and crime analysis that might be mainly or wholly carried out by automated systems. For instance, algorithms are already used as part of digital forensic investigations to match coders with their creations. Researchers at Princeton have produced a study which used algorithms to automate the analysis of coding styles from 1,600 programmers, correctly attributing authorship with 94 per cent accuracy.

Another example where algorithms are already working at a scale that far surpasses human capabilities would be the use of anti-fraud and money laundering software by banks to spot unusual financial transactions.

However, there are very many policing tasks where full automation would be practically, as well as morally, inappropriate. This includes areas such as community engagement, victim liaison, the use of police powers to stop or arrest, and much of the order maintenance function.

The reason for this is that these tasks fall into one or more of the following categories. First, many police tasks require complex perception. This means deep and broad human perception that is capable of making sense of highly unstructured data. For example, think of a police officer trying to manage a large crowd of people.

Second, policing requires considerable manual dexterity or an ability to respond to sudden events in a physically agile way, to a degree that robots would find difficult to match. For example, consider an officer pursuing an offender through a densely populated urban area.

Third, policing requires social intelligence or a deep understanding of human heuristics and an ability to relate and communicate on an emotional level with other people. Just imagine Paul Verhoeven’s cyborg trying to provide reassurance to a vulnerable victim of crime.

Finally, and this is where efficacy meets morality, policing constantly requires moral judgments to be made. Take the use of stop and search powers, for instance. Now it might be suggested that a robot officer could achieve better outcomes because, for instance, it could be programmed not to think with the unconscious biases that very likely lead to stop and search being used disproportionately against young black and Asian men.

However, even if this were true, we would still feel uncomfortable about a robot officer having a power to stop and search. This is partly because it seems unlikely we would ever be able to properly code for moral decision making. But even if it were technically possible, for example, for a robot to be able to distinguish right from wrong in a way that mocks human decision-making, we would still be concerned about giving this power to a robot.

This is because we want our police officers to take responsibility for the moral judgments they make. We want another human being to have reflected upon and agonised over decisions that matter and have moral weight. It is because law enforcement intrinsically involves making moral judgments about the appropriate use of police powers that so much of it is so unsuited to automation.

The need for public debate
The technological revolution will transform the way the police work. Before that happens we need to openly debate the implications. Where is automation appropriate and where is it not? What happens to the accountability of policing when much of what the police do will be determined by complex algorithms that only small numbers of experts can understand?

Unlike the retail, taxi or music industries policing is an accountable public service. This means that is has been slower than other sectors to adapt to technological innovation, but it also means that changes are subject to public debate. Even if robotics and algorithms can make policing more effective and efficient, the public will still need to be convinced that their application in any particular instance would be right. Police decision makers would do well to start thinking now about some of the ethical as well as technical issues raised by the technological revolution we are living through.

Rick Muir is Director of the Police Foundation. The implications of technological change for the police workforce and the ethical issues raised will be discussed at the Police Foundation’s Annual Conference on 1st December. Tickets can be booked here.

Table 1. Which policing tasks could be supplemented or replaced by automated systems?


Copyright © 2015 Police Professional
Terms & Conditions website by 64 Digital