Professor Colin Rogers examines whether the use of algorithms in decision-making could unintentionally introduce bias into assessments of risk and predicting future offending.
Professor Colin Rogers examines whether the use of algorithms in decision-making could unintentionally introduce bias into assessments of risk and predicting future offending. At this time of the year it is usual to make resolutions, reflect upon the past and, of course, to try to make predictions for the coming year. Predicting the future, and events, is not an exact science, as recent political elections and referendums will testify. However, in part as an answer to economic cuts, reduced workforce and resources and other demands upon their services, police forces in England and Wales have naturally looked to technology to assist. This technology includes the automatic identification of vehicle registration numbers, the increased use of drones, hand-held computers for officers and various methods of analysing data and information to assist in providing adequate resources at high demand times. The use of technology and computer-assisted decision-making has even reached the hallowed halls of the custody office. Durham Constabulary is reportedly using artificial intelligence to help officers decide whether or not a suspect should be kept in custody. The system has been tested by the force and classifies suspects at a low, medium or high risk of offending and has been trained on five years of offending data. It is believed the tool could prove useful, and data for the Harm Assessment Risk Tool (HART) was taken from Durham police records between 2008 and 2012. It would appear that the system was tested during 2013, and the results showing whether suspects did in fact offend or not were monitored over the following two years. It appears forecasts that a suspect was low risk turned out to be accurate 98 per cent of the time, while forecasts that they were high risk were accurate in 88 per cent of assessments. And Kent Police, considered one of the pioneers of this type of predictive policing in the UK, reported that its trial of software, based on data analysis utilising certain algorithms, was 60 per cent better at spotting where crimes would take place than the force analyst. These kinds of initiatives, and similar approaches to the use of data, could well become the norm as police continue to rationalise their working processes. It appears to fit into the already well-known use of information and data for geospace analysis for crime and incidents and the general intelligence-led policing approach. These approaches are, of course, dependent on what is commonly referred to as big data and the analysis of such, particularly the use of algorithms. Big data and algorithms Big data is a term that describes the large volume of data both structured and unstructured that inundates an organisation and its business on a day-to-day basis. It includes the incorporation of other databases to formulate a larger database. Therefore, for the police, recorded crime statistics, stop and search figures, victim statistics, etc, are all part of what could be incorporated into big data for the purposes of this approach. Risk assessment algorithms are formulae that analyse big data sets which, as in the instance of Durham Constabulary, weigh up a variety of factors relating to recidivism or the likelihood an individual will commit another crime, or should be granted bail. Thus, predictive policing is based on the application of analytical techniques, particularly quantitative techniques, to identify likely targets for police intervention and prevent crime or solve past crimes by making statistical predictions. Creating the data All data is socially constructed. Policing is a people business, and social interactions between people are complex to say the least, especially, perhaps, in fraught situations. This interaction involves a certain amount of interpretation with citizens by officers to determine whether a formal instance of crime will be produced. It contains the opinions, stereotyping, beliefs and attitudes of those people working in their constructi