Forces criticised for use of predictive algorithms

The human rights campaign group Liberty has called for UK police forces to “end their use of predictive policing mapping programs, which it says rely on problematic historical arrest data and encourage the over-policing of marginalised communities”. 

Feb 5, 2019
By Neil Root

A new report by Liberty has revealed that at least 14 police forces are utilising predictive policing software algorithms to identify hotspots and calls the practice “biased” and that there is “a severe lack of transparency” in its use. 

It says the use of algorithms by police is “entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed”. 

And it concludes that the software lends “unwarranted legitimacy to biased policing strategies” that disproportionately focus on black, Asian and minority ethnic (BAME) and lower income communities. 

But despite its deep ethical concerns, Liberty does admit that the predictive software being used could be cost-efficient in the current policing financial climate.  

The claims are made in the report ‘Policing by Machine’ published on Sunday (February 3) after Liberty made 90 Freedom of Information requests to police forces. 

Algorithms are a set of instructions that perform a specific task, in this case to identify and map criminal activity and predicting the location of future offences, allowing forces to deploy officers to hotspots. 

The research says forces that “are using or trialling, or planning to use or trial predictive mapping programs” are Avon and Somerset Constabulary, Cheshire Constabulary, Dyfed-Powys Police (in development), Greater Manchester Police, Kent Police, Lancashire Police, Merseyside Police, the Metropolitan Police Service, Norfolk Constabulary (uses a solvability algorithm), Northamptonshire Police, Warwickshire Police and West Mercia Police (in development), West Midlands Police and West Yorkshire Police. 

A strategic adviser to the West Midlands police and crime commissioner, Tom McNeil, said: “We are determined to ensure that any data science work carried out by West Midlands Police has ethics at its heart. These projects must be about supporting communities with a compassionate public health approach.”  

Additionally, three forces – Avon and Somerset, Durham and West Midlands – are using individual risk-assessment programs, which Liberty says, “predict how people will behave, including whether they are likely to commit, or even be victims of, certain crimes”.     

Durham Constabulary has been developing its Harm Assessment Risk Tool (HART) algorithm for more than five years. It uses a method known as ‘random forests’, analysing huge numbers of combinations of ‘predictor values’, most of which focus on the suspect’s offending history, as well as age, gender and geographical area.  

A spokesperson for Durham Constabulary said: “We are proud HART, which is part of our intervention programme to help repeat offenders turn their lives around, break away from the revolving door of prison and reduce crime. 

“All decisions are ultimately made by an experienced custody officer, but the HART advisory tool gives them a clear indication as to who might be more at risk of re-offending – not so they are stereotyped, but so we can give them more support to turn away from crime.” 

Assistant Chief Constable Jon Drake, National Police Chiefs’ Council lead for intelligence, said: “Policing is underpinned in the UK by a strong set of values and ethical standards, as well as a significant amount of legislation.  

“At all times we seek to balance keeping people safe with people’s rights. This includes the way in which we police hot-spots for crime.

“For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to develop new approaches to achieve these aims.” 

Liberty wants forces to fully disclose information about the use of predictive mapping and says investment in digital solutions to policing should focus on developing algorithms that actively reduce biased approaches to policing. 

Additionally, forces should examine more widely their use of data, including independent reviews into the Gangs Matrix and Prevent programme, with targeted criteria for inclusion and removal of data – although Mayor of London Sadiq Khan ordered a “comprehensive overhaul” of the Gangs Matrix on December 21 last year. 

The Law Society is also running evidence sessions on the issue – ‘Technology and the Law Policy Commission: Algorithms in the Justice System’ – with two sessions in Cardiff and London scheduled for this month. 

 

 

 

 

Related News

Select Vacancies

PIP2 Detective Constables

Cambridgeshire Constabulary

JOU RPU Forensic Collision Investigator

Hampshire Constabulary & Thames Valley Police Joint Ops Unit

Copyright © 2019 Police Professional