‘Ethical oversight needed’ for police data analytics

A lack of evidence base, poor data quality and insufficient skills and expertise are three major barriers to successful use of algorithms in policing, according to research conducted by the Royal United Services Institute (RUSI).

Feb 24, 2020
By Website Editor

Despite new data technologies clearly having the potential to improve police effectiveness and efficiency, the development of policing algorithms is often not underpinned by a robust empirical evidence base regarding their claimed benefits, scientific validity or cost effectiveness, the RUSI study found.

And it identified a need for meaningful and independent ‘ethical oversight’ of police data analytics projects.

Funding requirements for bespoke digital ethics committees could be prohibitive on a force level, it added, which means forces and police and crime commissioners should consider a regional model of digital ethics committees subject to “consistent terms of reference and transparency requirements”.

While predictive policing tools have been criticised for being ‘racially biased’, the RUSI study found insufficient evidence to assess the extent to which bias in police use of algorithms actually occurs in practice in England and Wales, and whether this results in unlawful discrimination.

It was unable to say whether findings of racial bias in police algorithms, identified in analysis conducted in the US, are transferable to the UK context. However, there remains a legitimate concern that the use of algorithms may replicate or amplify the disparities inherent in police-recorded data, the study claims, and forces need to track any risk of discrimination at all stages of a police data analytics project, from problem formulation and tool design to testing and operational deployment.

The report, published on February 23, said: “A clear business case [for the introduction of a data analytics project] is often absent. In the context of statistical forecasting, claims of ‘predictive accuracy’ are often misunderstood or misinterpreted, making it difficult for the force to assess a tool’s real-world benefits.

“Furthermore, capability development is largely driven by data science, with comparatively little focus on the underlying conceptual framework, criminological theory or legal requirements.”

Commissioned by the Centre for Data Ethics and Innovation (CDEI), the independent study into the use of data analytics by police forces in England and Wales, with a focus on algorithmic bias, involved engaging closely with senior police officers, government officials, academics, legal experts, regulatory and oversight bodies and civil society organisations.

The project revealed widespread concern across the UK law enforcement community regarding the lack of official national guidance for the use of algorithms in policing, with respondents suggesting that this gap should be addressed as a matter of urgency.

The report authors said police use of algorithms has expanded significantly in scale and complexity, driven by a significant increase in volume and complexity of digital data, ongoing austerity measures and an expectation that the police service adopts a preventative, rather than reactive posture, with greater emphasis on anticipating potential harm before it occurs.

However, the use of advanced algorithms, predictive analytics and ‘data scoring’ tools raises various legal and ethical concerns, they explain, with human rights considerations not always considered at the outset.

“The deployment of such technology as a direct response to resourcing constraints prompts significant questions regarding necessity and proportionality: in some cases, it could be argued that the use of such tools would not be ‘necessary’ if the police force had the resources needed to deploy a non-technological solution to the problem at hand, which may be less intrusive in terms of its use of personal data.”

The report recommends police forces conduct an integrated impact assessment – covering data protection, human rights, discrimination risk, assessment of empirical accuracy and operational effectiveness, as well as any other relevant legal requirements – before any new police analytics project takes place to assess whether a clear justification for using the tool has been established.

It said the National Police Chiefs’ Council (NPCC) and the Home Office, with input from the College of Policing, should produce nationally approved guidelines to ensure the legitimate development and deployment of statistical algorithms for policing. And Her Majesty’s Inspectorate of Constabulary and Fire and Rescue Services (HMICFRS) should inspect forces’ compliance against these standards as part of crime data integrity inspections.

The NPCC and Association of Police and Crime Commissioners should establish a national coordinating group for data analytics, which “should maintain a high-level catalogue of all algorithms used by police forces nationwide to inform operational decision-making”, the report recommends. The group should also assess the feasibility of establishing a mechanism for police forces to access a centralised team of specialist legal advisers, in the same way that government departments can access specialist legal advice via the Government Legal Department.

Other recommendations include: changing terminology from ‘predictive policing’ or ‘risk assessment’ tools to ‘classification and prioritisation systems’; HMICFRS establishing an External Reference Group for police use of data analytics drawing on expertise of the Information Commissioner’s Office and the Equality and Human Rights Commission; and conducting further empirical research to assess the extent to which racial bias in police use of algorithms occurs in practice in the UK.

NPCC lead for Information Management, Commissioner Ian Dyson, said: “Data-driven technology can help us to keep the public safe. Police chiefs recognise the need for guidelines to ensure legal and ethical development of new technologies and to build confidence in their ongoing use.

“We welcome the report’s contribution and we will work closely with government, the College of Policing and regulators to consider their recommendations.”

Related News

Select Vacancies

Constables on Promotion to Sergeant

Greater Manchester Police

Copyright © 2024 Police Professional