News

Durham Police to use artificial intelligence to aid custody decisions

The AI software is set to be used as a reference tool for some cases.
The AI software is set to be used as a reference tool for some cases. The AI software is set to be used as a reference tool for some cases.

Police in Durham are set to begin using a new artificial intelligence tool to help officers decide whether a suspect should be kept in custody.

The Harm Assessment Risk Tool (Hart) has been trained with data from five years of offending histories, and uses the information to classify suspects as low, medium or high risk of offending if released.

Police officer
Police officer
(Joe Giddens/PA)

It was first tested in 2013, and its classification of suspects as low risk was accurate 98% of the time. Those classed as high risk was correct 88% of the time.

Durham Constabulary’s head of criminal justice, Sheena Urwin, told the BBC: “I imagine in the next two to three months we’ll probably make it a live tool to support officers’ decision making.”

During the latest experiment, officers will refer to the system in randomly selected cases so that its impact can be contrasted with cases where it is not used.

handcuffs
handcuffs
(David Cheskin/PA)

The system has been programmed to err on the side of caution, and will most likely classify suspects as a medium or high risk in order to avoid suggesting the release of someone who may commit a crime.

However, the tool is limited in that it only works with data from Durham Constabulary, so it can not take into consideration crimes that took place outside of the area.

Questions over the viability of artificial intelligence (AI) as a decision-making tool are still being raised.

Possible bias within AI systems and concerns about the impact this could have have been raised by experts.

laptop keyboard
laptop keyboard
(Dominic Lipinski/PA)

Some have argued that because the humans programming this software are themselves inherently biased, whether they realise it or not, AI technology can not be completely unbiased.

An investigation in the US last year claimed that an algorithm used to determine the likelihood of a suspect committing another crime in the future had a racial bias.

It was claimed the algorithm made more negative forecasts about black suspects than white, something denied by those behind the system.

Durham Police said they plan to use their system as an advisory tool, with discretion remaining with the officers involved.