Case study

Machine learning helps flag issues with police forces sooner 

New proof-of-concept tool uses crime data to provide an early warning of potential problems, enabling quicker action from inspectors. 

His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) regularly inspects and monitors police forces using the PEEL (police effectiveness, efficiency and legitimacy) assessment. 

A police force is escalated into enhanced monitoring, also known as Engage, when a serious problem is found, and the response is not deemed sufficient. However, by this stage, service to the public may already have been impacted.  

HMICFRS approached the Accelerated Capability Environment (ACE) to see if there was a way to create an early-warning predictor tool which could estimate PEEL assessment grades before inspections. This would enable HMICFRS’s inspection programme to be prioritised to more urgently visit forces which have been flagged, with the aim of helping to improve any issues – and making communities safer – sooner. 

Applying machine learning to crime data 

A decision was taken to focus on one of the PEEL assessment questions: how well forces investigate crime. Working with supplier The London Data Company, a proof of concept for a machine-learning algorithm was pulled together in just eight weeks.  

This used publicly available data from sources such as 999 calls, the Home Office and Office for National Statistics on levels of crime and crime outcomes. It was able to correctly predict the PEEL grade for a force in around 60% of cases and was one grade either higher or lower in around 90% of cases. 

Jacquie Hayes, HMICFRS insight portfolio director, said: “Our inspection process looks at a massive amount of data from a force, and broadly this tool is coming to a very similar conclusion.

“We are now exploring what more we can do with the data that we collect, as well as what other PEEL questions we could expand this to.”

HMICFRS, which has an ambition to be more data-driven, is now working with The London Data Company on how it can deploy the initial demonstrator tool into its live systems and overall inspection process over the next 18 months. 

In terms of potential next applications, Hayes said: “Fire and rescue is also on the list – but it’s a very long list, because we would like to do a lot of things with it! 

“You can’t replace our inspection teams with artificial intelligence, but we can certainly think about what this means for how we inspect, and I think this will have an implication on that.”

She added: “This tells us where there might be a problem, but not why there might be a problem, which is why we’d then want to follow up with inspection activity to check if that is right, what the source of the problem is and what the force is doing about it.  

“We want to make communities safer, so it’s not just about pointing out problems, it’s trying to support forces to improve those as well.”

Published 22 April 2024