Hello!
A police department in the United Kingdom is testing an AI-powered system that could potentially help solve cold cases by condensing decades of detective work into mere hours, Sky News reports.
But there's no word yet on the accuracy rate of this Australia-developed platform, dubbed Soze, which is a major concern since AI models tend to spit out wildly incorrect results or hallucinate made up information.
The Avon and Somerset Police Department, which covers parts of South West England, is putting the program through its paces by having Soze scan and analyze emails, social media accounts, video, financial statements, and other documents.
Sky reports that the AI was able to scan evidence from 27 "complex" cases in about 30 hours — the equivalent of 81 years of human work, which is staggering.
No wonder this police department is interested in using this program, because those numbers sound like a force multiplier on steroids — which makes it especially enticing to law enforcement that may be stretched thin in terms of personnel and budget constraints.
"You might have a cold case review that just looks impossible because of the amount of material there and feed it into a system like this which can just ingest it, then give you an assessment of it," the UK's National Police Chiefs' Council chairman Gavin Stephens told Sky. "I can see that being really, really helpful."
Minority Report
Another AI project Stephens referenced is putting together a database of knives and swords, which many suspects have used to attack and maim or kill victims in the United Kingdom.
Stephens seems optimistic about these AI tools being rolled out soon, but it'd make sense to validate that they're working properly first. AI, perhaps especially in law enforcement, is notoriously error prone and can lead to false positives.
One model which was used to predict a suspect's chances of committing another crime in the future was inaccurate and also biased against Black people — which sounds like something straight out of the Philip K. Dick novella "Minority Report," later adapted into the 2002 Steven Spielberg movie.
AI facial recognition can also lead to false arrests, with minorities repeatedly being fingered for crimes they didn't commit. These inaccuracies are so concerning that the US Commission on Civil Rights recently criticized the use of AI in policing.
There's a perception that because these are machines doing analysis, they'll be infallible and accurate. But they are built on data collected by humans, who can be biased and flat out wrong, so familiar issues are baked in from the start.
Thank you!
Join us on social media!
See you!