Minority Report is often a go-to sci-fi comparison for new technology, but retina scanners and motion controls are just side dishes to the film's central futuristic idea: a system to predict when and where crime may occur, in order to prevent it. Given that system's reliance on humans with precognitive abilities, we're unlikely to see it in the real world, but machine learning algorithms are already aiding police forces around the world identify crime hotspots with the goal of preventing crimes before they occur. The Dubai Police is the latest to have AI backup, in the form of Space Imaging Middle East's (SIME) new Crime Prediction software.
SIME's software is said to work like others already in use: machine learning algorithms are fed existing data and intelligence from police databases to identify patterns that human crime analysts might miss. The system can then use those patterns to determine where crime may be likely to occur next, and inform a police force where to best deploy officers and other resources. The idea isn't just to go there and arrest suspicious-looking people, but to use a larger police presence in an area to deter crime from happening in the first place.
"This software is uniquely intelligent in its capability to accurately discern intricate patterns of criminal behavior in seemingly unconnected events and then predict the probability of reoccurrence," says Spandan Kar, Head of SIME's Geographic Information Systems (GIS) Division. "We are confident that these precise analytics, when combined with the knowledge and instincts of experienced police officers, will create a formidable force to deter crime."
It sounds impressive, but does it work? Police departments in various US cities have been using systems like Predpol, HunchLab and Series Finder for years, with mixed results and uneasy moral implications. After using HunchLab for 12 months, the St Louis County Police department expects a drop in this year's crime statistics, but the results are hard to measure as a direct effect of predictive policing.
Not only are the benefits unclear, but organizations like the American Civil Liberties Union (ACLU) argue that the data carries embedded racial biases, and according to a statement released in August, the systems "are used primarily to further concentrate enforcement activities in communities that are already over-policed, rather than to meet human needs… Predictive policing tools threaten to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change."
SIME hasn't given many details on exactly how its system works or if it's built to overcome some of these issues, but others like HunchLab are actively trying to be transparent about its inner workings. How predictive policing tools, and the laws around them, continue to change will be interesting to see.
Source: Space Imaging Middle East via Businesswire