Algorithms have been used by the police identify crime hot spots in Memphis, Tennessee since 2005. Under the code name of Operation Blue Crush, from 2005 to 2011 crime has dropped by 24%.
Crush represents “Criminal Reduction Utilising Statistical History” or predictive policing as police officers are guided by algorithms. Criminologists and data scientists at the University of Memphis compiled crime statistics from across the city over time and overlaid it with other statistics such as social housing maps, outside temperatures etc. They then instructed algorithms to search for correlations in the data to identify crime “hot spots” which led the police to flood the crime hot spot areas with targeted patrols.
According to the Guardian, Dr Ian Brown, the associate director of Oxford University’s Cyber Security Centre, raises concerns over the use of algorithms to aid policing, as seen in Memphis where Crush’s algorithms have reportedly linked some racial groups to particular crimes: “If you have a group that is disproportionately stopped by the police, such tactics could just magnify the perception they have of being targeted.”
Can this system work here? As the Home Secretary, Theresa May stated yesterday in Parliament:
Out of one million stop and search only 9% resulted in an arrest. So should Police Authorities use this or similar systems to target areas and predict crime or does it have the potential to create so-called crime “hot spots” with possible out of date data? There are then issues to take into account such as fairness and community confidence and the wasting of police time.
Theresa May July, 2nd 2013 http://www.parliamentlive.tv/Main/Player.aspx?meetingId=13391&player=smooth
Viktor Mayer-Schönberger, professor of internet governance and regulation at the Oxford Internet Institute, also warns against humans seeing causation when an algorithm identifies a correlation in vast swaths of data.
“This transformation presents an entirely new menace: penalties based on propensities, that are the possibility of using big-data predictions about people to judge and punish them even before they’ve acted. Doing this negates ideas of fairness, justice and free will.”
“In addition to privacy and propensity, there is a third danger. We risk falling victim to a dictatorship of data, whereby we fetishise the information, the output of our analyses, and end up misusing it. Handled responsibly, big data is a useful tool of rational decision-making. Wielded unwisely, it can become an instrument of the powerful, who may turn it into a source of repression, either by simply frustrating customers and employees or, worse, by harming citizens.”