Several prominent academic mathematicians want to sever ties with police departments across the U.S., according to a letter submitted to Notices of the American Mathematical Society on June 15. The letter arrived weeks after widespread protests against police brutality, and has inspired over 1,500 other researchers to join the boycott.
These mathematicians are urging fellow researchers to stop all work related to predictive policing software, which broadly includes any data analytics tools that use historical data to help forecast future crime, potential offenders, and victims. The technology is supposed to use probability to help police departments tailor their neighborhood coverage so it puts officers in the right place at the right time.
"Given the structural racism and brutality in U.S. policing, we do not believe that mathematicians should be collaborating with police departments in this manner," the authors write in the letter. "It is simply too easy to create a 'scientific' veneer for racism. Please join us in committing to not collaborating with police. It is, at this moment, the very least we can do as a community."
I'm against police brutality in all forms. In fact, I've had many debates with a cop friend of mine--I don't even want my small Northern California city to have the military surplus vehicle (an MRAP) that it has because of the potential for abuse. But when using data to deter crime is a called a "'scientific' veneer for racism", you know that politics and not helping others has taken over.
That can include statistical or machine learning algorithms that rely on police records detailing the time, location, and nature of past crimes in a bid to predict if, when, where, and who may commit future infractions. In theory, this should help authorities use resources more wisely and spend more time policing certain neighborhoods that they think will yield higher crime rates.
Predictive policing is not the same thing as facial recognition technology, which is more often used after a crime is committed to attempt to identify a perpetrator. Police may use these technologies together, but they are fundamentally different.
I'm all for algorithms to help prevent crime and catch perpetrators, I'm against facial recognition technology. The latter's potential for abuse is too high for my liking.
If the accuracy of the predictive policing models is low, then don't rely on it until it gets better. But if the concern is not accuracy but the rather the racial results, then the problem isn't the mathematics.
... "In general, there are lots of people, many whom I know personally, who wouldn't call the cops," he says, "because they're justifiably terrified about what might happen when the cops do arrive..."
ReplyDeleteNo sweat. You get robbed or raped, call BLM.