Despite having a $12 billion budget and being located adjacent to Silicon Valley, San Francisco doesn’t always take advantage of the ways in which tech can improve civic life or the work of its city employees.
Despite having a $12 billion budget and being located adjacent to Silicon Valley, San Francisco doesn’t always take advantage of the ways in which tech can improve civic life or the work of its city employees. (Case in point: the disaster that is the DMV.) But there is one office that is pushing the envelope and collaborating with programmers, nonprofits, and computer scientists with the vital goal of improving its criminal justice practices. Just last month District Attorney Geroge Gascón announced that a partnership with Code for America had enabled his office to clear all old marijuana convictions made defunct with the passage of Proposition 64. And on Wednesday, he shared the news that a new collaboration with Stanford was in the works, to employ artificial intelligence as a means of mitigating implicit racial bias among his staff.
If the words “artificial intelligence” combined with “criminal justice system” give you goosebumps, you’re not alone. The system is already so flawed; is technology going to hinder or hurt us? Gascón is convinced it’s the former.
“Lady justice is depicted wearing a blindfold to signify impartiality of the law, but it is blindingly clear that the criminal justice system remains biased when it comes to race,” he says. “This technology will reduce the threat that implicit bias poses to the purity of decisions which have serious ramifications for the accused, and that will help make our system of justice more fair and just.”
The concept is fairly simple. When a district attorney receives a case and has to decide whether to move forward with charges, they get all sorts of information that could subconsciously be creating a racial bias. Often someone’s race is listed in police incident reports, but if not it can easily be deduced or assumed based on the hair and eye color listed, the neighborhood where the alleged crime took place, or even the name of the officer who made the arrest — it’s a small city, and DAs often know what districts individual police patrol.
Under this new AI system, all of that information would be redacted. An attorney would look at the basic evidence, decide whether or not to charge the case, and then move forward to an unredacted copy. If their decision then changes based on the disclosed information, they’ll have to justify it.
It’s one small step toward mitigating our nation’s incredibly racist criminal justice system, and while it doesn’t prevent cops from arresting people of color, it could — in theory — prevent district attorneys from prosecuting them based off racial bias.[…]