Risk-assessment tools are on the rise in courts across the country, causing a fierce debate over whether justice should be meted out via algorithm.
Sunday, The Marshall Project published a piece of commentary by Adam Neufeld, a senior fellow at Georgetown Law’s Institute for Technology Policy and Law and the Beeck Center for Social Impact & Innovation. In it, Neufeld comes down in favor of risk-assessment tools, arguing that algorithms can help the criminal justice system, “but only alongside thoughtful humans.”
To add to Neufeld’s thoughtful commentary, The Marshall Project solicited takes from other leading voices in the field. Here’s what they had to say.
Elizabeth Glazer, director of the New York City Mayor's Office of Criminal Justice.
New York City has achieved the lowest incarceration rate of any big city in the nation, even as we’ve kept our crime rates below national averages. Risk-assessment instruments have been a major factor in that achievement. In New York, they help judges make fair decisions while keeping the city safe. Together with researchers from across the country, prosecutors, defenders, judges, police, advocates, and others, we are now updating our risk-assessment tool to improve its accuracy. Through the smart use of data and technology, New York City hopes to continue its path to creating a criminal justice system that is smaller, safer, and fairer.
Hannah Jane Sassaman, Soros Justice Fellow and policy director at the Media Mobilizing Project.
Almost all risk-assessment tools use criminal justice data as proxies for crime. Most forecast future arrest, which is actually predicting law enforcement behavior. We know that certain communities, especially communities of color, are disproportionately over-policed, more likely to be over-charged by prosecutors, and forced into pleas that result in convictions. Risk-assessment tools must empirically account for those disparities in the criminal justice system and its data. Instead of working bias into risk-assessment tools, we should define and measure community risk and needs in partnership with the people most impacted by crime: crime survivors and individuals who live in over-policed communities. With community-led accountability, testing, and an oversight mechanism, we can help uncover bias and correct for it.
Jon Wool, director the Vera Institute of Justice’s New Orleans office.
In criminal justice reform, one of the most critical questions one can ask is, “compared to what?” Critics of risk-based algorithms rightly point out that these tools, which depend heavily on prior convictions as a factor, can reify racial disparities that permeate the system. For that reason, many conclude risk-based algorithms are undesirable. But, again, one must ask, “compared to what?” Most systems presently in use decide whether a person is held in jail or released to enjoy their right to pretrial liberty on one factor: whether they have the money to buy their freedom. That’s an enormous, avoidable driver of racial disparity that can be mitigated by shifting the focus of judges to risk-based decision-making and away from money-based detention.