In 2014, Eric Holder, then the U.S. attorney general, articulated the uncertainty swirling around risk assessment tools in a speech given to the National Association of Criminal Defense Lawyers’ 57th Annual Meeting, reported the ABA Journal.
“Although these [risk assessment] measures were crafted with the best of intentions, I am concerned that they may inadvertently undermine our efforts to ensure individualized and equal justice,” he said. “They may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”
Angel Ilarraza, director of consulting and business development at Northpointe Inc., the Michigan-based company that created Compas, thinks that this concern is ill-founded. “There’s no secret sauce to what we do; it’s just not clearly understood,” Ilarraza says.
Compas uses an algorithm, a term Ilarraza does not like because he thinks it is confusing, that assesses 137 questions answered by the charged person and supplemented by his or her criminal records. These inputs are plugged in to the algorithm, which is a set order of operations like a math equation. Based on this process, the person’s likelihood of committing a future crime (the output) is pegged on a scale of 1 (low risk) to 10 (high risk). Beyond Wisconsin, Compas also is used in California, Michigan and New York, among other jurisdictions.
The questionnaire covers the gamut of a person’s criminal history and personal background as a way to decipher risk. Questions include whether an alleged offender experienced his or her parent’s divorce or has a telephone at home, and whether the screener thinks the defendant is a suspected or admitted gang member.
Ilarraza, supporting the Wisconsin Supreme Court view, is quick to point out that the tool is meant to inform decision-making. “It facilitates the implementation of evidence-based practices,” he says.
Christine Remington, the Wisconsin assistant attorney general who argued Loomis for the state in the supreme court, agrees. “I don’t think there’s any question that [Compas] is a good thing,” she says. It allows the corrections department to “tailor limited resources in the best way possible.”
Compas recently came under scrutiny by ProPublica, an investigative journalism organization. Assessing the tool’s outputs in Broward County, Florida, ProPublica found that it was 61 percent predictive of rearrest, “somewhat more accurate than a coin flip.” The algorithm was likely to indicate black defendants as “future criminals” at almost twice the rate as white defendants.
Northpointe disputes ProPublica’s findings. The back-and-forth can be read in full on ProPublica’s website.
This clash illustrates a new found popular interest in these tools. But using math to guide decision-making in the criminal justice system is not new. According to Richard Berk, a professor of criminology and statistics at the University of Pennsylvania, an Illinois parole board started to use algorithms in the 1920s.
“In the ‘20s, parole boards were worried about what parole boards are worried about today: If I release somebody, are they going to commit a horrible act?” Berk explains. Back then, the tools were simple mathematical tabulations that assessed risk by comparing people up for parole to those previously released.
Since then, the math behind these tools has improved accuracy, and technological advancement allows for statisticians to wrestle with bigger data sets through computers. However, the point remains: U.S. criminal justice systems have used math to guide decision-making for about a century.
Even with this history, how these tools affect equal protection and due process of defendants remains unresolved.
To read more CLICK HERE