Officials in Pennsylvania, which has been slowly preparing
to use risk assessment in sentencing for the past six years, are sensitive to
these potential pitfalls, reported Bloomberg. The state’s experience shows how tricky it is to
create an algorithm through the public policy process. To come up with a
politically palatable risk tool, Pennsylvania established a sentencing
commission. It quickly rejected commercial products like Compas, saying they
were too expensive and too mysterious, so the commission began creating its own
system.
To understand the algorithms being used all over the
country, it’s good to talk to Richard Berk. He’s been writing them for decades. Berk, a
professor at the University of Pennsylvania, is a shortish, bald guy, whose
solid stature and I-dare-you-to-disagree-with-me demeanor might lead people to
mistake him for an ex-cop. In fact, he’s a career statistician.
“Race was discarded immediately as an input. But every other
factor became a matter of debate. When the state initially wanted to include
location, which it determined to be statistically useful in predicting who
would re-offend, the Pennsylvania Association of Criminal Defense Lawyers
argued that it was a proxy for race, given patterns of housing segregation. The
commission eventually dropped the use of location. Also in question: the
system’s use of arrests, instead of convictions, since it seems to punish
people who live in communities that are policed more aggressively.
Berk argues that eliminating sensitive factors weakens the
predictive power of the algorithms. “If you want me to do a totally
race-neutral forecast, you’ve got to tell me what variables you’re going to
allow me to use, and nobody can, because everything is confounded with race and
gender,” he said.
To read more CLICK HERE
No comments:
Post a Comment