A new set of algorithms, created by members of the American Civil Liberties Union (ACLU), Carnegie Mellon University (CMU), Idaho Justice Project and the University of Pennsylvania, aims to assess the likelihood of defendants being mistreated in court, reports Government Technology, according to The Crime Report. The tool considers details that ought to be immaterial to the ruling — such as the judge’s and defendant’s gender and race — and then predicts how likely the judge is to award an unusually long sentence.
The predictions can suggest when socio-demographic
details may sway judgments, resulting in especially punitive treatments. The
algorithms’ designers say it’s the first to consider a defendant’s perspective.
In a recent report, the group also suggested that potentially
wronged defendants could use the second algorithm — the one assessing the
likelihood that bias played a role — to argue for reducing sentences that may
have been unfair. However, like other predictive algorithms, the tool draws on
historical data, which could limit how accurately it can reflect today’s
landscape.
Go read more CLICK HERE
No comments:
Post a Comment