Digital justice tool found to have racial bias

Tags: governance
Categories:
law

What? A study of the algorithm tool used by many U.S. states to predict a violent offender's likelihood to reoffend found that it erroneously gave disproportionately higher risk scores to blacks and disproportionately lower scores to whites. The study, by investigative journalism non-profit ProPublica, analyzed the results from the use of a digital risk assessment tool that claims to be able to accurately predict the recidivism of violent offenders. However the ProPublica study of the widely used tool found that it only accurately predicted violent recidivism in at most 61-percent of cases, or only slightly better than chance.

So what? Far from being quick fixes to systemic problems, the use of algorithmic decision-aids in highly complex fields such as law and governance, may create additional problems that echo long-standing issues with which these fields already grapple, such as racial bias.

Source: ProPublica - Machine Bias & ProPublica - How We Decided to Test Racial Bias in Algorithms

Economy Weakly Scan icon
    
Environment Weakly Scan icon
    
Governance Weakly Scan icon
    
Security Weakly Scan icon
    
Social Weakly Scan icon
    
Technology Weakly Scan icon

Date modified

2017-03-29