LESSON 2.2: BIAS IN CRIMINAL LAW - THE CASE OF COMPAS
The COMPAS software (Correctional Offender Management Profiling for Alternative Sanctions) is a risk assessment tool used in U.S. courts to evaluate the likelihood that a defendant will reoffend. Its algorithm assesses general recidivism risk, violent recidivism potential, and the risk of pretrial misconduct, aiming to support judicial decisions around sentencing, parole, and bail. Although COMPAS does not directly consider race in its calculations, a 2016 ProPublica investigation revealed significant racial disparities in its predictions.
ProPublica’s analysis found that COMPAS was nearly twice as likely to label Black defendants as high-risk for reoffending compared to white defendants. Specifically, it misclassified 45% of Black defendants as high risk, whereas only 23% of white defendants received the same classification. Additionally, COMPAS incorrectly labeled more white defendants as low-risk who subsequently reoffended—48% of white defendants compared to 28% of Black defendants. Even when controlling for other factors like prior crimes, age, and gender, Black defendants were still rated 77% more likely to reoffend than white defendants.
Some critics later challenged ProPublica’s findings, arguing that the investigation inaccurately interpreted COMPAS’s results and implied that all actuarial risk assessments are inherently biased. However, further research found that COMPAS is no better at predicting recidivism than a random person, raising serious questions about the validity and reliability of using such algorithms in legal decision-making.
This example underscores the potential dangers of using biased algorithms in criminal law. Even if race is not an explicit variable in an algorithm, systemic biases embedded in other data points—like socioeconomic status, zip code, or arrest records—can lead to racially biased outcomes. In critical applications like criminal sentencing, the implications of algorithmic bias are profound, potentially leading to unfair treatment and perpetuating historical injustices. This case highlights the urgent need for transparency, rigorous testing, and ethical oversight in the deployment of AI systems in the justice system.
Hao, K., & Stray, J. (2019). Can you make AI fairer than a judge? Play our courtroom algorithm game. MIT Technology Review. Link
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2022). Machine bias. In Ethics of data and analytics (pp. 254-264). Auerbach Publications. Link
Angwin, J., Larson, J., Mattu, S. and Kirchner, L. (2016). ProPublica Machine Bias. There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica web-page. Link
Yong, E. (2018). A popular algorithm is no better at predicting crimes than random people. The Atlantic, 17, 2018. Link
Larson, J., Mattu, S., Kirchner, L. and Angwin, J. (2016). How We Analyzed the COMPAS Recidivism Algorithm. ProPublica web-page. Link
Park, A. L. (2019). Injustice ex machina: Predictive algorithms in criminal sentencing. UCLA Law Review, 19. Link