This document discusses the use of sentencing software called COMPAS in courts to assess the recidivism risk of criminal defendants. It notes that an investigation by ProPublica found COMPAS to be biased against black defendants by incorrectly classifying them as having a higher risk of reoffending compared to white defendants. While COMPAS claims its algorithms are not biased, several studies have found COMPAS to be no more accurate than human judges in predicting recidivism and that it has low success rates, especially for predicting violent crime recidivism. There are concerns about ethical issues arising from the use of algorithms to help determine criminal sentences.
This document discusses the use of sentencing software called COMPAS in courts to assess the recidivism risk of criminal defendants. It notes that an investigation by ProPublica found COMPAS to be biased against black defendants by incorrectly classifying them as having a higher risk of reoffending compared to white defendants. While COMPAS claims its algorithms are not biased, several studies have found COMPAS to be no more accurate than human judges in predicting recidivism and that it has low success rates, especially for predicting violent crime recidivism. There are concerns about ethical issues arising from the use of algorithms to help determine criminal sentences.
This document discusses the use of sentencing software called COMPAS in courts to assess the recidivism risk of criminal defendants. It notes that an investigation by ProPublica found COMPAS to be biased against black defendants by incorrectly classifying them as having a higher risk of reoffending compared to white defendants. While COMPAS claims its algorithms are not biased, several studies have found COMPAS to be no more accurate than human judges in predicting recidivism and that it has low success rates, especially for predicting violent crime recidivism. There are concerns about ethical issues arising from the use of algorithms to help determine criminal sentences.
This document discusses the use of sentencing software called COMPAS in courts to assess the recidivism risk of criminal defendants. It notes that an investigation by ProPublica found COMPAS to be biased against black defendants by incorrectly classifying them as having a higher risk of reoffending compared to white defendants. While COMPAS claims its algorithms are not biased, several studies have found COMPAS to be no more accurate than human judges in predicting recidivism and that it has low success rates, especially for predicting violent crime recidivism. There are concerns about ethical issues arising from the use of algorithms to help determine criminal sentences.
JOHN PHILIP MONERA LEONIL OCANA SENTENCING SOFTWARE SOFTWARE USED BY JUDGES IN SENTENCING HEARINGS TO CREATE RECIDIVISM. COMPAS -CORRECTIONAL OFFENDER MANAGEMENT PROFILING FOR ALTERNATIVE SANCTIONS SOME OF ETHICAL DILEMMAS
• In May, the investigative news organization ProPublica claimed that COMPAS
is biased against black defendants. Northpointe, the Michigan-based company that created the tool, released its own report questioning ProPublica’s analysis. ProPublica rebutted the rebuttal, academic researchers entered the fray, this newspaper’s Wonkblog weighed in, and even the Wisconsin Supreme Court cited the controversy in its recent ruling that upheld the use of COMPAS in sentencing. • ProPublica points out that among defendants who ultimately did not reoffend, blacks were more than twice as likely as whites to be classified as medium or high risk (42 percent vs. 22 percent). Even though these defendants did not go on to commit a crime, they are nonetheless subjected to harsher treatment by the courts. ProPublica argues that a fair algorithm cannot make these serious errors more frequently for one race group than for another. DISTRIBUTION OF DEFENDANTS ACROSS RISK CATEGORIES BY RACE. BLACK DEFENDANTS REOFFENDED AT A HIGHER RATE THAN WHITES, AND ACCORDINGLY, A HIGHER PROPORTION OF BLACK DEFENDANTS ARE DEEMED MEDIUM OR HIGH RISK. AS A RESULT, BLACKS WHO DO NOT REOFFEND ARE ALSO MORE LIKELY TO BE CLASSIFIED HIGHER RISK THAN WHITES WHO DO NOT REOFFEND. • using AI in investigations and sentencing could potentially help save time and money • COMPAS’s algorithms overall have been found to be no more effective than human decision. A study conducted by a researcher at Dartmouth College determined that humans were able to accurately predict whether a criminal reoffended just as well as COMPAS. Another study by Rutgers University reaffirmed the notably low success rates of COMPAS, especially when predicting the likelihood of someone who committed a violet crime to reoffend. • COMPAS involves inputting the answers to over 100 questions about a person’s history from a variety of subjects, including offenses, family, and even social life. • means to guide courts in their sentencing. HTTPS://GCN.COM/ARTICLES/2018/01/18/RECID IVISM-PREDICTION-SOFTWARE-FLAWS.ASPX • http://si410wiki.sites.uofmhosting.net/index.php/Criminal_sentencing_softwar e • https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can- an-algorithm-be-racist-our-analysis-is-more-cautious-than- propublicas/?noredirect=on