Criminal sentencing software

From SI410
Jump to: navigation, search

Criminal sentencing software refers to non-human tools and algorithms used to determine the length and severity of a criminal’s sentence. This software typically takes a number of factors about the criminal, runs them through an algorithm, and serves the results as a recommendation to a judge. The increased reliance on criminal sentencing software over the last 20 years has raised ethical concerns. The algorithms of these software products have been shown to exhibit biases against certain ethnic groups, which has become a source of controversy among judges and policymakers.


Risk assessment systems have been in place for many years, but computerized forms did not exist until the last 20-30 years. Early forms of criminal sentencing software were developed starting in the mid-1990s. Police officials in Placer County, California worked with Miriam Liskin, a computer scientist, to create California CrimeTime. This software took input from the user regarding criminal offenses and allowed the user to see sentencing information from others who performed similar crimes. Over the years, a number of features were added to enhance functionality, which led to the expansion of CrimeTime outside California and across the USA. An example of this includes the New York Prosecutors Training Institute, which is used for educational purposes. While newer, more complex criminal sentencing software now exists, CrimeTime is still very prevalent, and used in approximately 90% of California’s counties.[1]


Correctional Offender Management Profiling for Alternative Sanctions, or COMPAS, is a criminal sentencing software program that, unlike CrimeTime, uses a machine learning algorithm to establish the risk of a criminal reoffending. It is owned by a Equivant, a privately-held company, as a part of their Northpointe software suite.2[2]. COMPAS involves inputting the answers to over 100 questions about a person’s history from a variety of subjects, including offenses, family, and even social life. [3]

COMPAS was launched in 2009 under partnership with the State of California Department of Corrections and Rehabilitation as a part of The Public Safety and Offender Rehabilitation Services Act of 2007.[4]

The COMPAS Assessment Tool Fact Sheet lists the following factors taken into consideration as a part of its algorithms:[5]

  1. Educational-vocational-financial deficits and achievement skills
  2. Anti-social attitudes and beliefs
  3. Anti-social and pro-criminal associates and isolation
  4. Temperament and impulsiveness (weak self-control) factors
  5. Familial-marital-dysfunctional relationship (lack of nurturance-caring and/or monitoring-supervision)
  6. Alcohol and other drug disorders
  7. Deviant sexual preferences and arousal patterns

Ethical implications


Equivant (Northpointe) dismisses any requests to reveal the inner-workings of COMPAS and the algorithms used are undisclosed to this day. Opponents of sentencing software argue that these algorithms must be properly explained for there to be rights to due process. In the case State v. Loomis a Wisconsin man was sentenced to prison for six years [6]. His sentencing was determined with the assistance of COMPAS.[7] He challenged the decision, with the argument that the “black box“ model of criminal sentencing is unfair. The case was rejected by the Wisconsin Supreme Court.[8] They stated that judges should understand the shortcomings of the software but they are still allowed to use it for guidance.[9]

Racial Bias

Another controversy involves the apparent racial bias the sentencing software algorithms show. Extensive research was conducted by ProPublica using a large sample of criminals in Broward County, Florida. They showed that COMPAS consistently determined African Americans to be at a notably higher risk to reoffend compared to white Americans, even when other factors were normalized.[10] ProPublica also found that the algorithm was wrong in different ways for different races. It determined that blacks are more likely to be labeled as high-risk but not actually re-offend whereas with whites, it determined that they were labeled as low-risk but were more likely to re-offend. [11] Northpointe responded to the ProPublica study and disagreed with these results.[12]


COMPAS’s algorithms overall have been found to be no more effective than human decision. A study conducted by a researcher at Dartmouth College determined that humans were able to accurately predict whether a criminal reoffended just as well as COMPAS.[13] Another study by Rutgers University reaffirmed the notably low success rates of COMPAS, especially when predicting the likelihood of someone who committed a violet crime to reoffend.[14]

Recent developments

In response to the COMPAS controversies, there have been attempts to produce criminal sentencing software programs using pre-established machine learning algorithms that perform well and do not show evidence of racial bias. A collaboration between MIT and Stanford University produced the Supersparse Linear Integer Model, or SLIM. It simplifies the process down to only a few questions to increase efficiency and still performs just as well as software like COMPAS.[15] Another method developed by Cornell categorizes criminals to help determine rates of reoffending from a smaller pool, resulting in more accurate success rates.[16]

While applications of current machine algorithms to criminal sentencing are still under research, they provide an early answer to the continuing controversies surrounding COMPAS.


  1. “Who Is PlacerGroup, What Is CrimeTime?” The Placer Group ,
  2. Case Management for Supervision.” Equivant,
  3. Weill, Kelly. “Computer Program That Calculates Prison Sentences Is Even More Racist Than Humans, Study Finds.” The Daily Beast, The Daily Beast Company, 21 Jan. 2018,
  6. “State v. Loomis.” Harvard Law Review,
  8. Israni, Ellora Thadaney. “When an Algorithm Helps Send You to Prison.” The New York Times, The New York Times, 26 Oct. 2017,
  9. Liptak, Adam. “Sent to Prison by a Software Program's Secret Algorithms.” The New York Times, The New York Times, 1 May 2017,
  10. Israni, Ellora Thadaney. “When an Algorithm Helps Send You to Prison.” The New York Times, The New York Times, 26 Oct. 2017,
  11. Spielkamp, Matthias. “We Need to Shine More Light on Algorithms so They Can Help Reduce Bias, Not Perpetuate It.” MIT Technology Review, MIT Technology Review, 16 June 2017,
  12. Angwin, Julia, et al. “Machine Bias.” ProPublica,
  13. MatacicJan, Catherine, et al. “In the United States, Computers Help Decide Who Goes to Jail. But Their Judgment May Be No Better than Ours.” Science | AAAS, 18 Jan. 2018,
  14. Herrschaft, and Bryn A. “Evaluating the Reliability and Validity of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) Tool: Implications for Community Corrections Policy.” Rutgers University Community Repository,