Arizona Criminal Law and Procedure Blog

Automated Risk Assessment Tools Gaining Traction in Court

Posted by Steven Sherick | Jun 16, 2017 | 0 Comments

As incarceration levels in Arizona and around the United States rise and courts struggle to cope with the number of defendants passing through their doors, many states and governments have been turning to automated systems streamline the processing of many defendants. One controversial tool implemented in multiple states, including Arizona, is automated risk assessment.

Automated risk assessment is usually a computer algorithm which uses certain characteristics of a defendant or prisoner such as age, sex, family background, employment status, and location to determine that individual's likelihood of continuing to engage in criminal behavior. In theory, these tools are designed to objectively measure a defendant or prisoner's risk to the community. They are used by probation departments and the courts to help judges determine the length of a sentence and whether or not someone should receive a sentence of probation as opposed to prison.

Automated risk assessment programs are sometimes provided by private “criminal justice” companies, sold for profit, and proprietary. In other words, “The developers tend to view their technologies as trade secrets. As a result, they often refuse to disclose details about how their tools work, even to criminal defendants and their attorneys, even under a protective order, even in the controlled context of a criminal proceeding or parole hearing.” This means that the mechanisms and reasoning behind the numerical representation of risk are typically not shared with the people who are being convicted, their lawyers, or even the judge. In fact, in 2015, a California appeals court upheld a ruling that the code source for risk assessment technology from the company TrueAllele could remain concealed from the defense.

Flaws in Automated Risk Assessment

A judge's assessment of a defendant's risk of repeating a crime has always been a part of the sentencing process. This is why second or third offenses usually carry heavier penalties and longer jail time. However, with the introduction of an automated processes, defendants and their lawyers have a harder time convincing a judge that other, perhaps unmeasurable or situational, factors, should be considered. In addition, an analysis of the technology by the independent news organization ProPublica found that these risk assessment tools are both inaccurate and racially biased.

When ProPublica assessed a commonly used technology, COMPAS, in Florida convictions, they found that black offenders were almost twice as likely to be flagged as future criminals as compared to white offenders when all other factors were equal. Furthermore, they found that the technology was not accurate when predicting an offender's likelihood of committing future violent crimes; “only 20 percent of the people predicted to commit violent crimes actually went on to do so.” Inaccuracy may be due to a failure to input certain information, the way the technology weighs different factors more or less heavily, or simply computer error. Whatever the origin of these inaccuracies, the practice of relying on computer-generated risk factor assessments may jeopardize a defendant's ability to distinguish themselves as unique individuals in the court system.

About the Author

Steven Sherick

Steve Sherick is a "near native" Arizonan having lived most of his life in Tucson.  He received his undergraduate and legal education at the University of Arizona.  Since 1980 he has devoted all of his practice to criminal law, always in defense of those accused of crimes.  He received his Board ...

Comments

There are no comments for this post. Be the first and Add your Comment below.

Leave a Comment

Contact Us Today

Our Tucson criminal defense attorneys are ready to protect your rights. Contact us at 520-318-3939 to schedule a consultation.

Menu