The Black Box of Bail Algorithms: One Sensible Solution Commentary
Ichigo121212 / Pixabay
The Black Box of Bail Algorithms: One Sensible Solution
Edited by: Kelly Cullen

Much needed legislation, which reins-in use of algorithms to determine who stays in jail pending trial and who gets out, recently passed in the Idaho legislature. House Bill 118, sponsored by Representative Greg Chaney (R-Caldwell), is first-of-its-kind legislation that addresses inherent flaws in the criminal justice system. The bill calls for the algorithms to be transparent and certified free of any bias. It’s a good start to addressing a problem that has afflicted not only Idaho, but the entire nation.

What are these algorithms and how do they work?  They start by taking data concerning individuals who have been previously involved in the criminal justice system and then breaking it down into categories. A typical risk assessment tool will have around ten factors, including the number of previous convictions in a person’s criminal history, but there can be as many as 200. For example, age is a factor that correlates highly to the probability that a defendant is going to flee or commit a new crime while out on bail. (It should also be pointed out that age is one of the biggest predictors in criminal justice. Individuals typically “age out” of crime as they get older.)  Regression analysis is then performed on the data, which reveals mathematical correlations. After the entirety of data is analyzed, an overall risk score is produced, which is then given to judges.

There are a number of fundamental problems with the algorithms. Generally, their analyses are not shared with either the public or criminal defendants. They don’t get to see the data that was used to build these tools. Also, they are not allowed to examine the regression analysis, nor are they permitted to perform even basic checks on the data, such as confirming the math.

As to why this is the case, the reasons lie in the murky world of corporate self-interest. In many instances, for-profit corporations are the builders of the algorithms, which includes COMPAS and Arnold Ventures, LLC. These entities are fearful of others stealing their information and building their own algorithms, thus taking them out of the game. Also, the algorithms are often constructed from information gleaned from FBI crime files, as well as state and local criminal history databases. Builders and users of the algorithms are claiming that existing laws prohibit disclosure of such information to the public. In addition, these same algorithm proprietors are asserting they are trade secrets and thus protected from disclosure.

Another issue is that the majority of algorithms used around the country have never been tested for racial or other bias. Arnold Ventures had a research center called RTI International perform a study, which unsurprisingly, concluded that their risk assessment tool was free of bias. Conversely, legitimate researchers from ProPublica found just the opposite for the COMPAS tool, damningly concluding that it was, in fact, “biased against blacks.”

To my knowledge, there are few risk assessment tools that have ever been tested for racial bias. Despite their being deployed in dozens of jurisdictions across the country from Virginia to California, rarely, if at all, has anyone even bothered to check on what should be a foundational aspect of the other algorithms. Because of this systemic oversight, we can’t really know if the tools are inherently biased, but there is evidence to suggest they are.

Yet another problem is that risk assessment algorithms don’t seem to deliver on what they promise to achieve. They have been widely touted as being effective in decreasing pretrial incarceration and reducing new crimes while out on bail. However, after examining a variety of such tools, a paper published by the Minnesota Law Review concluded that they largely have little to no positive effect on the system, while even having a negative effect in a substantial number of cases.

Idaho’s House Bill 118 attempts to solve these problems by first requiring that the algorithms are free of bias before they are used. Even more important, it calls for making the data behind them open-source. This will have a profound effect across the country, as it will allow researchers to conduct bona fide analyses as to whether these ubiquitous tools truly work and whether or not they address the glaring problem of racial bias in our criminal justice system.

H-118’s call for nationwide transparency echoes recommendations made more than two years ago by New York University’s AI Now Institute, which declared, “Core public agencies, such as those responsible for criminal justice, healthcare, welfare, and education (e.g., ‘high stakes’ domains) should no longer use ‘black box’ AI and algorithmic systems.”

As the debate on algorithms grows louder—even as their use grows disturbingly more widespread—policy-makers would be wise to take a page out of the Gem State’s playbook and carefully consider all the facts. If they do, they will also conclude that use of these black-box systems must come to an end, once and for all.

Jeff Clayton has worked in both the public and private spheres on the bail issue; for the Colorado State Courts and Probation Department and as General Counsel for the Professional Bail Agents of Colorado. Currently Mr. Clayton serves as the Executive Director for the American Bail Coalition.

 

Suggested citation: Jeff Clayton, The Black Box of Bail Algorithms: One Sensible Solution, JURIST – Academic Commentary, Mar. 14, 2019, https://www.jurist.org/commentary/2019/03/jeff-clayton-bail-algorithm/.

 

Editor’s Note: We have made some minor corrections; removing language suggesting that RTI International was not a legitimate research institution and adding a link to their home page, indicating that we have found additional risk assessment tools that have been tested for racial bias, and clarifying Arnold Ventures status as an LLC, as opposed to the Arnold Foundation.

 


This article was prepared for publication by Kelly Cullen, JURIST Managing Editor. Please direct any questions or comments to him at commentary@jurist.org


Opinions expressed in JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of JURIST's editors, staff, donors or the University of Pittsburgh.