Moral AI in the News: AI in the Legal System

Trust in predictive software has reached a level where the legal system has adopted it in various forms to assist in crime prevention. AI systems are being used to help decide bail before trial, determine the likelihood of individuals’ recidivism, and design effective policing strategies to deter and accurately anticipate crime. These systems have been successful in reducing crime rates in America and internationally, but they have also been the subject of debate. Questions of how strict courts should be in an effort to stop crime – at the risk of excessively punishing those who are falsely accused or will not reoffend – have always been relevant to legal systems, and are of particular interest as machines that can be programmed to make consistently lenient or severe decisions become increasingly common. Further, both those who advocate judicial AI and those who oppose its adoption worry about bias: humans are notorious for bringing prejudice into their decisions, which, especially in the law, can create unfair conditions that perpetuate disparities in behavior among different groups. AI has the potential to make fewer irrational errors than humans because it can be told not to take race, for example, into account in its decisions. However, technology is likely to learn and reinforce bias because the software will be trained using human decisions made in a world in which past human decisions have shaped the criminal landscape. Researchers, reporters, police officers, and judges are all working to develop AI that can accurately predict and therefore reduce crime, and that can do so in such a way that does not unfairly target particular demographics.

Official website and description of the Public Safety Assessment tool from the Laura and John Arnold Foundation that is prominently being used to make bail decisions in numerous US jurisdictions:

Description from data-driven experts partnering with the Laura and John Arnold Foundation:

1/12 Paper explains various technical issues that will arise as AI judicial support systems develop:

11/1/13 Article gives an overview of how analytic predictive software can be used to prevent crime by anticipating recidivism:

2/21/14 Article discusses the Arnold Foundation PSA’s use and success in Camden:

8/15/14 Published dissertation gives detailed description of bail decision making strategies in the UK court system, as well as various possible systems that could be automated or eventually made artificially intelligent to determine a convict’s likelihood of breaking bail:

9/9/15 Article describes some ways that criminal recidivism AI judges operate transparently and discusses problems with black-box style systems:

5/23/16 Article provides thorough investigation of the factors AI and related tools use to make criminal risk judgments and the biases these systems have:

6/7/16 Interview with the creators of Risk Terrain Modeling Diagnostics Utility, in which they describe how their software can assess what urban spatial features coincide with crime rates and how this can be used to reduce crime by moving urban features and selective policing strategies:

6/26/16 Article describes the Arnold Foundation PSA and discusses various places where it is being used and its success rates:

8/2/2016 Article gives an overview of the goals of using AI in determining bail amounts, as well as where such AI is starting to be used and the existing shortcomings in the systems:

10/7/16 Article details ways in which AI comes to be racially biased in various contexts, from beauty assessment to criminal law:

12/28/16 Article reports on Oakland police opinions of computer-based predictive policing and how that impacts the police force’s relationship with the public’s trust, especially related to race issues:

2/25/17 Article discusses built-in sexism and racism in personal assistant software and crime-fighting AI respectively, and references the ProPublica article linked above:

3/6/2017 Article addresses bail-determining AI in relation to racial inequities in the criminal justice system:

4/17 Article describes various ways in which AI could be used to replace lawyers, especially because of the lack of law assistance available to the poor:


Image from


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a free website or blog at

Up ↑

%d bloggers like this: