Artificial Intelligence in the Courtroom

The use of data and predictive algorithms for calculating prison sentences for convicted felons is the latest in the criminal justice system. This practice has come to public attention thanks to a man named Eric L. Loomis. Mr. Loomis was accused of driving a vehicle that was used in a shooting in 2013. And though he ultimately pleaded guilty to eluding an officer and no contest to operating a vehicle without the owner’s consent, Mr. Loomis is appealing his six-year sentence imposed by the judge. The judge informed Mr. Loomis that he presented a “high risk” to the community and stated that his sentence was decided partially by an algorithm called the Compas assessment. Compas, a piece of software developed by Northpointe Inc., calculates the likelihood an individual reoffends and also suggests the type of supervision best suited for that person while incarcerated.

In order to use the program, the user enters information from a survey about the defendant’s personal characteristics and facts about his past criminal conduct. The software then compares that information to reams of stored data and spits out a probability of a future offense. The parameters for Mr. Loomis might include his status as a registered sex-offender for a prior 3rd-degree sexual assault conviction and the fact that he is a white 34-year-old male.

The issue being litigated by Loomis’s attorneys is the manner in which an offender’s score is calculated. Company officials at Northpointe state the algorithms are proprietary and will not disclose the factors considered and the ‘weight’ assigned to each factor. The company acknowledges that the algorithm treats men differently from women and adult offenders differently from juveniles, but other than that little is known.

Michael D. Rosenberg, counsel for Mr. Loomis, argues that his client be allowed to peak under the hood. He says that a defendant has a right to evaluate the algorithm so that he can argue its validity as part of his defense. Mr. Rosenberg sees the black-box sentencing system as opaque and corrupt; in his brief, he writes that the Compas algorithm “is full of holes and violates the requirement that a sentence be individualized.” In response Northpointe general manager Jeffrey Harmon says that “It’s not about looking at the algorithms. It’s about looking at the outcomes.”

Using an algorithm like Compas has obvious benefits – chief among them is that they are not as prone to bias as humans are. Crucially, algorithms like these (and presumably Compas as well) do not take the defendant’s race into account. The algorithms are also efficient in weeding out very low –risk offenders who can be removed from an already overcrowded prison system while identifying those particularly atypical defendants for whom a more thorough pre-sentencing investigation is appropriate. In a way, the algorithms bring a degree of precision and reproducibility to a justice system that relies too heavily on human judgement. Although a judge is encouraged to use certain guidelines and recommended ranges when deciding upon a sentence, there is no guarantee his thought process will be objective or even consistent. Research has shown that factors as seemingly insignificant as the amount of time that has elapsed after his last meal can drastically affect the approval rate of parole officers. Unfortunately, humans are much less rational than previously thought.

The downside to these predictive algorithms is that they pigeon hole the defendant. Perhaps a convict who finally learns his lesson won’t be given a chance for a lesser prison term because of his record. The software also has the effect of over-generalizing in a way that could unfairly punish certain types of defendants. For example, perhaps younger offenders are penalized more harshly and therefore a young convict is unjustly scored simply because other young convicts have displayed a penchant for recidivism. Already, it is known that Compas discriminates between men and women. This means men are given higher risk scores simply for having a Y chromosome – an affliction that cannot be cured (the alleged sexism is another point raised by Loomis).

The ultimate fear is that secret algorithms could be tampered with by agents from the State. This worry might be too Orwellian to fit reality, but because the Compas software code cannot be viewed by anyone outside the company, there is no way for the court to ensure that it hasn’t been hacked or otherwise compromised.

The anxiety could be eliminated by publishing the contents of the software. This is already the case with sentencing algorithms used in the state of Pennsylvania. And officials in Utah have stated that their algorithms weigh four main factors: history of behaviors that harm others, antisocial personality patterns, attitudes and beliefs that favor crime, and association with pro-criminal peers.

It’s now clear that the use of predictive algorithms can bring rigor to the punishment phase of a trial as well as subsequent classification and parole hearings. Creating a formal system would allow judicial decisions to be made impartially, giving all convicts a fair shake. But great care must be taken to ensure the algorithms are calibrated properly to minimize bias and that significant oversight is put in place to ensure that they are used properly.

The use of data science in the courtroom is the first step in systematizing a criminal justice system already prone to error in human judgment and a ruling by the Wisconsin Supreme Court in Mr. Loomis’s case could affirm that step. Perhaps one day, computer algorithms will help identify cases with the highest probability of a wrongful conviction (those that rely exclusively on eye-witness accounts or problematic confessions) and even assist the courts in the adjudications process itself.

Leave a Reply

Your email address will not be published. Required fields are marked *