Secret sauce and sentencing? Say it isn’t so!

Could you spend a long time in prison due to a software bug and not have the right to examine the software? Possibly.

One of the things that we in technology understand is that we make mistakes, a truth we don’t like to admit to customers.  What happens, however, when a mistake can lead to tragic consequences?

Yesterday’s New York Times reports about a case that the U.S. Supreme Court may soon hear, involving a man who received a six year jail sentence, in part due to a computer program.  The software known as Compas was supposedly developed by Northpointe Inc. (although a search seems to redirect to a Equivant) to provide a risk assessment of a person’s reentry into society.  Such a data-driven analysis is vaguely reminiscent of the movie, Minority Report.  In this case, the defendant Eric L. Loomis was not allowed to examine the software that assessed that he was a significant risk to the community, even though at least one analysis showed that the software may be programmed with some form of racial bias.  The company argues that the algorithm used to make the sentencing recommendation is proprietary, and so should not be subject to review, and that if they release their algorithm to scrutiny they will essentially be giving away their business model, and they may have a point.  Patents on such technology may be flimsy, and they eventually do come to a halt.  To protect themselves, they make use of another legal tool, the trade secret, which has no fixed term of protection.

One can’t say that a mistake is being made in the case of Mr. Loomis, nor can one authoritatively state that the program is formally correct.  The Wisconsin Supreme Court argued creatively that much like college admissions, so long as the software is one input combined with others, the software can be used.  Is it, therefore, any different from a potentially flawed witness giving evidence?  The question here is whether those who wrote the software can be cross-examined, to what extent they may be questioned, and whether the software itself can be examined.  Mr. Loomis argues that to deny his legal team access to the source is a violation of his 14th Amendment right to due process.

We know from recent experience that blind trust in technology, and more precisely, those who create and maintain it, can lead to bad outcomes.  Take for instance the over 20,000 people whose convictions were overturned because a chemist falsified hair analysis results, or other examples where the FBI Crime Lab just flat got it wrong.  Even Brad D. Schimel, the Wisconsin attorney general, conceded before the appeals court that, “The use of risk assessments by sentencing courts is a novel issue, which needs time for further percolation.”  But what about Mr. Loomis and those who may suffer tainted results if there is a software problem?

While the Supreme Court could rule soon on the matter, they will only have very limited avenues, such as permitting or prohibiting its use.  Congress may need to get involved in order to provide other alternatives.  One possibility would be to provide the company some new intellectual property protection, such as an extended patent with additional means of enforcement (e.g., higher penalties against infringement or lower thresholds for discovery) in exchange for releasing the source.  Even if they do, one question would be whether or not defendants could then game the system so as to score better on sentencing.  How great a risk that is we can’t know without knowing what the inputs to the algorithm are.

It is probably not sufficient for the defendant and his legal teams to have access to the source, precisely because more research is needed in this field to validate the models that software like Compas uses.  That can’t happen unless researchers have that access.

Leave a Reply

Your email address will not be published. Required fields are marked *