Brian's Blog

14

The longest-running television crime drama was Law and Order, hitting the airwaves for a consecutive twenty seasons. I would argue that the effectiveness of the series had a lot to do with the balance of airtime between the “law” and the “order” side of the criminal justice system. Therefore, as a continuation of my blog series, Conversations on Big Data, I am picking up on a theme that was started in prior blogs where I discussed how real-life police – the “Order” half of the legal system – use quantitative analytics as key evidence that convinces the jury to send the crooks away for a long, long time.

Today’s blog addresses the other half of the system – the “Law” – expanding on a conversation that I had with Gerald Ray, the Deputy Executive Director of the Office of Appellate Operations at the Social Security Administration. Gerald’s interest in applying analytics to the law started when he was in law school. He explains to us that his professors “would have us read cases and then extract some maxims, … as if there was logic to it all.” He continued on to say, “but what I also saw is … divergent opinions from different judges.” A sitting judge has demonstrated expertise as a result of years of training and experience. Yet, different judges, at similar levels of the judicial system and having very similar training and experience can and do render different outcomes.

Gerald discovered that sometimes different outcomes happen because of “process errors.” He concluded that “if we could be a little more systematic, we could … give better feedback and change the behavior.” As a result of his hypothesis, Gerald spearheaded a program of continuous improvement, applied specifically at SSA to policy compliance. This resulted in increasing the performance metrics as the team’s process improvements enabled them to do more work. I provide more details on this in one of my prior blog posts, “Applying Analytics to Continuous Process Improvement.”

This demonstrated program success paved the way for the expansion of analytics as a legal tool beyond the process of policy compliance. Gerald explored techniques such as natural language processing (NLP), text-mining, and predictive analytics to leverage the SSA’s other non-structured data types. As a result, Gerald initiated a program that applies K-means clustering techniques to mine claims data and see if there is any relationship in process errors. This early initiative has already given Gerald enough evidence to cite a correlation with the expression of “homicidal ideation” (remember, this was a conversation with a lawyer) and cases associated with drug or alcohol problems. Gerald says, this “small example” is important, because the legal issues generated by the threatening behavior might be prevented. “We’d rather not push them down that path because they really can’t control necessarily what they’re saying.”

Gerald says that there are 3 important components needed to leverage quantitative analytics in the legal profession.

1.       Problem

2.       Data scientists and subject matter experts

3.       Solution

The solution, “which is often in the data”, typically requires change in process and behavior. Gerald says that “it’s easier for me to find the problem than to get people to change their behavior.” Successful programs are able to work with staff so that the numbers help to make the “solution” easier to understand and more intuitive.

For example, visualization tools are incredibly useful to demonstrate “that it’s better to do it this way than that way”, Gerald says. Furthermore, he uses heat maps to publish policy compliance monitoring reports, noting that “when people see rows and columns and numbers, their eyes glaze over unless they’re accountants.” “But if I can show patterns from the visualizations … if something is different and it jumps out in the visualizations – using, color and things to make it pop – then people can see that instantly.” This results in a higher degree of buy-in. Sounds like a session in the situation room of whatever your favorite legal drama show, doesn’t it?

In my prior blog on my conversation with Gerald, he talked about the importance of feedback for “Applying Analytics to Continuous Process Improvement.” And by sharing that feedback with stakeholders, through visualizations, one can dramatically impact change by encouraging conforming to an optimal process or outcome.

Another influence on Gerald is historian James Burke, who teaches that all events are ultimately interconnected. For example, the steam engine was invented in the mining industry and its success there triggered the Industrial Revolution. Gerald feels that using data analytics and mathematical techniques and tools in the legal system is another example of a Burke-like interconnection, which we see in the growing use of video, audio, and image data in criminal investigations and prosecutions.

For the complete audio interview, please visit: http://ourpublicservice.org/OPS/events/bigdata/

___

This article represents the views of the author only, and the information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation.

Actions: E-mail | Permalink |

Latest Tweets

Search Blogs