×

User Login

×

Which kind of user are you?

×
Chat with us

Legal Risk Assessment Tools

Attorneys Appearing in Court

The best way of marketing your firm is to get a good reputation. That means battling valiantly to ensure your clients’ needs are met, and doing everything you can to get good results. For small firms, that often means selecting choice few clients, and dedicating all of your resources to them. This will help you spread the good word - potential clients bail is “pay to play” , and a lack of sufficient funds can mean those who can’t afford bail lose their jobs, their homes, their lives - all while they wait for trial. There are a variety of alternative systems being deployed across the United States right now; risk assessment tools are one of them.

Risk assessment tools are quite controversial. These tools are algorithms, created by private companies, that attempt to assess the flight risk of someone who has been accused of a crime. In California, Jerry Brown signed a law that will make risk assessment tools the rule of the land in 2019. These algorithms are posited as being better than judges at determining flight risk without implicit bias; the thinking is that machines can’t see race or class, so they can assess with less subconscious judgement than a human judge - the supposition is that they will err less.

The problem with this thinking is quite simple - machines don’t exist in a vacuum. These algorithms were created by human beings, using data generated by machines programmed by human beings - everything always comes back to us. When the data you’re using to predict risk is already biased against a particular group, you can end up with flawed data in your algorithm - something which occurred with the COMPAS risk assessment tool, which unfairly discriminated against black defendants. This is somewhat bizarre - after all, you wouldn’t tell your risk assessment tool what the race of the defendant is. Systemic bias, however, persisted through the data.

For a more complete reading on the potential problems with risk assessment tools, EFF’s article on pre-trial risk assessment is a great start. The focus here is how demographic data, among other types of data, can cause machines to unfairly target a particular demographic, even if that particular demographic isn’t a part of the data that’s been fed into the algorithm. In other words, the machine can figure out that you’re black, and unfairly target you, even if it’s never told that you’re black. The EFF also implores us to consider what “success” with risk assessment tools would mean. After all, if you simply deny bail to 100% of participants, there’s no flight risk at all - but that certainly can’t be considered a success.

It’s easy to envision a nightmarish dystopia, where risk assessment tools are used in a Minority Report style police state - we all hope it doesn’t come to that. These tools, and asking judges to reassess your client’s flight risk, is one of the reasons it’s so important to make all of your court appearances. Should you have an emergency that renders you unable to attend a hearing, hiring an appearance attorney has never been easier.