Welcome to Episode 116 of the Digital Life, a show about our adventures in the world of design and technology. I’m your host Jon Follett, and with me is founder and co-host, Dirk Knemeyer.
For our podcast today, Dirk, I thought we could discuss a little bit about the design of one particular aspect of our criminal justice system, which is the growing use of risk assessments across the country. For our listeners who may not be familiar with risk assessments, they are essentially using statistical data to figure out whether someone who has committed a crime is likely to do it again. There are a variety of program types, software types, that do this, and it’s used in various aspects of our criminal justice system.
Right now it’s primarily used after somebody is sentenced, and they’re trying to figure out if that person can go into a rehabilitation-type program, or if they’re going to go into prison proper. It helps to streamline less violent offenders who may not need to do hard prison time and can be rehabilitated. Now there is quite a bit of controversy being kicked up, because there is a consideration in Pennsylvania for using risk assessments during the sentencing phase of the criminal system.
What that would mean is that your sentence, your prison sentence itself, could be determined by statistical data about other people who are similar to you. That has caused all kinds of hubbub, and there is a wonderful article on 538 blog, which was done in collaboration with the Marshall Project. The title of that article is, “Should Prison Sentences Be Based On Crimes That Haven’t Been Committed Yet?” That’s a very minority report style title, very provocative. I loved the Minority Report movie, but it’s a terrifying premise of locking up people for crimes that they haven’t even done yet. We’re not quite there with risk assessments, but boy the specter of that sort of hangs over this discussion. Dirk, what’s your take on this?
I think it’s a really interesting and touchy topic. I first started to talk about some of these issues about five years ago in my first TEDx talk. We are on the science side understanding human behavior, and the human condition, so quickly, so deeply, so much. The reality is, already even going back to when we were relatively dumb about these things, it was predictable who would commit crimes. Again, I researched this in the context of my TEDx talks, I’m pretty well versed on it.
For example, the number one predictor of if an individual will commit a crime is their gender. If someone is a male, the chance that they will commit, and not just a crime, I’ll get the terms wrong, because it’s been a few years since I did this research. It’s certain classes of violent crimes, crimes that involve physical violence, and imposing of the physical will and self over other. If you’re male that’s the top predictor, it’s orders of magnitude, more likely just because you’re a male you’re going to commit those crimes then, then a female.
There’s other predictors as well, a whole series of them, but to the point where going back to when we didn’t understand the human condition nearly as well as we do now. They would be able to say this class of people, this type of people, are far more likely to commit crimes based on socioeconomic status, at the more macro level. Then even on the micro level, things like testosterone, the chemicals in our body which were known, I think more abstractly not as well before, but we’re knowing better and better now.
If we’re born, and I actually don’t know how testosterone evolves in our bodies as we grow, so maybe when we’re born isn’t the right time to do it. If at a certain milepost in our life we go into the doctor’s office, or listen, in the future it’s just on our device. If we have our testosterone level checked, and if it’s over a certain level, there’s a substantial better chance of certain crimes being committed by that individual, than someone with testosterone that’s much lower.
What do we do with that information? Historically, we wouldn’t do anything with it. We’d say each person at the end of the day has free will is the belief, and they’re going to do or not do what they do, and then they’ll be punished based on what they do if they do something wrong. The rights that that’s ignoring are the rights of the victim. If we can begin to figure out what are the chances for certain types of people, for certain people, even certain individuals, and be able to say this person has a 62% chance of “N”. Is it still acceptable to leave that 62% chance out there as a free radical, and then punish it if it happens, and hope that it doesn’t. Meanwhile there’s a victim on the other end of it that potentially has their life permanently altered.
It opens a door into all kinds of moral questions, and to all kinds of social engineering questions, biological intrusiveness. Should males above a certain level of testosterone be allowed to continue with that, or should they have a pill, or should they have a surgery, because we can tell that there’s a much greater chance that they commit a crime. It’s a huge can of worms, but it’s one we’re coming to really soon, but nobody sees it coming. I did think that article was very interesting, and starting down that path a little bit, albeit more based on the sentencing, the aftermath side, as opposed to the predictive side.
I think there are a number of elements at play in this how risk assessments are being used currently. Part of the data that’s being collected, as you pointed out, being male obviously the disposition towards crime is increased. As well as, being a previous offender, and given the various types of crime. Those violent crimes, perhaps, dispose you more to recidivism, things like that. Ultimately, those are all elements that are in a database somewhere, and they’re not necessarily tied to the individual that’s going to be receiving the sentence.
It becomes a really tough nut to crack when you’re trying to decide somebody’s fate for next “X” number of years. You have statistical data, but you can’t see inside that person, and know whether or not they’re going to commit another crime. The reason why the states like risk assessments is because there are so many people incarcerated right now in the American prison systems that frankly, we don’t have beds to house all of the people. Additionally, the budget continues to increase year over year. I think Pennsylvania spends two billion dollars locking people up. America, generally speaking, likes to lock people up, and we’re spending an awful lot of money of that.
Risk assessments are seen from both the conservative and the liberal side, as being a possible way out of this mess. The conservatives will see it as a possibility for locking up the truly dangerous criminals, and letting the folks be rehabilitated who can be. Whereas, the liberal argument is that hey we’re using data to remove some of the institutional bias that has become such a lightening rod in this country over the past couple of years. What’s interesting is that the intentions are good, but we can see coming, quite possibly, scenarios where individuals rights are going to be trampled on.
I think that we’re going to have to go through a reframing of what are individuals rights? Up to this point, it has been the rights of, what’s the best way to put this? It’s been the rights of the criminal over the rights of the victim. Using murder as an example, if you murder someone, or murder a lot of people, you’re given a sentence, and when the sentence is over you’re on your way.
You have to acknowledge that you’re a felon, or there are some limits, but for the most part, if you murdered somebody in Maine you can move to New Mexico and start over, and nobody knows you’re a murderer. You may be a murderer who’s compelled to try it again, but your rights are protected. You don’t have an “M” carved into your forehead to show that you’re a murderer. That leaves exposure to different degrees, depending on the individual murderer, for the people in the free society who are surrounding you now. That’s always been the default.
What’s interesting is over the last…I don’t know how recent, certainly 20th Century or sooner, but I don’t know how late into the 20th Century, the one exception to this is with sex offenders. Sex offenders do essentially have an “S” carved into their forward. They have to go and register as sex offenders whenever they move somewhere. We have, as a society, arbitrarily chosen that that class of criminal is going to be branded, and additionally punished, and is going to in order to protect the rights of a potential future victim. This person, who may or may not commit a future crime, is going to be branded in ways that make it nearly impossible for them to live the rest of their life.
I don’t have particular sympathy for sex offenders, but it’s instructive that we have culturally chosen just to focus on that specific microcosm, not murders, but sex offenders. The question is will that type of thinking bloom and blossom out to many more instantiations of crime, and predictability. If I’m over a certain threshold of markers as a repeat offender, do I need to register? Do I need to staple to every document that I have with a city, a state, an employer, if I’m going to get married, do I have to staple on the criminal flag? I don’t know, but it’s possible. There’s going to be some really interesting moments, some really interesting working through this as the predictability gets higher and higher.
It’s big data with such a cliché thing. Five years ago and it’s still really popular. This is a great example of where big data is so powerful. If you have science, if you have an infrastructure, that you can pour a lot of data into, that it has some validity to it, suddenly you can determine some really interesting and powerful things. One of those, potentially, is the likelihood of people to commit crimes again.
If you know that I have an 82% chance of committing another violent crime, for me, as a citizen, and considering human rights, and the rights of the citizens, I want me as the 82% future violator to be controlled. I don’t want the 82% future violator to be given that 82% chance to harm and destroy other people. That doesn’t seem equitable to me. My saying that, probably to a number of people listening to this show, is very controversial, which gets to the complexity of the issue.
I think it’s really interesting stuff, and we’re heading toward a whole new moral battleground that we haven’t had to deal with in our culture in the past. It’s coming fast, and it has the potential of doing much more good for the society, and for our citizens. Probably if we go down that path at the expense of individual rights, in ways we become accustomed to taking for granted. I’m fascinated to see how this all plays out.
It’s probably worth mentioning at this point, that risk assessment is the name given to this type of vetting system. It’s systematized and it’s ordered, and it’s done in a certain way. Risk assessment in a more general it has always been practiced by parole officers, and judges, and lawyers, and police officers. They are doing it based on expert knowledge, right? They are making, parole boards are making decisions about whether or not to release somebody. Police officers are making decisions about whether or not to arrest somebody, or pull somebody over. Judges are making decisions about sentencing based on circumstances that they’re evaluating based on their expert opinion.
None of that, in particular, is transparent to the public. The reasons that are incorporated into all of those sets of decisions that lead to someone being incarcerated, or let out, are not auditable, they’re not visible. One of the things that the risk assessment tool, whether it be paper or software, what it does is it exposes some of the decision making that’s being made around this process, and it provides a numerical value, of course. It does pull from statistical information. In that respect, it might not be as nuanced as a judges assessment of an individual’s family situation, or things they’ve done in the past.
It does provide a trail that we can easily see as public citizens, and begin to evaluate why we are incarcerating certain people, and not others, or there are people being released, and not others. I think that’s an interesting by product of this discussion is the idea that we’re going to have, hopefully, some additional transparency that comes along with this risk assessment.
One other point I think is worth exploring is that in light of this additional transparency, there’s a possibility of exposing certain trends that may very well be biased within our criminal justice system. If minorities are more likely to be arrested at a traffic stop, and that arrest record is used statistically to inform sentencing, you hope that we would begin to see a trend in the way the risk assessments are used. Say, aha, there is a disproportionate number of minority individuals being stopped at traffic stops, and that has an effect on the risk assessments generally speaking, so we need to correct for that aberration in the risk assessment.
I do think that scientifically and mathematically, and from a software design perspective, there are some interesting good things that could come out of this. Like you, I’m fascinated to see where this goes, because as we all know, software can be full of bugs, and systems can be full of problems. I don’t know if we’re just creating more problems by introducing this risk assessment, but I’m anxious to find out.
I’ll tell you, the one opportunity that we can take is to shift away from the analog model of permanence, to the more digital model of fluidity. Instead of, and going back to the sex offender saying, once you’re registered as a sex offender you’re always registered as a sex offender. The rest of your life that’s your thing. In this model, in the more general criminal model, if my score for committing a future violent crime is 82 at a certain point in time, if it’s a good model, if it’s a valid model that will change. If I get out of prison and don’t commit another crime over in time, that will significantly reduce my score. As my score reduces so should the way the society views me, and so should the way that I’m allowed to act and participate within that system.
One of the neat things is that a digital system, a big data approach, can enable that sort of consideration. The high risk individuals might be limited, but they can be limited knowing that those limits can go away over time. Just changing our thinking around how all of this stuff is done I think will help to lubricate the best possible solutions.
Listeners, remember that while you’re listening to the show you can follow along with the things that we’re mentioning here in real time. Head over to thedigitalife.com, that’s just one “L” in the digitalife, and go to the page for this episode. We’ve included links to pretty much everything mentioned by everybody, so it’s a rich information resource to take advantage of while you’re listening, or afterward if you’re trying to remember something that you liked. If you want to follow us outside of the show you can me on Twitter @jonfollett, that’s J-O-N-F-O-L-L-E-T-T, and of course, the whole show is brought to you by Involution Studios, which you can check out at goinvo.com, that G-O-I-N-V-O.com. Dirk?
You can follow me @dknemeyer, that’s at D-K-N-E-M-E-Y-E-R, or e-mail me, firstname.lastname@example.org.
That’s it for Episode 116 of the Digital Life. For Dirk Knemyer, I’m Jon Follett, and we’ll see you next time.