Science fiction has always had that annoying way of predicting this sort of thing.
To prove my point, I need go no further than directing you to this post that I read this morning talking about computer algorithms being used to predict which parolees are most likely to re-offend once they are out of prison, and thus used to direct which parolees receive extra supervision.
Don’t get me wrong, I’m not an advocate of people incarcerated for violent crimes being treated with delicacy. Nor, though, am I an advocate for anyone being stripped of their basic human dignity, and that includes those behind bars. They are still human, and I find it tragic that our system doesn’t treat them as such.
I think, in fact, that that’s the core of my problem with this: the removal of the human factor. I love technology. I love what it does for us. I’ve said before, though, that there’s a line where it stops doing things for us, and starts to make us do things for it. If the tool assumes the role of the person using the tool, where does this leave the person?
I don’t think that human behavior can be reduced and quantified into mathematical formulae. We are way, way too complex for that. Our unpredictability, in fact, is part of what makes us endearingly human.
And what really concerns me about the topic at hand is that the software’s predictions are being used in place of the human parole officer’s instincts. Those people have been doing this for a while. They develop good instincts, the same as any of us do in our respective fields. Those instincts, I would argue, are far more valuable than a computer’s prediction.
To say nothing of the fact that I would rather our country’s parolees’ re-integration into society not be supervised by computer software in lieu of a person.
Talk about recidivism…