There are no naked pre-cogs inside glowing jacuzzis yet, but the Florida State Department of Juvenile Justice will use analysis software to predict crime by young delinquents, putting potential offenders under specific prevention and education programs. Goodbye, human rights!
They will use this software on juvenile delinquents, using a series of variables to determine the potential for these people to commit another crime. Depending on this probability, they will put them under specific re-education programs. Deepak Advani - vice president of predictive analytics at IBM - says the system gives "reliable projections" so governments can take "action in real time" to "prevent criminal activities".
Really? "Reliable projections"? "Action in real time"? "Preventing criminal activities"? I don't know about how reliable your system is, IBM, but have you ever heard of the 5th, the 6th and the 14th Amendments to the United States Constitution? What about article 11 of the Universal Declaration of Human Rights? No? Let's make this easy then: Didn't you watch that scientology nutcase in Minority Report?
Sure. Some will argue that these juvenile delinquents were already convicted for other crimes, so hey, there's no harm. This software will help prevent further crimes. It will make all of us safer? But would it? Where's the guarantee of that? Why does the state have to assume that criminal behaviour is a given? And why should the government decide who goes to an specific prevention program or who doesn't based on what a computer says? The fact is that, even if the software was 99.99 per cent accurate, there will be always an innocent person who will be fucked. And that is exactly why we have something called due process and the presumption of innocence. That's why those things are not only in the United States Constitution, but in the Universal Declaration of Human Rights too.
Other people will say that government officials already makes these decisions based on reports and their own judgment. True. It seems that a computer program may be fairer than a human, right? Maybe. But at the end the interpretation of the data is always in the hands of humans (and the program itself is written by humans).
But what really worries me is that this is a first big step towards something larger and darker. Actually, it's the second: IBM says that the Ministry of Justice in the United Kingdom - which has an impeccable record on not pre-judging its citizens - already uses this system to prevent criminal activities. Actually, it may be the third big step, because there's already software in place to blacklist people as potential terrorist, although most probably not as sophisticated as this.
IBM clearly wants this to go big. They have spent a whooping $US12 billion beefing up its analytics division. Again, here's the full quote from Deepak Advani:
Predictive analytics gives government organizations worldwide a highly-sophisticated and intelligent source to create safer communities by identifying, predicting, responding to and preventing criminal activities. It gives the criminal justice system the ability to draw upon the wealth of data available to detect patterns, make reliable projections and then take the appropriate action in real time to combat crime and protect citizens.
If that sounds scary to you, that's because it is. First it's the convicted-but-potentially-recidivist criminals. Then it's the potential terrorists. Then it's everyone of us, in a big database, getting flagged because some combination of factors - travel patterns, credit card activity, relationships, messaging, social activity - indicate that we may be thinking about doing something that someone doesn't want us to do.
You know, the last time IBM got into the people database business for a similar kind of control was in the 1930s. They shipped a lot of cataloguing machines to certain government in Europe. It didn't end well for more than 11 million people. Yes, this comparison is a gross exaggeration, but one thing is clear no matter how you look at this: Cataloguing people - any people - based on statistical predictive software, and then taking pre-emptive actions against them based on the results, is the wrong way to improve our society. [Yahoo!]