Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on August 14, 2020

Horrific AI surveillance experiment uses convicted felons as human guinea pigs


Horrific AI surveillance experiment uses convicted felons as human guinea pigs Image by: Pixabay / Gerd Altmann

When a convicted felon in the US completes their court-mandated sentence(s) they’re considered a free citizen with limited rights. Felons can only vote in two states, they cannot legally own a firearm, and they may not serve on a jury. Otherwise the articles of the US Constitution and the Bill of Rights apply to those who’ve done their time the same as those who’ve never been criminally convicted.

One of the biggest impediments to felons reintegrating into society is recidivism – according to a 2018 update from the US Department of Justice more than 80% of state prisoners are arrested for a crime within nine years of their release.

A pair of computer scientists from Purdue University, Marcus Rogers and Umit Karabiyik, recently concocted a study to address this problem.

According to a Purdue press release titled “Artificial intelligence examines best ways to keep parolees from recommitting crimes,” the duo intends to outfit 125 released felons with wearable devices that will collect myriad data:

Artificial intelligence will give researchers a window into the parolee’s daily life through bracelets that collect health information, including stress and heart rate.

Smart phones carried by each person also will collect information, ranging from where they are at any given time to the photos they may take.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The study will last four years, with the devices worn during the third, and an additional 125 parolees will be included as an unmonitored control group.

Quick take: There’s a lot going on here. Let’s do a longer take starting with what Venture Beat’s Kyle Wiggers wrote about the study earlier today:

University of Washington AI researcher Os Keyes takes issue with the study’s premise, noting that the reasons for high recidivism are already well-understood. “When low-income housing prohibits parolees, even parolees as guests or housemates, when there’s a longstanding series of legal and practical forms of discrimination against parolees for employment, when there is social stigma against people with criminal convictions, and when you have to go in once a week to get checked and tagged like a chunk of meat — you’re not welcome.”

Keyes argues this sort of monitoring reinforces “dangerous ideas” by presuming a lack of bodily autonomy and self-control and overlooking the individualized and internal nature of recidivism. Moreover, it is premised on paternalism, rendering convicts’ parole status even more precarious, he says.

There’s almost no conceivable impetus for designing this AI system. Recidivism isn’t a murky area of study, so there’s no direct line of reasoning to build a surveillance system to monitor parolees. When we talk about an 80% arrest rate in the first nine years for convicted felons there are numerous factors criminal behavior experts take into account when rationalizing that number, but nobody’s confused as to why felons commit crimes.

Parolees are among the most vulnerable members of society. Unlike those who’ve served their entire sentence, they’re still subject to monitoring by the US government and can have their freedom revoked for non-criminal conditional offenses. Many parolees, for example, are required to maintain housing and employment during their parole period.

The prospect of participating in a government study from a prestigious university could be seen as a positive step towards eventual freedom from the justice system. This incentive, when offered to vulnerable members of society who might not understand the value of their privacy and data, makes this study appear predatory.

The program is voluntary. Every participant will be fully aware of the fact that the US government will have their biometric, health, location, and smart phone data – it’s unclear whether the researchers will only have file access or if they’ll have camera and microphone access – at all times. There’s no two ways about it: the data gleaned from this experiment will be utterly worthless in any capacity related to recidivism. Even physics tells us that an object under direct observation behaves differently than one unobserved.

The researchers’ claim appears to be that the system is designed to identify warning signs of recidivism so that, in the future, early interventions can be developed. This is horrifying. Either the researchers are claiming they’re getting myriad data on people just so they can attempt to answer the basic question of “what makes criminals do crimes,” or they’re claiming that this system will help develop targeted interventions.

What the hell would that look like? An Android notification? Little shocks through a wearable? Installing an off switch inside their brains? Having a therapist call the parolee whenever their blood pressure goes up? Auto-dialing 911 whenever they get mad?

No matter how obscure the perceived benefits are, the dangers are crystal clear. That level of surveillance is far beyond the pale. The very fact that this study exists will normalize the use of surveillance as a government and academic tool.

There was a time when people were told that lobotomies and shock therapy would cure them of homosexuality and other “mental illnesses” and many people voluntarily underwent the treatments in hopes of finding relief. Using vulnerable members of society to test individualized surveillance systems may not have the visceral physical implications of invasive brain surgery, but it’s just as predatory at the social scale.

The Constitution and Bill of Rights are clear on US citizens’ right to privacy. Asking a parolee to trade their privacy away even further for the purpose of government information gathering is an unconscionable act. This is reminiscent of Google exploiting homeless Black people so it could develop better facial recognition algorithms.

It’s difficult to imagine any good that could possibly come out of this study, but the normalization of surveillance as a solution to societal problems and the further subjugation of one of the most vulnerable populations in the country (parolees) is a direct threat to democracy and the inalienable human rights of free citizens.

It’s impossible for me to imagine a scenario where 125 felons are capable of giving informed consent as to exactly how much of their privacy they’re giving up. Worse, I highly doubt the general public at large – including the college graduate population – would have more than a very basic understanding of the sheer amount of data the Purdue researchers will be able to gather and what they and the US government could do with it.

So you like our media brand Neural? You should join our Neural event track at TNW2020, where you’ll hear how artificial intelligence is transforming industries and businesses. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with