Using AI to uncover “risky” behaviors among parolees is problematic on many levels. Nevertheless, researchers will soon embark on an ill-conceived effort to do so at Tippecanoe County Community Corrections in Indiana. Funded by a grant from the Justice Department and in partnership with the Tippecanoe County Sheriff’s Department, Florida State University, and the University of Alabama-Huntsville, researchers at Purdue University Polytechnic Institute plan to spend the next four years collecting data from the bracelets of released prisoners. The team aims to algorithmically identify “stressful situations and other behavioral and physiological factors correlated with those individuals at risk of returning to their criminal behavior.”
The researchers claim their goal is to identify opportunities for intervention in order to help parolees rejoin general society. But the study fails to acknowledge the history of biased decision-making engendered by machine learning, like that of systems employed in the justice system to predict recidivism.
A 2016 ProPublica analysis, for instance, found that Northpointe’s COMPAS algorithm was twice as likely to misclassify Black defendants as presenting a high risk of violent recidivism than white defendants. In the nonprofit Partnership on AI’s first-ever research report last April, the coauthors characterized AI now in use as unfit to automate the pretrial bail process, label some people as high risk, or declare others low risk and fit for release from prison.
According to Purdue University press materials, the researchers’ pilot program will recruit 250 parolees as they are released, half of whom will serve as a control group. (All will be volunteers who consent to participate and whose family members will be notified at sign-up time, but it’s not unreasonable to assume some subjects might feel pressured to enroll.) At intervals, parolees’ bracelets will collect real-time information like stress biomarkers and heart rate, while the parolees’ smartphones will record a swath of personal data, ranging from locations to the photos parolees take. The combined data will be fed into an AI system that makes individual behavioral predictions over time.
The monitoring infrastructure is currently being developed and isn’t expected to be used until the third year of research. But the researchers are already sketching out ways the system might be used, like to recommend communities, life skills, coping mechanisms, and jobs for the parolees.
“Our goal is to utilize and develop AI to better understand the data collected from the given devices to help the participants in various ways of their life,” Umit Karabiyik, a Purdue assistant professor and a lead researcher on the study, told VentureBeat via email. “The AI system will not report any conclusions from the participants’ actions to Tippecanoe County Community Corrections … Data collection will be anonymized from our research perspective. We (as researchers) will not have access to personally identifiable information from the participants. Participants will be given a random ID by our partnering agency, and we will only know that ID, not the individuals in person. As for the surveillance aspect of this work, our goal is not policing the participants for any of their actions.”
The research is seemingly well-intentioned — the coauthors cite a Justice Department study that found more than 80% of people in state prisons were arrested at least once in the nine years following their release, with almost half of the arrests in the year following release. But experts like Liz O’Sullivan, cofounder and technology director of the Surveillance Technology Oversight Project, say the study is misguided.
“AI has some potential to contribute to reducing recidivism, if done correctly. But strapping universal surveillance devices to people as though they were animals in the wild is not the way to go about it,” O’Sullivan told VentureBeat via email. “There’s little evidence that AI can infer emotional state from biometrics. And even more, unless the end goal is to equip all future parolees with universal tracking devices, I’m not convinced that this study will inform much outside of how invasive, complete surveillance impacts a willingness to commit crime.”
Other ill-fated experiments to predict things like GPA, grit, eviction, job training, layoffs, and material hardship reveal the prejudicial nature of AI algorithms. Even within large data sets, historic biases become compounded. A recent study that attempted to use AI to predict which college students might fail physics classes found that accuracy tended to be lower for women. And many fear such bias might reinforce societal inequities, funneling disadvantaged or underrepresented people into lower-paying career paths, for instance.
University of Washington AI researcher Os Keyes takes issue with the study’s premise, noting that the reasons for high recidivism are already well-understood. “When low-income housing prohibits parolees, even parolees as guests or housemates, when there’s a longstanding series of legal and practical forms of discrimination against parolees for employment, when there is social stigma against people with criminal convictions, and when you have to go in once a week to get checked and tagged like a chunk of meat — you’re not welcome.”
Keyes argues this sort of monitoring reinforces “dangerous ideas” by presuming a lack of bodily autonomy and self-control and overlooking the individualized and internal nature of recidivism. Moreover, it is premised on paternalism, rendering convicts’ parole status even more precarious, he says.
“Lord knows what this is going to be used for if it ‘works,’ but I doubt it’s good: ‘The computer said you’re stressed so back in jail you go in case you commit a crime again,’” Keyes said. “Imagine if the researchers spoke to their participants, asked them to journal at their own pace and comfort level, as many studies do. Or if the researchers spoke to existing prison organizations, who would tell them quite rightly that the issue is structural. But no. They don’t appear to have considered prison abolitionism, structural discrimination, or actual liberation.”
Author: Kyle Wiggers.
Source: Venturebeat