AI & RoboticsNews

Oregon is shutting down its controversial child welfare AI in June

In 2018, Oregon’s Department of Human Services implemented its Safety at Screening Tool, an algorithm that generates a “risk score” for abuse hotline workers, recommending whether a social worker needs to further investigate the contents of a call. This AI was based on the lauded Allegheny Family Screening Tool, designed to predict the risk of a child ending up in foster care based on a number of socioeconomic factors. 

But after the Allegheny tool was found to be flagging a disproportionate number of black children for “mandatory” neglect, and a subsequent AP investigative report into the issue, Oregon officials now plan to shutter their derivative AI by the end of June in favor of an entirely new, and specifically less automated, review system.

The department’s own analysis predicts that the decision will help reduce some of the existing racial disparities endemic to Oregon’s child welfare system. “We are committed to continuous quality improvement and equity,” Lacey Andresen, the agency’s deputy director, said in a May 19 email to staff obtained by the AP

A number of states across the country have already implemented, or are considering, similar algorithms within their child welfare agencies. But as with Northpointe’s COMPAS before them, their implementation have raised concerns about the transparency and reliability of the process as well as their clear tendency towards racial bias. However, the Allegheny developers did note that their tool was just that and was never intended to operate on its own without direct human oversight.   

“Making decisions about what should happen to children and families is far too important a task to give untested algorithms,” Senator Ron Wyden (OR-D) said in a statement. “I’m glad the Oregon Department of Human Services is taking the concerns I raised about racial bias seriously and is pausing the use of its screening tool.”

In its place, the Oregon DHS will implement a Structured Decision Making model used by California, Texas and New Jersey. Oregon’s other child welfare AI, one that generates a score for whether or not a foster kid should be reunited with their family, remains on hiatus.


Author: A. Tarantola
Source: Engadget

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!