Over the past seven years, employees at the Allegheny County Department of Human Services have frequently used an AI risk model prediction program to help assess children’s risk factors for foster care in the Pittsburgh area. In recent months, however, the underlying algorithms behind the Allegheny Family Screening Tool (AFST) have come under increased scrutiny for their opaque design, taking into account the longstanding racial, class, and gender biases of predictive AI tools.
Previous research by the Associate Press of the Allegheny Family Screening Tool’s algorithm found that certain data points could be interpreted as surrogate descriptions for racial groups. But now it appears that the AFST could also affect families within the disability community, as well as families with a history of mental illness. And the Justice Department takes note.
[Related: The White House’s new ‘AI Bill of Rights’ plans to tackle racist and biased algorithms.]
According to a new report released today by the Associated PressSeveral formal complaints regarding the AFST have been filed through the Justice Department’s Civil Rights Division, citing the AP‘s previous research into its potential problems. Anonymous sources within the Justice Department say officials are concerned that the AFST’s over-reliance on potentially skewed historical data risks “automating past inequalities,” particularly long-standing prejudices against people with disabilities and mental health problems.
The AP explains that the Allegheny Family Screening Tool uses a “groundbreaking” AI program that it claims is designed to help overworked social workers in the Pittsburgh area identify which families need further investigation into child support claims. More specifically, the tool was developed to help predict the potential risk of a child being placed in foster care within two years of a family background assessment.
The AFST’s black box design reportedly considers numerous case factors, including “personal information and birth, Medicaid, substance abuse, mental health, prison and probation records, and other government records” to determine further investigations for neglect. Although human social workers ultimately decide whether or not to follow up cases based on the results of the AFST algorithm, critics argue that the program’s potentially erroneous judgments could influence staff decisions.
[Related: The racist history behind using biology in criminology.]
A spokesman for the Allegheny County Department of Human Services told the AP that they are not aware of any complaints from the Justice Department, nor are they willing to discuss the larger criticism of the screening tool.
Child protection systems have long been heavily criticized for both their overall effectiveness and the disproportionate impact black, disabled, poor and otherwise marginalized families face. The AFST official website contains numerous third-party studies, reports, and articles that confirm the program’s supposed reliability and usefulness.