41 pages 1 hour read

Virginia Eubanks

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor

Nonfiction | Book | Adult | Published in 2018

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Index of Terms

AFST (Allegheny Family Screening Tool)

Launched in August 2016, the AFST is a predictive risk model designed by New Zealand researchers and implemented through independent corporate strategists in Allegheny County, Pennsylvania, replacing a system that relied on human decision-making with one that rigidly adheres to algorithmic rules and sidelines social work professionals. AFST uses data graded along a risk/severity continuum to determine whether a family will be investigated for child abuse by a country social worker. After implementation, the AFST has proved to be inaccurate at best and biased at worst. Its algorithm equates symptoms of poverty with those of child abuse: “A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF, Supplemental Security Income, SNAP, and county medical assistance” (156). The AFST’s illogical framework built bias directly into its own dataset.

Coordinated entry system (CES)

Los Angeles’s automated welfare system collects, stores, shares, catalogs, classifies, and ranks numerically information about the unhoused. Designed to triage those with immediate need of housing, CES should be “a standardized intake process to reduce waste, redundancy, and double-dipping across agencies” (85), but in practice it often comes up short and is difficult to navigate. Moreover, CES poses a threat to the people it claims to be trying to help because it so deeply invades individuals’ privacy.