29 pages 58 minutes read

Ruha Benjamin

Race After Technology: Abolitionist Tools for the New Jim Code

Nonfiction | Book | Adult | Published in 2019

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Chapter 2Chapter Summaries & Analyses

Chapter 2 Summary: “Default Discrimination: Is the Glitch Systemic?”

When programmers design databases, they project their own worldviews. What we often call glitches can function to exclude certain demographics, as with the Google Maps voice that reads the “X” in “Malcom X Boulevard” as a ten. While glitches seem like fleeting errors, they reveal the innerworkings of our social biases.

Predictive policing software aims to “predict” where criminal activity might be and who might reoffend, but it has been shown to make false predictions and overrepresent black criminality. Crime prediction algorithms lead law enforcement to over-criminalize certain neighborhoods. In The Matrix, the Oracle predicts that Neo is about to knock over a vase, taking him off guard and causing him to knock it over. Just as her prediction is self-fulfilling, predictive algorithms create situations for police to find crime. Glitches are not lapses in a benign system but signs of a flawed process.

“Defensive” architecture—such as armrests on public benches that discourage lying down—abounds in stratified societies. Structures can be engineered to reinforce hierarchies. We see this with Robert Moses’s overpasses, which are rumored to be purposely too low for buses and limit the mobility of the Black working class. When a school bus of affluent white children got into an accident at one such overpass, it demonstrated the way that discriminatory design can impact everyone.