Diane Vaughan

The Challenger Launch Decision

  • This summary of The Challenger Launch Decision includes a complete plot overview – spoilers included!
  • We’re considering expanding this synopsis into a full-length study guide to deepen your comprehension of the book and why it's important.
  • Want to see an expanded study guide sooner? Click the Upvote button below.

The Challenger Launch Decision Summary

SuperSummary, a modern alternative to SparkNotes and CliffsNotes, offers high-quality study guides that feature detailed chapter summaries and analysis of major themes, characters, quotes, and essay topics. This one-page guide includes a plot summary and brief analysis of The Challenger Launch Decision by Diane Vaughan.

In her sociological work The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA (1996), Diane Vaughan, a professor of sociology at Columbia University, seeks to explain the Challenger disaster by analyzing the corporate culture at NASA. Vaughan’s analysis is regarded as a significant contribution to the sociology of complex organizations, in particular, for its development of the concept of the “normalization of deviance,” a process by which the very rules of an organization generate a culture in which deviance from those rules becomes acceptable and even expected. The Challenger Launch Decision was the inaugural winner of the Rachel Carson Prize and received nominations for the National Book Award and the Pulitzer Prize.

Vaughan introduces her book by setting out how she came to examine the Challenger disaster. Her background is in the study of corporate misconduct, and based on the early press coverage of the Rogers Commission (which investigated the disaster), she assumed that corporate misconduct had played a role in the decision to launch Challenger despite the presence of known safety flaws. The Commission’s report confirmed prevailing ideas about the causes of the disaster—that managers, driven by financial and political concerns, overruled the warning of engineers.

However, as she investigated, Vaughan found that the real picture was more complex and more interesting. In reality, both managers and engineers at NASA and at Morton Thiokol—the company which produced the crucially malfunctioning component—were acting within a well-established corporate culture, in which unacceptable risks had become acceptable. “I discovered that what I thought were rule violations were actions completely in accordance with NASA rules!”

Forced to abandon the hypothesis that rule-breaking caused the Challenger disaster, she develops the alternative hypothesis that “controversial decisions were not calculated deviance and wrongdoing, but normative to NASA insiders.”

Vaughan describes her analytical approach as “an archaeological revisit” or a “historical ethnography.” As well as the Rogers Commission report and the 160 interviews conducted by the Commission, Vaughan’s book is based on many interviews of her own and 122,000 pages of NASA records.

On 27 January 1986, more than 30 people in three different locations assembled for a telephone conference. The forecast had just predicted that the temperature in Florida would drop below zero overnight, and a decision needed to be taken: Should the launch of the space shuttle Challenger take place, as planned, at 09:38 the next day?

Engineers at NASA and at contractor Morton Thiokol were concerned about the effect of low temperatures on the rubber “O” rings in Challenger’s solid rocket boosters. These rings needed to move quickly to seal the boosters, and the engineers were concerned that if the rings stiffened in the cold, they would not react quickly enough. Hot gases could then escape and ignite the shuttle’s huge external fuel tank.

Due to these concerns, engineers at Morton Thiokol issued an unprecedented “no launch” recommendation. Engineers and managers at NASA pushed back, asking for an explanation. Morton faxed back the same data they had previously offered in support of their “launch” recommendation. This data showed that launching at low temperatures carried a small but significant risk; however, Morton had characterized this data as an anomaly, to be expected with a new technology. NASA, in turn, had studied the action of the boosters under extreme conditions and “evidence initially interpreted as a deviation from expected performance was reinterpreted as within the bounds of acceptable risk.”

In short, both teams had identified the problem, but as they “recurrently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable.”

Pushed by the NASA team, the Morton Thiokol team asked for some time to think. Managers at Morton Thiokol, concerned about securing future NASA and other government contracts, decided to reverse their recommendation. One engineer in the team was told to “take off your engineering hat and put on your management one.”

The Challenger launched the next day as scheduled. At an altitude of ten miles, the shuttle exploded, killing the seven astronauts on board.

Vaughan develops three conceptual tools to analyze this chain of events. The first is the “Normalization of Deviance.” This is a process whereby deviations from safety rules are rationalized; these rationalizations become part of the culture of the company or working group, and soon they are all but invisible. Vaughan suggests that this rationalization occurs through several processes. Commonly, rules are deemed stupid or cumbersome and set aside. In other cases, those whose job it is to enforce the rules within the company culture are too afraid to do so.

In many cases, deviance is normalized as a result of financial or other pressures, and Vaughan argues that this was the case at NASA, leading to the development of her second conceptual tool, the “Culture of Production.” At NASA, Vaughan argues, the tight launch schedule, together with political and budgetary pressures drove both managers and engineers to compromise on safety: “Within the culture of production, cost/schedule/safety compromises were normal and non-deviant for managers and engineers alike.”

Vaughan’s third tool is “Structural Secrecy.” She argues that NASA’s decision-making processes were hindered by obstructions in the flow of information between work groups, and particularly up the organizational hierarchy. These obstructions made it harder for decision-makers to recognize the dangers of the O-ring flaw, even though to outsiders with the benefit of hindsight these dangers were tragically obvious.