Being Wrong Summary

Kathryn Schulz

Being Wrong

  • Plot overview and analysis written by an experienced literary critic.
  • Full study guide for this title currently under development.
  • To be notified when we launch a full study guide, please contact us.

Being Wrong Summary

SuperSummary, a modern alternative to SparkNotes and CliffsNotes, offers high-quality study guides that feature  detailed chapter summaries and analysis of major themes, characters, quotes, and essay topics. This one-page guide includes a plot summary and brief analysis of Being Wrong by Kathryn Schulz.

Kathryn Schulz’s 2010 book Being Wrong: Adventures in the Margin of Error is a meditation on what it means to err and why we, as humans, tend to implicitly assume that we are correct about almost everything. The author asserts that such an error is part of the fundamental human condition. Taking the reader on a journey through the history and psychology of mistakes, Schulz offers a new perspective on how we perceive erring on both the large scale and in everyday life.

Schulz opens the book with a general discussion of what it means to be wrong. Even the most seemingly well-established historical scientific theories were, eventually, proved inaccurate. Thus, the author questions whether truth can be known and whether objective reality exists outside of our perceptions. “Wrong” is used to refer to deviance from external reality or to reject something as false that was thought to be true. Furthermore, while we use “wrong” to refer to both errors and morality, the word is also used as a matter of opinion or taste.

The author describes two models of wrongness: pessimism, which paints errors as humiliating, and optimism, which includes surprise and even delight. An interesting assertion the author makes is that insanity is also wrongness in a sense. That is, if madness is a state of radical wrongness, then wrongness is a state of minor madness.

Schulz then introduces the notion of perception: we first experience sensation and then interpretation, which can introduce error. Optical illusions, for example, produce an effect that is actually deceitful; knowing the secret behind the way they operate does not prevent the illusion from working. What differentiates illusions from other errors is our consent to be deceived.

Governments and religions, says the author, have long exploited such perceptual distortions. For example, David Brewster’s 1833 treatise Letters on Natural Magic explains a way to use concave silver to project human figures onto smoke. Another example the author mentions is how Napoleon III sent Jean Eugène Robert-Houdin on a trip to Algeria to do magic tricks as a way of convincing the Algerians that the French were superior to the area’s Islamic holy men.

The author then focuses on the unreliable nature of human memory. In one example, Ulric Neisser, a psychologist who researched memory and perception, recalled how, in the middle of a baseball game, a radio announcer interrupted the game with a bulletin on the Pearl Harbor bombing. However, forty years later, he realized that professional baseball is not even played in December.

In another example, Neisser surveyed students regarding the Space Shuttle Challenger disaster of 1986 the day after it occurred. He surveyed them again three years later, and the results revealed that less than 7 percent of secondary reports were consistent with the initial reports. Furthermore, 50 percent of students were incorrect about two-thirds of their claims, and 25 percent were incorrect about every major detail of the event.

Researchers have even conducted false memory studies, in which they have been able to convince subjects that they experienced an event as a child that they actually did not, such as becoming lost in a store or going on a hot air balloon ride. In fact, one in four subjects, on average, will accept a false memory. Thus, research suggests that memory is not a singular function, but an amalgamation of several processes; memories are reassembled by the brain each time we recall them.

Next, Schulz asserts that individuals are programmed to draw conclusions using only meager evidence but that this is an advantage. It is our inductive reasoning—our ability to determine what is probable—that makes our brains so powerful. When we are able to modify a conclusion based on additional evidence, we successfully reason inductively, but when we fail, it is known as confirmation bias. This phenomenon occurs when we fail to accept information that contradicts our beliefs.

While the notion of doing something because everyone else is might be considered dangerous, independent thought is not possible because whether we read a newspaper, research on the internet, or listen to our parents, we are receiving second-hand information.

The author also discusses the influence of communities, which are often insulated. They expose us to a disproportionate amount of information that supports our already well-established positions, guard us against disagreement from outsiders, and suppress dissent from within the community.

Schulz touches on certainty versus doubt. When we feel certain, our knowledge and picture of the world feel complete. Doubt, on the other hand, feels uncomfortable. Whereas certainty provides us with the reassurance of answers, doubt challenges us with questions. Error challenges what we know but also who we are.

Such mistakes are also associated with emotions. That is, being wrong causes us to feel, for example, foolishness. Freud claimed that denial of contrary evidence is a defense mechanism unconsciously employed to protect one from distress and anxiety. To be in denial, however, is to deceive oneself.

When considering person-to-person connections, in truth, says the author, we can never really know others from the outside, though we assume we can. Conversely, we feel we can only be known from the inside. Others are seen as transparent, but we feel our inner selves are submerged. Even when considering our inner selves, the author claims that we do not know ourselves well enough to remain static and consistently come up with correct answers. That is, to accept that we are wrong, we must also accept the confounding notion that a gap exists between that which is being represented in our mind and that which is doing the representing, which is also our mind.

The author once again raises the optimistic model of wrongness, in which error and change are natural, ongoing processes. Error is thus a mechanism for learning. It takes courage, says Schulz, to leave behind and accept our wrong past selves. To counter our predisposition to deny our errors, we must accept their likelihood and be open about our mistakes.