Plot Summary

The Information

James Gleick
Guide cover placeholder

The Information

Nonfiction | Book | Adult | Published in 2011

Plot Summary

James Gleick traces the concept of information from its earliest manifestations in human culture to the digital age, arguing that information has always been central to civilization but was not recognized as a distinct, measurable quantity until the twentieth century. The book moves roughly chronologically, weaving together the stories of scientists, inventors, and thinkers who shaped humanity's relationship with information.


Gleick opens by identifying 1948 as the pivotal year. At Bell Telephone Laboratories in New York, Claude Shannon, a thirty-two-year-old mathematician and engineer, published "A Mathematical Theory of Communication," a monograph that introduced the bit as a fundamental unit of information. Shannon had worked in near isolation, drawing on wartime cryptography, George Boole's algebra of logic, and eclectic interests in codes and puzzles. His theory stripped information of its everyday meaning, defining it instead as a measure of uncertainty, surprise, and choice. Gleick argues that this act of abstraction, comparable to Isaac Newton's formalization of force and mass, set the stage for the information age, with consequences reaching from compact discs and computers to genetics and quantum physics.


Before turning to Shannon's work in detail, Gleick reaches back to the talking drums of sub-Saharan Africa. European explorers long failed to recognize that drums could convey detailed messages. John F. Carrington, an English missionary living in the Belgian Congo from the late 1930s, discovered that because tonal African languages like Kele encode meaning partly through pitch, the drums transmit only pitch contours, stripping away consonants and vowels. To overcome the resulting ambiguity, drummers add stereotyped phrases that supply context. Gleick identifies this technique as redundancy, a concept central to Shannon's later theory, and draws a parallel to Samuel Morse's telegraphic code, where shorter dot-and-dash sequences were assigned to more common letters. Both the drums and the telegraph faced the same fundamental challenge: mapping an entire language onto a stream of simple signals.


Gleick then examines writing as the first great information technology. Drawing on the work of Walter J. Ong, a Jesuit philosopher and cultural historian, he argues that writing transformed human consciousness in ways difficult to recover imaginatively. The alphabet, invented only once near the eastern Mediterranean around 1500 BCE by Semitic peoples, reduced language to a small set of abstract symbols. Its spread catalyzed logic, philosophy, and science. Gleick traces a path from Homer's oral culture, where meter and formulaic redundancy served as memory aids, through the rise of literacy, to the invention of the dictionary. Robert Cawdrey, a village schoolmaster and priest, published the first English dictionary in 1604. Four centuries later, the Oxford English Dictionary encompasses over a million words in a continuously revised online edition.


The book's central narrative follows the lineage of machines built to process information. Charles Babbage, a nineteenth-century English polymath, designed the Difference Engine and the Analytical Engine, anticipating the modern computer by more than a century. The Difference Engine, funded by the British government beginning in 1823, was meant to automate the computation of mathematical tables but was never completed. The Analytical Engine, inspired by the Jacquard loom's use of punched cards to encode weaving patterns, went further: it separated storage from processing, employed conditional branching, and could in principle perform any computation. Ada Lovelace, daughter of the poet Lord Byron and Babbage's intellectual companion, wrote what is recognized as the first computer program and declared that the engine "might act upon other things besides number" (116), including composing music.


Gleick traces the electric telegraph as the technology that first made information visible as a commodity. Claude Chappe's optical semaphore system in revolutionary France served as a precursor. The shift to electrical telegraphy in the 1840s transformed perceptions of time, space, and knowledge: weather reports, standardized time zones, and a new sense of simultaneity all followed. Gleick connects this history to the emerging science of information, noting that Boole's 1854 symbolic logic, which proposed an algebra using only zero and one, provided the tools that would later underpin digital circuits.


Shannon enters the narrative as a child in Gaylord, Michigan, rigging a barbed-wire telegraph and reading Edgar Allan Poe. At MIT, he writes a master's thesis demonstrating that Boole's algebra can describe electrical switching circuits, an insight foundational to all digital computing. During World War II, he analyzes cryptographic systems for military communications. His public theory of information appears in 1948. Shannon defines the fundamental problem of communication as reproducing a message at one point that was selected at another, deliberately setting meaning aside. His entropy formula measures information as the average surprise per symbol, and a bit, the binary digit, represents the information content of a single coin flip. He proves that any noisy channel has a definite maximum capacity for error-free communication and estimates that English is about 50 percent redundant.


The reception of Shannon's theory intersected with Norbert Wiener's cybernetics, a framework the MIT mathematician developed from wartime work on feedback and purposeful machine behavior. The Macy Conferences on Cybernetics, held from the late 1940s to 1953, brought together researchers from diverse fields to debate the implications of information theory for understanding brains, behavior, and society. Alan Turing proposed his Imitation Game as a practical test for machine intelligence. Gleick argues that information theory catalyzed a cognitive revolution in psychology, giving researchers tools to study memory, attention, and pattern recognition as information-processing tasks, supplanting the behaviorist model that treated the mind as a black box.


The book's middle chapters trace the concept of entropy from thermodynamics to information theory and then to biology. Rudolf Clausius, the German physicist who coined the term in 1865, defined entropy as the unavailability of energy for work. James Clerk Maxwell's thought experiment of a demon who could sort molecules by speed exposed a deep connection between information and energy, later formalized by Hungarian physicist Leó Szilárd. Szilárd showed in 1929 that the demon's measurements incur an entropy cost that preserves the second law of thermodynamics. Shannon's entropy formula and the thermodynamic entropy formula share the same mathematical form, a correspondence Gleick presents as fundamental rather than coincidental.


This link between information and life deepened with James Watson and Francis Crick's 1953 discovery of DNA's double helix. The genetic code became an exercise in information theory: mapping a four-letter nucleotide alphabet onto 20 amino acids. Francis Crick's Central Dogma stated that information flows from nucleic acid to protein but never in reverse. Richard Dawkins, the evolutionary biologist, extended the information-centered view of biology in 1976, arguing that genes are the true units of natural selection and proposing the meme as a unit of cultural transmission that propagates by imitation, spreading from brain to brain.


Later chapters explore algorithmic information theory, developed independently by Gregory Chaitin, Andrei Kolmogorov, and Ray Solomonoff, which defines the complexity of a number as the length of the shortest computer program that can generate it. A truly random string cannot be compressed at all, linking randomness, complexity, and incompleteness. Gleick also traces the rise of quantum information science, from the black hole information paradox through quantum cryptography and quantum computing. Physicist John Archibald Wheeler's declaration, "It from Bit," held that every particle and field of force derives its existence from information.


The final chapters survey the consequences of information abundance, invoking Jorge Luis Borges's "Library of Babel," a mythical library containing every possible book, where knowledge is indistinguishable from noise. Gleick describes Wikipedia's collaborative growth, the exponential increase in data storage, and the recurring human experience of information overload, which he traces from Robert Burton's 1621 lament through Gottfried Wilhelm Leibniz's fears to the anxieties of the digital age. He closes by arguing that although Shannon's deliberate sacrifice of meaning was necessary to found the science of information, meaning inevitably returns. We walk the corridors of the information age, "looking for lines of meaning amid leagues of cacophony and incoherence" (426).

We’re just getting started

Add this title to our list of requested Study Guides!