Plot Summary

Ten Arguments for Deleting Your Social Media Accounts Right Now

Jaron Lanier
Guide cover placeholder

Ten Arguments for Deleting Your Social Media Accounts Right Now

Nonfiction | Book | Adult | Published in 2018

Plot Summary

Jaron Lanier, a tech industry insider, presents a sustained case that commercial social media platforms cause serious harm to individuals and society, and that the most effective response available to ordinary people is to delete their accounts. In a March 2018 author's note addressing the Cambridge Analytica scandal, Lanier argues that those with the privilege to quit have a responsibility to do so, both to stop reinforcing a harmful system and to demonstrate alternatives.

Lanier opens with a metaphor comparing cats and dogs. Cats, which partly domesticated themselves and remain self-directed, represent what people want to be online: autonomous agents who participate in the modern world without surrendering control. The fear, he contends, is that social media is turning people into obedient dogs, responsive to algorithmic cues they cannot see.

The first argument establishes Lanier's foundational claim: Social media erodes users' free will. Algorithms collect enormous quantities of data and correlate it across millions of users to predict and modify behavior. Unlike traditional advertising, which was fleeting and identical for all viewers, social media delivers individualized, continuously optimized stimuli. Lanier compares this process to B. F. Skinner's behaviorist experiments, in which caged animals were trained through mechanical rewards and punishments. He cites confessions from Facebook insiders, including the company's first president, Sean Parker, who acknowledged that Facebook deliberately exploited psychological vulnerabilities to create addictive feedback loops. Because negative emotions like fear and anger are cheaper tools for engagement than positive ones, the platforms carry an inherent bias toward amplifying negativity. Combined with network effects that make switching platforms practically impossible, this addiction traps users.

The second argument narrows the target. The problem is not the internet or smartphones but a specific business model. Lanier coins the acronym BUMMER, standing for "Behaviors of Users Modified, and Made into an Empire for Rent," and breaks it into six components: attention systems that reward abrasive behavior; pervasive surveillance; algorithmic feeds that personalize content; optimization of those feeds to maximize engagement; financial incentives that force journalism and other industries to reformulate around clickbait; and vast populations of bots and fake accounts that resist reform. He argues that outside of China, only Facebook and Google fully depend on this model, making the problem specific enough to contain.

The third argument contends that BUMMER makes users into worse people. Lanier recounts his own experiences, dating to the late 1970s, of being drawn into hostile arguments on early online platforms and later writing things he did not believe while blogging for the Huffington Post simply to provoke reactions. He uses Donald Trump as another example, observing that Trump's compulsive reactivity on social media illustrates the personality changes BUMMER induces. Lanier proposes the "Solitary/Pack switch," a deep setting in human personality. In the Solitary Wolf position, people think independently. In the Pack position, they become consumed by social hierarchy and lose sight of broader reality. BUMMER flips this switch to Pack by removing users' direct contact with the world beyond social posturing.

The fourth argument addresses truth, focusing on BUMMER's fake populations. Users routinely interact with fake accounts without realizing it: Fake reviews shape purchases, fake links boost search rankings, and armies of bots amplify tweets. These fake entities make the system resistant to reform because they can route around any regulation. Lanier uses the anti-vaccination movement as a concrete example: Educated parents refuse to vaccinate their children, fed by algorithmically promoted memes and scare stories. No one at any tech company chose to promote this rhetoric; paranoia is simply an efficient way to capture attention, so BUMMER reinforces it automatically.

The fifth argument holds that BUMMER strips meaning from communication by destroying context. Users become numbers measured by follower counts and likes rather than individuals. Newsrooms operate as components of the machine, with writers monitored by real-time engagement statistics. Lanier points to podcasting as a counter-example: a medium that maintains a person-to-person structure through stores and subscriptions rather than algorithmic feeds.

The sixth argument concerns empathy. Because BUMMER delivers different curated content to each person, no one can know what others are seeing. Lanier introduces "theory of mind," the ability to model someone else's experience, and argues that BUMMER is destroying it. He cites the Pizzagate incident, in which a person fired a shot in a pizza restaurant based on an online conspiracy theory, as an example of false social perception becoming physical violence. Dark ads, targeted posts shown only to selected users and never publicly published, along with subtle algorithmic tuning of feeds, make older forms of propaganda seem transparent by comparison.

The seventh argument presents evidence that social media makes users unhappy. Lanier cites research linking use to increased isolation, anxiety, and risk of self-harm, particularly among young women. Facebook's own researchers demonstrated they could make users unhappy without their awareness. The deeper source of unhappiness, he argues, is structural humiliation: being constantly ranked by opaque algorithms, judged in competitions one never entered, and subordinated to tech insiders who hold disproportionate power.

The eighth argument addresses economics. Lanier traces BUMMER's business model to the collision between the free-software movement, which insisted that software must be free and open to preserve democratic transparency, and Silicon Valley's worship of entrepreneurial wealth. Advertising was the only reconciliation, and it inevitably morphed into mass behavior modification. He proposes an alternative in which users pay a low monthly fee and earn money when their data proves valuable, a concept he calls "Data as Labor." His central example is language translation: Automatic translation depends on millions of fresh phrase translations gathered daily from real bilingual people, yet those people are told they are becoming obsolete. He points to Netflix and HBO as proof that people will pay for quality digital services.

The ninth argument examines politics. Lanier argues that BUMMER has reversed the long trend toward broader justice, citing authoritarian-leaning leaders in Turkey, the United States, India, and elsewhere. In developing regions, effects proved more acute: The crisis facing the Rohingya, a persecuted Muslim minority in Myanmar, corresponded to Facebook being flooded with hateful posts targeting them. WhatsApp-fueled lies destabilized parts of India, and a United Nations report documented social media as a deadly weapon in South Sudan. Lanier traces a consistent pattern through several movements. During the Arab Spring, young protesters toppled a government with social media's help, but no coherent governance followed. Gamergate's extreme harassment of women in gaming became a feeder for the alt-right, an online far-right movement. LGBTQ rights and Black Lives Matter both saw early gains overtaken by BUMMER's amplification of hostility. Russian operatives exploited this infrastructure in the 2016 U.S. presidential election, running fake activist accounts to suppress Black voter turnout. Lanier emphasizes that BUMMER is neither liberal nor conservative but "pro-paranoia, pro-irritability, and pro-general assholeness" (120).

The tenth argument reframes the case in spiritual terms. Lanier contends that BUMMER functions as a de facto religion. He examines its explicit pretensions: Google's mission to organize all information amounts to organizing reality; Facebook promised to give every person purpose; Google funded a project aimed at conquering death. He critiques the concept of "memes" as coined by evolutionary biologist Richard Dawkins, arguing that this framework, which reduces culture to units competing in a pseudo-Darwinian process, undergirds BUMMER's design. Lanier calls AI "a fantasy, nothing but a story we tell about our code" (141) and argues that BUMMER suppresses belief in the exceptional nature of personhood, encouraging users to see themselves as interchangeable components of a larger system.

In his conclusion, Lanier frames deleting accounts not as opposition to Silicon Valley but as a form of help that can redirect the industry toward better models. He offers practical alternatives: email friends through non-surveilling providers, read news websites directly, watch YouTube without a Google account. He recommends quitting all BUMMER platforms for at least six months as an experiment in self-knowledge.

An afterword adds three reflections. The first recounts a visit to high school students who asked why their parents had them if AI will make humans obsolete. The second connects social media's harms to the broader AI narrative, speculating that the constant message that humans will be replaced may fuel the fear of replacement driving white nationalist violence. The third notes that an estimated 10 percent of U.S. Facebook users deleted accounts in 2018, podcasting surged, and a growing culture of technology criticism emerged. Lanier acknowledges that BUMMER politics continues gaining ground globally but expresses optimism that understanding the business model as the root problem will drive meaningful change.

We’re just getting started

Add this title to our list of requested Study Guides!