62 pages • 2-hour read
A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.
Aware of the negative publicity that Facebook’s China policies might produce, management provided talking points for Zuckerberg’s testimony before the US Senate. Zuckerberg was coached to sidestep issues and be evasive. He stretched the truth to the point of misrepresentation. For example, he claimed that Facebook was using the same technology in China as elsewhere, but that was not true. He postponed concerns about China crossing red lines, saying that it would be dealt with when it happened. When a senator made a comparison to complicity with the Nazis, Zuckerberg called the comparison unfair. He lied about decisions already made to comply with Chinese censorship. After his testimony, “Facebook’s stock price [rose]” (320).
China took advantage of the negotiations with Facebook by requiring it “to shut down things they d[id]n’t like—including the free speech of activists living abroad” (324). For example, in April 2017, Facebook blocked critic Guo Wengui’s account. That September, a week after China blocked WhatsApp, Facebook permanently removed Guo’s account. Zuckerberg made this decision, caving in to the Chinese government.
In May 2017, Facebook launched two apps in China using shell companies. In doing so, Facebook violated Chinese law, as it stored data on servers outside of China. Nor had Facebook made disclosures to investors or Congress about its operations in China. Smith, the chief financial officer, noted that China was okay with Facebook hiding its name. He was more concerned about bad press coverage in the West. Facebook was also “required to have a local subsidiary to employ anyone in China” (331), but it did not. Wynn-Williams concludes that Facebook is “completely indifferent to the rules” (332).
In April 2017, a document was leaked that showed that Facebook was targeting “thirteen-to-seventeen-year-olds across its platforms, including Instagram, during moments of psychological vulnerability” (333). Advertisers love this targeting, as people are more likely to buy products when feeling emotionally fragile. Additionally, Facebook flagged the emotional states of young mothers and racial and ethnic minorities. In fact, there was a team at Facebook developing a tool that would allow advertisers to do this themselves.
Since the optics were bad, Facebook fired a junior researcher in Australia in a feeble attempt to claim that this practice was not authorized. They also lied about the company’s activities, saying that the company did not offer tools to target people’s emotional states.
Given concerns over Facebook’s role in electing Trump, issues of sexual harassment, and silence about the harm that Facebook was causing globally, the staff no longer trusted management. To address discontent, brown-bag sessions were set up. However, conversations about sensitive topics were shut down, and it was clear that nothing would change.
Facebook has done almost nothing to stop hate groups from using its platform. For example, the page for the racist rally in Charlottesville in 2017 was only removed the day before the event. There was a divide at Facebook between management, who argued that critics unfairly targeted the company, and employees, who wanted the company to “right its wrongs” (342). Facebook only installed “window-dressing” as a public relations mechanism in response to the impact it had on the US election.
In Myanmar, Internet.org had taken off, and Facebook was equivalent to the Internet there. The platform had been used to inflame hatred toward the Muslim minority or Rohingya. Facebook did nothing to stop this use of its site. There were clear violations of Facebook’s standards, such as a false story about a Buddhist woman being raped by a Muslim man, yet the content operations team, operating from Dublin, Ireland, did not take down these posts. There was one person in Dublin responsible for monitoring this country, which was woefully insufficient. There was only one person who speaks Burmese in Facebook’s operation team as well. The community standards had not even been posted in Burmese—it is impossible to moderate content if you cannot understand it.
Also in use were unofficial Facebook apps that did not have a reporting function. Violence was being triggered as a result of Facebook posts. In contrast, the company was spending a lot of money to help China censor its platform. While an election in Myanmar went peacefully, hate speech and fake news continued to be posted on Facebook afterward. Wynn-Williams notes that Facebook was “in over [their] heads” (353), with no understanding of the effect it was having. The leadership team did not care.
Wynn-Williams found an outstanding candidate to focus on Myanmar, a Harvard graduate and friend of Schrage’s. She understood that Facebook liked to hire people like the top management. Kaplan ultimately squashed the hire, allowing “hate speech and inflammatory posts” to increase in Myanmar (355).
In August 2017, the military in Myanmar attacked the Muslim population, committing crimes of genocide. Children were killed, women were raped publicly, and elderly people were burned. Later, it was learned that the military had used Facebook in a “massive operation” (358), spreading hate and misinformation. Facebook was complicit in this tragedy, yet its top management did not care: “It wasn’t the things they did; it was the things they didn’t do” (360).
With Trump in power, Kaplan’s power increased, and he was closer to Zuckerberg. At a work party, Kaplan made inappropriate comments to Wynn-Williams and then “grind[ed] into” her from behind (363). She fled to stand beside someone from Human Resources.
Later, she asked Olivan if she could join his department, and he agreed. However, Schrage did not approve the transfer. She told him that things had gotten worse with Kaplan. After this awful meeting, she drove to Sonoma to meet her family and was attacked by wasps. She told Tom that that was not the worst part of her day.
Although the company’s employment lawyer investigated her experience with Kaplan, it cleared him. Meanwhile, Schrage then reported concerns about her performance. He complained that she had not expanded her team quickly enough, but Kaplan had blocked her hires. Soon after Kaplan was cleared, Wynn-Williams was fired. Stunned, she was escorted off the premises by security.
Wynn-Williams explains that she “wasn’t silent enough” about the harassment (372). Facebook has turned into a machine that turns people against each other, manipulates, and monitors. It did not have to turn out this way, as there was an opportunity to “make different choices” on multiple occasions (373). Management simply did not care about the consequences of their decisions. They built software for China to conduct surveillance, and they helped the Trump campaign in its “war of misinformation” (373). Despite the enormity of their wealth, Facebook’s leaders were “happy to get richer” (374), regardless of the public consequences.
Kaplan sat behind Brett Kavanaugh on “company time” in a show of support for his controversial nomination to the Supreme Court (375). He remains at Facebook, while Schrage left after reports about Facebook’s use of a Republican opposition research firm to get dirt on anyone who criticizes the company. Sandberg, who was accused of using Facebook’s resources for her personal benefit, left as well. Zuckerberg is preoccupied with the Metaverse, or virtual reality.
Wynn-Williams had another baby despite the problems with her second pregnancy. Her friend Ifeoma Ozoma “cowrote and cosponsored California’s Silenced No More Act” (377), which protects employees who publicize harassment and discrimination after they have signed a nondisclosure agreement.
Drawn to artificial intelligence (AI), Wynn-Williams is now working on “unofficial negotiations between China and the US on AI weapons” (377). Such weapons have the potential to kill large numbers of people. There is a battle over whether closed or open models of AI should be used. China and Facebook want an open model, while the West wants a closed one. It is imperative that the relationship between Facebook and China be understood. These same careless people are now working on AI, with Meta being “one of the world’s most powerful companies” (380).
The genocide in Myanmar is the most significant example of The Influence of Technology on Politics and People’s Lives in the memoir. The military used Facebook in 2017 to stir hatred of the Muslim minority and encourage violence against them. As a result, thousands of Muslims were assaulted and killed brutally, including children and the elderly. Instead of reevaluating their policies and taking responsibility, the top management at Facebook simply ignored the criticism and evaded all accountability. The genocide thus illustrates both the power of social media to foster unchecked violence and the lack of political will to hold Facebook to account for its insufficient moderation.
Facebook also assisted advertisers in identifying young users in states of emotional distress. It did the same for other groups, such as minorities. Since people are more likely to be receptive to advertising at these times, this tool allowed Facebook to make more money from advertisers. Nevertheless, when this policy came to light, Facebook denied it completely. It again took no responsibility for the damage it was doing to people’s lives. Wynn-Williams thus suggests that the company has benefited from the immense influence it has while dodging the responsibilities associated with such influence.
Wynn-Williams also continues to emphasize The Problem of Corporate Greed by stressing Facebook’s increasingly close ties with China. Zuckerberg lied to Congress about its policies in China, claiming that they were no different from elsewhere, but that was not true. Facebook’s relationship with China was kept obscure, and Zuckerberg was evasive in his testimony. In reality, Facebook was doing almost anything the Chinese government wanted, including shutting down the account of a dissident who lived outside of China. As Wynn-Williams contemplates her new position working in the field of AI, she worries that the same careless people are now working on that frontier and argues that the potential dystopian consequences are even more frightening.
Finally, the handling of Wynn-Williams’s harassment complaints also demonstrates the power dynamics at Facebook, illustrating Gender and Power Dynamics in High-Tech Industries. Kaplan, who had become politically valuable because of his close ties to the Trump administration, was cleared in a sham investigation. Wynn-Williams was fired for reasons that did not make sense: She was told that she did not expand her team fast enough with new hires, but Kaplan blocked those hires. Wynn-Williams thus presents her termination as a retaliatory measure for her complaints, suggesting that women are both vulnerable to harassment in the tech industry and unlikely to gain adequate redress for the bad behavior of their superiors.



Unlock all 62 pages of this Study Guide
Get in-depth, chapter-by-chapter summaries and analysis from our literary experts.