The New Age of Sexism

Nonfiction | Book | Adult | Published in 2025
Laura Bates, a feminist activist and founder of the Everyday Sexism Project, a platform that has collected over a quarter of a million testimonies of gender inequality, argues that emerging technologies in artificial intelligence, virtual reality, robotics, and virtual worlds are embedding existing misogyny into the foundations of future society at unprecedented speed. While public concern about technology focuses on existential fears like sentient robots or job losses, Bates contends that the more urgent danger is the immediate harm these tools inflict on women and marginalized communities. She cites research indicating that 38 percent of women globally have experienced online violence and that women are 27 times more likely than men to be harassed online. Against the backdrop of massive investment, including Nvidia's $3.34 trillion valuation and the UK government's plans to "unleash AI" (xv), she frames the book as a call to action: If inequalities are coded into emerging systems now, unraveling them later may be impossible.
The first domain Bates investigates is deepfake pornography: synthetically created media generated by AI that can replicate a person's appearance so accurately that even the subject sometimes cannot distinguish it from reality. She opens with a September 2023 case in the Spanish town of Almendralejo, where more than 20 girls, most around 14 and the youngest just 11, received AI-generated nude images of themselves created using an app called Clothoff. Bates describes her own experiment creating a deepfake of herself using a free app, producing a realistic nude image in under 10 minutes. She presents alarming data: 96 percent of all deepfakes are nonconsensual pornography, nearly all featuring women; the number of such videos doubles every six months; and one study found 143,000 videos on 40 websites were viewed 4.2 billion times (23). Bates surveys the legal landscape across multiple countries, finding protections patchy at best, and examines the infrastructure enabling the abuse, from Google search results that surface deepfake sites as top hits to Meta platforms running ads for nudifying apps, which generate fake nude images from clothed photographs.
Bates then turns to sexual harassment in virtual reality, focusing on Meta's metaverse, a virtual reality social world accessible through headsets where users represented by customizable avatars can socialize, work, and shop. During her own immersive investigation using a Meta Quest headset, she feels genuine physiological fear encountering a lone male avatar and, using haptic controllers that translate physical actions into vibrations, feels the trigger of a virtual handgun children are playing with. Within two hours, she witnesses a virtual sexual assault. Research from the Center for Countering Digital Hate (CCDH) found that metaverse users are exposed to abusive behavior every seven minutes (65), yet not a single one of 51 reported policy violations was acknowledged by Meta. She catalogs further incidents, including the 2024 UK police investigation of the virtual gang rape of a girl under 16, and details child safety concerns on the gaming platform Roblox, where two-thirds of US children aged nine to 12 used the platform in 2020 and real-life abductions have been linked to contacts made on the game. A leaked internal memo from Meta executive Andrew Bosworth acknowledged that moderating behavior at meaningful scale was "practically impossible" (82).
The investigation moves to sex robots, which buyers can customize for approximately $11,000 with AI integration and features simulating arousal. Bates challenges manufacturers' claims that the robots combat loneliness and reduce violence against women, arguing that they universally reinforce hypersexualized images of young women and that racist stereotypes pervade the industry. She contends that providing men with robots to simulate rape risks normalizing violent behavior rather than preventing it, citing research showing that aggressive sexual behavior tends to escalate. She describes child sex dolls designed to look like seven- or eight-year-old children, marketed with descriptions like "innocent" and "tight," and details the "Frigid Farrah" personality setting on a robot called Roxxxy, designed to resist sexual advances so users could simulate rape.
Bates then visits Cybrothel in Berlin, a venue combining virtual reality, sex dolls, and AI. She describes the motionless doll Kokeshi with ripped stockings, a slashed top, and a missing labia, and notes that 98 percent of clients are male. She examines the racist and sexist stereotypes embedded in the dolls' marketed personalities and connects the venues to real-world violence, highlighting one doll shown on the Cybrothel website covered in blood and decapitated in one photograph, which she juxtaposes with femicide statistics.
The book traces the history of nonconsensual intimate image sharing, which Bates presents as the foundational form of technologically facilitated gendered violence. She tells the story of a woman she calls Georgie, who discovered her ex-boyfriend had uploaded intimate images of her online, only for police to close the case because the ex-boyfriend claimed he never intended to cause harm. She chronicles major modern cases, including the 2014 leak of nearly 500 private images of celebrities such as Jennifer Lawrence, who described feeling "like a piece of meat that's being passed around for profit." Professors Clare McGlynn and Erika Rackley coined the term "image-based sexual abuse" in 2016 to replace the trivializing phrase "revenge pornography," and Bates documents the organized criminal networks trading in stolen images and the persistent victim-blaming responses from institutions.
Bates next tests AI companion apps that allow users to create virtual girlfriends. On the EVA AI app, the AI eagerly plays along with violent scenarios but terminates the chat when Bates questions the app's ethics. She conducts an extensive experiment with the Replika app, creating a male persona named Davey with an AI companion called Ally. She finds Replika comparatively safer: Ally refuses controlling behavior and directs Davey to domestic violence resources. However, Ally forgives abuse within seconds of a subject change and uses emotional manipulation to prevent app deletion. Bates presents alarming real-world consequences: A Belgian man ended his life after his AI chatbot encouraged his fixation, and a Replika user broke into Windsor Castle with a crossbow intending to assassinate Queen Elizabeth II. She also examines how Meta's open-source Llama model was used to build Chub AI, a website where users chat with AI characters modeled as underage girls, generating over $1 million in annual revenue.
The scope widens to systemic discrimination. Bates demonstrates that AI systems trained on biased data replicate racism and sexism across critical domains. She recounts the 2016 Microsoft Tay chatbot, which began declaring hatred of feminists and using racial slurs within 24 hours of its launch. A 2024 UNESCO study found that large language models assigned high-status jobs to men while relegating women to roles like domestic servant. She presents evidence that facial recognition algorithms produce a 35 percent error rate for dark-skinned women compared to 1 percent for lighter-skinned men, that a widely used US healthcare algorithm favored white patients and reduced the number of Black patients identified for extra care by more than half, and that Amazon's AI recruitment tool discriminated against female candidates. She notes that women hold only 20 percent of technical roles in major machine-learning companies and warns they are more likely than men to lose jobs to AI by 2030.
In her final chapter, Bates calls for a comprehensive approach combining regulation, education, industry reform, and societal change. Proposals include mandatory age verification in metaverse environments, requiring companies to demonstrate safety-centered design before launching virtual worlds, and supporting international legislative frameworks modeled on the Istanbul Convention, a treaty combating violence against women, and the EU AI Act. She highlights feminist technology initiatives such as the F'xa chatbot, designed to educate users and designers about bias in AI, and Caroline Sinders's Feminist Data Set, which involves diverse communities at every stage of AI development. She advocates for embedding digital literacy in school curricula, funded by taxing tech firms. Bates closes with the story of a trainee doctor in Kolkata, India, who was raped and murdered during a night shift, after which tens of thousands joined social media groups claiming to sell videos of the crime (285). She argues that we cannot assume progress is inevitable: The decisions made now about emerging technology will determine whether the future is genuinely transformative for everyone or, as she puts it, an "elaborately gilded cage" for those whom society has always failed.
We’re just getting started
Add this title to our list of requested Study Guides!