72 pages 2-hour read

Science and Human Behavior

Nonfiction | Reference/Text Book | Adult | Published in 1953

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Part 2Chapter Summaries & Analyses

Part 2: “The Analysis of Behavior”

Part 2, Chapter 4 Summary: “Reflexes and Conditioned Reflexes”

Skinner traces the historical origins of the reflex concept, noting that early mechanistic analogies—such as Rene Descartes’s comparison between hydraulics and muscle movement—challenged the idea that behavior was purely spontaneous. Over time, experimental findings, such as a salamander’s tail moving when stimulated, demonstrated that some behavior could be explained as a response to external events rather than inner will. This led to the formalization of the stimulus-response relationship as a “reflex,” with measurable properties like latency and magnitude. As scientific understanding expanded, the scope of reflex action increased, gradually displacing inner-cause explanations.


While reflexes allow for precise prediction in certain cases—like the pupil contracting in response to light—they account for only a small fraction of total behavior. The principle gained broader significance with Ivan Pavlov’s discovery of the conditioned reflex, in which a neutral stimulus elicits a response through pairing with a stimulus. Pavlov replaced cage explanations like “psychic secretion” with precise, controllable conditions. He also identified the process of extinction, in which a conditioned stimulus loses its effect when reinforcement stops.


Skinner discusses the evolutionary “survival value” of both unconditioned and conditioned reflexes, while noting that not all conditioned responses are adaptive—some, like phobias or superstitions, result from accidental pairings of stimuli. Conditioned reflexes have wide practical application in controlling behavior, from eliciting emotional reactions in art and advertising to shaping attitudes in political and military contexts. They can also be used therapeutically, as in counterconditioning to reduce anxiety or treat substance dependency. Techniques such as graded exposure—gradually increasing the intensity of a conditioned stimulus—can effectively reduce strong emotional reactions.

Part 2, Chapter 5 Summary: “Operant Behavior”

Skinner distinguishes operant behavior from reflexes by emphasizing its effect on the environment and the role of consequences in changing its probability of recurrence. Building on E.L. Thorndike’s “Law of Effect,” he recounts experiments in which cats learned to escape from puzzle boxes more quickly over repeated trials, not because of reasoning but because successful behaviors were “stamped in” by the outcome. While Thorndike’s learning curves offered an early quantitative account, Skinner notes that such curves often reflect experimental apparatus as much as the behavioral process itself.


To clarify the underlying mechanism, Skinner describes operant conditioning, in which a consequence—such as food for a hungry pigeon—is made contingent on a specific behavior. This increases the frequency of that behavior without requiring an identifiable eliciting stimulus, distinguishing operant from Pavlovian conditioning.


Skinner examines related processes, including operant extinction, in which behavior decreases when reinforcement is withheld. Extinction curves reveal orderly declines in response rates, with the rate and persistence of behavior depending on reinforcement history. Intermittent reinforcement, for instance, often produces greater resistance to extinction than continuous reinforcement—a finding with practical applications in education, industry, and therapy. Extinction can also evoke emotional responses, such as frustration or aggression, which may cycle until ceasing.


The chapter also explores reinforcers—stimuli defined solely by their ability to increase response probability. These may be positive (adding a stimulus) or negative (removing an aversive stimulus), with conditioned and generalized reinforcers (e.g., money, attention, approval) acquitting reinforcement power through association with primary reinforcers. Generalized reinforcers are especially effective because they are linked to multiple deprivations, making behavior less dependent on a single motivational state.


Skinner addresses the origins of superstitious behavior, noting that accidental pairings of behavior and reinforcement can strengthen nonfunctional actions in both animals and humans. He critiques explanations of behavior in terms of “goals” or “purposes,” reframing these as shorthand for past reinforcement histories: “Instead of saying that a man behaves because of the consequences which are to follow his behavior, we simply say that be behaves because of the consequences which have followed similar behavior in the past” (87). By replacing mentalist constructs with observable contingencies between behavior and consequences, Skinner positions operant conditioning as a fundamental process for understanding, predicting, and shaping behavior.

Part 2, Chapter 6 Summary: “Shaping and Maintaining Operant Behavior”

Skinner compares shaping behavior through operant conditioning to a sculptor working on a lump of clay—both involve gradual, continuous modification. Behavior is rarely an entirely new “unit”; rather, it develops from undifferentiated activity through successive approximations reinforced at each step. For example, in training a pigeon to peck a spot, reinforcement begins with any movement toward the target and becomes progressively more selective until only direct contact is rewarded. This procedure can establish responses that would otherwise have a low probability of occurring. Even seemingly discrete acts, like pecking, may have preformed components shaped by prior history or genetic predisposition.


This continuity means reinforcement can affect other, related behaviors, a phenomenon sometimes called “response generalization” or “transfer.” Skinner argues this is not mysterious but reflects shared elements between those responses. Reinforcement strengthens all behaviors containing those elements, suggesting that the true functional unit of behavior may be smaller “behavioral atoms” rather than whole acts. Verbal behavior offers clear examples, as many speech responses share muscular elements and can vary together under a single controlling variable.


Skinner distinguishes between acquiring new behaviors and refining existing ones. The latter, termed differential reinforcement, selects for responses with particular properties, thereby improving skill or precision. This process often requires immediate reinforcement to preserve fine distinctions, as in sports or craftsmanship. Environmental contingencies—including social ones—can unintentionally shape undesirable behaviors, such as a child’s whining when only louder or more emotive vocalizations receive attention.


He also addresses the maintenance of behavior, emphasizing that ongoing reinforcement is necessary to sustain learned responses. Without it, extinction occurs. Many behaviors in daily life are maintained by intermittent reinforcement, which can yield highly persistent responding. Skinner details several schedule types, including fixed-interval (FI), variable-interval (VI), fixed-ratio (FR), and variable-ratio (VR). Combinations of ratio and interval schedules, or schedules contingent on response rate, can further manipulate behavior. High rates are encouraged when only rapid sequences are reinforced, while low rates are maintained when reinforcement follows slower responding.


Skinner notes the practical applications of these principles in areas such as gambling, industry, and education, and cautions that while optimized schedules can increase productivity and morale, ethical considerations must guide their use. As he concludes, informed application depends on “clear-cut information regarding the nature and effect of the devices responsible for the maintenance of behavior and strength” (106).

Part 2, Chapter 7 Summary: “Operant Discrimination”

Skinner explains that while operant behavior is emitted rather than elicited, it often comes under the control of external stimuli through processes of discrimination. For example, a pigeon’s neck-stretching may be reinforced when a light is on but extinguished when the light is off. Over time, the behavior occurs only in the presence of light. Unlike reflexes, this three-term contingency requires specifying the stimulus or discriminative cue, the response, and the reinforcement.


The discriminative stimulus does not compel the behavior but alters its probability, allowing for precise, immediate control over when responses are likely to occur. Such discriminative repertoires ensure behavior occurs in situations where it is more likely to be reinforced. Everyday life offers many examples, from children learning to respond to ringing telephones to adults adjusting social behavior in response to smiles or frowns. Verbal behavior also follows this pattern, as naming, reading, or answering exam questions rely on discriminative stimuli that determine when a particular response will be reinforced. Education and training systems deliberately create these repertoires so that individuals respond appropriately under future circumstances.


Skinner contrasts discriminative control with reflexive control in his discussion of voluntary and involuntary behavior. Reflexes are elicited and appear more coercive, while discriminative stimuli alter response probabilities in subtler, more flexible ways; however, operant responses can be predicted with the same inevitability as reflexes. This undermines the distinction between voluntary and involuntary action, as well as the explanatory usefulness of “will” or “personal responsibility.” As Skinner argues, “When all relevant variables have been arranged, an organism will or will not respond. If it does not, it cannot. If it can, it will” (112).


The chapter examines discriminative repertoires, in which organisms acquire coherent sets of responses governed by stimulus fields. Examples include reaching toward objects across the visual field, drawing from copy, singing or playing music by ear, and imitation. Imitation is not instinctive but arises when reinforcement strengthens responses that correspond to observed behavior. Even complex repertoires in dance, acting, or sport are explained through discriminative contingencies rather than innate mechanisms.


Skinner further explores the concept of attention, framing it as a relation in which stimuli exert discriminative control over behavior. To “attend” is simply for behavior to be under the influence of certain stimuli. He then turns to the temporal relations that shape discriminative behavior. Responses may be reinforced only when they occur immediately after a stimulus (as in answering a ringing phone), after a delay (as in deliberate decision-making), or within an interval defined by environmental contingencies. These patterns generate characteristic behaviors, including anticipation, expectancy, and preparatory sets such as the tension before a race. Together, these concepts and examples show how discrimination extends operant conditioning.

Part 2, Chapter 8 Summary: “The Controlling Environment”

Human behavior, Skinner argues, is inseparable from environmental control. Clinical psychology often underestimates this role, treating external events casually as “facts” in case histories rather than specifying how stimuli affect behavior. A more systematic analysis of organism-environment interaction clarifies questions of perception, misinterpretation, and symbolic substitution.


The study of stimuli begins with their physical description: Light, sound, chemical substances, and other measurable energies. Stimuli are effective only within certain limits, and variations in perception across individuals (i.e., colorblindness or deafness) illustrate the boundaries of sensory systems.


Skinner next describes induction, or generalization, as the spread of control from one stimulus to others that share properties. If a pigeon is conditioned to peck a red spot, similar responses may occur to orange or yellow spots. He argues that this process explains human responses to resemblance and the evocative power of metaphor. Generalization gradients, which measure the degree of responding across varying stimulus values, demonstrate which properties exert the strongest control.


Discrimination occurs when reinforcement strengthens behavior under one stimulus condition while extinction weakens it under another. This sharpening of stimulus control allows organisms to respond precisely to relevant differences. Similarly, abstraction arises when responses are conditioned to a single property, such as redness, across varied objects. True abstraction is rare and usually requires verbal mediation. The development of abstract terms like “chance” illustrates how verbal communities progressively refine stimulus control over time.


Skinner addresses the traditional problems of stimulus control. Cross-modal induction, in which stimuli from different sensory fields evoke similar responses, may be explained by common mediating behavior. Relational responding, such as choosing larger rather than smaller objects, depends on conditioning histories that emphasize relative properties. Finally, apparent discrepancies between perception and reality—such as mistaking smoke for fog—reflect differences in behavioral responses rather than access to a distinct “perceptual world.”

Part 2, Chapter 9 Summary: “Deprivation and Satiation”

Skinner examines deprivation and satiation as variables impacting the probability of behavior. While often confused with stimulus effects, these operations function differently: Deprivation increases the likelihood of behavior that restores balance, and satiation reduces it. For example, water deprivation heightens reflexive and operant behaviors leading to drinking, while satiation lowers the probability. These processes have clear adaptive significance, ensuring that organisms act to maintain survival needs.


Deprivation extends beyond material exchanges. Preventing activity, such as restricting exercise, increases later activity levels when opportunities arise. Similarly, sexual satiation may result from both the act itself and its physiological consequences. In each case, the biological economy of the organism determines whether behavior is strengthened or weakened.


Traditional psychology often frames these effects in terms of needs and drives, attributing behavior to internal causes like hunger, thirst, or desire. A functional approach, according to Skinner, instead focuses on observable operations rather than inferred inner states.


Practical examples illustrate how deprivation and satiation are deliberately arranged to influence behavior. Restricting water intake makes a child more likely to drink milk; delaying meals increases appetite; solitary confinement increases the probability of talking; and rationing food heightens cooperation. Conversely, satiation is used to suppress behavior, as when serving bread before a meal reduces complaints about portion sizes or when abundant attention reduces undesirable actions. These effects are best explained by the direct action of deprivation and satiation rather than by positing inner drives.


Skinner addresses questions about the number and interaction of drives. Instead of asking whether hunger is stronger than sex, or whether one drive reduces another, he argues that it is more precise to analyze the specific deprivations and satiations involved. Conditioning further demonstrates the link between reinforcement and motivation: Operant responses, such as a pigeon’s neck-stretching reinforced by food, ultimately come under the control of food deprivation. Generalized reinforcers complicate matters, since they acquire power across multiple deprivation systems, but they do not justify positing separate drives without independent operations of deprivation and satiation. The text also considers time as a variable. Periodic cycles such as sleep and activity follow patterns of deprivation and satiation, while longer rhythms—menstrual cycles, seasonal migrations, aging—introduce additional layers of predictability.


Skinner situations deprivation and satiation within broader species and individual differences. Species-typical behaviors are sometimes mislabeled as “instincts,” but like drives, the term explains nothing beyond observed tendencies. Individual variation in sensitivity to deprivation, satiation, and reinforcement also shapes behavioral probabilities. Skinner concludes with a checklist of variables that must be considered in accounting for behaviors, ranging from species membership and age to current cycles, recent histories of deprivation and satiation, and other factors.

Part 2, Chapter 10 Summary: “Emotion”

Skinner refers to emotions as explanatory fictions. Classical theories, such as James-Lange, suggest that emotions arise from the perception of physiological changes: “we feel sorry because we cry, angry because we strike, afraid because we tremble” (161). While physiological responses often accompany emotion, they cannot consistently distinguish one emotion from another. Instead, Skinner argues, classifications rely less on physiology and more on observable patterns of predisposition and action. From this perspective, emotions are best understood as predispositions to act. To say that someone is angry means their probability of damaging behavior has increased while prosocial behaviors have decreased. Adopting adjectival forms—fearful, affectionate, timid—avoids the misleading reification of emotions as entities.


Emotional responses may vary together because of shared consequences. In anger, for example, unconditioned acts such as biting or striking inflict damage, and these consequences reinforce other conditioned behaviors such as verbal aggression or destructions. In other cases, emotional groupings are shaped by evolutionary contingencies, as with unconditioned fighting or escape behaviors. More subtle emotions, such as embarrassment or loneliness, are harder to define precisely, and their forms differ across circumstances. Skinner stresses that everyday categories like “frustration” often group together disparate effects under a single label, masking important distinctions.


Like motivation, emotions are tied to environmental operations. Sudden loud noises induce fear, physical restraint generates rage, and the interruption of established behaviors produces frustration. Overlapping categories such as nostalgia illustrate the difficulty of separating motivational from emotional conditions.


Skinner defines the “total emotion” as the combined changes in an individual’s behavioral repertoire produced by a given circumstance. A phobia illustrates this: The sight of a feared stimulus produces conditioned reflects (like sweating), operants of escape (running away, calling for help), and general changes such as reduced appetite or loss of interest in ongoing activities. All these together comprise the emotional effect.


Crucially, emotions are not causes of behavior. To say a man neglects his work “because of anxiety” merely classifies the behavior rather than explains it. Skinner believes that the true causes are the external circumstances that produce both the neglect and the accompanying emotional pattern. Skinner also considers the practical use of emotions. Emotional behavior is deliberately manipulated in entertainment, politics, advertising, and social influence. He argues that these practices demonstrate that emotions are best understood as predictable, manipulable effects of environmental operations, not as inner causes.

Part 2, Chapter 11 Summary: “Aversion, Avoidance, Anxiety”

Skinner distinguishes aversive control from deprivation, saying that a stimulus is aversive if escape from it is reinforcing. Aversive stimuli vary in form and intensity, from pain to conditioned cues, and cannot be defined by intrinsic physical properties. Escape behavior includes actions such as covering the ears to reduce noise or turning away from bright light. Importantly, aversive stimuli are defined functionally, not by their subjective unpleasantness.


Conditioned aversive stimuli emerge when neutral events are paired with established aversive conditions, transferring their aversive functions. For example, pairing alcohol with nausea leads to avoidance of alcohol. This conditioning underlies practices in ethics, religion, and government, where acts are branded as wrong or sinful to ensure that escape or avoidance behavior is reinforced. Withdrawal of positive reinforcers functions similarly to the presentation of negative reinforcers, such as when privileges are withheld until certain behaviors are performed.


Avoidance differs from escape in that the aversive stimulus never directly occurs. Instead, a conditioned stimulus predicts an aversive event, and behavior that reduces the conditioned cue is reinforced. For example, the sound of a drill preceding dental pain becomes aversive, and turning away from the drill is reinforced by reducing the threatening sound. Avoidance behavior is thus maintained by escape from conditioned threats, not the avoidance of future events per se. Over time, if the aversive event is consistently avoided, the conditioned threat undergoes extinction until renewed by re-exposure. Threats, therefore, serve as conditioned negative reinforcers that sustain avoidance behavior.


Anxiety arises when stimuli characteristically precede strong aversive events, producing both conditioned avoidance behavior and emotional by-products. A person who has been seasick may avoid ships and also display diffuse emotional effects such as preoccupation, withdrawal from normal activities, and physiological responses resembling fear. Anxiety thus represents a complex predisposition involving both operants and reflexes. It can be conditioned by single pairings, such as the sudden death of a loved one, where incidental daily stimuli acquire aversive properties. While anxiety may prompt avoidance of dangerous situations, its emotional aspects often interfere with effective behavior, making it a central concern in psychotherapy.


Skinner contrasts anxiety with anticipation. Just as stimuli predicting aversive events generate anxiety, those predicting positive reinforcement generate excitement or elation. The unopened envelope may evoke dread if associated with bad news or joy if associated with good news. Anticipatory emotions differ in effect but share the same functional basis in conditioned reinforcement.


Skinner also emphasizes the anxiety is not a cause. Like other emotions, it classifies patterns of predisposition produced by environmental conditions. Effective intervention must address the external contingencies that generate anxious behavior, not an intervening “state.”

Part 2, Chapter 12 Summary: “Punishment”

Punishment is one of the most common techniques of social control, appearing in family discipline, education, religion, law, and politics. While it is intended to reduce unwanted behavior, its effects are complex and often counterproductive. Punishment relies on the presentation of aversive stimuli or the withdrawal of positive reinforcers, but unlike reinforcement, it frequently generates undesirable emotional by-products such as fear, rage, anxiety, or retaliation.


Research has shown that punishment produces only temporary suppression of behavior rather than permanent elimination. For example, experiments demonstrate that while punishment reduces immediate responding, the behavior typically returns once punishment is discontinued. This challenges earlier theories, such as Thorndike’s original “stamping out” model, and aligns with observations of repressed but persistent tendencies in psychoanalysis.


Skinner identifies three main effects of punishment. First, punishment produces an immediate suppression of behavior by eliciting incompatible responses, such as fear or pain reflexes, that temporarily block the punished act. Second, behaviors that have been punished often generate conditioned emotional responses, evoking feelings of fear, guilt, or shame that further suppress behavior in the future. Finally, punishment establishes avoidance behavior, in which individuals engage in alternative actions—sometimes little more than “doing nothing”—to escape or avoid the aversive conditions associated with punishment.


Since punishment depends largely on social administration, it is often applied intermittently. This unpredictability fosters conflict and disorganized behavior, producing inhibition, timidity, or oscillation between punished and avoidance responses. Additionally, punishment can lead to chronic emotional disturbances, psychosomatic illness, and maladaptive guilt. Alternatives to punishment include extinction, satiation, developmental changes, and positive reinforcement of incompatible behaviors. These methods are generally more effective and produce fewer damaging side effects. While some progress has been made in shifting from punitive systems toward positive reinforcement, society still relies heavily on punishment. A more complete scientific analysis, Skinner suggests, is needed to design effective alternatives.

Part 2, Chapter 13 Summary: “Function Versus Aspect”

Behavior is often described in terms of traits or aspects—adjectives such as “cordial,” “timid,” or “intelligent”—rather than by specifying discrete actions. These trait-names provide convenient shorthand, but they rarely specify actual behavior or the variables that determine it.


Traits can be examined functionally in two main ways. Some represent differences in exposure to variables, such as reinforcement history, punishment, deprivation, heredity, age, or emotional circumstances. Others reflect differences in rates of behavioral processes, such as speed of conditioning, discrimination, or extinction. In both cases, traits, in principle, can be reduced to measurable differences in repertoires or behavioral processes. However, common methods of measurement, such as intelligence tests, often rely on arbitrary group comparisons rather than direct observation of behavior, making their results less scientifically meaningful.


Tests and surveys can predict behavior in certain contests, but such predictions are typically from effect to effect, rather than from variables to outcomes. For example, a test score may correlate with job performance, but it does not reveal the controlling conditions that shape both. Trait descriptions thus fail to advance the practical control of behavior, limiting their value in clinical, educational, or organizational settings.


Skinner emphasizes that traits are not causes. Linguistic habits transform adjectives into nouns—such as “intelligent” into “intelligence” or “narcissistic” into “narcissism”—which then appear to explain behavior. In reality, he argues, traits are inferred from behavior and cannot be manipulated independently to produce change. Even when statistical methods identify overlapping traits or distill them into minimal sets, these remain abstractions from observed behavior rather than explanatory variables. Functional analysis, by contrast, identifies the external conditions that control responses, offering both predictive and practical utility.

Part 2, Chapter 14 Summary: “The Analysis of Complex Cases”

Skinner addresses the charge that behavioral science is “oversimplified.” Like other sciences, behavior analysis begins with simple cases under controlled conditions, later expanding to account for complexity.


One variable can produce multiple effects. For example, punishment both elicits reflexive emotional responses, conditions avoidance behavior, and alters future predispositions. Similarly, reinforcement may strengthen behavior while simultaneously generating satiation, temporarily reducing responding. Seemingly contradictory phenomena—like a child’s misbehavior after receiving candy—can be explained as the combined effects of reinforcement, satiation, and discriminative stimuli.


Complexity also arises from multiple causes, where different operations converge on a single response. A behavior may be reinforced in several ways, or be simultaneously influenced by motivational, emotional, and discriminative variables. Verbal behavior is especially multiply determined, often shaped by overlapping repertoires and strengthened by supplementary factors. This helps explain phenomena such as wit, slips of the tongue, or literary style, which emerge from the convergence of variables.


Skinner explores the practical use of multiple causation in techniques such as suggestion and prompting, as well as in “projective” or “associative” methods used in clinical psychology. These procedures reveal latent behavior by supplementing existing responses, though he argues that they are best understood as manipulations of probability rather than expressions of hidden traits. Relatedly, projection and identification involve imitative or supplementary responses, which can be verbal or nonverbal, and are often shaped by prior reinforcement histories. Perception illustrates multiple determination, since stimuli interact with emotional and motivational conditions. Expectancy and deprivation can broaden stimulus control, as when someone mistakes a stranger for an acquaintance, or a faint sound is taken for an anticipated signal.


Skinner then turns to incompatible variables and conflict. When opposing responses are simultaneously strong, outcomes may take the form of “algebraic summation” (intermediate actions), prepotency, or oscillation. These dynamics appear in choices, hesitation, and Freudian notions such as “forgetting,” which Skinner reframes as the displacement of one response by an incompatible alternative.


Lastly, he introduces chaining, where one response produces variables that control subsequent responses. Chains may be loosely organized, as in wandering, or highly structured, as in problem-solving or performing music. Chains can also alter motivational conditions, such as drinking to relieve thirst, and in humans may extend to behavior that modifies the strength of other behaviors, laying the foundation for advanced analysis.

Part 2 Analysis

Part 2 of Science and Human Behavior shifts from Skinner’s broad methodological framing in Part 1 to a more detailed exploration of the principles that underpin behavioral science. Here, Skinner defines key terms, demonstrates how conditioning works, and begins to push against what he regards as psychology’s “explanatory fictions.”


One of Skinner’s central moves in Part 2 is to challenge traditional ideas of free will and inner causes by reframing behavior in environmental terms, deepening his exploration of Behavior as a Product of Environmental Conditioning Rather Than Inner Will. An example is his reinterpretation of voluntary control: “We need not say that the sneezing must have been voluntary ‘because he could stop it when he wanted to.’ A more acceptable translation reads, ‘He stopped sneezing when variables were introduced which strengthened competing behavior’” (115). Here, Skinner employs rhetorical reframing to show how everyday explanations rooted in will or desire can be replaced with a scientific vocabulary grounded in observable contingencies.


Similarly, he critiques what he considers the explanatory fiction of “instinct,” arguing, “The concept of ‘instinct’ has been used to account for them […] This is a flagrant example of an explanatory fiction” (157). By labeling instinct as a placeholder rather than a true explanation, Skinner redirects attention to the measurable variables—deprivation, reinforcement, extinction—that he believes better account for behavioral patterns. This insistence on environmental causation aligns with his overarching claim that behavior is not the product of inner essence, but of history and context.


While Part 2 remains largely focused on the mechanics of conditioning, Skinner begins to move toward ethical questions, once more gesturing towards The Ethical Implications of Control and Reinforcement. For instance, he notes, “Whether these improvements should be permitted is a matter to be discussed later” (106, emphasis added). This aside acknowledges the moral weight of behavioral technology, particularly reinforcement schedules, which can shape behavior with extraordinary precision. Such remarks foreshadow later debates about the limits of control, inviting readers to consider whether every possible manipulation ought to be applied.


Punishment is a particularly fertile site for ethical reflection. Skinner argues that “punishment does not actually eliminate behavior from a repertoire, and its temporary achievement is obtained at tremendous cost” (190). Here, he highlights not only the practical inefficacy of punishment but also its social consequences. The passage illustrates how ethical concerns are not external to behavioral science but emerge from its findings: Punishment generates fear, conflict, and inefficiency, making it less humane and less effective than reinforcement-based alternatives.


Definitions, too, carry ethical weight. When Skinner redefines “attention” as a “controlling relation—the relation between a response and a discriminative stimulus” (124), he strips the concept of moralistic overtones and reframes it as a neutral tool of analysis. This kind of linguistic precision forces a reconsideration of how power operates in relationships such as parent-child or teacher-student. By showing that attention functions as reinforcement, Skinner encourages a more self-conscious and responsible deployment of it.


Throughout Part 2, Skinner hints at the wider applications of behavioral principles, raising the specter of The Potential for Social Engineering Through Behavioral Science. Early in the section, he observes, “machines have become more lifelike, and living organisms have been found to be more like machines” (46). This simile collapses boundaries between natural and mechanical systems, preparing the ground for his vision of society as a system that can be engineered through the same principles of conditioning.


The second section of Science and Human Behavior is where Skinner lays the foundation for understanding behavior as lawful, predictable, and manipulable. By arguing that behavior is environmentally conditioned rather than freely chosen, by exposing the ethical stakes of reinforcement and punishment, and by hinting at the possibility of large-scale social engineering, Skinner prepares the reader for the practical and philosophical debates to come.

blurred text
blurred text
blurred text

Unlock all 72 pages of this Study Guide

Get in-depth, chapter-by-chapter summaries and analysis from our literary experts.

  • Grasp challenging concepts with clear, comprehensive explanations
  • Revisit key plot points and ideas without rereading the book
  • Share impressive insights in classes and book clubs