
Why Everyone (Else) Is a Hypocrite
Evolution and the Modular Mind
Categories
Nonfiction, Psychology, Philosophy, Science, Economics, Buddhism, Sociology, Biology, Evolution, Neuroscience
Content Type
Book
Binding
Hardcover
Year
2011
Publisher
Princeton University Press
Language
English
ASIN
0691146748
ISBN
0691146748
ISBN13
9780691146744
File Download
PDF | EPUB
Why Everyone (Else) Is a Hypocrite Plot Summary
Introduction
Why do we often say one thing but do another? How can we hold contradictory beliefs without noticing the inconsistency? These questions point to a fundamental puzzle about human nature that traditional psychology struggles to explain. The modular mind theory offers a revolutionary framework for understanding these contradictions by proposing that our brains aren't unified entities but collections of specialized modules that evolved to solve different adaptive problems. This perspective transforms our understanding of human behavior by revealing that inconsistency isn't a flaw but an inevitable consequence of our mental architecture. Rather than viewing hypocrisy as a moral failing, we can recognize it as the natural result of different brain systems operating according to their own internal logic. The modular mind explains phenomena ranging from self-deception and moral hypocrisy to inconsistent preferences and strategic ignorance, providing a coherent framework for understanding the puzzling contradictions that characterize human psychology.
Chapter 1: The Fragmented Brain: Mental Modules and Inconsistency
The human mind is not a unified entity but rather a collection of specialized mental modules, each designed to perform specific functions. This modular architecture explains why we often hold contradictory beliefs and exhibit inconsistent behaviors without experiencing cognitive dissonance. Much like a Swiss Army knife with different tools for different purposes, our brains contain numerous specialized mechanisms that evolved to solve specific adaptive problems faced by our ancestors. These mental modules operate according to their own internal logic, often independently of one another. This independence allows different parts of the mind to simultaneously hold contradictory information without reconciliation. The concept of "informational encapsulation" means that information processed in one module isn't necessarily available to others. Consider the Müller-Lyer illusion, where two lines of equal length appear different because of arrow-like figures at their ends. Even when we measure the lines and confirm they're identical, our visual system continues to perceive them as different lengths. One part of your brain knows the lines are equal, while another insists they differ. Split-brain patients provide striking evidence of this modularity. When the corpus callosum connecting the brain's hemispheres is severed, the two sides can operate independently, sometimes with contradictory goals. The left hand might perform actions the right hand tries to stop, and when asked to explain these contradictory behaviors, patients confabulate plausible but incorrect explanations. While these cases are extreme, they reveal the fundamentally modular nature of all human minds. The modular structure explains many everyday contradictions in human behavior. Consider why people lock their refrigerators at night - they're essentially preventing their future selves from acting against their current intentions. This seemingly irrational behavior makes perfect sense when we understand that different modules activate at different times. The evening module that values health and restraint locks the door to prevent the midnight module that values immediate gratification from accessing the cake. This specialized architecture provides tremendous evolutionary advantages. Modules can solve specific problems efficiently without needing to integrate all available information. When our ancestors needed to quickly determine if a rustling bush concealed a predator, waiting to integrate all possible information could prove fatal. A fast, specialized threat-detection module that sometimes generates false positives was more adaptive than a slower, more accurate system. Understanding the modular mind helps explain puzzling human behaviors, from religious beliefs that contradict scientific knowledge to moral judgments that defy logical consistency.
Chapter 2: The Press Secretary: Our Social Presentation System
The human mind contains a specialized module that functions much like a White House press secretary - it communicates with the outside world while carefully managing the information it presents. This "press secretary module" doesn't necessarily have access to all information in your brain; rather, it presents a carefully curated version of reality designed to portray you in the most favorable light possible to others. Unlike actual press secretaries who knowingly spin information, your mental press secretary genuinely believes the explanations it creates, even when they're inaccurate. This module evolved because humans are intensely social creatures whose survival and reproductive success depended on maintaining a positive reputation. Throughout our evolutionary history, those who could effectively manage others' impressions of them gained significant advantages in securing allies, mates, and resources. The press secretary doesn't need to know why you actually did something; it only needs to construct a socially acceptable explanation that others will believe and that maintains your social standing. The press secretary operates with limited information. It doesn't have access to the true motivations behind many of your actions, which are often determined by unconscious modules. When asked why you chose one potential partner over another, your press secretary might cite personality compatibility, when in fact unconscious modules were responding to fertility cues or status indicators. The press secretary isn't lying; it simply doesn't have access to these unconscious processes and must construct plausible explanations from available information. Research demonstrates this phenomenon clearly in "moral dumbfounding" experiments. When presented with scenarios involving taboo violations that cause no harm—such as consensual incest between adult siblings using contraception—people typically condemn the behavior immediately. When asked to explain their judgment, they struggle to provide coherent reasons, often citing potential harms that were explicitly excluded in the scenario. Their emotional moral modules had reached a conclusion, but their press secretary module lacked access to the underlying reasoning, forcing it to confabulate plausible-sounding but ultimately unsatisfactory explanations. In everyday life, the press secretary module manifests when we explain our choices and behaviors to others. When subjects in a study were asked to choose between identical pairs of pantyhose, they confidently explained their choice based on supposed quality differences, unaware that position alone influenced their decision. The press secretary had no access to the true cause of the choice but created a plausible narrative anyway. This isn't deception but rather a natural consequence of our modular mental architecture, where the system responsible for verbal explanation lacks access to the processes that actually drive our behavior.
Chapter 3: Strategic Ignorance: When Not Knowing Benefits Us
Sometimes, not knowing certain information can be more advantageous than knowing it. This counterintuitive idea - strategic ignorance - explains many puzzling human behaviors. While knowledge generally helps us make better decisions, there are specific circumstances where ignorance serves a strategic purpose, particularly in social contexts where others' perceptions of what we know influence their behavior toward us. Strategic ignorance isn't about random gaps in knowledge but about systematically avoiding information that would create inconvenient obligations. Strategic ignorance operates in numerous real-world scenarios. Consider crossing a busy street in Philadelphia - pedestrians sometimes deliberately avoid making eye contact with approaching drivers. Why? Because once a driver sees that you see them coming, they know you'll likely stay out of their way, giving them license to continue. By appearing oblivious, you force drivers to slow down, as they can't assume you'll yield. Similarly, military policies like "don't ask, don't tell" institutionalized strategic ignorance, allowing commanders to avoid knowledge that would require action. Our brains appear designed to sometimes actively avoid acquiring certain types of information. In laboratory experiments, researchers found that many participants deliberately chose not to learn how their decisions might harm others when such knowledge would create a moral obligation. When given the option to know whether choosing a higher payoff for themselves would reduce another person's earnings, approximately half the participants declined to look at this information, allowing them to pursue self-interest while maintaining their moral self-image. This strategic avoidance extends to medical information as well. People often resist testing for certain medical conditions when knowledge would create difficult choices or obligations. Someone who suspects they might have a sexually transmitted infection might avoid testing because confirming this knowledge would create a moral obligation to inform partners or abstain from sexual activity. The ignorance allows them to continue behaviors that knowledge would morally prohibit, while still maintaining their self-image as a responsible person. The value of strategic ignorance is particularly evident in situations involving plausible deniability. People in authority positions often implicitly prefer not knowing about certain violations, as knowledge creates an obligation to respond. This explains why parents sometimes pretend not to notice minor infractions by their children, or why teachers might strategically avoid seeing a student discreetly doing a crossword puzzle during class. By maintaining ignorance, they preserve flexibility in their response options and avoid the social costs of either enforcing rules or explicitly permitting violations.
Chapter 4: Self-Deception: Psychological Propaganda for Social Advantage
Humans systematically maintain overly positive beliefs about themselves, their abilities, and their futures - a phenomenon that serves as a form of psychological propaganda. These "positive illusions" aren't random errors but strategically advantageous misrepresentations designed to convince others of our value as social partners, mates, and allies. The key insight is that if others believed these enhanced self-views, we would indeed be better off socially. Unlike conscious deception, self-deception eliminates the cognitive burden of maintaining two contradictory representations and the behavioral tells that might expose deliberate lies. These strategic misrepresentations appear across numerous domains. Studies consistently show that people rate themselves as above average on virtually all desirable traits - from driving ability to intelligence to morality. In one famous study, college instructors were asked to rate their teaching abilities, and an astonishing 94% considered themselves above average, with 68% placing themselves in the top quarter. Similarly, hospitalized drivers who had recently caused serious accidents rated their driving skills just as highly as control subjects who hadn't been in accidents. These aren't simply errors in judgment but strategic representations designed to convince others of our competence. Our psychological propaganda extends beyond self-assessment to beliefs about control and causality. People systematically overestimate their control over random events, as seen in gamblers who believe they can influence dice by throwing them harder for higher numbers. We also show systematic biases in attributing causality - taking credit for positive outcomes while blaming external factors for negative ones. When given false feedback about performance on tests, subjects attribute success to their abilities and failure to external factors like test difficulty, even when outcomes are entirely random. Perhaps most striking is our unjustified optimism about the future. People consistently believe positive events are more likely to happen to them than to others, while negative events are less likely to affect them. College students estimate they're 35% more likely to travel to Europe and 58% less likely to develop drinking problems compared to their classmates. Even individuals with objectively high risk factors - such as people with numerous sexual partners attending STD clinics - believe their risk of infection is no higher than average. This optimism serves a strategic function: people who appear likely to succeed attract more social investment. These strategic misrepresentations aren't conscious lies but genuinely held beliefs in modules designed for social presentation. The evidence suggests they're constrained by plausibility - people adjust their self-assessments downward when they know they'll have to justify them to others. This confirms these beliefs function primarily as social tools rather than private comforts. By maintaining and projecting these enhanced self-views, we increase our value in the social marketplace, even at the cost of accuracy in our self-understanding.
Chapter 5: Inconsistent Preferences: Why Choices Lack Coherence
The human mind doesn't contain a single, unified set of preferences that guide all decisions. Instead, different mental modules have different, often conflicting preferences that activate under different circumstances. This explains why we might prefer coffee in the morning but wine in the evening, or why we lock our refrigerator doors at night despite having "decided" not to eat cake after midnight. Our preferences aren't stable entities recorded in our heads but emergent properties of which modules happen to be active at decision time. This modular view explains numerous puzzles in decision-making research. Studies show that preferences can reverse depending on how choices are presented. People will pay more for a lottery with a small chance of a large payoff than for one with a high chance of a small payoff, yet when choosing directly between the two, they select the latter. Similarly, when presented with identical options framed differently - saving 200 lives versus letting 400 die - people make contradictory choices. These aren't errors but reflections of different modules responding to different aspects of the decision context. The conflict between patient and impatient modules explains many everyday struggles with self-control. Some modules are designed to pursue immediate rewards - high-calorie foods, sexual opportunities, comfort - while others prioritize long-term benefits like health, reputation, and resource accumulation. These modules have different "discount rates" - impatient modules value immediate rewards highly and future rewards minimally, while patient modules maintain the value of delayed rewards. Which set of modules controls behavior depends on context, current state, and recent history. Context dramatically influences which modules control behavior. When hungry, food-seeking modules gain influence; when sexually aroused, modules related to sexual pursuit dominate decision-making. Research shows that men viewing attractive women become more financially impatient, preferring smaller immediate rewards over larger delayed ones. Similarly, men make riskier financial decisions (but not medical ones) when being evaluated by peers of equal status. These aren't general shifts in risk preference but specific activations of modules designed for particular adaptive challenges. We can recognize these module conflicts in our everyday experience. Activities where we think "I really like to do X, but afterwards I wish I hadn't" (eating ice cream, staying up late) reflect impatient module control, while those where we think "I don't like to do X, but afterwards I'm glad I did" (exercising, studying) reflect patient module control. The struggle between these systems explains why we employ commitment devices - alarm clocks placed across the room, automatic savings plans, or websites that let us forfeit money if we fail to meet goals. These aren't irrational restrictions on future choices but rational strategies for allowing patient modules to influence behavior even when they're not active.
Chapter 6: Moral Hypocrisy: The Inevitable Contradiction
Moral hypocrisy - condemning behaviors in others that we engage in ourselves - emerges naturally from our modular mental architecture. Different mental modules handle moral judgment versus behavioral guidance, allowing us to sincerely condemn actions while engaging in them ourselves. Unlike an idealized moral agent who applies consistent principles to both judgment and behavior, humans possess separate systems that can operate according to different rules without communicating fully with each other. The modular mind explains why we often apply stricter moral standards to others than to ourselves. When judging others, modules specialized for detecting cheating, free-riding, and norm violations activate strongly. These modules evolved to protect us from exploitation and maintain group cooperation. However, when guiding our own behavior, modules designed to pursue reproductive success, resource acquisition, and status often take precedence. This creates a situation where we can genuinely believe something is wrong while doing it anyway, without experiencing the cognitive dissonance we might expect. Our moral modules also respond differently to abstract versus concrete scenarios. When considering hypothetical moral dilemmas or distant events, modules specialized for rule-based reasoning dominate. However, when facing immediate temptations or opportunities, modules designed for self-interest gain influence. This explains why people might sincerely advocate ethical principles in the abstract but fail to apply them in personal situations where following those principles would be costly. The disconnect isn't conscious hypocrisy but the result of different specialized systems operating in different contexts. The press secretary module further complicates moral consistency by generating justifications for our behavior that preserve our moral self-image. When we act in ways that contradict our moral beliefs, this module creates explanations that frame our actions as exceptions to general rules: "I'm only doing this because of special circumstances," or "This situation is different because..." These explanations aren't conscious lies but genuine beliefs generated by modules designed to maintain our reputation and self-concept as moral individuals. Research demonstrates these moral inconsistencies in numerous contexts. Studies show that people judge others more harshly for identical moral violations, attribute others' immoral actions to character flaws while explaining their own as responses to situations, and hold others to higher standards of fairness in resource distribution. In economic games, participants who cheat for personal gain still condemn cheating in others with undiminished moral outrage. These aren't simple hypocrisies but reflections of different mental modules operating according to different adaptive logics - one set designed to protect us from exploitation by others, another set designed to advance our interests when opportunities arise.
Chapter 7: Living with Our Modular Architecture
Understanding the modular nature of our minds provides a powerful framework for making sense of human inconsistency and contradiction. Rather than viewing these inconsistencies as failures of rationality or character, we can recognize them as inevitable consequences of our evolved mental architecture. This perspective doesn't excuse hypocrisy but helps explain why it's so pervasive and difficult to eliminate, even among those who sincerely value consistency and integrity. The modular mind has profound implications for self-understanding. The feeling of having a unified "self" that makes decisions and holds beliefs is largely an illusion created by our press secretary module. In reality, different mental systems activate in different contexts, creating the experience of internal conflict and inconsistency. Recognizing this can help us be more compassionate toward ourselves and others when we inevitably fall short of perfect consistency in beliefs, preferences, and behaviors. It also explains why simple willpower often fails to resolve internal conflicts - these aren't failures of a unitary self but competitions between specialized modules with different priorities. This understanding offers practical strategies for managing our modular minds more effectively. By recognizing which contexts activate particular modules, we can structure our environments to support desired behaviors. If patient modules dominate in the evening but impatient modules take control at midnight, locking the refrigerator door isn't irrational but a sensible strategy for allowing the patient modules to influence behavior even when they're not active. Similarly, commitment devices like automatic savings plans or public pledges harness our modular architecture rather than fighting against it. The modular perspective challenges fundamental assumptions in fields from economics to philosophy. Traditional economic models assume stable preferences and rational decision-making, but modularity explains why these assumptions often fail to predict behavior. Philosophical questions about belief, desire, and self-deception take on new dimensions when we recognize that different mental modules can simultaneously hold contradictory representations without reconciliation. Even our understanding of consciousness shifts when we see it not as the central executive but as one specialized system among many. Perhaps most importantly, recognizing our modular nature can foster greater tolerance and understanding in social interactions. When we see that everyone struggles with internal contradictions - between what they know and what they feel, between what they believe and what they do - we can approach human inconsistency not as a moral failing but as an inevitable feature of our shared psychology. In this light, hypocrisy isn't something that afflicts everyone else while sparing us; it's the universal condition of minds built from specialized parts serving different functions.
Summary
The modular mind theory fundamentally transforms our understanding of human psychology by revealing that our brains consist not of a unified self but of numerous specialized modules that often work independently and sometimes at cross-purposes. This architecture explains why we simultaneously hold contradictory beliefs, why our preferences shift with context, and why we struggle with self-control and moral consistency. The press secretary module creates a coherent narrative about our thoughts and behaviors while lacking access to many of the mental processes that actually drive them, leading to the illusion of consistency where none exists. By recognizing that our inconsistencies stem from our fundamental mental architecture rather than moral weakness, we can approach human behavior with greater compassion and design more effective strategies for personal and social improvement. The modular mind doesn't just explain why everyone is inconsistent - it reveals that inconsistency is an inevitable feature of minds evolved to solve diverse adaptive problems in a complex social world. Rather than fighting against this architecture, we can work with it by creating environments and institutions that account for these predictable patterns of inconsistency, ultimately leading to more realistic expectations and more effective approaches to human behavior.
Best Quote
“This might be one reason that politicians appear to be such hypocrites. My guess is that—and maybe I'm just naïve—politicians, despite appearances, aren't actually all that much more hypocritical than the rest of us. It's just that the rest of us skate by without anyone noticing.” ― Robert Kurzban, Why Everyone (Else) Is a Hypocrite: Evolution and the Modular Mind
Review Summary
Strengths: The review highlights the book's humor and intelligence, suggesting it is both entertaining and thought-provoking. The reviewer appreciates the book's exploration of modularity theory and its explanation of human hypocrisy.\nOverall Sentiment: Enthusiastic\nKey Takeaway: The book provides an insightful examination of human behavior through the lens of modularity theory, explaining how conflicting mental modules can lead to hypocritical actions. The reviewer finds the book enlightening and believes it is a valuable read for understanding human contradictions.
Trending Books
Download PDF & EPUB
To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Why Everyone (Else) Is a Hypocrite
By Robert Kurzban









