
Think Again
The Power of Knowing What You Don't Know
Categories
Business, Nonfiction, Self Help, Psychology, Philosophy, Science, Leadership, Audiobook, Personal Development, Book Club
Content Type
Book
Binding
Hardcover
Year
2021
Publisher
Viking Books
Language
English
ASIN
1984878107
ISBN
1984878107
ISBN13
9781984878106
File Download
PDF | EPUB
Think Again Plot Summary
Synopsis
Introduction
When the Mann Gulch wildfire erupted in Montana in 1949, fifteen smokejumpers parachuted in to contain the blaze. As the fire suddenly changed direction and raced toward them, the team's foreman, Wagner Dodge, made an unexpected decision. Instead of trying to outrun the inferno, he stopped, lit a match, and set the grass in front of him on fire. Then he stepped into the burned area as the main fire swept around him. His bewildered crew thought he had lost his mind—why would anyone start another fire when fleeing from one? Tragically, most ignored his urgent calls to join him, and twelve perished. Dodge survived because in that critical moment, he was able to rethink conventional wisdom about wildfire escape tactics. This ability to rethink—to question our beliefs, unlearn outdated knowledge, and embrace new perspectives—is at the heart of Adam Grant's exploration in "Think Again." In a rapidly changing world, our success depends less on how much we know and more on how well we can update that knowledge. Yet most of us pride ourselves on consistency and conviction, clinging to old views even when evidence suggests we should reconsider. Through fascinating stories and compelling research, Grant reveals how we can cultivate the intellectual humility to know what we don't know, the doubt to question our own certainties, and the curiosity to explore new territory. Whether in our careers, relationships, or personal growth, rethinking isn't just a skill—it's a mindset that can help us navigate an increasingly complex world.
Chapter 1: Survival by Rethinking: The Mann Gulch Fire Story
The smoke was thickening around the firefighters as they raced uphill, desperate to escape the approaching wall of flame. Wagner Dodge, their foreman, suddenly realized they couldn't outrun the fire. In that moment of crisis, he did something none of them had been trained to do: he lit a match and burned a patch of grass, creating a safe zone where the main fire couldn't reach. "Up here!" he shouted to his crew, motioning toward his escape fire. But his men, confused by this counterintuitive action, continued running. Only Dodge survived by lying down in his burned area as the main fire passed around him. This tragic story from the 1949 Mann Gulch fire illustrates a profound truth about human thinking. When faced with danger, we typically revert to well-learned responses—in this case, running from fire rather than creating one. The smokejumpers couldn't rethink their assumptions quickly enough to save themselves. They were carrying heavy equipment as they fled, and even when Dodge ordered them to drop their tools, many hesitated. As organizational psychologist Karl Weick observed, "Dropping one's tools creates an existential crisis. Without my tools, who am I?" The same cognitive inflexibility that doomed those firefighters appears in less dramatic but equally consequential ways in our everyday lives. We cling to outdated beliefs about our careers, relationships, and identities. We expect our squeaky brakes to keep working until they fail on the freeway. We believe the stock market will keep rising despite warnings of a bubble. We assume our marriage is fine despite growing emotional distance. What makes rethinking so difficult is that it often requires us to admit we were wrong. Our beliefs become intertwined with our identities, making it painful to reconsider them. We'd rather preach our perspectives with conviction, prosecute alternative views, or politick for approval than adopt the mindset of a scientist—someone who forms hypotheses, tests them, and revises based on new evidence. The story of Mann Gulch reminds us that sometimes the greatest danger isn't ignorance but the illusion of knowledge. The ability to unlearn, to question our assumptions, and to rethink our approaches isn't just valuable—it can be lifesaving. In our rapidly changing world, the skill of rethinking may be the most crucial tool in our mental toolkit.
Chapter 2: The Confidence Trap: Armchair Quarterbacks and Impostors
When Halla Tómasdóttir received an unexpected call in December 2015, her house roof had just collapsed under heavy snow and ice. But the call wasn't about that—someone had started a petition for her to run for the presidency of Iceland. Her immediate thought was, "Who am I to be president?" Despite her success as an entrepreneur who had guided her investment firm through Iceland's devastating 2008 financial crisis, Halla felt like an impostor. She had no political background and no experience in public service. The pit in her stomach was familiar—she'd felt it before giving piano recitals as a child, that sense of not being worthy of the honor. Meanwhile, on the opposite end of the confidence spectrum stood Davíð Oddsson, who was also running for president. As Iceland's former prime minister and central bank governor, he had actually played a significant role in the country's economic collapse. Yet he announced his candidacy with supreme confidence, declaring, "My experience and knowledge, which is considerable, could go well with this office." Despite being identified by Time magazine as one of the twenty-five people to blame for the global financial crisis, Oddsson showed no self-doubt whatsoever. These contrasting candidates illustrate what psychologists call the armchair quarterback syndrome and the impostor syndrome. Armchair quarterbacks like Oddsson overestimate their abilities—they're convinced they know more than the coaches on the sidelines. Those with impostor syndrome, like Halla, underestimate themselves, feeling they don't deserve their success. Research shows this pattern is common: in a meta-analysis of ninety-five studies, women typically underestimated their leadership abilities while men overestimated theirs. The irony is that both extremes can prevent rethinking. The overconfident become blind to their weaknesses, while the underconfident may be too paralyzed to act. Psychologists David Dunning and Justin Kruger found that the least knowledgeable people are often the most confident—a phenomenon now known as the Dunning-Kruger effect. When we lack competence, we lack the metacognitive ability to recognize our incompetence. What we need instead is confident humility—having faith in our capabilities while appreciating that we may not have the right solution or even be addressing the right problem. This mindset gives us enough doubt to reexamine our old knowledge and enough confidence to pursue new insights. As Halla eventually discovered, her impostor feelings could be fuel for growth: "I've learned to use it to my advantage. I actually thrive on the growth that comes from self-doubt." She ended up placing second in the election with 28 percent of the vote, trouncing the overconfident Oddsson, who finished fourth. The sweet spot of confidence isn't about eliminating self-doubt—it's about transforming it from a paralyzing force into a motivating one. When we embrace our doubts rather than deny them, we open ourselves to the possibility of rethinking and growth.
Chapter 3: The Joy of Being Wrong: Embracing Fallibility
Daniel Kahneman, the Nobel Prize-winning psychologist, was attending a conference when he heard a speaker present findings that contradicted one of his own theories. Rather than becoming defensive, Kahneman's eyes lit up and a huge grin spread across his face. "That was wonderful," he said. "I was wrong!" Later, when asked about his unusual reaction, Kahneman explained that in his eighty-five years, no one had pointed out his enthusiasm for being wrong before. "Being wrong is the only way I feel sure I've learned anything," he said. This joy in discovering error stands in stark contrast to how most of us react when our beliefs are challenged. In a famous Harvard study, students were invited to write out their personal philosophies, only to have them aggressively attacked by law students working with the researchers. While many participants found this experience agonizing, others were surprisingly enthusiastic. One described it as "highly agreeable," and another even called it "fun." What made these students different wasn't their intelligence or knowledge, but their relationship to being wrong. For most of us, being wrong feels threatening. When our core beliefs are challenged, it triggers the amygdala, activating a fight-or-flight response. What psychologists call the "totalitarian ego" steps in like a dictator, controlling the flow of information to protect our self-image. We preach our convictions more deeply, prosecute opposing views more passionately, and politick harder for our side. This is why, when presented with evidence that contradicts our views, we often double down rather than reconsider. Jean-Pierre Beugoms, one of the world's best election forecasters, takes a different approach. In 2015, he predicted Donald Trump had a 68% chance of winning the Republican nomination when most experts gave Trump only a 6% chance. What made Beugoms so accurate wasn't just his analysis, but his willingness to update his beliefs. "Accept the fact that you're going to be wrong," he advises. "Try to disprove yourself. When you're wrong, it's not something to be depressed about. Say, 'Hey, I discovered something!'" To find this joy in being wrong, we need to detach our opinions from our identity. As Kahneman put it, "My attachment to my ideas is provisional. There's no unconditional love for them." When our sense of self isn't tied to our current beliefs, we can update them without feeling threatened. We can see our views as hypotheses to be tested rather than treasures to be guarded. The paradox is that the most confident thinkers are often the most willing to reconsider their views. They're secure enough to accept that they might be wrong now because they're confident in their ability to get closer to the truth later. As Jeff Bezos says, "People who are right a lot listen a lot, and they change their mind a lot. If you don't change your mind frequently, you're going to be wrong a lot."
Chapter 4: Constructive Conflict: How Arguments Propel Progress
The Wright brothers were inseparable. As the two youngest boys in their family, Wilbur and Orville did everything together. They launched a newspaper, built a printing press, opened a bicycle shop, and ultimately invented the first successful airplane. People often assumed their success came from their harmony and brotherly bond. Wilbur even said they "thought together." But this common perception misses something crucial about their relationship: they fought constantly. "I like scrapping with Orv," Wilbur once reflected. Their arguments weren't signs of dysfunction—they were the engine of their innovation. When they tackled the seemingly impossible problem of human flight, their disagreements forced them to question assumptions, test alternatives, and refine their thinking. Their father, a bishop, had encouraged intellectual debate from an early age, including books by atheists in his library and inviting the children to challenge those ideas. The Wright brothers developed both the courage to fight for their ideas and the resilience to accept when they were wrong. This pattern of productive conflict appears across domains of excellence. Psychologist Karen "Etty" Jehn distinguishes between two types of conflict: relationship conflict, which involves personal, emotional clashes filled with animosity, and task conflict, which centers on ideas and opinions. While relationship conflict typically harms performance, moderate task conflict can enhance creativity and decision-making. Studies show that teams with early task conflict generate more original ideas in Chinese technology companies, innovate more in Dutch delivery services, and make better decisions in American hospitals. At Pixar Animation Studios, director Brad Bird deliberately recruited "pirates"—disagreeable misfits who challenged conventional thinking—for his film The Incredibles. When technical leaders said his vision would require ten years and $500 million, Bird gathered these rebels and told them no one believed they could succeed. The pirates rose to the occasion, finding innovative workarounds that allowed them to create Pixar's most complex film on a reasonable budget. The movie went on to gross over $631 million worldwide and win an Oscar. What makes task conflict productive is that it focuses on ideas rather than personalities. The Wright brothers might have shouted for hours about propeller designs, but they weren't attacking each other's character. Brad Bird and his producer John Walker had legendary arguments about character details in The Incredibles, but as John explained, "I love working with Brad because he'll give me the bad news straight to my face. It's good that we disagree. It makes the stuff stronger." The lesson is clear: progress often requires friction. As one research team concluded, "The absence of conflict is not harmony, it's apathy." By cultivating environments where ideas can be challenged without threatening relationships, we create the conditions for breakthrough thinking and continuous rethinking.
Chapter 5: Changing Minds: Debates, Persuasion, and the Art of Listening
Harish Natarajan is a world champion debater who has won three dozen international tournaments. In February 2019, he faced off against an unusual opponent—Project Debater, an artificial intelligence developed by IBM. The topic: whether preschools should be subsidized by the government. Going into the debate, 79% of the audience supported subsidies. After the debate, that number dropped to 62%, while opposition more than doubled from 13% to 30%. Despite Project Debater presenting more data and evidence, Harish had changed more minds. How did he do it? Unlike most of us who approach debates as battles to be won, Harish treated it more like a dance. When he took the stage, he immediately acknowledged areas of agreement: "I think we disagree on far less than it may seem." He recognized the validity of some studies his opponent cited before raising his concerns about subsidies as a solution. Rather than overwhelming the audience with dozens of arguments, he focused on just two key points, developing them thoroughly. And instead of making declarations, he posed thoughtful questions that invited the audience to think for themselves. This approach stands in stark contrast to how most of us try to persuade others. When my former student Jamie called me for advice about business school, I bombarded her with rational arguments about why it was a waste of time and money. After listening to my lecture, she shot back: "You're a logic bully!" By overwhelming her with reasons, I had actually made her more resistant to changing her mind. The harder I attacked, the harder she fought back. Research on expert negotiators reveals a better way. In a classic study, researchers found that skilled negotiators spent more than a third of their planning identifying potential common ground. During negotiations, they presented fewer reasons to support their case, avoiding the trap of diluting their strongest points with weaker ones. They rarely went on offense or defense, instead expressing curiosity through questions. Of every five comments they made, at least one ended in a question mark. Michele Hansen embodied this approach when applying for a job she wasn't qualified for. Instead of hiding her shortcomings, she opened her cover letter with: "I'm probably not the candidate you've been envisioning. I don't have a decade of experience as a Product Manager nor am I a Certified Financial Planner." After acknowledging these limitations, she highlighted her strengths and got the job. By showing confident humility, she disarmed potential objections before they could be raised. The most powerful tool for changing minds may be genuine listening. As psychologists have discovered, people are more likely to reconsider their views when they feel heard. When we listen deeply, we give others the space to reflect on their own thinking and potentially discover flaws in their reasoning. As the writer E.M. Forster put it, "How can I tell what I think till I see what I say?" By creating that opportunity for self-expression, we often help others persuade themselves.
Chapter 6: Breaking Stereotypes: From Rivalries to Reconciliation
Daryl Davis, a Black musician, was playing piano at a country music gig when an older white man approached him after the show. The man was astonished that a Black musician could play like Jerry Lee Lewis. When Daryl explained that Lewis himself had acknowledged being influenced by Black musicians, the man was skeptical. As they continued talking over drinks, the man admitted he'd never shared a drink with a Black person before—because he was a member of the Ku Klux Klan. Most people would have walked away from this encounter, but Daryl was driven by a question that had puzzled him since childhood: "How can you hate me when you don't even know me?" This curiosity led him to engage with the Klansman rather than dismiss him. Over time, a friendship developed, and the man eventually left the KKK. Since then, Daryl has convinced over 200 white supremacists to abandon their robes and their hatred. Our minds naturally form stereotypes—simplified beliefs about groups that often spill over into prejudice. Once these views calcify, they become remarkably difficult to change. When Boston Red Sox fans were asked what it would take for them to root for the New York Yankees, one fan replied, "If they were playing Al Qaeda... maybe." This intense animosity isn't unique to sports—we see it in political divisions, corporate rivalries, and cultural conflicts around the world. To understand how to break these entrenched stereotypes, researchers conducted experiments with passionate baseball fans. Simply asking them to list positive qualities of their rivals didn't work. Neither did having them read humanizing stories about individual fans from the opposing team. What finally made a difference was asking them to reflect on the arbitrariness of their animosity: "If you were born into a family of fans of the rival team, you would likely also be a fan of them today." This approach—known as counterfactual thinking—helped fans realize that their hatred wasn't based on rational assessment but on accident of birth. After this reflection, they became less hostile toward their rivals and more willing to engage with them. As one participant noted, "If someone hated me because of the team that I loved, it would feel unfair. Almost like a form of prejudice because they are judging me based on one thing about me and hating me for that reason." Daryl Davis's approach embodies this principle. He doesn't try to convert white supremacists through preaching or condemnation. Instead, he listens to them, asks questions, and helps them see the arbitrariness of their beliefs. When one Klan leader claimed Black people were genetically predisposed to violence, Daryl pointed out that he himself had never shot anyone or stolen a car. When the man insisted Daryl's "criminal gene" must be latent, Daryl turned the argument around, listing famous white serial killers and suggesting the man's own "serial-killer gene" must be dormant. Breaking stereotypes requires more than just exposure to different groups—it requires helping people recognize the arbitrary foundations of their beliefs and imagine how easily they could have held different views. As Daryl reflects, "We are living in space-age times, yet there are still so many of us thinking with stone-age minds. Our ideology needs to catch up to our technology."
Chapter 7: Building Cultures of Learning: Rethinking in Schools and Workplaces
When Luca Parmitano stepped out of the International Space Station for his second spacewalk in July 2013, he had no idea he was about to face a life-threatening crisis. Forty-four minutes into the mission, he felt water collecting in his helmet. As the liquid increased, it covered his eyes, filled his nostrils, and threatened to reach his mouth where he could drown. With his communications failing and vision obscured, Luca somehow managed to navigate back to the airlock by memory and the tension in his safety tether. The investigation revealed that a week earlier, after his first spacewalk, Luca had noticed water droplets in his helmet. He and his colleagues assumed they came from his drinking bag, which they replaced. No one asked the critical question: "How do you know the drink bag leaked?" This normalization of a warning sign nearly cost an astronaut his life—and it wasn't NASA's first such failure. Similar patterns of overlooking red flags had contributed to both the Challenger and Columbia space shuttle disasters. These incidents highlight the difference between a performance culture and a learning culture. In performance cultures, excellence of execution is valued above all else. People become attached to "best practices" and standard operating procedures. When something goes wrong, they focus on who's to blame rather than how to improve. In learning cultures, by contrast, people know what they don't know, doubt their existing practices, and stay curious about new approaches. They create psychological safety—an environment where people can take risks without fear of punishment. Ellen Ochoa, who became deputy director of flight crew operations at NASA just before the Columbia disaster, recognized the need to transform NASA's culture. She started carrying a note card with questions like "What leads you to that assumption?" and "What are the uncertainties in your analysis?" These simple inquiries helped shift the organization from a performance mindset to a learning mindset. Years later, when serving as director of the Johnson Space Center, she voted against a shuttle launch due to sensor concerns despite overwhelming consensus from her team. In the emerging learning culture, "it's not just that we're encouraged to speak up," she explained. "It's our responsibility to speak up." In classrooms, similar transformations are possible. Ron Berger, an elementary school teacher in rural Massachusetts, created a culture of excellence through continuous rethinking. Instead of teaching established knowledge, he presented students with problems to solve together. When he required them to do at least four drafts of their work, other teachers warned that young students would become discouraged. Instead, his students embraced the process, often asking to do eight or ten versions. "Quality means rethinking, reworking, and polishing," Ron explains. "They need to feel they will be celebrated, not ridiculed, for going back to the drawing board." Building a culture of learning requires both psychological safety and accountability. Research shows that when psychological safety exists without accountability, people stay in their comfort zones. When there's accountability without safety, people remain silent out of anxiety. But combining the two creates a learning zone where people feel free to experiment and challenge each other's thinking. Organizations like Amazon implement process accountability by evaluating the quality of decision-making rather than just outcomes. They ask: Even if a decision led to success, was the process thoughtful? If it led to failure but followed a rigorous analysis, was it a valuable experiment? The most powerful learning cultures don't just tolerate rethinking—they celebrate it. They recognize that in a complex, changing world, the ability to question assumptions, update knowledge, and revise approaches isn't just nice to have—it's essential for survival and growth.
Chapter 8: Escaping Tunnel Vision: Rethinking Life and Career Plans
When my cousin Ryan was in kindergarten, he already knew exactly what he wanted to be when he grew up: a doctor. It wasn't just his dream—it was our family's dream. Our great-grandparents had emigrated from Russia and struggled to make ends meet. None of their children had gone to medical school, so our grandparents hoped our generation would bring the prestige of a Dr. Grant to the family. Ryan, the ninth grandchild, seemed destined to fulfill that expectation. Ryan diligently followed the prescribed path. He sailed through the premed curriculum, got accepted to an Ivy League medical school, and secured a neurosurgery residency at Yale. Yet at each step, he experienced flickers of doubt. In college, he expressed interest in economics but abandoned it to stay on track. In medical school, he considered a joint MD-MBA program but feared dividing his attention. During his residency, the grueling hours and intense focus took their toll. He felt that if he died, no one in the system would care or even notice. Still, he told himself he had gone too far to change course. After completing his seven-year neurosurgery residency and an additional year of fellowship, Ryan finally became a neurosurgeon at a major medical center. Despite helping patients and enjoying aspects of his work, he remained unfulfilled. The long hours and bureaucratic frustrations drained his enthusiasm. In his mid-thirties, still in debt from student loans, he told me that if he could do it over, he would have chosen a different path. Ryan's story illustrates what psychologists call identity foreclosure—settling prematurely on a sense of self without enough exploration. When we commit to a path too early, whether in careers, relationships, or life plans, we can become trapped by escalation of commitment. We sink more resources into our chosen direction even when it's not working, rationalizing our choices to protect our egos and validate our past decisions. Ironically, qualities like grit and determination, which are usually beneficial, can sometimes keep us locked in unfulfilling paths. The problem often begins with that seemingly innocent question adults ask children: "What do you want to be when you grow up?" This query encourages a fixed mindset about work and identity. As Michelle Obama writes, "It's one of the most useless questions an adult can ask a child. As if growing up is finite. As if at some point you become something and that's the end." Instead of helping children identify as future doctors, artists, or engineers, we might better serve them by encouraging them to think about careers as actions to take rather than identities to claim. For adults already on a particular path, periodic career checkups can prevent tunnel vision. Questions like "When did you form the aspirations you're currently pursuing, and how have you changed since then?" and "Have you reached a learning plateau in your role or workplace?" can activate rethinking cycles. Rather than planning our entire lives in advance, we might adopt E.L. Doctorow's approach to writing: "It's like driving at night in the fog. You can only see as far as your headlights, but you can make the whole trip that way." This mindset extends beyond careers to all aspects of life. When pursuing happiness becomes our focus, research shows we often become less happy. We're better off seeking meaning through contribution and connection, letting joy arrive as a by-product. As John Stuart Mill observed, "Those only are happy who have their minds fixed on some object other than their own happiness; on the happiness of others, on the improvement of mankind, even on some art or pursuit, followed not as a means, but as itself an ideal end."
Summary
Think Again illuminates the transformative power of intellectual humility in a world that rewards certainty and conviction. Through stories ranging from smokejumpers in burning forests to astronauts in space, from debating champions to reformed white supremacists, Grant shows us that our greatest strength may be our willingness to recognize when we're wrong. The book's central insight is that **rethinking is not a sign of weakness but the ultimate form of growth**—it requires the courage to doubt what we know, the curiosity to explore what we don't, and the humility to recognize the difference. The path to more effective rethinking begins with adopting a scientist's mindset—treating our beliefs as hypotheses to be tested rather than treasures to be protected. We can practice this by building challenge networks of people who question our thinking, embracing the joy of being wrong, and creating cultures where psychological safety and accountability coexist. In our personal lives, we might schedule regular checkups to reconsider our career paths and relationship patterns, asking not "What do I want to be?" but "How do I want to grow?" As Grant reminds us, the most successful thinkers aren't those who are right all the time—they're those who are willing to admit when they're wrong and change their minds accordingly. In a world of constant change, the ability to rethink may be our most essential skill.
Best Quote
“If knowledge is power, knowing what we don’t know is wisdom.” ― Adam M. Grant, Think Again: The Power of Knowing What You Don't Know
Review Summary
Strengths: The review effectively challenges readers to reconsider their beliefs and open their minds to new perspectives. It emphasizes the importance of critical thinking and self-reflection. Weaknesses: The review lacks specific details about the book's content, structure, and writing style, making it difficult to gauge the book's quality or relevance. Overall: The review encourages readers to question their assumptions and consider alternative viewpoints, suggesting that the book offers valuable insights for those willing to engage with diverse perspectives.
Trending Books
Download PDF & EPUB
To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Think Again
By Adam M. Grant