
Extra Life
A Short History of Living Longer
Categories
Nonfiction, Health, Science, History, Technology, Audiobook, Sociology, Medicine, Medical, Historical
Content Type
Book
Binding
Hardcover
Year
2021
Publisher
Riverhead Books
Language
English
ASIN
0525538852
ISBN
0525538852
ISBN13
9780525538851
File Download
PDF | EPUB
Extra Life Plot Summary
Introduction
Imagine standing on a battlefield after the First World War, surveying the carnage left by both combat and the devastating Spanish Flu pandemic that followed. Life expectancy had plummeted, wiping out decades of progress. If someone told you that within a century, humans would routinely live twice as long as they did in 1920, you might have dismissed it as fantasy. Yet this miraculous transformation occurred, giving us what author Steven Johnson calls an "extra life" - approximately 20,000 additional days that the average person now enjoys compared to their ancestors. This remarkable doubling of human life expectancy represents perhaps the most important development in modern history, yet we rarely celebrate or even acknowledge it. The stories behind this achievement aren't typically featured in history textbooks, which tend to focus on wars, political revolutions, and technological breakthroughs. The heroes of this narrative often worked in obscurity: statisticians who tracked mortality rates, engineers who designed sewer systems, physicians who developed vaccines, and public health officials who implemented safety regulations. Through examining these underappreciated turning points, the author helps us understand not just how we gained those extra decades of life, but also what this achievement teaches us about solving large-scale problems in our own time. This exploration speaks to anyone interested in public health, scientific progress, or simply curious about how modern life came to be so fundamentally different from that of our great-grandparents.
Chapter 1: The Great Pandemic of 1918 and Its Aftermath
The story of humanity's extra life begins, ironically, with a devastating event. In early March 1918, a mess cook named Albert Gitchell reported to the infirmary at Camp Funston, Kansas, with fever and muscle pain. Within days, hundreds of other soldiers fell ill with similar symptoms. This marked the beginning of what would become known as the Spanish Flu pandemic, which would eventually kill between 50 and 100 million people worldwide - far more than the First World War that coincided with it. The pandemic spread with frightening speed, carried by troops shipping out to the European front. By May 1918, it had reached Europe, India, China, and beyond. What made this influenza outbreak particularly terrifying was how it disproportionately killed young, healthy adults rather than the very young and elderly who typically succumb to seasonal flu. In the United States, life expectancy plunged by an entire decade virtually overnight. The army scientist Victor Vaughan speculated that if the epidemic continued at its rate of acceleration, "civilization could easily disappear from the face of the earth within a matter of a few more weeks." Yet despite this grim forecast, what followed was not societal collapse but an unprecedented century of health progress. The Spanish Flu marked the last major reversal in global life expectancy. From that point forward, human lifespans began an astonishing upward trajectory that continues to this day. Children born in England in 1920 could expect to live about 41 years; their descendants today enjoy life expectancies in the eighties. Developing nations, particularly China and India, have seen even more dramatic improvements in recent decades. This transformation was not the result of a single miracle cure or technological breakthrough, but rather what Johnson describes as an "invisible shield" constructed piece by piece over decades: chlorination of drinking water, vaccination campaigns, data-driven public health initiatives, food safety regulations, and countless other interventions. What makes this achievement particularly remarkable is how it unfolded largely outside the spotlight of public attention, through incremental improvements that took decades to reveal their true magnitude. The 1918 pandemic, devastating as it was, ultimately served as a turning point that galvanized medical research and public health infrastructure, setting the stage for the dramatic health improvements that followed.
Chapter 2: Vital Statistics: How Data Saved Lives
In the summer of 1866, a deadly cholera outbreak erupted in London's East End after a couple named Hedges died from the disease. Within weeks, the epidemic claimed nearly a thousand lives weekly. While this outbreak seemed like just another chapter in London's long history of disease, it actually marked a crucial turning point in humanity's battle against infectious disease - one driven not by miracle drugs but by the revolutionary power of data. At the center of this story stands William Farr, a doctor and statistician who transformed public health by developing systems to track disease outbreaks with unprecedented precision. Beginning in 1837 as a "Compiler of Abstracts" at England's newly created General Register Office, Farr pioneered what became known as "vital statistics" - the systematic collection and analysis of birth and death data. Rather than viewing each death as an isolated tragedy, Farr saw patterns in mortality that revealed crucial insights about public health. Farr's most influential innovation was his development of the life table, which broke down mortality rates by age group across different communities. In 1843, he published a groundbreaking study comparing life expectancy in metropolitan London, industrial Liverpool, and rural Surrey. The results painted a shocking picture: while rural Surrey residents enjoyed life expectancies approaching fifty years, industrial Liverpool's citizens died at an average age of just twenty-five - one of the lowest life expectancies ever recorded in a large human population. These visualizations made visible what had previously been invisible: the deadly toll of industrial urbanization. What made Farr's approach revolutionary was how it transformed the understanding of disease from something mysterious and inevitable to something that could be tracked, analyzed, and potentially prevented. He developed what became known as "Farr's Law," the first attempt to mathematically describe the rise and fall of epidemics. This pioneering work laid the foundation for modern epidemiology and public health surveillance systems that continue to save millions of lives today. Farr's statistical approach worked in tandem with the famous investigations of physician John Snow, who used data mapping to identify a contaminated water pump as the source of a cholera outbreak in 1854. Together, these pioneers demonstrated that disease was not merely a matter of bad air or divine punishment, but followed discernible patterns that could be interrupted through targeted interventions. Their work led directly to the construction of London's revolutionary sewer system, designed by engineer Joseph Bazalgette, which effectively ended cholera's reign in the city. The legacy of these Victorian data pioneers extends far beyond their immediate achievements. Their methodologies created the template for modern public health systems worldwide, establishing the principle that measuring and analyzing health outcomes is itself a powerful intervention. Today's sophisticated disease surveillance networks, epidemiological models, and public health dashboards all descend from the humble mortality tables that Farr first compiled in the 1830s. As we've seen during recent pandemics, the ability to track and visualize disease outbreaks remains one of humanity's most powerful tools for saving lives.
Chapter 3: From Miracle Cures to Modern Medicine
For most of human history, medical treatments were more likely to harm than heal. In the early 1900s, respected pharmaceutical companies like Parke, Davis & Company proudly sold products containing cocaine, mercury, arsenic, and strychnine. As the medical historian William Rosen noted, "Virtually every page in the catalog of Parke, Davis medications included a compound as hazardous as dynamite, though far less useful." This era of "heroic medicine" involved treatments that seem barbaric today: bloodletting, mercury poisoning, and dangerous laxatives were standard care for everything from fevers to mental illness. The turning point came through two tragic events that transformed how drugs were regulated. In 1937, the S.E. Massengill Company rushed a liquid form of a sulfa drug to market, dissolving it in diethylene glycol - a toxic chemical similar to antifreeze. The resulting "Elixir Sulfanilamide" killed 105 people, mostly children, before it could be recalled. Yet remarkably, the government had no legal authority to prevent such tragedies; the FDA could only ensure that product labels correctly listed ingredients, not that those ingredients were safe. Public outrage over the sulfanilamide tragedy led to the 1938 Food, Drug, and Cosmetic Act, which finally empowered the FDA to evaluate drug safety. But a second crisis would prove equally transformative. In 1960, a new FDA medical reviewer named Frances Oldham Kelsey was assigned to evaluate thalidomide, a sedative widely used in Europe to treat morning sickness in pregnant women. Despite pressure from the manufacturer, Kelsey refused approval, insisting on more evidence of safety. Her stubbornness proved prescient when thalidomide was revealed to cause severe birth defects in thousands of children across Europe. This second crisis prompted the 1962 Kefauver-Harris Amendments, which required drug manufacturers to provide proof not just of safety but of efficacy - meaning drugs had to actually work as claimed. This seemingly obvious requirement had never before been mandated, largely because medicine lacked a reliable method for determining whether treatments were effective. The key innovation that made this possible was the randomized controlled trial (RCT), pioneered by statistician Austin Bradford Hill in the 1940s. Hill's methodology transformed medicine by providing a systematic way to separate effective treatments from placebos. His first landmark study in 1948 tested streptomycin for tuberculosis, establishing a template that would become the gold standard for medical research. Hill and his colleague Richard Doll then used similar methods to conduct the first major study linking smoking to lung cancer in 1950, research that eventually saved millions of lives despite fierce resistance from tobacco companies. The revolution in drug regulation and testing represented a fundamental shift in medicine's relationship with evidence. For the first time, treatments had to demonstrate their value through rigorous scientific testing rather than tradition, authority, or marketing claims. This transformation helped medicine finally break free from centuries of ineffective or harmful practices, establishing the foundation for the remarkable advances in treatment that would follow. While earlier public health measures like clean water and vaccines had already begun extending lifespans, the mid-20th century revolution in drug regulation and testing ensured that medicine itself would finally become a net positive force in human health.
Chapter 4: Public Health Innovations: Chlorination and Pasteurization
In May 1858, a progressive journalist in Brooklyn named Frank Leslie published a shocking exposé denouncing what he called "the wholesale slaughter of the innocents." His target wasn't mobsters or drug dealers, but milk producers who were selling what he termed "liquid poison" to unwitting families. This crusade against contaminated milk revealed one of humanity's most underappreciated life-saving innovations: the purification of everyday liquids through chemistry. The deadly danger of milk seems surprising today, but in mid-19th century American cities, contaminated milk was a major killer of children. The problem had worsened as cities expanded, pushing dairy farms further away. To maintain urban milk supplies, dairy producers established "swill milk" operations, keeping cows in squalid conditions and feeding them waste products from whiskey distilleries. This produced a bluish, unhealthy milk that was often adulterated with chalk or flour to improve its appearance. The result was catastrophic: in New York, more than half of all reported deaths were infants and young children, many from milk-borne diseases. Leslie's journalistic crusade, complete with Thomas Nast's shocking illustrations of diseased cows, eventually led to modest reforms. But the true transformation came through pasteurization - the process of heating milk to kill harmful bacteria, named after Louis Pasteur who developed the technique in the 1860s. Yet despite its scientific validity, pasteurization wasn't widely adopted until Nathan Straus, a department store magnate who had lost two children to disease, became its champion. Starting in 1892, Straus established "milk depots" throughout New York that sold pasteurized milk at below cost to poor families. He also funded a pasteurization plant at an orphanage where mortality rates plummeted after children began receiving sterilized milk. Parallel to the milk revolution, another chemical innovation was quietly transforming public health: water chlorination. Dr. John Leal, medical officer of Jersey City, secretly began adding small amounts of chlorine to the municipal water supply in 1908. Though he was nearly sent to prison for this unauthorized intervention, chlorination proved remarkably effective at killing waterborne pathogens without harming humans. The innovation spread rapidly to other cities, dramatically reducing deaths from cholera, typhoid, and other waterborne diseases. The combined impact of pasteurization and chlorination was astonishing. Between 1900 and 1930, infant mortality in the United States dropped by 62 percent. Harvard researchers David Cutler and Grant Miller determined that water filtration techniques like chlorination were responsible for more than two-thirds of this improvement. These chemical interventions transformed liquids that had once been major vectors of disease into safe, life-sustaining necessities. What makes these innovations particularly notable is how they emerged not primarily through laboratory breakthroughs but through persistent advocacy and public campaigns. Science provided the understanding, but implementation required champions like Frank Leslie, Nathan Straus, and John Leal who were willing to challenge convention and battle entrenched interests. Their stories remind us that major public health advances often depend as much on persuasion and political will as on scientific discovery itself.
Chapter 5: Safety Regulations: The Silent Protectors
On August 31, 1869, an Irish scientist and aristocrat named Mary Ward took a ride in an experimental steam-powered vehicle with her husband and cousin through the Irish countryside. While rounding a corner at less than four miles per hour, she was thrown from the carriage and crushed by its rear wheels, dying almost instantly. This tragic accident earned Ward a morbid distinction: she is believed to be the first person killed in a motor vehicle accident, marking the beginning of humanity's complicated relationship with machine-related death. As the Industrial Revolution accelerated, mechanical dangers proliferated. Railways were particularly deadly; in the United States of the 1880s, railway workers had a 1-in-117 chance of dying in an industrial accident each year. These statistics prompted the passage of the 1893 Safety Appliance Act, requiring railroads to install power brakes and automatic couplers - marking the first American law focused primarily on workplace safety. But this was merely the opening chapter in humanity's battle against the machines we ourselves created. The automobile would prove to be the deadliest machine of all. Since 1913, more than four million Americans have died in car accidents - three times the number killed in all American military conflicts combined. At its peak, automobile travel was astonishingly dangerous, with fifteen fatalities for every 100,000 miles driven in 1935. Yet the automobile industry largely dismissed these deaths as inevitable consequences of physics, claiming that the forces in a crash were simply too great for the human body to survive. This fatalistic view was challenged by two remarkable innovators. Hugh DeHaven, a pilot who had miraculously survived a plane crash, conducted pioneering "egg drop" experiments in the 1930s, demonstrating that proper "packaging" could protect fragile objects - including humans - from tremendous impacts. Colonel John Stapp took this research further, subjecting himself to extreme deceleration forces on rocket sleds to prove that the human body could withstand much greater g-forces than previously believed. In one famous 1954 test, Stapp decelerated from 632 miles per hour to zero in just 1.4 seconds, suffering only temporary vision loss and bruising. Despite this evidence that crashes could be survivable, most automakers resisted safety innovations. The breakthrough came from an unexpected source: Volvo hired aviation engineer Nils Bohlin, who in 1959 developed the three-point seat belt. This elegant design reduced fatalities by 75 percent, yet Volvo chose not to enforce the patent, making it freely available to all manufacturers. Even then, American car companies balked until consumer advocate Ralph Nader published his 1965 exposé "Unsafe at Any Speed," sparking public outrage that led to the 1966 National Traffic and Motor Vehicle Safety Act, which mandated seat belts in all new cars. The results of these safety innovations have been stunning. Today, the fatality rate per mile driven is less than one-tenth what it was in the 1930s. Similar improvements have occurred across all domains of mechanical safety, from industrial machinery to consumer products. These advances didn't emerge naturally from market forces; they required persistent advocacy, government regulation, and a fundamental shift in how we viewed machine-related deaths - not as inevitable tragedies but as preventable events. The story of safety regulation demonstrates that progress often requires challenging fatalistic assumptions. Just as public health pioneers refused to accept that cities were inevitably disease-ridden, safety advocates refused to accept that mechanical interactions had to be deadly. Their success reminds us that many "inevitable" risks in our world today might similarly yield to human ingenuity and social action.
Chapter 6: The Quest for Food Security and Nutrition
For most of human history, famine represented one of humanity's most relentless enemies. In May 1915, as World War I raged across Europe, a devastating famine began to take hold in Persia (modern-day Iran). Within two years, food prices had skyrocketed - wheat costing up to twenty dollars per bushel, an astronomical sum at the time. Eyewitness accounts described people eating dead animals and women abandoning their infants, unable to feed them. When the British major general L.C. Dunsterville arrived in Persia, he encountered a country on the verge of collapse, with bodies lying unburied in the streets. The Great Persian Famine of 1916-18 was just the beginning of a wave of catastrophic food shortages that swept across the world in the 1920s, killing upwards of fifty million people. These tragedies were not unusual by historical standards; famine had been a regular feature of agricultural societies for millennia. The Irish Potato Famine of the 1840s had killed approximately one-eighth of Ireland's population and forced another quarter to emigrate. Even in relatively prosperous 18th-century Europe, diets were comparable to those in the poorest countries of the late 20th century. The transformation began with a scientific breakthrough that emerged from an unlikely source: munitions manufacturing. In the early 1900s, a German chemist named Fritz Haber developed a process for synthesizing ammonia from atmospheric nitrogen. His colleague Carl Bosch then designed a system to produce this artificial nitrogen at industrial scale. While initially developed to produce explosives for the German war effort, the Haber-Bosch process had a world-changing secondary effect: it enabled the mass production of artificial fertilizer. For the first time in history, farmers could supplement soil fertility without relying on natural sources like manure or guano (bird droppings). The impact was revolutionary. Agricultural yields soared, allowing far more food to be produced on the same amount of land. This single innovation is estimated to have made it possible to feed an additional 4 billion people - roughly half the current world population. The transformation continued with the industrialization of meat production, particularly chicken. In the 1920s, Cecile Steele of Sussex County, Delaware, accidentally received 500 chicks instead of the 50 she had ordered. Rather than return them, she raised them for meat, pioneering what would become the modern broiler industry. By the 1950s, factory farming had revolutionized protein production through specialized feeds (often supplemented with antibiotics) and confined animal feeding operations. Today, there are approximately 23 billion chickens on earth - more than all other bird species combined - providing affordable protein that was unimaginable a century ago. The results of these agricultural revolutions have been astonishing. Famine deaths have plummeted from approximately 50 million between 1940 and 1980 to just 5 million in the four decades since, despite significant population growth. In developing countries, the prevalence of undernourishment has dropped from over one-third of the population in 1970 to about 10 percent today. This remarkable achievement came with significant environmental costs - from fertilizer runoff creating dead zones in oceans to greenhouse gas emissions from intensive agriculture - but it fundamentally transformed human health and society by ending the ancient scourge of mass starvation.
Chapter 7: Global Cooperation: Fighting Diseases Together
In 1851, an unusual gathering took place in Paris: physicians and diplomats from twelve European nations met to discuss ways to collaborate on public health issues. This modest meeting, known as the International Sanitary Conference, marked one of the first times in history that countries formally joined forces to combat disease. While initially focused on standardizing quarantine procedures for cholera, these conferences ultimately led to the formation of the Office International d'Hygiene Publique in 1907, the precursor to today's World Health Organization (WHO). This spirit of international cooperation would achieve its greatest triumph in the global campaign to eradicate smallpox, a disease that had plagued humanity for thousands of years. The effort began in earnest after a 1958 speech by Soviet physician Victor Zhdanov at a WHO meeting, where he called for member nations to commit to the ambitious goal of eliminating the virus entirely. Despite the Cold War tensions that dominated the era, the United States and Soviet Union found a way to collaborate on this monumental project. The eradication campaign faced enormous challenges. It required tracking down cases in remote regions, developing new vaccination techniques, and overcoming cultural barriers to immunization. A key breakthrough came when epidemiologist William Foege, working in Nigeria in 1966, developed what became known as "ring vaccination" - a targeted approach that focused on vaccinating people in a protective circle around outbreak zones rather than attempting to vaccinate entire populations. This strategy proved remarkably effective at containing the virus with limited vaccine supplies. On October 26, 1977, the last naturally occurring case of smallpox was recorded in Somalia. A few years later, on May 8, 1980, the World Health Assembly officially declared smallpox eradicated - the first and still only human disease ever completely eliminated from the wild. This achievement stands as perhaps humanity's greatest collaborative triumph, saving an estimated 200 million lives since eradication. The smallpox victory inspired similar global campaigns against other diseases. The Global Polio Eradication Initiative, launched in 1988, has reduced polio cases by more than 99.9 percent worldwide. The international response to the HIV/AIDS pandemic, though initially slow, eventually mobilized unprecedented resources through organizations like the Global Fund to Fight AIDS, Tuberculosis and Malaria, providing life-saving treatment to millions. More recently, the Medicines for Malaria Venture has coordinated efforts between pharmaceutical companies, academic institutions, and philanthropic organizations to develop new treatments for a disease that still kills over 400,000 people annually. These cooperative efforts extend beyond infectious diseases. The World Food Programme coordinates responses to food emergencies, preventing localized shortages from developing into full-blown famines. International health regulations help monitor and contain potential outbreaks before they become pandemics. Research consortiums share data and resources to accelerate medical breakthroughs. What makes these collaborative achievements particularly remarkable is how they transcend political, economic, and cultural differences. Nations that find themselves in conflict on virtually every other issue have nevertheless found ways to work together on health challenges that threaten all humanity. This tradition of cooperation, though often strained by nationalism and competing interests, represents one of our most valuable resources for addressing future threats, from emerging infectious diseases to the health impacts of climate change.
Summary
The doubling of human life expectancy over the past century represents perhaps the most significant improvement in the human condition ever achieved. This extraordinary feat was not the result of a single breakthrough or heroic genius, but rather the culmination of countless innovations working in concert: vaccines that trained our immune systems to fight deadly pathogens; data systems that revealed previously invisible patterns of disease; infrastructure that delivered clean water and removed waste; regulations that ensured food and drug safety; agricultural techniques that ended mass starvation. Each innovation built upon others, creating a virtuous cycle that transformed human health and society. The stories behind this transformation offer powerful lessons for our present challenges. First, many of humanity's greatest achievements have emerged not from market forces or individual brilliance alone, but from effective collaboration between scientists, activists, government officials, and ordinary citizens. Second, progress often requires challenging fatalistic assumptions about what problems are "inevitable" versus solvable. Third, our most successful interventions typically work by amplifying the body's natural defenses or designing environments that prevent harm, rather than heroic interventions after illness strikes. As we face new threats from emerging diseases, climate change, and technological risks, the history of our "extra life" reminds us of our remarkable capacity to overcome seemingly intractable problems through collective action guided by scientific understanding. The past century's achievements in extending human life were not inevitable; they were the result of deliberate choices and sustained effort - a pattern we would do well to remember and replicate as we confront the health challenges of our own time.
Best Quote
“That is often how new ideas come into the world: someone perceives a signal where others would instinctively perceive noise.” ― Steven Johnson, Extra Life: A Short History of Living Longer
Review Summary
Strengths: The review highlights the book's comprehensive exploration of the history and complexity behind life-extending innovations, emphasizing the involvement of a diverse range of contributors beyond popular narratives. It also appreciates the book's structured ranking of innovations based on their impact on saving lives.\nOverall Sentiment: The sentiment appears to be positive, appreciating the depth and breadth of the book's analysis of life-extending innovations and their historical context.\nKey Takeaway: The book underscores that significant life-extending innovations often faced initial resistance and were developed through the efforts of a wide array of contributors, including activists and academics, rather than solely by corporations or start-ups.
Trending Books
Download PDF & EPUB
To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Extra Life
By Steven Johnson