
Cannibalism
A Perfectly Natural History
Categories
Nonfiction, Psychology, Science, History, Animals, Nature, Anthropology, Audiobook, Biology, Natural History
Content Type
Book
Binding
Hardcover
Year
2017
Publisher
Algonquin Books
Language
English
ISBN13
9781616204624
File Download
PDF | EPUB
Cannibalism Plot Summary
Introduction
When most people hear the word "cannibalism," they likely picture horrific scenes from horror movies or sensationalized accounts of isolated tribal practices. Yet the reality of consuming one's own species is far more complex, widespread, and scientifically fascinating than popular culture suggests. Cannibalism occurs naturally across thousands of animal species, from insects to mammals, serving various evolutionary functions that help individuals and populations survive. Even in human societies, the practice has appeared in contexts ranging from survival emergencies to medicinal treatments that were once mainstream in European pharmacies. This exploration into cannibalism reveals surprising truths that challenge our assumptions about nature and human behavior. You'll discover how consuming conspecifics can actually be an adaptive strategy rather than a pathological behavior, why certain species have evolved specific cannibalistic adaptations, and how Western civilization maintained a remarkable double standard—condemning cannibalism among indigenous peoples while simultaneously consuming human-derived medicines. By examining this taboo subject through a scientific lens, we gain insights not just about an unusual behavior, but about resource management, cultural hypocrisy, and the extraordinary lengths to which all species will go to ensure their survival.
Chapter 1: The Natural Prevalence of Cannibalism in the Animal Kingdom
Cannibalism, far from being an aberration in nature, occurs in over 1,500 species across virtually every major animal group. This widespread distribution suggests that consuming members of one's own species represents a normal ecological strategy rather than deviant behavior. Among insects, female praying mantises famously consume their mates after copulation, providing nutrients that help produce healthy offspring carrying the male's genes. Similarly, certain spiders engage in sexual cannibalism, with males sometimes actively positioning themselves to be consumed during mating—a seemingly self-destructive behavior that actually increases reproductive success by allowing longer copulation. The vertebrate world offers equally compelling examples of cannibalistic adaptations. Sand tiger sharks practice intrauterine cannibalism, where the first embryo to develop teeth consumes its siblings while still in the mother's womb. This ensures that only the strongest offspring survives, emerging as an experienced predator from birth. Amphibians like spadefoot toads can transform into cannibalistic morphs with specialized jaw structures when environmental conditions deteriorate, allowing them to grow faster and metamorphose sooner to escape drying ponds. Even typically social animals like chimpanzees occasionally practice infanticide and cannibalism, particularly against infants from neighboring communities. Scientists have identified several evolutionary benefits that explain why cannibalism persists despite its apparent costs. First, it provides nutrition during food shortages, converting competitors into calories. Second, it eliminates rivals for limited resources like food, territory, or mates. Third, it can remove weak or injured individuals from the population, potentially improving overall genetic fitness. Fourth, in some species, it increases reproductive success through mechanisms like sexual cannibalism or filial cannibalism (consuming one's own offspring to reclaim energy for future reproduction). The prevalence of cannibalism often fluctuates with environmental conditions, revealing its role as an adaptive response to ecological pressure. Under normal circumstances, many species avoid consuming conspecifics due to risks like disease transmission, injury from fighting similarly-sized prey, or reducing inclusive fitness by eating relatives. However, when resources become scarce or population densities reach critical thresholds, these costs are outweighed by the benefits of obtaining nutrition and eliminating competition. This pattern appears consistently across diverse taxonomic groups, from insects to mammals. Cannibalism in nature follows predictable patterns that help scientists understand its ecological function. It occurs more frequently in species with high reproductive rates, significant size differences between age classes, and unstable environments. Many cannibalistic species have evolved specific adaptations to facilitate the behavior, such as specialized morphology (like the enlarged jaws of certain salamander larvae) or behavioral mechanisms to recognize and avoid consuming close relatives. These adaptations highlight how natural selection has shaped cannibalism as a legitimate ecological strategy rather than a random occurrence. Understanding the natural prevalence of cannibalism challenges our human tendency to view nature through a moralistic lens. What appears horrific from a human perspective often represents a practical solution to ecological challenges from an evolutionary standpoint. By recognizing cannibalism as a normal part of nature's repertoire, we gain insights into how resource limitations shape behavior and the complex strategies that organisms employ to maximize their survival and reproductive success in challenging environments.
Chapter 2: Ecological Drivers: When Resources Determine Who Eats Whom
Resource scarcity stands as the primary ecological driver of cannibalism across the animal kingdom. When essential resources like food, water, or space become limited, the benefits of consuming conspecifics often outweigh the costs. During droughts, certain amphibian species develop cannibalistic morphs with enlarged heads and specialized teeth specifically adapted for consuming smaller members of their own species. This transformation allows some individuals to survive when typical food sources disappear, essentially converting competition into sustenance. Similarly, in crowded aquarium conditions, many fish species begin consuming their own young when other food sources are depleted, a behavior rarely observed in uncrowded natural habitats. Population density interacts with resource availability to trigger cannibalistic episodes in numerous species. The relationship between these factors is dramatically illustrated by Mormon crickets, which form massive migratory swarms when population densities reach critical thresholds. These swarms become moving cannibalistic fronts, with each cricket pursuing the one ahead while simultaneously being pursued from behind. Individuals that slow down or become injured are quickly consumed by their fellows. Research has revealed that protein and salt deficiencies drive this behavior, with each cricket seeking these nutrients from the nearest available source—other crickets. This creates a macabre "forced march" driven by nutritional needs rather than predator avoidance as previously thought. Environmental unpredictability selects for cannibalistic traits, especially in species inhabiting temporary habitats. Spadefoot toad tadpoles develop into either omnivorous or cannibalistic forms depending on pond conditions. When rainfall is sparse and ponds are likely to dry quickly, the cannibalistic morph provides a faster growth rate, allowing metamorphosis before the habitat disappears completely. This represents a form of phenotypic plasticity—the ability of a single genotype to produce different physical characteristics in response to environmental conditions. Similar adaptations appear in many insects, amphibians, and fish that inhabit ephemeral environments, suggesting that cannibalism often functions as an evolutionary response to unpredictability. Cannibalism can serve as a population regulation mechanism, preventing boom-and-bust cycles that might otherwise lead to ecosystem collapse. In crowded conditions, many species increase their cannibalistic behavior, effectively reducing population density to sustainable levels. This self-regulating function appears in diverse taxa from insects to mammals. For instance, in captive situations, chickens confined in overcrowded conditions may begin pecking each other to death and consuming the victims, a behavior so common that the poultry industry developed specialized "chicken glasses" with red lenses to prevent birds from seeing blood that might trigger cannibalistic frenzies. In nature, such density-dependent cannibalism helps maintain populations within the carrying capacity of their environments. The ecological benefits of cannibalism must outweigh its costs for the behavior to persist. Potential costs include disease transmission (consuming conspecifics increases the risk of contracting species-specific pathogens), reduced inclusive fitness (eating relatives removes shared genes from the population), and injury risk (attempting to consume similarly-sized conspecifics can be dangerous). Species that regularly practice cannibalism have evolved mechanisms to mitigate these costs, such as the ability to recognize and avoid consuming close relatives, as seen in many cannibalistic amphibians that preferentially eat non-siblings. These adaptations suggest that cannibalism has been subject to natural selection over evolutionary time. Understanding the ecological drivers of cannibalism provides insights into how resource limitations shape behavior across the animal kingdom. Rather than representing aberrant behavior, cannibalism emerges as a predictable response to specific environmental conditions—a strategy that allows individuals and populations to survive when resources become scarce or unpredictable. This ecological perspective helps explain why cannibalism has evolved independently in so many different lineages and why it persists despite its apparent costs. For many species, consuming conspecifics represents not a failure of natural systems but an adaptation to their inherent constraints.
Chapter 3: Human Survival Cannibalism: The Breaking Point of Taboos
Human survival cannibalism emerges in situations of extreme deprivation where normal food sources have been exhausted and death by starvation becomes imminent. Unlike ritual or cultural cannibalism, survival cannibalism represents a last-resort response to dire circumstances rather than a preferred practice. Historical records document numerous cases occurring during famines, sieges, shipwrecks, and expeditions gone wrong. The Donner Party tragedy of 1846-1847, where pioneers trapped in the Sierra Nevada mountains by winter storms resorted to consuming the dead, stands as perhaps the most infamous American example. Similar situations occurred during the Siege of Leningrad in World War II, where approximately 2,000 people were arrested for cannibalism during the 872-day German blockade that caused massive starvation. The physiological progression of starvation follows a predictable pattern that helps explain why people eventually resort to cannibalism. As the body depletes its reserves, it first burns through carbohydrates stored in the liver and muscles, then moves on to fat reserves, and finally begins breaking down proteins from muscles and organs. This internal self-consumption mirrors the external cannibalism that may follow. Physical symptoms progress from lethargy and weakness to the characteristic "mask of famine"—sunken cheeks, protruding bones, and an apathetic stare. Cognitive function deteriorates, making complex moral decisions increasingly difficult as the brain itself suffers from inadequate nutrition. This physiological decline helps explain how normally taboo behaviors become acceptable when survival is at stake. Social dynamics during starvation crises follow recognizable patterns that anthropologists have documented across different cultures and time periods. Initially, groups facing food shortages often show increased cooperation and resource sharing. As starvation progresses, however, social bonds begin to fray, with groups fragmenting along family lines and altruistic behavior declining. In the terminal phase, social order may collapse entirely, with individuals focusing solely on personal survival. This progression helps explain how normally taboo behaviors like cannibalism can eventually become acceptable within a starving group. Survivors of the 1972 Andes flight disaster, where members of a Uruguayan rugby team consumed deceased passengers after their plane crashed high in the mountains, reported that their initial horror at the idea gradually gave way to a pragmatic acceptance as starvation advanced. The psychological aftermath for survivors of cannibalism episodes reveals complex responses. Many experience profound guilt, shame, and trauma that persists long after the crisis has passed. Others develop psychological mechanisms to rationalize their actions, often framing the consumption of the dead as a way of honoring them or ensuring their sacrifice was not in vain. The Uruguayan rugby team survivors initially concealed their actions but later acknowledged them publicly, justifying their decision through what became known as the "communion defense," likening their consumption of the dead to the Catholic sacrament where believers symbolically consume Christ's body. This religious framing helped them reconcile their actions with their moral framework. Gender and family dynamics influence survival outcomes in cannibalism situations in surprising ways. Women typically outlast men in starvation conditions due to physiological advantages: they metabolize protein more slowly, require fewer daily calories, and possess greater fat reserves that provide both nutrition and insulation. Family members fare better than individuals, as social bonds provide emotional support that helps mitigate stress hormones like cortisol, which can impair immune function and healing. These patterns appear consistently across historical cases, from Arctic expeditions to concentration camps, suggesting biological rather than cultural foundations. From an evolutionary perspective, survival cannibalism can be understood as an adaptive response that increases individual fitness in life-threatening situations. The strong taboo against cannibalism in most human societies likely evolved because, under normal circumstances, the risks of consuming human flesh (including disease transmission and social conflict) outweigh the benefits. However, when starvation becomes the greater immediate threat, this cost-benefit calculation shifts dramatically. This evolutionary framework helps explain why survival cannibalism emerges so predictably across different cultures and historical periods when humans face extreme food scarcity, despite being universally taboo under normal conditions.
Chapter 4: Kuru and Prions: The Deadly Disease Spread Through Ritual Consumption
Kuru, a fatal neurodegenerative disease that devastated the Fore people of Papua New Guinea, represents the most direct link between cannibalism and human disease transmission. First documented by Western researchers in the 1950s, kuru manifested through progressive symptoms including tremors, loss of coordination, uncontrolled emotional outbursts (including pathological laughter that led to its nickname "the laughing death"), and eventually total incapacitation and death. The disease predominantly affected women and children, with very few adult men succumbing to it—a pattern that would prove crucial to understanding its transmission mechanism. The breakthrough in understanding kuru came when researchers connected the disease to the Fore's funerary practices, which involved ritual consumption of deceased relatives. Anthropological investigations revealed that women and children primarily participated in these mortuary feasts, with women handling and preparing the bodies and sharing the meal with their children. Men rarely participated, explaining the skewed gender distribution of the disease. The practice was considered an act of respect and love, allowing the deceased to nourish their family rather than being consumed by worms. Most significantly, the Fore consumed all parts of the body, including the brain, which contained the infectious agent responsible for kuru. The scientific understanding of kuru took a dramatic turn when Dr. Carleton Gajdusek successfully transmitted the disease to chimpanzees by injecting them with brain tissue from kuru victims. This experiment, which earned Gajdusek a Nobel Prize, proved kuru was transmissible but raised perplexing questions about the infectious agent. Unlike typical pathogens, the kuru agent produced no immune response, survived treatments that would destroy viruses and bacteria, and had an extraordinarily long incubation period—sometimes decades between exposure and symptom onset. These unusual properties challenged conventional understanding of infectious diseases and led to one of the most revolutionary discoveries in modern medicine. The mystery of kuru's causative agent was eventually solved by Dr. Stanley Prusiner, who proposed a revolutionary concept: the prion. Unlike all other known infectious agents, prions contain no genetic material (DNA or RNA). Instead, they are misfolded proteins that can induce normal proteins to adopt their abnormal configuration, creating a cascade of protein misfolding that gradually destroys brain tissue. This explained why kuru-infected tissue triggered no immune response—the prions weren't foreign invaders but corrupted versions of the body's own proteins. Prusiner's prion theory, which initially faced significant skepticism from the scientific community, eventually earned him a Nobel Prize as evidence accumulated supporting this unprecedented mechanism of disease transmission. Kuru belongs to a family of diseases called transmissible spongiform encephalopathies (TSEs), which includes Creutzfeldt-Jakob disease in humans, scrapie in sheep, and bovine spongiform encephalopathy (BSE or "mad cow disease") in cattle. The connection between kuru and these other diseases became alarmingly relevant in the 1980s and 1990s when a variant of Creutzfeldt-Jakob disease emerged in the United Kingdom, linked to consumption of beef from cattle infected with BSE. This modern epidemic had striking parallels to kuru, as it involved prions jumping between species through consumption of infected neural tissue—essentially, a form of cross-species cannibalism facilitated by industrial food production practices that included feeding cattle with protein supplements derived from other cattle. The kuru story illustrates how cultural practices can interact with biological agents to create devastating disease outbreaks. When Australian authorities prohibited cannibalism among the Fore in the late 1950s, new cases of kuru gradually declined, though some individuals who had been exposed decades earlier continued to develop the disease into the early 2000s due to the extraordinarily long incubation period. The scientific investigation of kuru not only saved the Fore from extinction but revolutionized our understanding of infectious disease, demonstrating that proteins alone, without genetic material, can transmit deadly conditions. This discovery continues to influence research on more common neurodegenerative disorders like Alzheimer's and Parkinson's diseases, which also involve protein misfolding, though not necessarily through infectious transmission.
Chapter 5: Cultural Cannibalism: Ritual, Medicine, and Funeral Practices
Ritual cannibalism—the culturally sanctioned consumption of human flesh or body parts—has existed across diverse societies for reasons ranging from religious reverence to expressions of dominance. Anthropologists distinguish between endocannibalism (consuming deceased members of one's own community) and exocannibalism (consuming enemies or outsiders). Endocannibalism typically served as a form of mortuary ritual rather than a nutritional practice. The Wari' people of the western Amazon, for instance, consumed portions of their dead relatives until the 1960s. Anthropologist Beth Conklin documented how this practice represented an act of compassion and respect. By consuming the deceased, family members helped release the spirit from its physical form while literally incorporating the loved one into themselves. For the Wari', burying bodies in the ground was considered disrespectful and distressing, while cannibalism provided proper closure for grieving relatives. Funerary cannibalism followed specific protocols that reflected complex cultural beliefs about death, the body, and spiritual transformation. Among the Fore people of Papua New Guinea, mortuary cannibalism involved elaborate preparation and distribution of the deceased's body. Different parts were allocated according to kinship relations and social status, with the brain—ironically the source of kuru infection—often reserved for women and children as a mark of respect. These practices were not haphazard but followed strict cultural rules that maintained social order during the traumatic period of bereavement. Far from being evidence of "savagery," as colonial observers often claimed, these rituals represented sophisticated systems for managing grief and reaffirming community bonds in the face of loss. Warfare-related cannibalism served different cultural functions, often connected to beliefs about power, dominance, and the absorption of enemies' qualities. Some Amazonian groups reportedly consumed parts of slain enemies to acquire their strength, courage, or other desirable qualities. The Aztec practice of heart extraction and partial cannibalism of sacrificial victims was embedded within a complex cosmological system where human sacrifice was believed necessary to sustain the universe. However, many historical accounts of such practices were exaggerated or fabricated by colonial powers to justify conquest and subjugation. Christopher Columbus and subsequent European explorers frequently labeled indigenous peoples as "cannibals" to classify them as subhuman, thereby legitimizing enslavement and genocide. This pattern repeated across the Americas, Africa, and the Pacific islands, making it difficult to separate historical truth from colonial propaganda. Medicinal cannibalism represents perhaps the most surprising cultural context, particularly because it was widely practiced in Europe until the early 20th century. European physicians routinely prescribed "mummy powder" (ground-up Egyptian mummies) for ailments ranging from headaches to stomach ulcers. Human blood, fat, and bone were standard ingredients in pharmacies across Europe. King Charles II of England reportedly drank a mixture called "King's Drops" containing human skull powder. This practice extended to China as well, where medical texts dating back thousands of years detailed the therapeutic uses of various human tissues. The Chinese practice of consuming placenta after childbirth, which has recently gained popularity in Western countries, represents a continuation of this tradition of medicinal cannibalism. Religious contexts have also incorporated cannibalistic elements, though interpretations vary widely. The Christian Eucharist, in which believers consume bread and wine representing (or in some denominations, literally transformed into) the body and blood of Christ, has been interpreted by some scholars as a symbolic form of cannibalism. This ritual commemorates Jesus's instruction at the Last Supper to "take, eat; this is my body" and "drink from it, all of you; for this is my blood." The Catholic doctrine of transubstantiation holds that the bread and wine physically become Christ's body and blood, though they retain their sensory properties. This belief led to accusations that Christians practiced a form of ritual cannibalism, charges that early Christians vigorously denied. Ironically, Christians later used similar accusations against Jews, claiming they desecrated communion hosts and used Christian blood in rituals—false allegations that justified centuries of persecution. Understanding ritual cannibalism requires suspending our cultural biases to recognize that practices we find abhorrent may have held profound meaning and purpose in other societies. These customs often reflected sophisticated belief systems about the relationship between the physical and spiritual worlds, the proper treatment of the dead, and the transmission of vital essence between individuals. Rather than indicating savagery, ritual cannibalism typically occurred within highly structured cultural frameworks with specific rules governing who could participate and under what circumstances. This anthropological perspective helps us recognize the cultural relativity of taboos and the diverse ways humans have conceptualized death, the body, and community across different societies.
Chapter 6: Western Hypocrisy: Europe's Hidden History of Medicinal Cannibalism
Medicinal cannibalism, the consumption of human body parts for therapeutic purposes, was widely practiced in Europe from the Middle Ages through the 18th century. This surprising aspect of Western medical history has been largely erased from popular memory, creating the false impression that cannibalism was exclusively a practice of "primitive" non-European cultures. In reality, European physicians, apothecaries, and even royalty regularly prescribed and consumed remedies containing human blood, fat, bone, skull, and other tissues. These practices were not conducted in secret but openly documented in medical texts and pharmacopoeias of the era, including the influential works of Nicholas Culpeper and the Royal Pharmacopoeia of London. The theoretical foundation for medicinal cannibalism in Europe stemmed largely from the work of Paracelsus (1493-1541), the influential Swiss physician who promoted the idea that "like cures like." According to this principle, consuming human body parts could heal corresponding parts in the living patient. This belief system was further supported by the doctrine of the four bodily humors, which suggested that consuming certain human tissues could restore proper humoral balance. These ideas gained widespread acceptance among the educated classes, with human-derived medicines prescribed by the most respected physicians of the day to patients of all social ranks, including European nobility and royalty. The range of human-derived remedies was extensive and specific. Human skull, often ground into a powder called "cranium humanum," was prescribed for epilepsy and other neurological conditions. Human fat, obtained from executed criminals or battlefield casualties, was applied topically for joint pain and rheumatism. Blood was consumed fresh to treat epilepsy, while mummified human flesh (known as "mumia") was prescribed for everything from headaches to stomach ulcers. Even human skin, bones, and organs found their way into the European pharmacopoeia, each assigned specific therapeutic properties. These remedies were available at standard apothecaries and listed in official medical compendia alongside plant and animal-based medicines. Perhaps the most sought-after human-derived medicine was "mumia," originally referring to bitumen-preserved Egyptian mummies. As demand outstripped the limited supply of actual Egyptian mummies, European entrepreneurs created "artificial mummies" by preparing recently deceased bodies with herbs and spices. Medical texts provided detailed instructions for creating these substitutes, specifying that the ideal source was "the cadaver of a redheaded man, age 24, who had been hanged" and had died a violent death, as such bodies were believed to contain more potent vital energy. This macabre industry continued well into the 18th century, with "mumia" listed in pharmaceutical catalogs as late as 1908. The social acceptance of medicinal cannibalism in Europe created a bizarre double standard in colonial encounters. Europeans expressed horror at indigenous practices that involved consuming human flesh, while simultaneously trafficking in human body parts for their own medicinal use. When Europeans encountered the medicinal use of human remains in other cultures, such as in China, they often failed to recognize the similarity to their own practices. This cognitive dissonance reveals how cultural framing can transform essentially identical behaviors into either acceptable medical practice or evidence of barbarism. The European distinction between "civilized" medicinal cannibalism and "savage" ritual cannibalism had more to do with cultural prejudice than with any substantive difference in the practices themselves. The decline of medicinal cannibalism in Europe coincided with the Enlightenment and the rise of modern scientific medicine. As understanding of anatomy, physiology, and disease mechanisms improved, the theoretical justifications for consuming human tissues lost credibility. Changing attitudes toward the human body, increasing disgust sensitivity, and new concepts of hygiene also contributed to the practice's demise. By the early 19th century, medicinal cannibalism had largely disappeared from mainstream European medicine, though some folk remedies incorporating human materials persisted in rural areas. This history was subsequently minimized or omitted from medical histories, creating the false impression that Western culture had always maintained a strict taboo against all forms of cannibalism. This hidden history of European medicinal cannibalism challenges the narrative of Western moral superiority that often framed colonial encounters. It reveals that the taboo against cannibalism has been selectively applied, with similar practices labeled as either medicine or savagery depending on who performed them and in what cultural context. Understanding this history helps us recognize how cultural biases shape our perception of practices across different societies and how easily we can overlook contradictions in our own cultural traditions while judging those of others.
Chapter 7: Modern Manifestations: From Placentophagy to Lab-Grown Human Tissue
Placentophagy, the consumption of the placenta after childbirth, represents the most widespread form of cannibalism in contemporary Western society. This practice has gained significant popularity in recent decades, particularly among middle-class women in the United States, Australia, and parts of Europe. Proponents claim numerous benefits, including reduced postpartum depression, increased energy, improved lactation, and faster recovery after childbirth. The placenta may be consumed raw, cooked in meals, blended into smoothies, or more commonly, dehydrated, ground into powder, and encapsulated into pills that the mother takes in the weeks following birth. Celebrity endorsements from figures like Kim Kardashian and January Jones have helped mainstream this practice, which was virtually unknown in Western culture before the 1970s. The scientific evidence for placentophagy's benefits remains limited and inconclusive. While almost all non-human mammals consume their placentas after giving birth, humans have no documented tradition of this practice in any culture studied by anthropologists. The current trend appears to have emerged in the 1970s as part of the natural childbirth movement and gained momentum through social media and celebrity endorsements in the 2000s. The few scientific studies conducted thus far have found little evidence supporting the claimed benefits, though many women report positive subjective experiences. From a biological perspective, the placenta does contain hormones and nutrients, but whether these survive preparation methods and digestion in meaningful amounts remains questionable. The ethical dimensions of placentophagy highlight interesting contradictions in our attitudes toward cannibalism. Because the placenta develops from the fertilized egg (making it genetically identical to the baby) and not from the mother's tissue, consuming it technically constitutes cannibalism. However, most people who practice placentophagy do not conceptualize it as such, instead viewing the placenta as a personal organ or even as a form of self-consumption. This cognitive framing allows the practice to avoid the cultural taboo associated with cannibalism, illustrating how context and intention shape ethical judgments about consuming human tissue. The fact that placentophagy has gained acceptance while other forms of cannibalism remain strongly taboo demonstrates the flexibility of seemingly rigid cultural prohibitions. Beyond placentophagy, modern medicine continues to utilize human tissues in ways that blur the boundaries of cannibalism. Organ transplantation involves placing one person's tissues inside another's body, while blood transfusions transfer bodily fluids between individuals. More controversially, some medical treatments involve direct consumption of human-derived materials. For instance, some stem cell therapies use materials derived from human embryos or umbilical cords. While these practices are not typically framed as cannibalistic, they involve the therapeutic use of human tissues by other humans, conceptually linking them to historical medicinal cannibalism. The key distinction lies in purpose (healing rather than nutrition) and the absence of oral consumption, allowing us to maintain the cannibalism taboo while embracing the life-saving benefits of tissue transfer. The development of laboratory-grown human tissues for consumption represents an emerging ethical frontier. Scientists have successfully created cultured meat using animal cells, and the same technology could theoretically produce meat from human cells. This raises profound questions: Would consuming lab-grown human meat constitute cannibalism? Would it carry the same ethical implications as consuming tissue from a deceased person? Some philosophers argue that the taboo against cannibalism primarily concerns the harm and disrespect to the person whose body is consumed, issues that might not apply to cultured tissue with no connection to a specific individual. Others maintain that regardless of source, consuming human flesh crosses a fundamental moral boundary. As biotechnology advances, these questions will likely move from theoretical to practical concerns. These modern instances reveal that cannibalism exists on a spectrum rather than as a binary category. Contemporary Western society maintains strong taboos against consuming human flesh in most contexts while simultaneously allowing exceptions based on source (self versus other), intention (therapeutic versus nutritional), and preparation (processed versus recognizable). This inconsistency suggests that our aversion to cannibalism is not simply about the act of consuming human tissue but about the social meanings we attach to different forms of consumption. As biotechnology continues to advance, these boundaries will likely face further challenges, forcing us to reconsider what exactly makes cannibalism taboo and whether those reasons apply to emerging practices that incorporate human tissues in novel ways.
Summary
Cannibalism emerges not as an aberration but as a widespread adaptive behavior across the natural world, shaped by environmental pressures and evolutionary advantages. From spiders that consume their mates to gain reproductive benefits, to stressed rodent mothers eating their young when resources are scarce, to humans in desperate survival situations—the consumption of conspecifics follows predictable patterns tied to ecological conditions. Even cultural forms of cannibalism, whether funerary practices or medicinal consumption, reveal how human societies have integrated this behavior into meaningful systems. Perhaps most surprising is how the supposedly strict Western taboo against cannibalism has historically been more flexible than acknowledged, with European medicinal cannibalism persisting into the early 20th century and modern practices like placentophagy continuing today. The study of cannibalism challenges us to reconsider fundamental assumptions about nature and human behavior. How do we reconcile our moral revulsion toward cannibalism with its natural occurrence throughout the animal kingdom? What does our selective memory about historical cannibalistic practices reveal about cultural bias and self-perception? As biotechnology advances, creating lab-grown human tissues and blurring the boundaries between therapeutic use and consumption, we face new ethical questions about what constitutes cannibalism and why we prohibit it. These questions invite us to look beyond simplistic moral judgments to understand cannibalism as a complex behavior that, while disturbing to contemplate, provides profound insights into how all species—humans included—respond to environmental pressures and the fundamental drive to survive.
Best Quote
“In all likelihood, the most significant of these is a heightened chance of acquiring harmful parasites or diseases from a conspecific. Both parasites and pathogens are often species-specific and many of them have evolved mechanisms to defeat their host’s immune defenses. As a result, predators that consume their own kind run a greater risk of picking up a disease or a parasite than do predators that feed solely on other species.” ― Bill Schutt, Cannibalism: A Perfectly Natural History
Review Summary
Strengths: The review highlights the book's sensitive handling of the topic of cannibalism, emphasizing the author's gentle approach due to the real-life impact on victims' relatives. It praises Schutt's decision to write from a scientific perspective rather than an entertaining one, and notes the inclusion of examples of cannibalism in nature as a positive aspect.\nOverall Sentiment: The overall sentiment of the review is positive, appreciating the author's respectful and scientific treatment of a sensitive subject.\nKey Takeaway: The book offers a thoughtful and scientifically-grounded exploration of cannibalism, urging readers to approach the topic with care and sensitivity, and broadening the discussion beyond human instances to include natural examples.
Trending Books
Download PDF & EPUB
To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Cannibalism
By Bill Schutt