Home/Nonfiction/In Pursuit of the Unknown
Loading...
In Pursuit of the Unknown cover

In Pursuit of the Unknown

17 Equations That Changed the World

4.0 (3,248 ratings)
26 minutes read | Text | 9 key ideas
Equations—those enigmatic strings of symbols—are the unsung heroes shaping the tapestry of our world, and Ian Stewart's "In Pursuit of the Unknown" reveals their hidden power. With the precision of a master storyteller, Stewart unravels the historical journeys of 17 pivotal equations, each a stepping stone in humanity's ascent. From the ancient wisdom of Pythagoras to the revolutionary insights of Einstein, Stewart breathes life into mathematical history, showing how these equations catalyzed leaps in technology, science, and philosophy. Imagine a world where logarithms unlocked the secrets of the cosmos or where Shannon's Information Theory became the backbone of digital communication. This book is not just a guide to mathematical marvels but a profound reminder of how the abstract becomes tangible, forever altering the course of human thought and achievement. Let Stewart be your guide to the hidden forces that have shaped our modern existence.

Categories

Nonfiction, Philosophy, Science, History, Unfinished, Physics, Mathematics, Engineering, Popular Science, History Of Science

Content Type

Book

Binding

Hardcover

Year

2012

Publisher

Basic Books

Language

English

ASIN

0465029736

ISBN

0465029736

ISBN13

9780465029730

File Download

PDF | EPUB

In Pursuit of the Unknown Plot Summary

Introduction

Throughout history, certain mathematical breakthroughs have altered the course of human civilization as profoundly as any war, political revolution, or natural disaster. When Pythagoras discovered the relationship between mathematics and musical harmony in the 6th century BCE, he revealed something fundamental about reality itself—that the universe speaks in the language of mathematics. This revelation would eventually lead to Maxwell's electromagnetic equations that made modern telecommunications possible, and to Schrödinger's wave function that unlocked the quantum world. Each mathematical revolution began with abstract symbols on parchment, clay, or chalkboard, yet ended by transforming how we live, communicate, and understand our place in the cosmos. This historical journey explores how mathematical equations have repeatedly reshaped our world—from ancient geometry that enabled architectural marvels to calculus that powered the Industrial Revolution, from information theory that birthed the digital age to the algorithms now driving artificial intelligence. We'll meet the brilliant minds who, often working against the intellectual currents of their time, developed new mathematical languages to describe previously incomprehensible aspects of reality. For curious readers interested in how abstract ideas become world-changing technologies, this exploration reveals the hidden mathematical foundations beneath our modern civilization and offers glimpses of the revolutions still to come.

Chapter 1: Ancient Foundations: From Pythagoras to Euclid (500 BCE-300 CE)

The first mathematical revolution emerged in the ancient Mediterranean world, where several civilizations began developing systematic approaches to numbers and geometry. Around 500 BCE, Pythagoras and his followers in Greece made a profound discovery: musical harmony corresponded to simple numerical ratios. When a string was divided in specific proportions, it produced pleasing sounds that formed the basis of musical scales. This revelation suggested something deeper—that mathematics wasn't merely a human invention but somehow reflected the underlying structure of reality itself. While the Pythagoreans explored these numerical harmonies, practical mathematics had already been developing for millennia in Egypt and Mesopotamia. Babylonian clay tablets from as early as 1800 BCE show sophisticated algebraic problem-solving techniques, including methods for solving quadratic equations. Egyptian papyri reveal formulas for calculating areas and volumes needed for construction and agriculture. These civilizations developed mathematics primarily as a practical tool, but the Greeks transformed it into something more profound—a system of logical reasoning based on proof. Euclid's Elements, compiled around 300 BCE in Alexandria, represented the culmination of early Greek mathematics. This monumental work organized geometric knowledge into a logical system derived from a small set of axioms and postulates. Euclid's approach—starting with self-evident truths and deriving complex results through rigorous proof—established a template for mathematical thinking that remains influential today. The Elements contained not just geometry but also number theory and a method for finding greatest common divisors that remains in use over two millennia later. While Greek mathematics excelled in geometry, other mathematical traditions developed different strengths. In India, mathematicians made crucial advances in the concept of zero and the place-value decimal system that would eventually transform calculation worldwide. Chinese scholars created sophisticated methods for solving systems of linear equations centuries before similar techniques appeared in Europe. During Europe's Dark Ages, Islamic mathematicians preserved Greek knowledge while making original contributions in algebra—the word itself comes from the Arabic "al-jabr" in al-Khwarizmi's influential 9th-century treatise. The transmission of mathematical knowledge across cultures often coincided with periods of commercial and intellectual exchange. The Hindu-Arabic numeral system reached Europe primarily through trading contacts with the Islamic world, with Leonardo of Pisa (Fibonacci) publishing an influential account in 1202. These numerals, combined with the place-value system, dramatically improved computational ability compared to the cumbersome Roman numerals previously used in Europe. By 1500, these mathematical tools—combined with recovered ancient texts and the new technology of printing—set the stage for an explosion of mathematical innovation during the Renaissance. This ancient mathematical foundation established core principles that remain essential today: the power of abstraction, the importance of proof, and the remarkable ability of mathematics to describe the physical world. From engineering to astronomy, commerce to music, mathematics had already demonstrated its practical value. Yet the most profound mathematical revolutions—those that would transform our understanding of motion, energy, and the fundamental nature of reality—still lay in the future.

Chapter 2: Renaissance Breakthroughs: Algebra and Calculus (1500-1700)

The period from 1500 to 1700 witnessed a mathematical revolution that paralleled the Renaissance's artistic and cultural flowering. As Europe emerged from the Middle Ages, mathematics transformed from a largely practical tool into a sophisticated language capable of describing the natural world with unprecedented precision. This transformation began in Italy, where commercial prosperity created demand for better mathematical methods in banking, navigation, and engineering. A pivotal breakthrough came in 1545 when Girolamo Cardano published "Ars Magna" (The Great Art), revealing solutions to cubic and quartic equations that had eluded mathematicians for centuries. These solutions required a conceptual leap—working with square roots of negative numbers, which seemed nonsensical but yielded correct answers. Though Cardano himself remained uncomfortable with these "imaginary" quantities, his work represented the first steps toward complex numbers, which would eventually become essential to modern physics and engineering. The development of symbolic algebra marked another crucial advancement. François Viète introduced the use of letters to represent unknown quantities around 1590, while René Descartes further refined algebraic notation in his 1637 work "La Géométrie." These seemingly simple innovations allowed mathematicians to express general relationships rather than just solving specific numerical problems. Descartes also united algebra with geometry through his coordinate system, creating analytical geometry and demonstrating how equations could represent curves and surfaces—a unification that would prove essential for the development of calculus. Meanwhile, practical problems were driving mathematical innovation in other directions. Navigation required accurate astronomical calculations, leading to the development of logarithms by John Napier in 1614. This computational tool dramatically simplified the multiplication and division needed for trigonometric calculations, making it possible to create more accurate navigational tables. Henry Briggs soon refined Napier's work into the base-10 logarithms that would remain essential computational tools for over three centuries. The period closed with the independent development of calculus by Isaac Newton and Gottfried Leibniz in the late 1600s. This powerful new mathematical approach could handle problems involving continuous change and infinitesimal quantities, providing tools to describe motion, forces, and other dynamic phenomena. Though their notation and philosophical approaches differed, both men recognized that differentiation and integration were inverse operations—a unifying insight that would prove fundamental to all subsequent mathematical physics. The mathematical innovations of this period transformed science itself. The new mathematics made it possible to move beyond qualitative descriptions to precise quantitative predictions, establishing the foundation for the Scientific Revolution. Galileo had already begun applying mathematical reasoning to physics, and Newton would soon use calculus to develop his laws of motion and universal gravitation. By 1700, mathematics had become the essential language of natural philosophy, enabling a new understanding of the universe as a system governed by mathematical laws—a perspective that continues to guide scientific inquiry today.

Chapter 3: Newtonian Mechanics: The Mathematical Universe (1687-1850)

In 1687, Isaac Newton published his monumental work "Philosophiæ Naturalis Principia Mathematica" (Mathematical Principles of Natural Philosophy), forever changing humanity's understanding of the physical world. Newton's achievement was not merely discovering new physical laws but expressing them in precise mathematical form, demonstrating that the same equations could describe both terrestrial and celestial motion. This mathematical unification suggested a profound truth: the universe operates according to consistent, knowable laws that can be expressed in the language of mathematics. Newton's three laws of motion and his law of universal gravitation provided a comprehensive framework for understanding physical phenomena. The mathematical form of these laws—particularly his second law, F=ma, relating force, mass, and acceleration—made it possible to predict the motion of objects with unprecedented accuracy. Newton's work represented the culmination of the Scientific Revolution begun by Copernicus, Kepler, and Galileo, establishing a mechanistic worldview in which the universe operated like a perfect clockwork mechanism governed by mathematical principles. The calculus that Newton had developed (simultaneously with Leibniz) proved essential to applying his physical laws to real-world problems. This new mathematics could handle the continuous changes and rates of change that characterize motion, providing tools to analyze everything from the swing of a pendulum to the orbit of a planet. Throughout the 18th century, mathematicians like Leonhard Euler, Joseph-Louis Lagrange, and Pierre-Simon Laplace refined and extended Newton's methods, developing more powerful mathematical techniques for solving mechanical problems. The industrial revolution provided both practical applications and new challenges for Newtonian mechanics. Steam engines, the transformative technology of this era, converted heat energy into mechanical work. The mathematics of thermodynamics developed to understand these engines, with Sadi Carnot establishing theoretical limits on their efficiency in 1824. His work led to the formulation of the laws of thermodynamics, which described how energy flows and transforms. These principles not only improved engine design but established fundamental constraints on all energy conversion processes. By the mid-19th century, Newtonian mechanics had been applied to an ever-widening range of phenomena. The wave theory of light, developed by Thomas Young and Augustin-Jean Fresnel, used mathematical analysis to explain optical phenomena like interference and diffraction. The kinetic theory of gases, advanced by James Clerk Maxwell and Ludwig Boltzmann, applied statistical methods to the motion of countless molecules, connecting microscopic behavior to macroscopic properties like temperature and pressure. These developments demonstrated the remarkable flexibility of Newtonian principles when combined with appropriate mathematical techniques. The success of Newtonian mechanics fostered a deterministic worldview that dominated scientific thinking until the early 20th century. Laplace famously suggested that a sufficiently powerful intellect, knowing the position and velocity of every particle in the universe, could predict all future states with perfect certainty. This mathematical determinism reflected the Enlightenment's confidence in human reason and the power of science. Though this classical paradigm would eventually be challenged by relativity and quantum mechanics, Newtonian mathematics remains remarkably accurate for describing the macroscopic world and continues to form the foundation of engineering and much of applied science.

Chapter 4: Electromagnetic Unification: Maxwell's Field Theory (1850-1900)

The mid-19th century witnessed a revolution in physics as profound as Newton's work two centuries earlier. While Newtonian mechanics had successfully described the motion of visible objects, a new mathematical framework was needed to understand electricity and magnetism—phenomena that would ultimately transform human civilization even more dramatically than the mechanical technologies of the industrial revolution. The story begins with experimental discoveries: in 1820, Hans Christian Ørsted observed that an electric current deflected a nearby compass needle, demonstrating a connection between electricity and magnetism. Michael Faraday, working in London during the 1830s and 1840s, conducted extensive experiments showing that changing magnetic fields could generate electric currents and vice versa. Lacking formal mathematical training, Faraday developed the intuitive concept of "fields"—regions of space where electric and magnetic forces act—to explain his observations. This conceptual innovation represented a significant departure from the Newtonian tradition of forces acting instantaneously across empty space. James Clerk Maxwell, a Scottish physicist and mathematician, transformed Faraday's intuitive ideas into precise mathematical equations in the 1860s. Maxwell's four equations elegantly described how electric and magnetic fields interact and propagate through space. The mathematics revealed something unexpected: these fields could travel through empty space as waves, moving at precisely the speed of light. Maxwell concluded that light itself must be an electromagnetic wave—a stunning unification of optics with electromagnetism that demonstrated the power of mathematical analysis to reveal hidden connections in nature. Maxwell's theory predicted that electromagnetic waves could exist with wavelengths far beyond the visible spectrum. This prediction was confirmed in 1887 when Heinrich Hertz generated and detected radio waves in his laboratory. Though Hertz saw his work as merely confirming Maxwell's theory, others quickly recognized its practical implications. By the early 1900s, Guglielmo Marconi and others had developed wireless telegraphy, transmitting messages across oceans without physical connections. This technology, based directly on Maxwell's equations, would eventually evolve into radio, television, and all modern wireless communications. The implications of Maxwell's equations extended far beyond practical applications. They represented the first successful field theory in physics, challenging the Newtonian paradigm of particles interacting through forces. Maxwell's work demonstrated that fields were not merely mathematical conveniences but physical realities in their own right. This conceptual shift would later influence Einstein's development of general relativity, which reimagined gravity itself as a manifestation of curved spacetime rather than a force acting across distance. By 1900, Maxwell's electromagnetic theory stood alongside Newtonian mechanics as a pillar of classical physics. Together, they seemed to offer a nearly complete description of the physical world. Yet subtle inconsistencies between these theories—particularly regarding the speed of light in different reference frames—would soon lead Einstein to develop special relativity, beginning another revolutionary chapter in mathematical physics. Maxwell's equations themselves remained valid, however, and continue to govern our understanding of electromagnetic phenomena from radio broadcasting to fiber optic communications, from electric motors to smartphone screens.

Chapter 5: Quantum Revolution: Schrödinger's Probabilistic Reality (1900-1930)

The dawn of the 20th century brought physics to a crisis point. Classical theories had achieved remarkable success in explaining macroscopic phenomena, but they failed dramatically when applied to the atomic realm. Between 1900 and 1930, a revolutionary new framework emerged—quantum mechanics—that would fundamentally change our understanding of reality at its smallest scales and eventually enable technologies from lasers to transistors that define our modern world. The quantum revolution began with Max Planck's reluctant introduction of energy quanta in 1900 to explain blackbody radiation. Planck discovered that heated objects emit electromagnetic radiation in discrete energy packets rather than continuously, contradicting classical physics. Five years later, Albert Einstein extended this quantum concept to light itself, proposing that light consists of discrete particles (later called photons) to explain the photoelectric effect. These early quantum ideas, while successful in explaining specific phenomena, remained disconnected fragments rather than a coherent theory. Niels Bohr's 1913 model of the hydrogen atom represented the next crucial step. Bohr proposed that electrons could only orbit the nucleus at certain fixed distances, corresponding to specific energy levels. When electrons jumped between these levels, they emitted or absorbed photons with precisely defined energies. This model successfully explained the discrete spectral lines observed in hydrogen but lacked a deeper theoretical foundation and failed for more complex atoms. The breakthrough came in 1925-1926 through two seemingly different approaches. Werner Heisenberg developed matrix mechanics, representing quantum properties as mathematical matrices whose multiplication didn't commute (A×B≠B×A)—a radical departure from classical variables. Almost simultaneously, Erwin Schrödinger formulated wave mechanics, centered on his famous equation describing how quantum waves evolve over time. Though appearing distinct, these approaches were soon proven mathematically equivalent by John von Neumann and others. Schrödinger's wave equation introduced a profound conceptual shift: particles were now described by wave functions that evolved deterministically, but these waves represented probability amplitudes rather than physical waves. When measured, quantum systems yield only discrete, probabilistic outcomes. This probabilistic interpretation, championed by Max Born, led to Heisenberg's uncertainty principle—the fundamental limit on how precisely complementary properties like position and momentum can be simultaneously known. The philosophical implications were staggering. The Copenhagen interpretation, developed primarily by Bohr and Heisenberg, suggested that quantum systems exist in superpositions of states until measured, at which point they "collapse" to definite values. This view challenged fundamental notions of determinism and objective reality that had underpinned science since Newton. Einstein famously resisted these implications, declaring "God does not play dice with the universe," but the mathematical formalism of quantum mechanics consistently produced correct experimental predictions despite its counterintuitive nature. By 1930, quantum mechanics had matured into a powerful mathematical framework that would eventually explain phenomena from superconductivity to chemical bonding. Though debates about its interpretation continue to this day, Schrödinger's equation and the quantum formalism it represents have enabled technologies that define the modern world—from nuclear magnetic resonance imaging to laser surgery, from microprocessors to quantum cryptography. What began as an attempt to understand puzzling laboratory results became the foundation for a technological revolution that continues to unfold.

Chapter 6: Information Age: Shannon to Silicon (1948-2000)

The second half of the 20th century witnessed an unprecedented technological transformation driven by mathematical innovations in information theory, computing, and semiconductor physics. This period saw abstract equations translated into practical technologies that would fundamentally reshape human civilization, creating the interconnected digital world we now inhabit. Claude Shannon's landmark 1948 paper "A Mathematical Theory of Communication" established information theory as a rigorous mathematical discipline. Working at Bell Laboratories, Shannon defined information in terms of uncertainty reduction and introduced the concept of entropy as a measure of information content. His work established fundamental limits on data compression and error-free transmission through noisy channels. Shannon's equations showed that digital communication could be made arbitrarily reliable through proper encoding, providing the theoretical foundation for all modern digital communications from cellular networks to the internet. The development of the transistor in 1947 by John Bardeen, Walter Brattain, and William Shockley represented another crucial breakthrough. These solid-state devices could amplify electrical signals and act as electronic switches, functions previously performed by bulky vacuum tubes. The physics behind transistors required quantum mechanics to explain how electrons move through semiconductor materials—a perfect example of how abstract quantum equations led to practical technology. Transistors would eventually replace vacuum tubes, enabling the miniaturization of electronic equipment and setting the stage for the computer revolution. The integrated circuit, developed independently by Jack Kilby and Robert Noyce in the late 1950s, represented the next transformative step. By fabricating multiple transistors on a single piece of semiconductor material, integrated circuits dramatically reduced the size, cost, and power consumption of electronic systems. Gordon Moore observed in 1965 that the number of transistors on a chip doubled approximately every two years—a pattern that became known as Moore's Law and continued for over five decades, driving exponential improvements in computing power. The mathematical foundations of computing had been established earlier by Alan Turing, whose 1936 paper introduced the concept of a universal computing machine capable of implementing any algorithm. John von Neumann's architecture for stored-program computers, developed in the 1940s, translated these theoretical concepts into practical designs. Programming languages evolved from machine code to higher-level languages like FORTRAN (1957) and COBOL (1959), making computers increasingly accessible to non-specialists. The development of structured programming, object-oriented languages, and graphical user interfaces gradually transformed computers from specialized scientific instruments to universal tools. The growth of computer networks represented another crucial development. ARPANET, created in 1969, pioneered packet-switching technology—breaking messages into discrete packets that could travel independently through a network. This approach, combined with the standardized protocols of TCP/IP developed in the 1970s, formed the technical foundation of the internet. Tim Berners-Lee's creation of the World Wide Web in 1989 added a user-friendly interface that would make the internet accessible to the general public in the 1990s. By 2000, these converging technologies had transformed nearly every aspect of human life. The mathematical equations of information theory, quantum mechanics, and computer science had enabled a digital revolution that democratized access to information, reshaped economic structures, and created new forms of social interaction. Computing power once available only to governments now fit in our pockets, connecting billions of people to a global information network unimaginable just decades earlier—all made possible by mathematical innovations that translated abstract theory into world-changing technology.

Chapter 7: Modern Complexity: Chaos, Networks and AI (1970-Present)

Since the 1970s, mathematics has increasingly focused on modeling complex systems characterized by nonlinearity, emergence, and adaptation. These new mathematical approaches have transformed our understanding of phenomena from weather patterns to financial markets, from biological ecosystems to artificial intelligence, revealing order within apparent chaos and patterns within complexity. Chaos theory emerged from Edward Lorenz's work on weather prediction in the 1960s. While running computer simulations, Lorenz discovered that tiny differences in initial conditions could lead to vastly different outcomes—the famous "butterfly effect" where a butterfly flapping its wings in Brazil might theoretically trigger a tornado in Texas. This sensitivity to initial conditions characterized many natural systems, from turbulent fluids to population dynamics. Chaos theory revealed that deterministic systems could generate apparently random behavior, blurring the line between order and disorder. Benoit Mandelbrot's work on fractals further demonstrated how complex, self-similar patterns could arise from simple mathematical rules, providing new tools for describing irregular natural forms from coastlines to cloud formations. Network theory has emerged as a powerful mathematical approach for analyzing complex interconnected systems. Building on Leonhard Euler's 18th-century work on graph theory, modern network mathematics provides tools for understanding systems from the internet to social networks, from neural connections to transportation infrastructure. Researchers like Duncan Watts and Steven Strogatz discovered that many real-world networks share common properties, such as the "small-world" phenomenon where most nodes can be reached from any other node in surprisingly few steps. These mathematical insights have practical applications from designing robust communication networks to tracking disease spread through social connections. Financial mathematics underwent a revolution with the development of the Black-Scholes equation in 1973. This partial differential equation provided a method for pricing options and other derivatives, transforming financial markets by creating a theoretical foundation for risk management. The equation's assumptions about market behavior would later be questioned, particularly after the 2008 financial crisis, highlighting the dangers of oversimplified mathematical models in complex social systems. More sophisticated models incorporating fat-tailed distributions and nonlinear feedback have since been developed, though the fundamental challenge of modeling human behavior remains. Climate modeling represents one of the most challenging applications of modern mathematics. These models combine fluid dynamics, thermodynamics, and numerous other physical processes into massive computational simulations. The Navier-Stokes equations, which describe fluid flow, form the core of these models but must be supplemented with equations for radiation, cloud formation, ocean circulation, and many other factors. Despite their complexity, these models have successfully captured the fundamental physics of climate change, providing essential insights for environmental policy and planning. Perhaps the most transformative mathematical development in recent decades has been the rise of machine learning algorithms. These computational methods, based on statistical learning theory and optimization, can discover patterns in vast datasets without explicit programming. Deep neural networks, inspired by the structure of the human brain, have achieved remarkable success in tasks from image recognition to language translation. The backpropagation algorithm, which efficiently calculates how to adjust network parameters to minimize errors, has been particularly crucial to these advances. These systems don't follow explicitly programmed rules but instead learn mathematical representations from examples, raising profound questions about the nature of intelligence and the future relationship between humans and machines. As we move further into the 21st century, these mathematical approaches to complexity continue to evolve and converge. Quantum computing promises to solve certain problems exponentially faster than classical computers, while topological data analysis provides new tools for finding structure in high-dimensional data. The mathematics of complexity, networks, and artificial intelligence is transforming our understanding of the world and our ability to interact with it, continuing the long tradition of mathematical revolutions that reshape human civilization.

Summary

Throughout human history, mathematical equations have served as powerful tools for understanding and reshaping our world. From Pythagoras's theorem that enabled precise measurement and construction to Maxwell's equations that unified electricity and magnetism, mathematical innovations have repeatedly transformed our understanding of reality and our technological capabilities. Each breakthrough built upon previous knowledge while opening entirely new domains of inquiry. Newton's laws provided a deterministic framework for the macroscopic world, while Schrödinger's equation revealed the probabilistic nature of quantum reality. Information theory and computational models translated these fundamental insights into practical technologies that now permeate every aspect of modern life. These mathematical revolutions share common patterns despite their diverse applications. Each emerged from attempts to solve specific problems but yielded insights far beyond their original contexts. Each required not just technical innovation but conceptual breakthroughs—new ways of thinking about space, time, energy, or information. And each transformed not just science but society itself, from navigation and industrialization to telecommunications and computing. As we face contemporary challenges from climate change to artificial intelligence, this historical perspective reminds us that mathematical innovation remains essential to human progress. The equations that will shape our future may already be taking form in laboratories and universities around the world, waiting to unleash the next wave of scientific and technological transformation.

Best Quote

“IQ is a statistical method for quantifying specific kinds of problem-solving ability, mathematically convenient but not necessarily corresponding to a real attribute of the human brain, and not necessarily representing whatever it is that we mean by ‘intelligence’.” ― Ian Stewart, In Pursuit of the Unknown: 17 Equations That Changed the World

Review Summary

Strengths: The topics in the book are described as very well-written and extremely interesting. The reviewer also mentions gaining many ideas for their YouTube channel from the book. Weaknesses: The book's attempt at "popularization" is criticized for being poor, with the mathematics not sufficiently simplified for non-specialists. The reviewer notes that the latter chapters lose excitement and become a chore to finish. The book is described as dense and potentially problematic for those without a math or science background. Overall Sentiment: Critical Key Takeaway: While the book presents interesting topics, its failure to effectively simplify complex mathematical concepts limits its accessibility, making it a challenging read for those outside of math or science fields. The excitement diminishes towards the end, leading to a less engaging experience.

About Author

Loading...
Ian Stewart Avatar

Ian Stewart

Ian Nicholas Stewart is an Emeritus Professor and Digital Media Fellow in the Mathematics Department at Warwick University, with special responsibility for public awareness of mathematics and science. He is best known for his popular science writing on mathematical themes.--from the author's websiteLibrarian Note: There is more than one author in the GoodReads database with this name. See other authors with similar names.

Read more

Download PDF & EPUB

To save this Black List summary for later, download the free PDF and EPUB. You can print it out, or read offline at your convenience.

Book Cover

In Pursuit of the Unknown

By Ian Stewart

0:00/0:00

Build Your Library

Select titles that spark your interest. We'll find bite-sized summaries you'll love.