Credit...Aaron Tilley for The New York Times. Set design by Kyle Bean.

The Health Issue

How Humanity Gave Itself an Extra Life

Between 1920 and 2020, the average human life span doubled. How did we do it? Science mattered — but so did activism.

In September 1918, a flu virus began spreading through Camp Devens, an overcrowded military base just outside Boston. By the end of the second week of the outbreak, one in five soldiers at the base had come down with the illness. But the speed with which it spread through the camp was not nearly as shocking as the lethality. “It is only a matter of a few hours then until death comes,” a camp physician wrote. “It is horrible. One can stand it to see one, two or 20 men die, but to see these poor devils dropping like flies sort of gets on your nerves. We have been averaging about 100 deaths per day.”

The devastation at Camp Devens would soon be followed by even more catastrophic outbreaks, as the so-called Spanish flu — a strain of influenza virus that science now identifies as H1N1 — spread around the world. In the United States, it would cause nearly half of all deaths over the next year. In what was already a time of murderous war, the disease killed millions more on the front lines and in military hospitals in Europe; in some populations in India, the mortality rate for those infected approached 20 percent. The best estimates suggest that as many as 100 million people died from the Great Influenza outbreak that eventually circled the globe. To put that in comparison, roughly three million people have died from Covid-19 over the past year, on a planet with four times as many people.

Image
Red Cross volunteers assembling gauze masks for use at Camp Devens, near Boston, during the 1918-19 influenza pandemic.Credit...Vintage_Space/Alamy

There was another key difference between these two pandemics. The H1N1 outbreak of 1918-19 was unusually lethal among young adults, normally the most resilient cohort during ordinary flu seasons. Younger people experienced a precipitous drop in expected life during the H1N1 outbreak, while the life expectancies of much older people were unaffected. In the United States, practically overnight, average life expectancy plunged to 47 from 54; in England and Wales, it fell more than a decade, from a historic height of 54 to an Elizabethan-era 41. India experienced average life expectancies below 30 years.

Imagine you were there at Camp Devens in late 1918, surveying the bodies stacked in a makeshift morgue. Or you were roaming the streets of Bombay, where more than 5 percent of the population died of influenza in a matter of months. Imagine touring the military hospitals of Europe, seeing the bodies of so many young men simultaneously mutilated by the new technologies of warfare — machine guns and tanks and aerial bombers — and the respiratory violence of H1N1. Imagine knowing the toll this carnage would take on global life expectancy, with the entire planet lurching backward to numbers more suited to the 17th century, not the 20th. What forecast would you have made for the next hundred years? Was the progress of the past half-century merely a fluke, easily overturned by military violence and the increased risk of pandemics in an age of global connection? Or was the Spanish flu a preview of an even darker future, in which some rogue virus could cause a collapse of civilization itself?

Both grim scenarios seemed within the bounds of possibility. And yet, amazingly, neither came to pass. Instead, what followed was a century of unexpected life.

The period from 1916 to 1920 marked the last point in which a major reversal in global life expectancy would be recorded. (During World War II, life expectancy did briefly decline, but with nowhere near the severity of the collapse during the Great Influenza.) The descendants of English and Welsh babies born in 1918, who on average lived just 41 years, today enjoy life expectancies in the 80s. And while Western nations surged far ahead in average life span during the first half of the last century, other nations have caught up in recent decades, with China and India having recorded what almost certainly rank as the fastest gains of any society in history. A hundred years ago, an impoverished resident of Bombay or Delhi would beat the odds simply by surviving into his or her late 20s. Today average life expectancy in India is roughly 70 years.

In effect, during the century since the end of the Great Influenza outbreak, the average human life span has doubled. There are few measures of human progress more astonishing than this. If you were to publish a newspaper that came out just once a century, the banner headline surely would — or should — be the declaration of this incredible feat. But of course, the story of our extra life span almost never appears on the front page of our actual daily newspapers, because the drama and heroism that have given us those additional years are far more evident in hindsight than they are in the moment. That is, the story of our extra life is a story of progress in its usual form: brilliant ideas and collaborations unfolding far from the spotlight of public attention, setting in motion incremental improvements that take decades to display their true magnitude.

Image
Preparing to treat a patient suffering from smallpox, from a 17th-century Ottoman manuscript.Credit...DeAgostini/Getty Images

Another reason we have a hard time recognizing this kind of progress is that it tends to be measured not in events but in nonevents: the smallpox infection that didn’t kill you at age 2; the accidental scrape that didn’t give you a lethal bacterial infection; the drinking water that didn’t poison you with cholera. In a sense, human beings have been increasingly protected by an invisible shield, one that has been built, piece by piece, over the last few centuries, keeping us ever safer and further from death. It protects us through countless interventions, big and small: the chlorine in our drinking water, the ring vaccinations that rid the world of smallpox, the data centers mapping new outbreaks all around the planet. A crisis like the global pandemic of 2020-21 gives us a new perspective on all that progress. Pandemics have an interesting tendency to make that invisible shield suddenly, briefly visible. For once, we’re reminded of how dependent everyday life is on medical science, hospitals, public-health authorities, drug supply chains and more. And an event like the Covid-19 crisis does something else as well: It helps us perceive the holes in that shield, the vulnerabilities, the places where we need new scientific breakthroughs, new systems, new ways of protecting ourselves from emergent threats.

How did this great doubling of the human life span happen? When the history textbooks do touch on the subject of improving health, they often nod to three critical breakthroughs, all of them presented as triumphs of the scientific method: vaccines, germ theory and antibiotics. But the real story is far more complicated. Those breakthroughs might have been initiated by scientists, but it took the work of activists and public intellectuals and legal reformers to bring their benefits to everyday people. From this perspective, the doubling of human life span is an achievement that is closer to something like universal suffrage or the abolition of slavery: progress that required new social movements, new forms of persuasion and new kinds of public institutions to take root. And it required lifestyle changes that ran throughout all echelons of society: washing hands, quitting smoking, getting vaccinated, wearing masks during a pandemic.

It is not always easy to perceive the cumulative impact of all that work, all that cultural transformation. The end result is not one of those visible icons of modernity: a skyscraper, a moon landing, a fighter jet, a smartphone. Instead, it manifests in countless achievements, often quickly forgotten, sometimes literally invisible: the drinking water that’s free of microorganisms, or the vaccine received in early childhood and never thought about again. The fact that these achievements are so myriad and subtle — and thus underrepresented in the stories we tell ourselves about modern progress — should not be an excuse to keep our focus on the astronauts and fighter pilots. Instead, it should inspire us to correct our vision.

The first life-expectancy tables were calculated in the late 1600s, during the dawn of modern statistics and probability. It turned out to be one of those advances in measurement that transform the thing being measured: By following changes in life expectancy over time, and comparing expected life among different populations, it became easier to detect inequalities in outcomes, perceive long-term threats and track the effects of promising health interventions more accurately. Demographers now distinguish between life expectancies at different ages. In a society with very high infant mortality, life expectancy at birth might be 20, because so many people die in the first days of life, pulling the overall number down, while life expectancy at 20 might easily be in the 60s. The doubling of life expectancy over the past century is a result of progress at both ends of the age spectrum: Children are dying far less frequently, and the elderly are living much longer. Centenarians are projected to be the fastest-growing age group worldwide.

One strange thing about the story of global life expectancy is how steady the number was for almost the entirety of human history. Until the middle of the 18th century, the figure appears to have rarely exceeded a ceiling of about 35 years, rising or falling with a good harvest or a disease outbreak but never showing long-term signs of improvement. A key factor keeping average life expectancy low was the shockingly high rates of infant and childhood mortality: Two in five children perished before reaching adulthood. Human beings had spent 10,000 years inventing agriculture, gunpowder, double-entry accounting, perspective in painting — but these undeniable advances in collective human knowledge failed to move the needle in one critical category: how long the average person could expect to live.

The first hint that this ceiling might be breached appeared in Britain during the middle decades of the 18th century, just as the Enlightenment and industrialization were combining to transform European and North American societies. The change was subtle at first and largely imperceptible to contemporary observers. In fact, it was not properly documented until the 1960s, when a historical demographer named T.H. Hollingsworth analyzed records dating back to 1550 and discovered a startling pattern. Right around 1750, after two centuries of stasis, the average life expectancy of a British aristocrat began to increase at a steady rate, year after year, creating a measurable gap between the elites and the rest of the population. By the 1770s, the British elite were living on average into their mid-40s; by the middle of Queen Victoria’s reign, they were approaching a life expectancy at birth of 60.

Those aristocrats constituted a vanishingly small proportion of humanity. But the demographic transformation they experienced offered a glimpse of the future. The endless bobbing of the previous 10,000 years had not only taken on a new shape — a more or less straight line, steadily slanting upward. It also marked the beginning of a measurable gap in health outcomes. Before 1750, it didn’t matter whether you were a baron or a haberdasher or a hunter-gatherer: Your life expectancy at birth was going to be in the 30s. All their wealth and privilege gave European elites no advantage whatsoever at the elemental task of keeping themselves — and their children most of all — alive.

The best way to appreciate the lack of health inequalities before 1750 is to contemplate the list of European royalty killed by the deadly smallpox virus in the preceding decades. During the outbreak of 1711 alone, smallpox killed the Holy Roman emperor Joseph I; three siblings of the future Holy Roman emperor Francis I; and the heir to the French throne, the grand dauphin Louis. Smallpox would go on to take the lives of King Louis I of Spain; Emperor Peter II of Russia; Louise Hippolyte, sovereign princess of Monaco; King Louis XV of France; and Maximilian III Joseph, elector of Bavaria.

How, then, did the British elite manage that first sustained extension in average life span? The classic story of health progress from the age is Edward Jenner’s invention of the smallpox vaccine, which ranks alongside Newton’s apple and Franklin’s kite among the most familiar narratives in the history of science. After noticing that exposure to a related illness called cowpox — often contracted by dairy workers — seemed to prevent more dangerous smallpox infections, Jenner scraped some pus from the cowpox blisters of a milkmaid and then inserted the material, via incisions made with a lancet, into the arms of an 8-year-old boy. After developing a light fever, the boy soon proved to be immune to variola, the virus that causes smallpox. As the first true vaccination, Jenner’s experiment was indeed a watershed moment in the history of medicine and in the ancient interaction between humans and microorganisms. But Jenner’s triumph did not occur until May 1796, well after the initial takeoff in life expectancy among the British elite. The timing suggests that an earlier innovation was most likely driving much of the initial progress, one that originated far from the centers of Western science and medicine: variolation.

No one knows exactly when and where variolation, a kind of proto-vaccination that involves direct exposure to small amounts of the virus itself, was first practiced. Some accounts suggest it may have originated in the Indian subcontinent thousands of years ago. The historian Joseph Needham described a 10th-century variolater, possibly a Taoist hermit, from Sichuan who brought the technique to the royal court after a Chinese minister’s son died of smallpox. Whatever its origins, the historical record is clear that the practice had spread throughout China, India and Persia by the 1600s. Enslaved Africans brought the technique to the American colonies. Like many great ideas, it may have been independently discovered multiple times in unconnected regions of the world. It is possible, in fact, that the adoption of variolation may have temporarily increased life expectancies in those regions as well, but the lack of health records make this impossible to determine. All we can say for certain is that whatever increase might have happened had disappeared by the time countries like China or India began keeping accurate data on life span.

Image
Early inoculation methods in China, as documented in an illustration from a 1913 history of vaccination.Credit...The Historical Medical Library of the College of Physicians of Philadelphia

Variolation made it to Britain thanks to an unlikely advocate: a well-bred and erudite young woman named Lady Mary Wortley Montagu. A smallpox survivor herself, Montagu was the daughter of the Duke of Kingston-Upon-Hull and wife of the grandson of the first Earl of Sandwich. As a teenager, she wrote poetry and an epistolary novel; in her early 20s, she struck up a correspondence with the poet Alexander Pope. She crossed paths with variolation thanks to an accident of history: Shortly after her successful recovery from smallpox, her husband, Edward Wortley Montagu, was appointed ambassador to the Ottoman Empire. In 1716, after spending her entire life in London and the English countryside, Mary Montagu moved her growing family to Constantinople, living there for two years.

Montagu immersed herself in the culture of the city, visiting the famous baths and studying Turkish. In her explorations, she came across the practice of variolation and described it in enthusiastic letters back to her friends and family in England: “The Small Pox — so fatal and so general amongst us — is here rendered entirely harmless, by the invention of engrafting.” In March 1718, she had her young son engrafted. After a few days of fever and an outbreak of pustules on both arms, Montagu’s son made a full recovery. He would go on to live into his 60s, seemingly immune to smallpox for the rest of his life. He is generally considered the first British citizen to have been inoculated. His sister was successfully inoculated in 1721, after Montagu and her family returned to London. Over the next few years, inspired by Montagu’s success, the Princess of Wales inoculated three of her children, including her son Frederick, the heir to the British throne. Frederick would survive his childhood untouched by smallpox, and while he died before ascending to the throne, he did live long enough to produce an heir: George William Frederick, who would eventually become King George III.

Image
Lady Mary Wortley Montagu helped popularize the practice of variolation in Britain.Credit...Bettmann/Getty Images

Thanks in large part to Mary Montagu’s advocacy, variolation spread through the upper echelons of British society over the subsequent decades. It remained a controversial procedure throughout the century; many of its practitioners worked outside the official medical establishment of the age. But the adoption of variolation by the British elite left an indelible mark in the history of human life expectancy: that first upward spike that began to appear in the middle of the 1700s, as a whole generation of British aristocrats survived their childhoods thanks at least in part to their increased levels of immunity to variola. Crucially, one Englishman inoculated during that period was Edward Jenner himself, who received the treatment as a young child in 1757; decades later, as a local doctor, he regularly inoculated his own patients. Without a lifelong familiarity with variolation, it is unlikely that Jenner would have hit upon the idea of injecting pus from a less virulent but related disease.

As Jenner would later demonstrate, vaccination improved the mortality rates of the procedure; patients were significantly more likely to die from variolation than from vaccination. But undeniably, a defining element of the intervention lay in the idea of triggering an immune response by exposing a patient to a small quantity of infected material. That idea had emerged elsewhere, not in the fertile mind of the country doctor, musing on the strange immunity of the milkmaids, but rather in the minds of pre-Enlightenment healers in China and India and Africa hundreds of years before. Vaccination was a truly global idea from the beginning.

The positive trends in life expectancy among the British elites in the late 1700s would not become a mass phenomenon for another century. Variolation and vaccination had spread through the rural poor and the industrial working classes during that period, in part thanks to political and legal campaigns that led to mandatory vaccination programs. But the decline of smallpox was overwhelmed by the man-made threats of industrialization. For much of the 19th century, the overall balance sheet of scientific and technological advances was a net negative in terms of human health: The life-span benefits of one technological advance (variolation and vaccines) were quickly wiped out by the costs of another (industrialization).

In 1843, the British statistician William Farr compared life expectancies in three parts of England: rural Surrey, metropolitan London and industrial Liverpool. Farr found that people in Surrey were enjoying life expectancies close to 50, a significant improvement over the long ceiling of the mid-30s. The national average was 41. London, for all its grandeur and wealth, was still stuck at 35. But Liverpool — a city that had undergone staggering explosions in population density, because of industrialization — was the true shocker. The average Liverpudlian died at 25.

The mortality trends in the United States during the first half of the 19th century were equally stark. Despite the widespread adoption of vaccination, overall life expectancy in the United States declined by 13 years between 1800 and 1850. In 1815, about 30 percent of all reported deaths in New York were children under 5. By the middle of the century, it was more than 60 percent.

One culprit was increasingly clear. In May 1858, a progressive journalist in New York named Frank Leslie published a 5,000-word exposé denouncing a brutal killer in the metropolis. Malevolent figures, Leslie wrote, were responsible for what he called “the wholesale slaughter of the innocents.” He went on, “For the midnight assassin, we have the rope and the gallows; for the robber the penitentiary; but for those who murder our children by the thousands we have neither reprobation nor punishment.” Leslie was railing not against mobsters or drug peddlers but rather a more surprising nemesis: milk.

Drinking animal milk — a practice as old as animal domestication itself — has always presented health risks, from spoilage or by way of infections passed down from the animal. But the density of industrial cities like New York had made cow’s milk far deadlier than it was in earlier times. In an age without refrigeration, milk would spoil in summer months if it was brought in from far-flung pastures in New Jersey or upstate New York. Increased participation from women in the industrial labor force meant that more infants and young children were drinking cow’s milk, even though a significant portion of dairy cows suffered from bovine tuberculosis, and unprocessed milk from these cows could transmit the bacterium that causes the disease to human beings. Other potentially fatal illnesses were also linked to milk, including diphtheria, typhoid and scarlet fever.

How did milk go from being a “liquid poison” — as Frank Leslie called it — to the icon of health and vitality that it became in the 20th century? The obvious answer begins in 1854, when a young Louis Pasteur took a job at the University of Lille in the northern corner of France, just west of the French-Belgian border. Sparked by conversations with winemakers and distillery managers in the region, Pasteur became interested in the question of why certain foods and liquids spoiled. Examining samples of a spoiled beetroot alcohol under a microscope, Pasteur was able to detect not only the yeast organisms responsible for fermentation but also a rod-shaped entity — a bacterium now called Acetobacter aceti — that converts ethanol into acetic acid, the ingredient that gives vinegar its sour taste. These initial observations convinced Pasteur that the mysterious changes of both fermentation and spoilage were not a result of spontaneous generation but rather were a byproduct of living microbes, and that insight, which would eventually help provide the foundation of the germ theory of disease, led Pasteur to experiment with different techniques for killing those microbes before they could cause any harm. By 1865, Pasteur, now a professor at the École Normal Supérieure in Paris, had hit upon the technique that would ultimately bear his name: By heating wine to around 130 degrees Fahrenheit and then quickly cooling it, he could kill many of the bacteria within, and in doing so prevent the wine from spoiling without substantially affecting its flavor. And it is that technique, applied to milk all around the world, that now saves countless people from dying of disease every single day.

Image
Louis Pasteur in his laboratory in Paris, circa 1880.Credit...Hulton Archive/Getty Images

Understanding that last achievement as a triumph of chemistry is not so much wrong as it is incomplete. One simple measure of why it is incomplete is how long it took for pasteurization to actually have a meaningful effect on the safety of milk: In the United States, it would not become standard practice in the milk industry until a half century after Pasteur conceived it. That’s because progress is never a result of scientific discovery alone. It also requires other forces: crusading journalism, activism, politics. Pasteurization as an idea was first developed in the mind of a chemist. But in the United States, it would finally make a difference thanks to a much wider cast of characters, most memorably a department-store impresario named Nathan Straus.

Born in the kingdom of Bavaria in 1848, Straus moved with his family to the American South, where his father had established a profitable general store. By the 1880s, Straus and his brother Isidor had become part owners of Macy’s department store in Manhattan. Straus had long been concerned about the childhood mortality rates in the city — he had lost two children to disease. Conversations with another German immigrant, the political radical and physician Abraham Jacobi, introduced him to the pasteurization technique, which was finally being applied to milk almost a quarter of a century after Pasteur developed it. Straus saw that pasteurization offered a comparatively simple intervention that could make a meaningful difference in keeping children alive.

One major impediment to pasteurization came from milk consumers themselves. Pasteurized milk was widely considered to be less flavorful than regular milk; the process was also believed to remove the nutritious elements of milk — a belief that has re-emerged in the 21st century among “natural milk” adherents. Dairy producers resisted pasteurization not just because it added an additional cost to the production process but also because they were convinced, with good reason, that it would hurt their sales. And so Straus recognized that changing popular attitudes toward pasteurized milk was an essential step. In 1892, he created a milk laboratory where sterilized milk could be produced at scale. The next year, he began opening what he called milk depots in low-income neighborhoods around the city, which sold the milk below cost. Straus also funded a pasteurization plant on Randall’s Island that supplied sterilized milk to an orphanage there where almost half the children had perished in only three years. Nothing else in their diet or living conditions was altered other than drinking pasteurized milk. Almost immediately, the mortality rate dropped by 14 percent.

Image
Picking up pasteurized milk at one of Nathan Straus’s milk depots in New York.Credit...George Grantham Bain Collection, via Library of Congress

Emboldened by the results of these early interventions, Straus started an extended campaign to outlaw unpasteurized milk, an effort that was ferociously opposed by the milk industry and its representatives in statehouses around the country. Quoting an English doctor at a rally in 1907, Straus told an assembled mass of protesters, “The reckless use of raw, unpasteurized milk is little short of a national crime.” Straus’s advocacy attracted the attention of President Theodore Roosevelt, who ordered an investigation into the health benefits of pasteurization. Twenty government experts came to the resounding conclusion that pasteurization “prevents much sickness and saves many lives.” New York still wavered, and in 1909, it was instead Chicago that became the first major American city to require pasteurization. The city’s commissioner of health specifically cited the demonstrations of the “philanthropist Nathan Straus” in making the case for sterilized milk. New York finally followed suit in 1912. By the early 1920s, three decades after Straus opened his first milk depot on the Lower East Side — more than half a century after Pasteur made his namesake breakthrough — unpasteurized milk had been outlawed in almost every major American city.

Image
Straus’s first milk depot in 1893.Credit...Augustus C. Long Health Sciences Library, Columbia University

The fight for pasteurized milk was one of a number of mass interventions — originating in 19th-century science but not implemented at scale until the early 20th century — that triggered the first truly egalitarian rise in life expectancy. By the first decade of the 20th century, average life spans in England and the United States had passed 50 years. Millions of people in industrialized nations found themselves in a genuinely new cycle of positive health trends — what the Nobel-laureate economist Angus Deaton has called “the great escape” — finally breaking through the ceiling that had limited Homo sapiens for the life of the species. The upward trend continued after the brief but terrifying firestorm of the Spanish flu, driven by unprecedented declines in infant and childhood mortality, particularly among working-class populations. From 1915 to 1935, infant-mortality rates in the United States were cut in half, one of the most significant declines in the history of that most critical of measures. For every hundred human beings born in New York City for most of the 19th century, fewer than 60 would make it to adulthood. Today 99 of them do.

One reason the great escape was so egalitarian in scope is that it was propelled by infrastructure advances that benefited the entire population, not just the elites. Starting in the first decades of the 20th century, human beings in cities all around the world began consuming microscopic amounts of chlorine in their drinking water. In sufficient doses, chlorine is a poison. But in very small doses, it is harmless to humans but lethal to the bacteria that cause diseases like cholera. Thanks to the same advances in microscopy and lens making that allowed Louis Pasteur to see microbes in wine and milk, scientists could now perceive and measure the amount of microbial life in a given supply of drinking water, which made it possible by the end of the 19th century to test the efficacy of different chemicals, chlorine above all else, in killing off those dangerous microbes. After conducting a number of these experiments, a pioneering sanitary adviser named John Leal quietly added chlorine to the public reservoirs in Jersey City — an audacious act that got Leal sued by the city, which said he had failed to supply “pure and wholesome” water as his contract had stipulated.

After Leal’s successful experiment, city after city began implementing chlorine disinfectant systems in their waterworks: Chicago in 1912, Detroit in 1913, Cincinnati in 1918. By 1914, more than 50 percent of public-water customers were drinking disinfected water. These interventions turned out to be a lifesaver on an astonishing scale. In 1908, when Leal first started experimenting with chlorine delivery in Jersey City, typhoid was responsible for 30 deaths per 100,000 people. Three decades later, the death rate had been reduced by a factor of 10.

The rise of chlorination, like the rise of pasteurization, could be seen solely as another triumph of applied chemistry. But acting on those new ideas from chemistry — the painstaking effort of turning them into lifesaving interventions — was the work of thousands of people in professions far afield of chemistry: sanitation reformers, local health boards, waterworks engineers. Those were the men and women who quietly labored to transform America’s drinking water from one of the great killers of modern life to a safe and reliable form of hydration.

The increase in life expectancy was also enhanced by the explosion of vaccine development during this period — and the public-health reforms that actually got those vaccines in people’s arms. The whooping-cough vaccine was developed in 1914, tuberculosis in 1921, diphtheria in 1923 — followed, most famously, by Jonas Salk’s polio vaccine in the early 1950s.

The curious, almost counterintuitive thing about the first stage of the great escape is that it was not meaningfully propelled by medical drugs. Vaccines could protect you from future infections, but if you actually got sick — or developed an infection from a cut or surgical procedure — there was very little that medical science could do for you. There was no shortage of pills and potions to take, of course. It’s just that a vast majority were ineffective at best. The historian John Barry notes that “the 1889 edition of the Merck Manual of Medical Information recommended one hundred treatments for bronchitis, each one with its fervent believers, yet the current editor of the manual recognizes that ‘none of them worked.’” If a pharmacist in 1900 was looking to stock his shelves with medicinal cures for various ailments — gout, perhaps, or indigestion — he would be likely to consult the extensive catalog of Parke, Davis & Company, now Parke-Davis, one of the most successful and well-regarded drug companies in the United States. In the pages of that catalog, he would have seen products like Damiana et Phosphorus cum Nux, which combined a psychedelic shrub and strychnine to create a product designed to “revive sexual existence.” Another elixir by the name of Duffield’s Concentrated Medicinal Fluid Extracts contained belladonna, arsenic and mercury. Cocaine was sold in an injectable form, as well as in powders and cigarettes. The catalog proudly announced that the drug would take “the place of food, make the coward brave, the silent eloquent” and “render the sufferer insensitive to pain.”

Today, of course, we think of medicine as one of the pillars of modern progress, but until quite recently, drug development was a scattershot and largely unscientific endeavor. One critical factor was the lack of any legal prohibition on selling junk medicine. In fact, in the United States, the entire pharmaceutical industry was almost entirely unregulated for the first decades of the 20th century. Technically speaking, there was an organization known as the Bureau of Chemistry, created in 1901 to oversee the industry. But this initial rendition of what ultimately became the U.S. Food and Drug Administration was toothless in terms of its ability to ensure that customers were receiving effective medical treatments. Its only responsibility was to ensure that the chemical ingredients listed on the bottle were actually present in the medicine itself. If a company wanted to put mercury or cocaine in their miracle drug, the Bureau of Chemistry had no problem with that — so long as it was mentioned on the label.

Image
Cultured penicillin mold, circa 1943.Credit...Fritz Goro/The LIFE Picture Collection, via Getty Images

Medical drugs finally began to have a material impact on life expectancy in the middle of the 20th century, led by the most famous “magic bullet” treatment of all: penicillin. Just as in the case of Jenner and the smallpox vaccine, the story of penicillin traditionally centers on a lone genius and a moment of surprising discovery. On a fateful day in September 1928, the Scottish scientist Alexander Fleming accidentally left a petri dish of Staphylococcus bacteria next to an open window before departing for a two-week vacation. When he returned to find a blue-green mold growing in the petri dish, he was about to throw it away, when he noticed something strange: The mold appeared to have stopped the bacteria’s growth. Looking at the mold under a microscope, Fleming saw that it was literally breaking down the cell walls of the bacteria, effectively destroying them. Seventeen years later, after the true magnitude of his discovery had become apparent, he was awarded the Nobel Prize in Medicine.

Like many stories of scientific breakthroughs, though, the tale of the petri dish and the open window cartoonishly simplifies and compresses the real narrative of how penicillin — and the other antibiotics that quickly followed in its wake — came to transform the world. Far from being the story of a lone genius, the triumph of penicillin is actually one of the great stories of international, multidisciplinary collaboration in the history of science. It also represents perhaps the most undersung triumph of the Allied nations during World War II. Ask most people to name a top-secret military project from that era involving an international team of brilliant scientists, and what most likely would spring to mind is the Manhattan Project. In fact, the race to produce penicillin at scale involved all the same elements — only it was a race to build a genuinely new way to keep people alive, not kill them.

For all Fleming’s perceptiveness in noting the antibacterial properties of the mold, he seemed to have not entirely grasped the true potential of what he stumbled upon. He failed to set up the most basic of experimental trials to test its efficacy at killing bacteria outside the petri dish. It took two Oxford scientists — Howard Florey and Ernst Boris Chain — to turn penicillin from a curiosity to a lifesaver, and their work didn’t begin for more than a decade after Fleming’s original discovery. By then, global events had turned the mold from a mere medical breakthrough into a key military asset: War had broken out, and it was clear that a miracle drug that could reduce the death rate from infections would be a major boost to the side that was first able to develop it.

Image
Howard Florey, who shared the 1945 Nobel Prize in Physiology or Medicine with Ernst Boris Chain and Alexander Fleming for their discovery of penicillin.Credit...Universal History Archive/Getty Images

With the help of a biochemist and brilliant laboratory engineer named Norman Heatley, Florey and Chain had built an elaborate contraption that could convert, in the span of an hour, 12 liters of broth filled with the penicillin mold into two liters of penicillin medication. By early 1941, after experiments on mice, Florey and Chain decided they were ready to try their new treatment on an actual human. In a nearby hospital they found a police constable named Albert Alexander, who had become “desperately and pathetically ill” — as one of the Oxford scientists wrote — from an infection acquired from a rose-thorn scratch. Alexander’s condition reminds us of the kind of grotesque infections that used to originate in the smallest of cuts in the era before antibiotics; already he had lost his left eye to the bacteria, and the other had gone blind. The night after Heatley visited Alexander in the hospital, he wrote in his diary, “He was oozing pus everywhere.”

Within hours of receiving an initial dose of penicillin, Alexander began to heal. It was like watching a reverse horror movie: The man’s body had been visibly disintegrating, but suddenly it switched directions. His temperature settled back to a normal range; for the first time in days, he could see through his remaining eye. The pus that had been dripping from his scalp entirely disappeared.

Image
Ernst Boris Chain at work.Credit...Hulton-Deutsch Collection/Corbis, via Getty Images

As they watched Alexander’s condition improve, Florey and his colleagues recognized they were witnessing something genuinely new. “Chain was dancing with excitement,” a colleague would write of the momentous day; Florey was “reserved and quiet but nonetheless intensely thrilled by this remarkable clinical story.” Yet for all their genius, Florey and Chain had not yet solved the problem of scale. In fact, they had such limited supplies of penicillin that they took to recycling the compound that had been excreted in Alexander’s urine. After two weeks of treatment, they ran out of the medicine entirely; Alexander’s condition immediately worsened, and on March 15 the policeman died. His remarkable, if temporary, recovery had made it clear that penicillin could battle bacterial infections. What was less clear was whether anyone could produce enough of it to make a difference.

To solve the scale problem, Florey turned to the Americans. He wrote to Warren Weaver, the visionary head of the Rockefeller Foundation, explaining the promising new medicine. Weaver recognized the significance of the finding and arranged to have the penicillin — and the Oxford team — brought over to the United States, far from the German bombs that began raining down on Britain. On July 1, 1941, Florey and Heatley took the Pan Am Clipper from Lisbon, carrying a locked briefcase containing a significant portion of the world’s penicillin supply. In America, the team was quickly set up with a lab at the Department of Agriculture’s Northern Regional Research Laboratory in Peoria, Ill. The project quickly gained the support of U.S. military officials, who were eager to find a drug that would protect the troops from deadly infections — and of several American drug companies, including Merck and Pfizer.

Image
Flasks growing penicillin culture, 1943.Credit...Daily Herald Archive/SSPL/Getty Images

It might seem strange that Florey and Heatley were set up in an agricultural lab when they were working on a medical drug. But Peoria turned out to be the perfect spot for them. The agricultural scientists had extensive experience with molds and other soil-based organisms. And the heartland location had one meaningful advantage: its proximity to corn. The mold turned out to thrive in vats of corn steep liquor, which was a waste product created by making cornstarch.

While the scientists experimented with creating larger yields in the corn steep liquors, they also suspected that there might be other strains of penicillin out in the wild that would be more amenable to rapid growth. At the same time, U.S. soldiers and sailors collected soil samples around the globe — Eastern Europe, North Africa, South America — to be shipped back to the American labs for investigation. An earlier soil search in the United States had brought back an organism that would become the basis for streptomycin, now one of the most widely used antibiotics in the world. In the years immediately after the end of the war, Pfizer and other drug companies would go on to conduct major exploratory missions seeking out soil samples everywhere, from the bottoms of mine shafts to wind-borne samples gathered with the aid of balloons. In the end Pfizer collected a staggering 135,000 distinct samples.

Image
Staff members of the United States Department of Agriculture in 1944 discussing tests related to methods of mass production of penicillin in Peoria, Ill.Credit...Fritz Goro/The LIFE Picture Collection, via Getty Images

The search for promising molds took place closer to home as well. During the summer months of 1942, shoppers in Peoria grocery stores began to notice a strange presence in the fresh produce aisles, a young woman intently examining the fruit on display, picking out and purchasing the ones with visible rot. Her name was Mary Hunt, and she was a bacteriologist from the Peoria lab, assigned the task of locating promising molds that might replace the existing strains that were being used. (Her unusual shopping habits ultimately gave her the nickname Moldy Mary.) One of Hunt’s molds — growing in a particularly unappetizing cantaloupe — turned out to be far more productive than the original strains that Florey and Chain’s team had tested. Nearly every strain of penicillin in use today descends from the colony Hunt found in that cantaloupe.

Aided by the advanced production techniques of the drug companies, the United States was soon producing a stable penicillin in quantities sufficient to be distributed to military hospitals around the world. When the Allied troops landed on the Normandy beaches on June 6, 1944, they were carrying penicillin along with their weapons.

Image
Penicillin being mass-produced at the Commercial Solvents Corporation in Indiana, circa 1944.Credit...Bettmann/Getty Images

Penicillin, alongside the other antibiotics developed soon after the war ended, triggered a revolution in human health. Mass killers like tuberculosis were almost entirely eliminated. People stopped getting severe infections from simple cuts and scrapes, like the rose-thorn scratch that killed Albert Alexander. The magical power of antibiotics to ward off infection also opened the door to new treatments. Radical surgical procedures like organ transplants became mainstream.

The antibiotics revolution marked a more general turning point in the history of medicine: Physicians now had genuinely useful drugs to prescribe. Over the subsequent decades, antibiotics were joined by other new forms of treatment: the antiretroviral drugs that have saved so many H.I.V.-positive people from the death sentence of AIDS, the statins and ACE inhibitors used to treat heart disease and now a new regime of immunotherapies that hold the promise of curing certain forms of cancer for good. Hospitals are no longer places we go to die, offering nothing but bandages and cold comfort. Routine surgical procedures rarely result in life-threatening infections.

Those medical breakthroughs were also propelled by the statistical breakthrough of randomized controlled trials (R.C.T.s), developed for the first time in the late 1940s, that finally allowed researchers to test the efficacy of experimental treatments or detect health risks from dangerous pollutants. The methodology of the R.C.T. then allowed private companies and government agencies to determine empirically whether a given drug actually worked. In the early 1960s, Congress passed the landmark Kefauver-Harris Drug Amendments, which radically extended the demands made on new drug applicants. The amendments introduced many changes to the regulatory code, but the most striking one was this: For the first time, drug companies would be required to supply proof of efficacy. It wasn’t enough for Big Pharma to offer evidence that they had listed the right ingredients on the label. They had to show proof — made possible by the invention of the R.C.T. — that their supposed cures actually worked.

The decade following the initial mass production of antibiotics marked the most extreme moment of life-span inequality globally. In 1950, when life expectancy in India and most of Africa had barely budged from the long ceiling of around 35 years, the average American could expect to live 68 years, while Scandinavians had already crossed the 70-year threshold. But the post-colonial era that followed would be characterized by an extraordinary rate of improvement across most of the developing world. The gap between the West and the rest of the world has been narrowing for the past 50 years, at a rate unheard-of in demographic history. It took Sweden roughly 150 years to reduce childhood mortality rates from 30 percent to under 1 percent. Postwar South Korea pulled off the same feat in just 40 years. India nearly doubled life expectancy in just 70 years; many African nations have done the same, despite the ravages of the AIDS epidemic. In 1951, the life-span gap that separated China and the United States was more than 20 years; now it is just two.

Image
A World Health Organization smallpox-program worker vaccinating residents in Benin in 1968.Credit...Smith Collection/Gado/Getty Images

The forces behind these trends are complex and multivariate. Some of them involve increasing standards of living and the decrease in famine, driven by the invention of artificial fertilizer and the “green revolution”; some of them involve imported medicines and infrastructure — antibiotics, chlorinated drinking water — that were developed earlier. But some of the most meaningful interventions came from within the Global South itself, including a remarkably simple but powerful technique called oral rehydration therapy.

One endemic disease that kept life expectancies down in low-income countries was cholera, which kills by creating severe dehydration and electrolyte imbalance, caused by acute diarrhea. In some extreme cases, cholera victims have been known to lose as much as 30 percent of their body weight through expelled fluids in a matter of hours. As early as the 1830s, doctors had observed that treating patients with intravenous fluids could keep them alive long enough for the disease to run its course; by the 1920s, treating cholera victims with IV fluids became standard practice in hospitals. By that point, though, cholera had become a disease that was largely relegated to the developing world, where hospitals or clinics and trained medical professionals were scarce. Setting up an IV for patients and administering fluids was not a viable intervention during a cholera outbreak affecting hundreds of thousands of people in Bangladesh or Lagos. Crowded into growing cities, lacking both modern sanitation systems and access to IV equipment, millions of people — most of them small children — died of cholera over the first six decades of the 20th century.

The sheer magnitude of that loss was a global tragedy, but it was made even more tragic because a relatively simple treatment for severe dehydration existed, one that could be performed by nonmedical professionals outside the context of a hospital. Now known as oral rehydration therapy, or O.R.T., the treatment is almost maddeningly simple: give people lots of boiled water to drink, supplemented with sugar and salts. (Americans basically are employing O.R.T. when they consume Pedialyte to combat a stomach bug.) A few doctors in India, Iraq and the Philippines argued for the treatment in the 1950s and 1960s, but in part because it didn’t seem like “advanced” medicine, it remained a fringe idea for a frustratingly long time.

That finally changed in 1971, after Bangladesh’s fight for independence from Pakistan sent a flood of refugees across the border into India. Before long, a vicious outbreak of cholera had arisen in the crowded refugee camps outside Bangaon. A Johns Hopkins-educated physician and researcher named Dilip Mahalanabis suspended his research program in a Kolkata hospital lab and immediately went to the front lines of the outbreak. He found the victims there pressed against one another on crowded hospital floors coated in layers of watery feces and vomit.

Mahalanabis quickly realized that the existing IV protocols were not going to work. Only two members of his team were even trained to deliver IV fluids. “In order to treat these people with IV saline,” he later explained, “you literally had to kneel down in their feces and their vomit.”

And so Mahalanabis decided to embrace the low-tech approach. Going against standard practice, he and his team turned to an improvised version of oral rehydration therapy. He delivered it directly to the patients he had contact with, like those sprawled bodies on the floor of the Bangaon hospital. Under Mahalanabis’s supervision, more than 3,000 patients in the refugee camps received O.R.T. therapy. The strategy proved to be an astonishing success: Mortality rates dropped by an order of magnitude, to 3 percent from 30 percent, all by using a vastly simpler method of treatment.

Inspired by the success, Mahalanabis and his colleagues started a widespread educational campaign, with fieldworkers demonstrating how easy it was for nonspecialists to administer the therapy themselves. “We prepared pamphlets describing how to mix salt and glucose and distributed them along the border,” Mahalanabis later recalled. “The information was also broadcast on a clandestine Bangladeshi radio station.” Boil water, add these ingredients and force your child or your cousin or your neighbor to drink it. Those were the only skills required. Why not let amateurs into the act?

In 1980, almost a decade after Bangladeshi independence, a local nonprofit known as BRAC devised an ingenious plan to evangelize the O.R.T. technique among small villages throughout the young nation. Teams of 14 women, each accompanied by a cook and a male supervisor, traveled to villages, demonstrating how to administer oral saline using only water, sugar and salt. The pilot program generated encouraging results, and so the Bangladeshi government began distributing oral hydration solutions in hundreds of health centers, employing thousands of workers.

The Bangladeshi triumph was replicated around the world. O.R.T. is now a key element of UNICEF’s program to ensure childhood survival in the Global South, and it is included on the World Health Organization’s Model List of Essential Medicines. The Lancet called it “potentially the most important medical advance of the 20th century.” As many as 50 million people are said to have died of cholera in the 19th century. In the first decades of the 21st century, fewer than 66,000 people were reported to have succumbed to the disease, on a planet with eight times the population.

Image
Rahima Banu, top left, in 2000. In 1975, when she was a toddler, she was recorded as having the last known infection of naturally occurring smallpox in the world.Credit...C.D.C./World Health Organization; Dr. Stanley O. Foster

Of all the achievements that brought the great escape to the entire world, though, one stands out: the vanquishing of smallpox. After thousands of years of conflict and cohabitation with humans, the naturally occurring variola major virus infected its last human being in October 1975, when the telltale pustules appeared on the skin of a Bangladeshi toddler named Rahima Banu. (A less deadly cousin of the virus, variola minor, was eliminated in Somalia two years later.) Banu lived on Bhola Island, on the coast of Bangladesh, at the mouth of the Meghna River. Officials from the World Health Organization were notified of the case and sent a team to treat the young girl. With local field workers, they vaccinated 18,150 individuals who lived within a 1.5-mile radius of her house. She survived her encounter with the disease, and the vaccinations on Bhola Island kept the virus from replicating in another host.

Four years later, after an extensive global search for lingering outbreaks, a commission of scientists signed a document on Dec. 9, 1979, certifying that smallpox had been eradicated. In May of the following year, the World Health Assembly officially declared that “the world and all its peoples have won freedom from smallpox” and paid tribute to all the nations “which by their collective action have freed mankind of this ancient scourge.” It was a truly epic achievement, one that required a mix of visionary thinking and on-the-ground fieldwork spanning dozens of different countries. Dec. 9, 1979 should be commemorated with the same measure of respect that we pay to the moon landing: a milestone in the story of human progress.

The original advocates for vaccination, back in Edward Jenner’s age, dreamed of wiping the smallpox virus off the face of the earth. On the eve of his first term as president, Thomas Jefferson wrote about removing smallpox from “the catalog of evils.” But in the early 1800s, the fight against variola was progressing on a patient-by-patient basis. Eradicating smallpox entirely on a global scale was a technical impossibility. What moved smallpox eradication from an idle fantasy to the realm of possibility?

One key factor was a scientific understanding about the virus itself. Virologists had come to believe that variola could survive and replicate only inside human beings. Many viruses that cause disease in humans can also infect animals — think of Jenner’s cowpox. But variola had lost the ability to survive outside human bodies; even our close relatives among the primates are immune. This knowledge gave the eradicators a critical advantage over the virus. A traditional infectious agent under attack by a mass vaccination effort could take shelter in another host species — rodents, say, or birds. But because variola had abandoned whatever original host brought it to humans, the virus was uniquely vulnerable to the eradication campaign. If you could drive the virus out of the human population, you could truly wipe it off the face of the earth.

Scientific innovations also played a crucial role in the eradication projects. A C.D.C. consultant named William Foege promoted a “ring vaccination” technique that helped clear smallpox from infected areas without having to vaccinate every single person. The invention of the “bifurcated” needle allowed fieldworkers to use what was called a multiple-puncture vaccination technique. Like O.R.T., the bifurcated needle was much less technologically advanced than its predecessor: the expensive “jet injectors” that were previously used in mass vaccination efforts. It also required less than a quarter of the amount of vaccine as earlier techniques, an essential attribute for organizations trying to vaccinate millions of people around the world. And like O.R.T., it democratized the field, making it easier for nonspecialists to perform vaccinations. Another crucial asset was a heat-stable vaccine, developed around 1950, that could be stored for 30 days unrefrigerated, an enormous advantage in distributing vaccines to small villages that often lacked refrigeration and electricity.

Image
Bifurcated needles, a key technology that helped democratize vaccination against smallpox.Credit...SSPL/Getty Images

But another key breakthrough was the development of institutions like the W.H.O. and the C.D.C. themselves.

Starting in the mid-1960s, the W.H.O. — led by a C.D.C. official, D.A. Henderson — worked in concert with hundreds of thousands of health workers, who oversaw surveillance and vaccinations in the more than 40 countries still suffering from smallpox outbreaks. The idea of an international body that could organize the activity of so many people over such a vast geography, and over so many separate jurisdictions, would have been unthinkable at the dawn of the 19th century.

But as with chlorination and oral rehydration therapy, smallpox eradication was a triumph of bottom-up organization. Just locating smallpox outbreaks in countries as vast as India, in an age without cellphones and the internet and in many cases electricity, was a feat of staggering complexity. The ring-vaccination approach offered a more efficient use of the vaccine — as opposed to simply vaccinating the entire population — but officials still needed to find the cases to build the ring around.

In India alone, that kind of surveillance work required thousands of district health personnel, and more than a hundred thousand fieldworkers, overcoming challenging physical conditions and local resistance to do their work. And even that wasn’t a big enough labor force to track every single outbreak in the country. Eventually the eradicators decided to widen their surveillance network further, by offering a reward to anyone who reported a smallpox case. (The reward money increased steadily as the smallpox caseload dropped, ultimately reaching the equivalent of $1,000.) The wide-network approach proved to be a spectacular success. Outbreaks dropped precipitously during the last four months of 1974: 2,124 to 980 to 343 to 285. During the final stages of the project, fieldworkers would visit each of the country’s 100 million households — once a month in endemic states, once every three months throughout the rest of the country — to trace the remaining spread of the virus.

Eradication was ultimately as dependent on that wide network as much as the bifurcated needle or any other technological advance. Smallpox eradication might have been originally dreamed up in the headquarters of public-health institutions in Atlanta and Geneva, but it took an army of villagers to make it a reality.

It is fitting that what is arguably the most impressive feat in the history of health revolved around smallpox, because the very first breakthroughs that made a material difference in extending our lives — variolation and vaccination — were also attempts to lessen the threat of that terrible disease. But the list of new ideas that propelled the great escape is long and varied. Some of them took the form of tangible objects: X-ray machines, antiretroviral drugs. Some of them were legal or institutional in nature: the creation of the Food and Drug Administration, seatbelt laws. Some of them were statistical breakthroughs: new ways of tracking data, like the invention of R.C.T.s, which finally allowed us to determine empirically if new treatments worked as promised, or proved a causal link between cigarettes and cancer. Some of them were meta-innovations in the way that new treatments are discovered, like the development of “rational drug design,” which finally moved drug development from the Fleming model of serendipitous discovery to a process built on the foundations of chemistry.

Looking forward, how likely is it that humans can continue their runaway growth in life expectancy? It’s by no means a given that we can. The infection count of the Covid-19 pandemic is still growing; even before the outbreak, the United States had experienced a significant rise of opioid overdoses and suicides — the so-called deaths of despair — which contributed to decreased life expectancies for the country for three years straight, the longest period of decline since the end of the Spanish flu. As the current pandemic has made clear, substantial health gaps still exist between different socioeconomic groups and nations around the world. (Provisional data suggests that African-Americans lost close to three years of expected life in 2020, while the country as a whole lost one year.)

The truth is the spike in global population has not been caused by some worldwide surge in fertility. What changed is people stopped dying.

And paradoxically, the epic triumph of doubling life expectancy has created its own, equally epic set of problems for the planet. In 1918, there were fewer than two billion human beings alive in the world, and today there are nearly eight billion. Demagogues sometimes rant about irresponsible birthrates in developing-world countries, but the truth is the spike in global population has not been caused by some worldwide surge in fertility. In fact, people are having fewer babies per capita than ever. What changed over the past two centuries, first in the industrialized world, then globally, is that people stopped dying — particularly young people. And because they didn’t die, most then lived long enough to have their own children, who repeated the cycle with their offspring. Increase the portion of the population that survives to childbearing years, and you’ll have more children, even if each individual has fewer offspring on average. Keep their parents and grandparents alive longer, and the existing population swells as the surviving generations stack up. Repeat that pattern all over the world for four or five generations, and global population can grow to eight billion from two billion, despite declining fertility rates.

All those brilliant solutions we engineered to reduce or eliminate threats like smallpox created a new, higher-level threat: ourselves. Many of the key problems we now face as a species are second-order effects of reduced mortality. For understandable reasons, climate change is usually understood as a byproduct of the Industrial Revolution, but had we somehow managed to adopt a lifestyle powered by fossil fuels without reducing mortality rates — in other words, if we had invented steam engines and coal-powered electrical grids and automobiles but kept global population at 1800 levels — climate change would be much less of an issue. There simply wouldn’t be enough humans to make a meaningful impact on carbon levels in the atmosphere.

Runaway population growth — and the environmental crisis it has helped produce — should remind us that continued advances in life expectancy are not inevitable. We know from our recent history during the industrial age that scientific and technological progress alone do not guarantee positive trends in human health. Perhaps our increasingly interconnected world — and dependence on industrial livestock, particularly chickens — may lead us into what some have called an age of pandemics, in which Covid-19 is only a preview of even more deadly avian-flu outbreaks. Perhaps some rogue technology — nuclear weapons, bioterror attacks — will kill enough people to reverse the great escape. Or perhaps it will be the environmental impact of 10 billion people living in industrial societies that will send us backward. Extending our lives helped give us the climate crisis. Perhaps the climate crisis will ultimately trigger a reversion to the mean.

No place on earth embodies that complicated reality more poignantly than Bhola Island, Bangladesh. Almost half a century ago, it was the site of one of our proudest moments as a species: the elimination of variola major, realizing the dream that Jenner and Jefferson had almost two centuries before. But in the years that followed smallpox eradication, the island was subjected to a series of devastating floods; almost half a million people have been displaced from the region since Rahima Banu contracted smallpox there. Today large stretches of Bhola Island have been permanently lost to the rising sea waters caused by climate change. The entire island may have disappeared from the map of the world by the time our children and grandchildren celebrate the centennial of smallpox eradication in 2079.

What will their life spans look like then? Will the forces that drove so much positive change over the past century continue to propel the great escape? Will smallpox turn out to be just the first in a long line of threats — polio, malaria, influenza — removed from Jefferson’s “catalog of evils”? Will the figurative rising tide of egalitarian public health continue to lift all the boats? Or will those momentous achievements — all that unexpected life — be washed away by an actual tide?


Steven Johnson’s article is excerpted from his 13th book, “Extra Life: A Short History of Living Longer.” Johnson is also a host of a four-part PBS/BBC series of the same title airing this month.

A correction was made on 
June 3, 2021

An earlier version of this article misidentified Norman Heatley's professional field. He was a biochemist by training, not an engineer.

How we handle corrections

A version of this article appears in print on  , Page 11 of the Sunday Magazine with the headline: The Living Century. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT