Against medical advice: a miner smoking in Heilongjiang Province, China, October 2015.
jason lee / reuters

For the first time in recorded history, bacteria, viruses, and other infectious agents do not cause the majority of deaths or disabilities in any region of the world. Since 2003, the number of people who die each year from HIV/AIDS has fallen by more than 40 percent. Deaths from malaria, tuberculosis, and diarrheal diseases have fallen by more than 25 percent each. In 1950, there were nearly 100 countries, including almost every one in sub-Saharan Africa, South Asia, and Southeast Asia, where at least one out of five children died before his or her fifth birthday, most of them from infectious diseases. Today, there are none. The average life expectancy in developing countries has risen to 70.

But the news is not all good. In the past, gains in longevity went hand in hand with broader improvements in health-care systems, governance, and infrastructure. That meant the byproducts of better health—a growing young work force, less deadly cities, and a shift in countries’ health-care needs to the problems of older people—were sources of wider prosperity and inclusion. Today, improvements in health are driven more by targeted medical interventions and international aid than by general development. Without that development, the changes that now accompany declines in infectious diseases are potential sources of instability: rising youth unemployment, overcrowded and underbuilt cities, surging rates of premature chronic diseases, and more migration.

Many developing countries are not investing enough to ensure that children who survive past adolescence get a good education, solid job opportunities, and high-quality health care as adults. Many rich countries, meanwhile, are embracing policies on trade, immigration, and climate change that make those tasks even harder.

There is a paradox in humanity’s progress against infectious diseases: the world has been getting healthier in ways that should make us worry. The recent hard-won gains threaten to bring a host of new and destabilizing problems. Whether the dramatic improvements in global health turn out to be a blessing or a curse depends on what the world does next.


Infectious disease, the historian William McNeill once wrote, has been “one of the fundamental parameters and determinants of human history.” Epidemics helped bring down the Roman Empire; parasites delayed the colonization of Africa; and measles, smallpox, and other infections enabled the Spanish conquest of the Aztec and Inca Empires. The invention of the printing press came partly in response to the scarcity of labor that followed the Black Death in fourteenth-century Europe.

The power of microbes over human affairs comes from several traits. Infectious diseases, by definition, spread. The risk of transmission is greatest when large numbers of people and animals come into close contact. And people who haven’t previously been exposed to a particular disease—isolated populations, such as the Aztecs and the Incas or infants and children—are the most vulnerable to it. These traits mean that diseases such as smallpox and malaria shaped the outcomes of wars, enabling some conquests and thwarting others. They also explain why for most of history, the only large cities were wealthy industrial centers or the capitals of empires, such as Rome, which could draw enough migrants from the countryside to compensate for the death toll caused by dysentery, tuberculosis, and other diseases of urban density. 

The measures taken to fight infectious diseases have driven human history just as much as the diseases themselves have. Preventing and controlling pestilence depend on cooperation between people and governments. Individuals and communities could not isolate themselves from the risk of infectious disease for long, and even then only at great cost. The historian Mark Harrison has argued that starting in the fourteenth century, the need to control infectious diseases helped create the modern state by forcing local and national authorities to begin assuming greater power over their citizens’ lives. 

Recent dramatic declines in plagues and parasites have brought more modest economic benefits than past declines did.

The same factors that made infectious diseases so influential through history also explain why overcoming them can lead to so much prosperity. As mortality rates from infectious diseases declined in the United States and Europe at the end of the nineteenth century, larger urban areas became more viable. Packed in cities, people swapped ideas, improved on one another’s inventions, and started successful businesses. Indeed, no country has ever grown wealthy without urbanizing first. 

Better health brought other benefits, too. Lower child mortality meant a larger share of young working-age adults in the population. Once fewer children died of infectious diseases, parents generally had fewer of them, freeing up women to join the labor force and leaving more resources to educate the children they did have. The government measures taken to reduce the incidence of infectious disease, such as quarantining the sick, mandating vaccinations, and building sewers and safe water systems, set early precedents for other forms of social regulation, such as compulsory schooling and military service, and for public investments in roads, railways, and ports. 

This combination of better health and broad social improvement was a recipe for prosperity. As the economist Robert Gordon has written, “The historic decline in infant mortality centered in the six-decade period of 1890–1950 is one of the most important single facts in the history of American economic growth.” In the 1950s, China was among the world’s poorest countries and on the cusp of a famine that would kill some 30 million people when it began a dramatic campaign against infectious diseases. Between 1960 and 1976, average life expectancy in China rose by 21 years, despite the Great Famine and the Cultural Revolution. The success of that campaign, which was built around immunization programs and a massive rural hygiene, health-education, and sanitation effort, helped the country emerge decades later as one of the great global economic powers. Most of the countries that achieved sustained economic booms over the past 50 years did so some two decades after getting infectious diseases and child mortality under control.


This same opportunity to harness the benefits of a healthier population is now emerging in today’s poor countries. But taking advantage of it will be harder than it was in the past, since more recent improvements in public health have not been accompanied by the same economic and social benefits.

Progress against infectious diseases over the past two decades has not occurred in the same way as such progress did in the past. When now-wealthy countries took on infectious diseases in the nineteenth and early twentieth centuries, they did so before the invention of effective medicines for most diseases. The main drivers of progress in public health were government-mandated measures—such as milk pasteurization, laws against overcrowded tenements, and investments in clean water and sewage treatment systems—and better social norms around hygiene, childcare, and girls’ education. In fact, nearly two-thirds of the gains in U.S. life expectancy since 1880 came before widespread access to antibiotics and the development of most vaccines. Only half of the decline in death rates in China and other developing countries between World War II and 1970 was due to antibiotics and immunization, with education, sanitation, and improved local oversight of public health playing large roles. 

Today, large-scale quality-of-life improvements are no longer the main drivers of public health progress in many countries. Take Niger. Its citizens are poorer now, with the country’s per capita GDP at $363 in 2016, than they were in 1980, when the per capita GDP was $419. A child in Niger today can expect to receive five years of schooling, tied for the lowest amount of education in the world. The national government spends just $17 per person each year on health care. Of the 188 countries that the United Nations ranked in its 2015 Human Development Index, Niger finished second to last. Yet despite these difficulties, a person born in Niger today can expect to live to be 61 years old, 16 years longer than someone born in that country 25 years ago. Infant mortality has declined by nearly 60 percent over the same period. Death and disability from infectious diseases has fallen by 17 percent since 1990.

The average government in the developing world spends just $23 per person on health care each year.

The progress in Niger—and in other poor countries like it—reflects the tireless efforts of foreign donors, international agencies, and local governments. From 2002 to 2012, aid to address infectious diseases in poor countries rose from $11 billion to $28 billion. The returns on that investment have been impressive: longer lives, fewer dead children, and less human suffering. Yet the focus on narrow medical interventions against specific infectious diseases has come at the expense of broader investments in infrastructure, governance, good health care, and the other determinants of health. Without those societal gains, recent dramatic declines in plagues and parasites have brought more modest economic benefits than past declines did.

In fact, although extreme poverty has declined everywhere, in Africa and Southeast Asia, the places that have recently seen the greatest progress against infectious diseases, the middle class has hardly grown at all. According to the Pew Research Center, the middle class in developing countries—people making between $10 and $20 per day—expanded by 385 million people between 2001 and 2011. But that growth occurred almost exclusively in China, eastern Europe, and South America. When low-income countries achieved an average life expectancy of 60 in 2011, their median per capita GDP was $1,072, a quarter of the figure for high-income countries when they reached that same average life expectancy, in 1947. In other words, the world has gotten dramatically better at lengthening life spans and reducing child suffering in poor places, but the improvements in much of everything else that matters to people’s well-being have failed to keep pace.

A child receives a polio vaccination in Jalalabad, Afghanistan, December 2015.
A child receives a polio vaccination in Jalalabad, Afghanistan, December 2015.
Parwiz Parwiz / REUTERS


The decline in infectious diseases has enabled many more people in poor countries to survive past infancy and adolescence, but their prospects for good health on reaching adulthood have not improved nearly as much. The life expectancy of a 15-year-old in the average low-income country is no better than it was in 1990. That is in large part due to the rise in chronic diseases. 

People have to die sometime, so it is unsurprising that fewer children dying from plagues and parasites means more adults dying from cancer, heart attacks, and diabetes. Yet the decline in infectious diseases does not explain why so many people in poor countries are developing chronic ailments at much younger ages and with much worse outcomes than people in wealthier countries. Deaths from hypertensive heart disease (caused by high blood pressure) among people under 60 have increased by nearly 50 percent in sub-Saharan Africa in the past 25 years. In 1990, heart disease, cancer, and other noncommunicable diseases caused about a quarter of deaths and disabilities in poor countries. By 2040, that number is expected to jump as high as 80 percent in countries that are still quite poor. At that point, the share of the total deaths and disabilities from chronic diseases in Bangladesh, Ethiopia, and Myanmar, for example, will be roughly the same as it is in the United Kingdom and the United States, but the diseases will affect much younger people.

Part of the problem is that most noncommunicable diseases are chronic, require more sophisticated health-care infrastructure, and cost more to treat than infectious diseases. Yet the average government in the developing world still spends just $23 per person on health care each year. In comparison, the United Kingdom spends $2,695 per person on health care, and the United States spends $3,860. So great is the disparity that in 2014, the governments of all 48 sub-Saharan African countries together spent less on health care ($67 billion) than the government of Australia did ($68 billion).

International donors have been slow to adjust to a world in which infectious diseases no longer pose the chief threat to public health. Although noncommunicable diseases now cause the majority of deaths in developing countries, they receive less than two percent of annual health aid. It is simply unsustainable to spend a lot of money to save someone from a preventable and treatable infectious disease in childhood only for that same person to succumb to a preventable and, in many cases, equally treatable chronic disease in middle age, when he or she will leave behind a family that needs to be cared for and a job that needs to be done. Those knock-on effects mean that noncommunicable diseases, in addition to claiming lives, also sap the labor force and diminish economic productivity. The World Economic Forum projects that these diseases will cost developing countries some $21 trillion in lost economic output between 2011 and 2030.


Perhaps no places have been more affected by the rise and fall of infectious diseases than cities. History remembers the great urban epidemics, such as the Plague of Athens, which reduced the city to “unprecedented lawlessness,” according to the Greek historian Thucydides. But it was everyday killers—tuberculosis, typhoid fever, and other food- and fecal-borne diseases—that for millennia made large cities deadly for their inhabitants. In the late seventeenth century, John Graunt, an amateur demographer in England, noted that London was recording significantly more deaths than christenings and that about 6,000 migrants had to come from the countryside each year to make up the shortfall. In the United States, as late as 1900, life expectancy was ten years higher in rural areas than in towns and cities.

The combination of public health reform, laws against overcrowded tenements, and better sanitation revolutionized urban health. In 1857, no U.S. city had a sanitary sewer system; by 1900, 80 percent of Americans living in cities were served by one. According to the economists David Cutler and Grant Miller, improved access to filtered and chlorinated water alone accounted for nearly half of the decline in mortality in U.S. cities between 1900 and 1936. Clean running water had the secondary benefit of enabling more manufacturing, especially in the textile sector, and indoor plumbing freed women from the drudgery of carrying fresh water into their homes and dirty water out of them. Building these waterworks and sanitation systems also marked the first major undertakings for many city governments that required significant public financing, usually in the form of long-term bonds. Having learned how to finance big projects, city councils later turned to the same methods to build railways, ports, highways, canals, and electrical grids. 

As infectious disease rates fall in developing countries, their cities are now experiencing population booms. By 2020, 1.48 billion more people will live in cities than did in 2000, and the vast majority—1.35 billion of them—will be in lower-income countries. But urban infrastructure has not kept pace, leaving many city dwellers living in slums. The UN estimates that in 2014, roughly one out of every eight humans, some 881 million people, lived in slums in poor countries. Some 96 percent of the urban population of the Central African Republic, for example, lives in slums. By 2030, the global population of slum dwellers is expected to reach two billion.

Although the urban residents of poor nations are healthier than their parents and grandparents, many do not enjoy the accompanying benefits that residents of now-wealthy metropolises once did. In too many developing countries, the electricity in cities is unreliable. The municipal water systems are old and poorly maintained and suffer from low or intermittent water pressure, which reduces the effectiveness of adding chlorine to kill bacteria and other microbes. Waste treatment plants are rare in Africa and Asia and treat only 15 percent of municipal wastewater in Latin America. 

Many urban transportation networks have also failed to keep up with all the extra people. In the past ten years, according to the World Bank, the average driving speed in Dhaka, the capital of Bangladesh, which has nearly 16 million inhabitants, has dropped to less than four miles an hour, little faster than walking. Sitting in traffic consumes 3.2 million of Dhaka’s residents’ work hours each day. Clogged roads, slums, and overwhelmed electrical and sewer systems threaten to cancel out the economic benefits that urbanization usually provides. If that pattern persists, fast-growing cities in developing countries may become the first to keep their residents poor rather than make them rich.

The slum of Petare in Caracas, Venezuela, February 2018.
Marco Bello / REUTERS


Both the promise and the peril of the recent decline in infectious diseases are most acute in sub-Saharan Africa. By 2035, more sub-Saharan Africans will be reaching working age (15 to 64) each year than will people in the rest of the world combined. Each year for the next ten years, 11 million young people in sub-Saharan Africa will join the job market. 

In the past, countries with a fast-growing work force of young adults employed them in labor-intensive manufacturing industries, from the textile mills in nineteenth-century Lancashire to smartphone factories in Shenzhen today. Yet manufacturing has made up the same share of economic output in sub-Saharan Africa since the 1960s. In 2010, only seven percent of the region’s work force was employed in factories, compared with 15 percent in Asia and 12 percent in Latin America. On top of that, agricultural employment is falling as climate change makes it harder for African farmers to earn a living. And most of the private-sector jobs created in the region over the past two decades have been in temporary or day labor.

Progress against infectious diseases cannot be measured just in terms of the lives that were once lost to plagues and parasites.

Some of the reasons for the lack of manufacturing jobs in sub-Saharan Africa are unsurprising: too few roads and ports, too little access to reliable electricity, too much corruption, and too many cumbersome labor regulations. Robots in wealthy countries are doing more and more of the jobs for which companies might have once hired low-skilled workers in poorer countries. But the biggest factor operating against manufacturing in sub-Saharan Africa may be that the decline of infectious diseases arrived in the region too late.

With improved health, the working-age populations of many African countries are growing, but they face stiff competition from workers in China and other countries that achieved their big gains against plagues and parasites earlier. A few African countries, such as Ethiopia, have made some headway in textile manufacturing, but the wages in China and other low-cost Asian labor centers are not rising fast enough to push most factory owners to leave for Africa. That poses a problem for African countries trying to build a domestic consumer base, make inroads into global markets, and employ their over 200 million young people.

One alternative is to increase employment in the service sector, but here, too, poor countries are running into problems. Many service-sector jobs require specialized education—medical school, law school, an accounting degree. Although sub-Saharan Africa has greatly -increased school attendance, the World Bank reports that as many as 40 percent of children in the region still do not meet basic learning standards in numeracy and around half fall short in literacy. Lower-skilled services, such as building and grounds maintenance, are harder to automate and can employ large numbers of people, but they do not offer the same track to the middle class that manufacturing jobs do. 

Demographics are working against many poor countries just when these characteristics should be the engine of economic prosperity. The World Bank estimates that the working-age population in developing countries will increase by 2.1 billion by 2050. Unless national employment rates improve, that will mean nearly 900 million more young adults without work. A disproportionate number of young unemployed or underemployed adults can lead to social unrest, particularly in weak or corrupt states already riven by ethnic or religious conflicts.


In the past, people have often responded to dramatic reductions in infectious diseases and potentially destabilizing youth bulges by emigrating. The nineteenth-century wave of migration from Europe to North America came primarily from countries with a surplus of young adults, around 20 years after sharp declines in infant mortality. In the 1980s and 1990s, similar demographic factors pushed large numbers of people to migrate from Latin America and the Caribbean, the Middle East, and South Asia to Spain, the United Kingdom, and the United States.

Today, it is sub-Saharan Africans who are on the move. Between 1990 and 2013, the number of economic migrants leaving sub-Saharan Africa increased sixfold, from less than one million to six million each year. Most went to France, Italy, Portugal, the United Kingdom, and the United States. In 2016, 311,000 migrants passed through Niger on their way to Europe, and over 5,000 of them died trying to cross the Mediterranean in ramshackle boats. Most came from Niger, Nigeria, and neighboring poor countries that all experienced sharp declines in child mortality and infectious diseases in the past 20 years. 

This wave of migration has sparked a backlash in the United States and Europe, where more and more politicians are campaigning for office—and sometimes winning—on platforms opposing immigration and espousing economic nationalism. Yet limiting trade undermines economic growth, making it even harder for developing countries to generate enough jobs to keep pace with their rising numbers of young adults. 

Populist politicians in the United States, Europe, and the rest of the developed world must come to terms with the inconsistencies in their policies on global health, trade, and immigration. More economic opportunity alone will not stop all young people from emigrating—their aspirations often go beyond a better job—but it can make waves of migration shorter, less desperate, and less intense. American and European voters can choose opposition to trade from low-wage countries or opposition to immigration; they cannot have both.

The lesson is not that progress against disease is not worthwhile or that it came too soon to developing nations. Nor is it that the war against microbes is over: global health threats, such as pandemic flu and antibiotic-resistant bugs, still loom. There is no worthier goal than reducing unnecessary pain and preventing deaths, especially among children. And a dire future is not inevitable; healthier populations can still lead poor countries to prosperity, just as they did in the past. 

To make sure that they do, the world needs to pair global health aid with investments that can help countries improve their health-care systems, make their cities more livable, and enable their companies to employ more people more productively. Voluntary family-planning and girls’ education initiatives have helped reduce fertility rates in countries, such as Senegal, to sustainable levels and better integrate women into the economy. Programs that encourage private investors to put their money toward building infrastructure and electricity generation, such as the U.S. government’s Power Africa initiative, which aims to get enough government and private investment to provide energy to 20 million African households and companies by 2030, can make it easier for entrepreneurs to start businesses and for factories to hire more young workers.

At the same time, developing countries need to devote more resources to their cities and health-care systems. Establishing more easily enforceable urban land rights can promote investment in formal housing, free up workers to move to find jobs, and create the foundation for a system of property taxes. Strong health-care systems can help doctors spot disease outbreaks quickly and diagnose chronic diseases early enough that patients can still be treated. That makes investments in basic health-care infrastructure a cost-effective way to improve public health. Brazil’s Family Health Strategy, for example, covers more than half the population, costs the government roughly $50 per person per year, and has sharply reduced deaths from heart disease, diabetes, and infectious diseases. 

Progress against infectious diseases cannot be measured just in terms of the lives that were once lost to plagues and parasites. The real miracles in global health will happen when the people whose lives are saved by better health care can seize the opportunities and gain the prosperity that have come with health improvements in the past.