In the late eighteenth century, the English political economist Thomas Malthus took a look at two sets of numbers and had an unnerving vision: with food supplies increasing arithmetically while the number of people grew geometrically, the world population would eventually run out of food. "By that law of our nature which makes food necessary to the life of man," he wrote in 1798, "the effects of these two unequal powers must be kept equal. This implies a strong and constantly operating check on population from the difficulty of subsistence. This difficulty must fall some where and must necessarily be severely felt by a large portion of mankind."
He was right, at least at the time: in Malthus' day, food production was essentially limited by the availability of land, whereas procreation faced few restraints. Malthus did not foresee, however, that new technologies in the late nineteenth century and throughout the twentieth century would dramatically raise agricultural productivity. Farmers worldwide learned to use new fertilizers, petrochemical-based herbicides and insecticides, genetically improved plants (especially wheat, corn, and rice), and massive diversions of water for irrigation, notably in China and South Asia. Crop yields soared, and in the United States so much so that by the 1950s chronic surpluses and low prices were becoming problems. The economist Willard Cochrane wrote in 1965 that thanks to the recent technological revolution in U.S. agriculture, the previous decade had witnessed "the greatest gain in productive efficiency of any ten-year period in the history of American farming."
Throughout the 1960s, 1970s, and 1980s, crop yields continued to rise, not only in rich countries but also in many parts of the developing world. In India, Mexico, and elsewhere the "green revolution" was launched by plant breeders, such as the legendary Norman Borlaug. New varieties of wheat, maize, and rice raised yields by amounts that seemed miraculous at the time. The effort provided a new model for traditional farmers and improved their food security. And it encouraged a sense of purpose for agricultural research: to end world hunger. But it also exacerbated the disadvantages of poor, landless farmers relative to land-rich ones, who could afford the innovations. Landed farmers could find the credit to invest in irrigation and purchase high-yielding seeds, but those without access to credit, and thus the new inputs, were left behind.
In the United States, farmers who were quick to adopt the new technologies outpaced their neighbors who were using outmoded methods and then bought them out, enlarging their land holdings and enhancing their eligibility for government subsidies, which are based on the amount of land in production. In the rest of the developed world, successful farmers became more productive and richer, and the income gap between those farmers and landless rural wage earners widened. Nevertheless, the Malthusian problem seemed to have been largely solved, at least in the aggregate. Yet as farmers in rich countries continued their race toward even higher yields, low-technology agriculture in the poorest countries, such as Bangladesh and Mali, fell further behind.
By some measures, the overall situation has continued to improve. According to the U.S. Department of Agriculture, wheat yields in the United States rose from roughly 26 bushels per acre in 1965 to roughly 43 bushels per acre in 1998 and then to roughly 45 bushels per acre in 2008. Over the same periods, corn yields rose from about 74 bushels per acre to about 134 bushels per acre and then to 154 bushels per acre. This progress, however, has also bred overconfidence or, as the Nobel Prize-winning economist Amartya Sen put it, "Malthusian optimism." This overconfidence, in turn, has bred a sort of complacency about those still in need. In their recent book Enough, the Wall Street Journal reporters Roger Thurow and Scott Kilman note that when food outputs outstrip population growth, the fear of famine eases and "the people who are still too poor to eat enough seem to evaporate from public consciousness." As Thurow and Kilman explain, clear advances in food production in the United States, South America, and parts of Asia throughout the late twentieth century masked the ongoing crisis in poor countries, and international development efforts in continuing to spread the green revolution flagged. Meanwhile, the world's poor farmers were still unable to take advantage of the technological advances that had brought food security and economic development to others. Some scientists, philanthropists, and governments of developed nations seemed to have lost sight of what had once been the green revolution's central goal: food security for all.
More recently, rising food prices have intensified the risks of large-scale hunger. The reasons for these increases are complex, but one of them is that demand for food is increasing as populations and incomes grow, especially in China and South Asia, even as the supply of food is increasingly being diverted to other uses, such as the production of biofuels. As a result, the specter of Malthus is again stalking the world's poor.
In June 2009, the Food and Agriculture Organization, a UN agency, projected that hunger now affects one billion people -- about one-sixth of the world's population -- due to "stubbornly high food prices" and the global economic slowdown, which has depressed the incomes of the world's poorest people. A July 2009 report by the FAO warned that "domestic prices in developing countries remain generally very high and in some cases are still at record levels." Speaking at the United Nations Conference on Trade and Development in mid-2009, Akinwumi Adesina, an agricultural economist with the Alliance for a Green Revolution in Africa, noted that the global recession's dampening of prices on commodity markets was masking "the next storm." Jacques Diouf, director general of the FAO, has stressed that the world's poor, mostly landless laborers and the residents of urban slums, both groups that are largely beyond the reach of global media, are suffering a "silent hunger crisis."
The crisis has been intensifying thanks to three ominous trends that are only now coming into focus. First, the rate of increases in crop yields appears to be slowing. Second, and this is related, agricultural research expenditures have diminished since the 1980s, especially in Africa. Third, global food supplies have begun to fall relative to demand and prices have begun to rise -- problems that are being exacerbated by the increasing use, in rich countries, of grain not only as food and feed but also as biofuel. By the middle of this decade, despite record harvests in most of the world, food shelves were increasingly empty. When the 2008 fall harvest in the Northern Hemisphere began, world grain stocks amounted to just 62 days of consumption, a near-record low. Lester Brown, founder and president of the Earth Policy Institute, wrote in May 2009 that world grain production had fallen short of consumption in six of the previous nine years.
The next month, the U.S. Department of Agriculture reported that despite near-record grain harvests in 2009, which might keep prices in check briefly, shortages could again raise the cost of food to consumers in the next several years, and perhaps even sooner. Oilseed reserves remain at low levels relative to demand, and soybean reserves are nearing a 25-year low. Greg Wagner, an analyst with AgResource, in Chicago, noted last June that "the dynamics for higher food prices are already in place, but they are being masked by problems in the larger economy" -- problems such as the crisis in global financial markets and chronic government deficits. The International Monetary Fund's index of primary commodity prices, which measures the average price variation in a group of critical food grains and oilseeds, rose from a base of 100 in 2005 to a high averaging 157 in 2008, fell to 126 in March 2009 as global demand collapsed with the economic crisis, but then rose back up to 143 in May 2009, despite weakened demand. By August 2009, at the beginning of the fall harvest season in the Northern Hemisphere, the index still stood at more than 135. Was Malthus right after all?
RESEARCH OR DESTROY
Since World War II, gains in agricultural productivity around the world have been defined by greater output per acre of land. These gains have primarily resulted from substantial increases in the use of agrochemicals, fertilizers, large farm equipment, water, and (mainly in Asia) labor. But all these inputs have come at a cost. As greater yields were coaxed from the land, the costs of extraction rose as well. Some of the added costs -- higher prices for supplies and irrigation and more hours on the tractor or behind oxen and mules -- were borne by farmers. But some costs were borne by villagers. Over-irrigation and the excessive use of fertilizers and agrochemicals polluted and depleted water supplies and sapped the soil's fertility. For example, satellite data show that due to increased crop irrigation, the level of ground water in the aquifers of northern India fell by about four inches per year from 2002 through 2008 -- representing about the same total volume of water as melted from Alaska's glaciers over the same period. Now, the aquifers cannot replenish fast enough to maintain current yields over time. Thus, the impressive climb in average agricultural yields over the last half of the twentieth century is but a surface reality. The deeper reality is that in the twenty-first century, as water and soil quality has fallen, unsustainable techniques have pushed biophysical systems to their limits. Although yields have continued to increase, they have been doing so at diminishing rates.
The economists Philip Pardey and Julian Alston, who have spent their careers investigating these subtle trends, have concluded that the freight train of yield-increasing technology began slowing in the 1990s. This was due only in part to biological limits; it was also due to cutbacks in agricultural research, a result of the complacency that arose from ever-increasing yields. Pardey and Alston, among others, have demonstrated that the improvements brought by investments in research come with a lag: they peak after about 25 years, and their effects persist for as long as 25 years more. Hence, the consequences of decisions taken in the 1970s and 1980s to limit the growth in funding for agricultural research have only recently become apparent.
In real 2008 dollars, U.S. investment in agricultural development abroad fell to $60 million in 2006, down from an average of $400 million a year in the 1980s. In rich countries, public investment in research, which had grown annually by more than two percent in the 1980s, shrank by 0.5 percent annually between 1991 and 2000. Global official aid to developing countries for agricultural research fell by 64 percent between 1980 and 2003. The decline was most marked in poor countries, especially in Africa.
Exactly who was engaged in research also changed; public and private researchers increasingly switched seats. Until the late 1970s, the public sector -- especially the U.S. Department of Agriculture and the international network of agricultural research centers, known as the Consultative Group on International Agricultural Research -- had taken the lead. The CGIAR system, which includes the International Rice Research Institute in the Philippines and a maize and wheat research center in Mexico, led the green revolution. In a 2008 review of agricultural research and development (R & D) policy, Pardey, Alston, and the agricultural economist Jennifer James concluded that "support for publicly performed agricultural R&D among developed countries is being scaled back in some cases and slowing down in many others."
With public investment lagging, multinational corporations -- Monsanto, Pioneer, and Dow Chemical, to name a few -- have lured many of the most talented scientists to their private laboratories, which are better equipped and better funded than national and international research stations, especially those in developing countries. In the process, agricultural research has increasingly focused on lucrative technical projects for the private sector. Many of the resulting technological innovations, such as genetically engineered corn and soybeans, have proved both profitable and advantageous to farmers, but they have also changed the nature of the research and its beneficiaries.
The research now primarily pays off for large commercial farmers. Public research tends to cast its benefits more widely, including to many traditional farmers, whom it allows to make small but significant improvements, such as adding nutrients to the soil or replacing draft animals with mechanical tillage. Together, diminished investments in agricultural research and the shift of the research from the public sector to the private sector have redirected the benefits to large, already successful commercial farmers.
Although the Obama administration has called for renewed support for scientific research, it has shortchanged agriculture in its proposed budget for the fiscal year starting October 1, 2009. It has suggested spending close to $148 billion on R & D this year -- $555 million more than Congress allocated in fiscal year 2008 -- but the budget for the Department of Agriculture has nonetheless been cut by over six percent, according to Science News. The Department of Agriculture conducts some research itself, but the majority of its research funds are passed on to universities designated as land-grant institutions, such as the University of Wisconsin, the University of California-Davis, and Iowa State University. There, the funds end up improving productivity in agriculture by supporting public research projects. It is these public projects that are most threatened by the proposed budgets.
Politicians and the public are scarcely aware of research efforts in the life sciences; they take notice only when the media report a breakthrough, usually in connection with human health. The results of the research percolate slowly and extend over a very long time, especially in the plant sciences. But what politician will pay attention if investments in agricultural research take a generation to pay off? The problem, as the early-twentieth-century economist Arthur Pigou once put it, is that humanity's "telescopic faculty is defective" and people are, therefore, myopic; present concerns dominate the future.
Alston, Pardey, and the economist Vernon Ruttan concluded in a 2008 paper that the slowdown in productivity-enhancing research likely is a prime reason for the slowdown in agricultural productivity. "The consequences may be severe," they warned, "given expectations of global population growth and the implied growth in demand for food, in conjunction with a shrinking natural resource base and the diversion of the existing resources to produce energy crops for biofuels."
THE CORN OF PLENTY
The biofuel connection, in particular, remains as controversial today as Benjamin Senauer and one of us, Carlisle Ford Runge, argued it was in these pages in "How Biofuels Could Starve the Poor" in 2007. The poor's need for food is inextricably connected to the debate about clean energy and carbon emissions. Food is, after all, energy itself, measured in caloric units, and its production consumes energy and emits carbon. Crops such as corn, soybeans, rapeseed, and oil palm are now the primary sources of biofuels, which means that the prices for these crops track the price of oil. Most important, the demand for crops for biofuel production is huge: over a third of the corn grown in the United States in 2009 will be consumed not by livestock or humans but by ethanol factories. This has created, as Brown puts it, "an epic competition between cars and people for the grain supply."
Yet in the United States, the biofuel juggernaut continues. Legislation passed in 2005 mandated that the production of ethanol reach 7.5 billion gallons by 2012. In 2007, the Energy Independence and Security Act upped this mandate to 15 billion gallons and also mandated 20.5 billion gallons by 2015 and 36 billion gallons by 2022. Much of the total ethanol mandate was to be satisfied by corn, the rest by new sources of ethanol, such as cellulose. As a result, the mandate for corn-based ethanol also increased, from 7.5 billion gallons by 2012 under the 2005 legislation to 13 billion gallons by 2012 under the 2007 legislation. This, in turn, has driven up demand for corn to be used for ethanol production, which was roughly 200 million bushels per year until 2005 and rose to about 800 million bushels per year for 2005 through 2009. The Energy Independence and Security Act capped corn-based ethanol production at 15 billion gallons from 2015 onward, so that 21 billion of the 36 billion gallons mandated by 2022 are supposed to come from cellulose, among other things. But these advanced biofuels have yet to be produced commercially, which means that corn production will almost certainly have to increase further in order to feed the ethanol maw. This is all the more likely given that President Barack Obama has proposed raising the overall ethanol mandate to 60 billion gallons by 2030 -- a move that, according to Doug Koplow, founder and director of the energy consulting firm Earth Track, could cost U.S. taxpayers $1 trillion.