Throughout the history of capitalism, economic bubbles have been commonplace. They have emerged wherever liquid financial markets exist. The range challenges the imagination: from the iconic tulip bulb bubble, to gold and silver mining bubbles, to bubbles around the debt of newly established countries of unknowable wealth, to -- again and again -- real estate and stock bubbles.

The central dynamic is always the same: The price of a financial asset becomes detached from the real value of the economic asset it represents. So the price of dotcom shares in 1998–2000 soared out of any relationship with the underlying cash flows -- present or future -- of the startup companies striving to exploit the commercial promise of the Internet. Speculators in the financial asset can profit, even when the project they have financed fails.

Economic bubbles have also been necessary. Occasionally, the object of speculation has been one of those fundamental technological innovations -- canals, railroads, electrification, automobiles, aviation, computers, the Internet -- that eventually transforms the economy. In these cases, the prospects of short-term financial gain from riding a bubble mobilizes far more investor capital than prudent professional investors would otherwise dole out. Moreover, the very momentum of the bubble forces those careful investors to join the herd lest their relative underperformance leave them with no funds to invest: Warren Buffet, who successfully steered clear of the great dotcom/telecom bubble of 1998–2000, is the exception that proves the rule.

Economic bubbles, as everyone knows, have also inevitably burst. And the consequences can be grave or transient. When the speculation infects the credit system that fuels the entire economy -- and especially when its object offers no prospect of increased economic productivity -- the consequences of its collapse are felt mostly in the short term and are unequivocally negative, maybe even catastrophic.

But when the damage of the speculation is limited to the market for equity and debt securities, the adverse economic consequences of the bubble’s popping may be muted. Further, when the object of speculation is a transformational technology, a new economy can emerge from the wreckage. That is why, for example, the consequences of the tech bubble in 2001 were radically different from those of the housing bubble in 2008.


So what can we learn from the history of productive bubbles that could help us anticipate where and how (if not when) the next may emerge? Here, understanding the role of the state is singularly important.

Productive bubbles have generally followed investments by the state -- that other source of financial support for projects of uncertain economic value. For example, the bonds that financed the building of the Erie Canal in the early nineteenth century were guaranteed by the state of New York. In the mid-nineteenth century, the federal government subsidized railroad construction through massive grants of public lands. At the start of the twentieth century, the government granted AT&T a monopoly on long-distance telephony in return for universal service, which helped make voice communication ubiquitous. Following World War I, as the Roaring Twenties took off, the U.S. Navy and Herbert Hoover’s Commerce Department sponsored the creation of RCA to exploit all American patents on wireless communications, thereby launching broadcast radio. Further, it was the states that made electrification possible: Their regimes of regional monopolies and price regulations enabled massive investment in expensive infrastructure. This pattern has continued into the present day; since World War II, unprecedented government investment in science has built the platforms on which entrepreneurs and venture capitalists have danced.

After each of these booms of investment, a bust followed. During the 1880s, 75,000 miles of railroad track were laid down in the United States. During the four years following the crash of 1893, more than half of that trackage was in receivership, but no one tore up the rails. Even the crash of 1929 and the ensuing Great Depression did not reverse the electrification of the American economy. And following the bursting of the dotcom/telecom bubble in 2001, the “dark fiber” that was prematurely laid down has come to be fully utilized and then some.

The government’s interventions in the market economy were not based on pure economic calculus. During the nineteenth century, the United States pursued mercantilist policies of protection and subsidies for domestic industry, as have all countries playing catch-up. The overriding mission was economic integration and coast-to-coast development: the canals and turnpikes, railroads and telephone lines were built in the name of America’s “manifest destiny” to expand across the continent.

In the twentieth century, the drive toward national development was followed by the imperative of national security. During World War II, science went to war on an unprecedented scale, yielding innovations from radar to the atomic bomb. And the commitment continued through the decades of the Cold War. From 1950 through 1978, federal government agencies accounted for more than 50 percent of all R&D spending. From silicon to software and the Internet, the entire array of information and communication technologies that we use today originated in government programs aimed at promoting national security.

State agencies not only funded scientific research; they also served as creative and collaborative customers for the products that followed. They pulled the suppliers down the learning curve to low-cost, reliable production. In other words, they rendered new technologies ripe for commercial exploitation.

Washington was not the only national capital to sponsor the computer revolution. In direct contrast with their European counterparts, however, the Defense Department, NASA, and the Atomic Energy Commission did not pick “national champions.” Rather, competition for contracts was open to such emerging players as Texas Instruments and Intel. And government agencies insisted on a transparent intellectual property regime, which created a reservoir of accessible technology that private-sector entrepreneurs could draw on in the following decades.

In the second half of the twentieth century, conquering disease came to complement national security as a motive for state investment. U.S. President Richard Nixon’s metaphor of the “war on cancer” represented more than a play on words; it invoked an open-ended commitment that transcended cost-benefit analysis, one that has underwritten the budgets of the National Institutes of Health for a generation.

By coincidence, just as the computer technologies that the federal government had fostered were maturing in the 1980s, the first wave of modern biotechnologies also came into public view. And the hugely successful initial public offerings (IPOs) of Apple Computer and Genentech in the autumn of that year marked the end of seven years characterized by an utter lack of exuberance in the stock market.


Although delayed by Federal Reserve Chairman Paul Volker’s painful defeat of inflation, the “mini-bubble” in IPOs in 1983 launched the greatest bull market in the history of capitalism, culminating in the dotcom/telecom blowout at the end of the millennium. And along the way, successive IPO windows opened up for biotech startups, continuing even after the bubble burst in 2001.

Now years, even decades, of building out the new digital economy lie before us. And with that, there will be numerous opportunities for speculation-worthy innovation: further extension of the virtual social world; making the mobile and cloud computing environments safe and reliable; moving from speech recognition to natural language understanding; extracting actionable information from big data. Progress in the biosciences has comparable potential.

But what is the next domain in which state investment and speculative mania could combine to deliver another new economy? Its first manifestations have recently lit up the sky in Germany and China. Haphazard movement toward the low-carbon economy of tomorrow is already discernible.

In fact, the first bubbles of the next economy have already been generated: In recent years, the German government has offered generous subsidies to support the rapid expansion of solar panel production, which were aggressively followed by China’s own offerings. The classic pattern of stock-market speculation driving massive increases in supply, followed by price collapse and bankruptcy, has played out in both countries.

Meanwhile, the United States has tied itself into its own complicated knot. Former President George W. Bush’s Energy Policy Act of 2005 authorized a program of loan guarantees “to support innovative clean energy technologies that are typically unable to obtain conventional private financing due to high technology risks.” On taking office in January 2009, during the post–Lehman Brothers implosion of the economy, President Barack Obama substantially expanded this program as part of his stimulus plan. The success to date of one recipient, Tesla, in leveraging sufficient interest in its high-end electric cars to generate a micro-bubble in its stock and repay its loan hardly offset the political cost of writing off loans guaranteed to Solyndra (advanced solar cells) and A123 (novel battery technology) when each company went bankrupt.

Washington faced a conflict between stimulating the economy in the short run and supporting the supply of alternative sources of energy in the long run. Drawing on the history of the Pentagon’s sponsorship of digital technologies, the Department of Energy could have been more open, competitive, and transparent in multiple sources of innovative battery technology. But that would have entailed a multi-year program incompatible with the immediate need to put a floor under the collapsing U.S. economy.

At a more general level, there is no doubt that establishing the foundations for the low carbon economy will require direct and indirect support from government on a massive scale. But the ideological and institutional barriers to such commitments are formidable.

On the ideological front, the United States has now distinguished itself among the countries of the world for the number of its political leaders who deny that climate change actually exists. Institutionally, government support for clean tech and green tech is much more difficult to mobilize than was the commitment to computing in the name of national security. An enormous, profitable, and politically entrenched conventional energy industry already exists, in contrast with the nascent information processing industry of the post–World War II era. The Advanced Research Projects Agency-Energy (ARPA-E) is explicitly modeled on the fabled Defense Advanced Research Projects Agency (DARPA). But it commands a trivial amount of resources relative to DARPA’s endowment during the 1960s and 1970s. It also lacks the set of big brothers -- collectively the Department of Defense -- whose role as customers for the new technologies was transformational. The Department of Energy could have done better in sponsoring new battery technologies, but it is not clear where it would have found the source of demand to pull the innovations sufficiently close to the commercial market for the private sector to take over.

Behind the scenes and out of sight, many of the technologies that will ultimately power the next new economy have been sponsored by what the sociologist Fred Block has called the “hidden developmental state.” The most visible and economically significant of the new production technologies, hydraulic fracturing of shale hydrocarbons (fracking), was fostered by various arms of the federal government starting in the 1970s.

It is true that fracking has taken a full generation from research and experimentation to large-scale deployment. Yet on this and other fronts, the seeds of innovation have been sewn and watered. When it comes to engendering a financial mania, a wave of speculative investment at the scale necessary to construct the foundations of a new economy, what matters is a plausible story -- not the hard numbers that would satisfy rational agents of the old neoclassical economic theory. From the electric charging stations proliferating in Silicon Valley to the shale fields of North Dakota and West Virginia, by way of the busted solar bubbles of China and Germany, those stories are already beginning to accumulate. 

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now
  • WILLIAM H. JANEWAY is Managing Director of Warburg Pincus and Chairman of the Board of Trustees of Cambridge in America, University of Cambridge.
  • More By William H. Janeway