Legions of pundits have argued that the dollar's status as an international currency has been damaged by the great credit crisis of 2007-9 -- and not a few have argued that the injury may prove fatal. The crisis certainly has not made the United States more attractive as a supplier of high-quality financial assets. It would be no surprise if the dysfunctionality of U.S. financial markets diminished the appetite of central banks for U.S. debt securities. A process of financial deglobalization has already begun, and it will mean less foreign financing for the United States' budget and balance-of-payments deficits. Meanwhile, the U.S. government will emit vast quantities of public debt for the foreseeable future. Together, these trends in supply and demand are a recipe for a significantly weaker dollar. And as central banks suffer capital losses on their outstanding dollar reserves, they will start considering alternatives.
This is especially likely because these trends are superimposed on an ongoing shift toward a more multipolar world. The growing importance of emerging markets has sharply reduced the United States' economic dominance, weakening the logic for why the dollar should constitute the largest part of central-bank reserves and be used to settle trade and financial transactions.
As emerging markets grow, they naturally accumulate foreign reserves as a form of self-insurance. Central banks need the funds to intervene in the foreign exchange market so that they can prevent shocks to trade and financial flows from causing uncomfortable currency fluctuations. This capacity becomes more important as previously closed economies open up and when international markets are volatile, as has been the case recently. It is only logical, in other words, for emerging markets to accumulate reserves.
But in what form? There is a growing feeling among economists and government officials that any system that
- Full website and iPad access
- Magazine issues
- New! Books from the Foreign Affairs Anthology Series