The Transformation of Diplomacy
How to Save the State Department
During the Cold War, the United States preferred to husband, rather than expend, its military power. The idea was not to fight but to defend, deter, and contain, a cold peace infinitely preferable to nuclear cataclysm. When U.S. policymakers strayed from this principle, attempting to unify the Korean Peninsula in 1950 or deploying combat troops to Vietnam in the 1960s, the results proved unhappy in the extreme.
Husbanding did not imply timidity. To impart credibility to its strategy of containment, the United States stationed substantial forces in Western Europe and Northeast Asia. For allies unable to defend themselves, U.S. garrisons offered reassurance, fostering an environment that facilitated recovery and development. Over time, regions deemed vulnerable stabilized and prospered.
Beginning in the 1990s, however, official thinking regarding the utility of force changed radically. The draft “Defense Planning Guidance” prepared in 1991 under the aegis of Paul Wolfowitz, then U.S. undersecretary of defense for policy, hinted at the emerging mood. The mere avoidance of war no longer sufficed. Describing an international order “shaped by the victory of the United States” over communism and in the just-concluded war against Iraq, the document identified opportunities to “shape the future security environment in ways favorable to [the United States].”
Shaping the future—here was an enterprise worthy of a superpower charged with fulfilling history’s purpose. Lending such expectations a semblance of plausibility was an exalted appreciation of American military might. By the early 1990s, concepts such as “defend and deter” seemed faint-hearted, if not altogether cowardly. One army field manual from that era credited U.S. forces with the ability to achieve “quick, decisive victory on and off the battlefield anywhere in the world and under virtually any conditions.” Once considered a blunt instrument, force was now to serve as an all-purpose chisel.
Rarely has a benign-sounding proposition yielded greater mischief. Pursuant to the imperative of shaping the future, military activism became the order of the day. Rather than adhere to a principled strategy, successive administrations succumbed to opportunism, cultivating a to-do list of problems that the United States was called on to solve. More often than not, the preferred solution involved the threat or actual use of force.
By the early 1990s, concepts such as “defend and deter” seemed faint-hearted, if not altogether cowardly.
Putting the chisel to work gave rise to a pattern of promiscuous intervention. After 9/11, confidence in the efficacy of American military might reached its apotheosis. With his “freedom agenda” providing ideological camouflage, President George W. Bush embraced preventive war, initially targeting “an axis of evil.” U.S. military policy became utterly unhinged.
So it remains today, with U.S. forces more or less permanently engaged in ongoing hostilities. In one theater after another, fighting erupts, ebbs, flows, and eventually meanders toward some ambiguous conclusion, only to erupt anew or be eclipsed by a new round of fighting elsewhere. Nothing really ends. Meanwhile, as if on autopilot, the Pentagon accrues new obligations and expands its global footprint, oblivious to the possibility that in some parts of the world, U.S. forces may no longer be needed, whereas in others, their presence may be detrimental. During the Cold War, peace never seemed anything but a distant prospect. Even so, presidents from Harry Truman to Ronald Reagan cited peace as the ultimate objective of U.S. policy. Today, the term “peace” itself has all but vanished from political discourse. War has become a normal condition.
The next U.S. president will inherit a host of pressing national security challenges, from Russian provocations, Chinese muscle-flexing, and North Korean bad behavior to the disorder afflicting much of the Islamic world. Americans will expect Washington to respond to each of these problems, along with others as yet unforeseen. To a considerable extent, the effectiveness of that response will turn on whether the people making decisions are able to distinguish what the U.S. military can do, what it cannot do, what it need not do, and what it should not do.
As a prerequisite for restoring prudence and good sense to U.S. policy, the next administration should promulgate a new national security doctrine. In doing so, it should act promptly, ideally within the first 100 days, when presidential authority is least constrained and before the day-to-day crush of crisis management consumes the ability to act proactively.
The central theme of that doctrine should be pragmatism, with a sober appreciation for recent miscalculations providing the basis for future policy. Before rushing ahead, take stock. After all, in Afghanistan, Iraq, and elsewhere, U.S. troops have made considerable sacrifices. The Pentagon has expended stupendous sums. Yet when it comes to promised results—disorder curbed, democracy promoted, human rights advanced, terrorism suppressed—the United States has precious little to show.
Ever since President George Washington warned against foreign entanglements in his Farewell Address, doctrines have played a recurring role in guiding American statecraft. In some instances, they provide an orientation for future action, specifying intentions and reordering priorities. Such was the case with the eponymous doctrine of Truman in 1947, which committed the United States to assisting countries vulnerable to communist subversion, and that of President Jimmy Carter in 1980, which designated the Persian Gulf as a vital U.S. national security interest, adding that region to the places Washington considered worth fighting for and thereby inaugurating the militarization of U.S. policy in the Middle East. The Bush Doctrine of 2002, which announced that the United States would no longer “wait for threats to fully materialize” before striking, also falls in this category.
In other instances, doctrines aim to curb tendencies that have proved harmful. In 1969, tacitly acknowledging the Vietnam-induced limits to presidential freedom of action, President Richard Nixon warned Asian allies to ratchet down their expectations of U.S. assistance. Henceforth, Washington might provide arms and advice, but not troops. And in 1984, Reagan’s secretary of defense, Caspar Weinberger, spelled out strict requirements for intervening abroad. Both the Nixon and the Weinberger Doctrines sought to preclude further U.S. involvement in unnecessary and unwinnable wars.
Today, the United States needs a doctrine that combines both functions. At a minimum, a new national security doctrine should codify and expand on President Barack Obama’s admirable, if cryptic, dictum “Don’t do stupid stuff.” Beyond that, it should establish criteria governing the use of force and clarify the respective responsibilities of the United States and U.S. allies.
Such criteria will not, of course, apply always and everywhere. Nor should they be expected to. The Ten Commandments and the Sermon on the Mount do not encompass every conceivable circumstance, yet they remain useful guides to human conduct. It is the absence of appropriate guidelines that invites stupid stuff—as evidenced by the persistent misapplication of U.S. military power in recent years.
A new U.S. national security doctrine should incorporate three fundamental provisions: employ force only as a last resort, fully engage the attention and energies of the American people when going to war, and enjoin U.S. allies capable of providing for their own security to do just that.
Back in 1983, Reagan assured Americans and the world at large, “The defense policy of the United States is based on a simple premise: The United States does not start fights. We will never be an aggressor.” As was often the case with “the Gipper,” words and actions aligned only imperfectly, with U.S. military intervention on behalf of Saddam Hussein’s Iraq in its war of aggression against Iran offering but one example. Still, Reagan was right that the United States would do well to avoid starting fights. The next president should return to that position, explicitly abrogating the Bush Doctrine and permanently renouncing preventive war. He or she should restore defense and deterrence as the principal mission of U.S. forces.
Strong legal and moral arguments favor such a posture. Yet the principal rationale for using force only as a last resort—and, even then, strictly for defensive purposes—is not to uphold the rule of law or to abide by some moral code. Rather, it is empirical. When weighing pain against gain, preventive war just doesn’t pay.
Post–Cold War illusions about employing violence to shape the international order stemmed from specific assumptions about changes in the nature of war that had ostensibly endowed the United States with something akin to outright military supremacy. Thoroughly tested in Afghanistan and Iraq, those suppositions have proved utterly false. Even in an era of big data, pilotless aircraft, and long-range precision-guided weapons, the nature of war remains fixed. Today’s war managers, accessing battlefield imagery fed directly into their headquarters hundreds or thousands of miles from the fight, are hardly better informed than the “chateau generals” of World War I, who peered at maps depicting the western front and fancied themselves in charge. War remains what it has always been: an arena of chance that is exceedingly difficult to predict or control. As always, surprise abounds.
When it comes to promised results the United States has precious little to show.
Along with prerogatives, power confers choice. As the world’s most powerful nation, the United States should choose war only after having fully exhausted all other alternatives and only when genuinely vital interests are at stake. The point is not to specify a fixed hierarchy of interests and then to draw a line, everything above which is worth fighting for and everything below which isn’t. That’s a losing game. Rather, the point is to restore a bias in favor of restraint as an antidote to the penchant for reckless or ill-considered interventionism, which has cost the United States dearly while reducing places like Iraq and Libya to chaos. No more ready, fire, aim. Instead, keep the weapon oiled and loaded but holstered.
When the state does go to war, however, so, too, should the nation. Since the end of the Cold War, the prevailing practice in the United States has been otherwise, reflecting expectations that a superpower should be able to wage distant campaigns while life on the home front proceeds unaffected. During the wars in Afghanistan and Iraq—the longest in U.S. history—the vast majority of Americans heeded Bush’s post-9/11 urging to “enjoy life, the way we want it to be enjoyed.” The we-shop-while-they-fight contract implicit in this arrangement has undermined U.S. military effectiveness and underwritten political irresponsibility.
The next administration will inherit a deeply flawed civil-military relationship that dates back to the Vietnam War. Nearly half a century ago, disenchantment with that conflict led Americans to abandon the citizen-soldier tradition that until then had formed the foundation of the U.S. military system. By rescinding their prior acceptance of conscription, the American people effectively opted out of war, which became the exclusive purview of regulars—the “standing army” that the founders had warned of.
As long as the United States confined itself to small-scale contingencies, such as invading Grenada or bombing Kosovo, or to campaigns of limited duration, such as the Gulf War of 1990–91, the arrangement worked well enough. In an era of long wars, however, its shortcomings have become glaringly apparent. When the invasions of Afghanistan and Iraq produced twin quagmires, the United States found itself requiring more soldiers than war planners had anticipated. Avenues that in the past had enabled the country to field large armies—in the nineteenth century, summoning masses of volunteers to the colors, and in the twentieth, relying on the draft—no longer existed. Although today more than enough young men and women are available for service, few choose to sign up. Washington’s appetite for war exceeds the willingness of military-age Americans to fight (and perhaps die) for their country.
To make up the difference, the state has resorted to expedients. It subjects the less than half a percent of Americans who do serve to repeated combat tours. It offers blandishments to foreign governments in return for token troop contributions. It hires contractors to perform functions previously assigned to soldiers. The results do not comport with recognized standards of success or even fairness. If winning implies achieving stated political objectives, U.S. forces don’t win. If fairness in a democracy implies shared sacrifice, then the existing U.S. military system is unfair.
Meanwhile, a people substantively disengaged from their military find that they have precious little say as to how that military is used. As senior officials and senior commanders experiment endlessly with ways of translating military might into some approximation of a desired outcome, flitting from “shock and awe” to counterinsurgency to counterterrorism to targeted assassination and so on, citizens awaken to the fact that they have been consigned to the status of onlookers.
Remedying this defective relationship will not be easy. A first step toward doing so should be to require the people to pay for the wars that the state undertakes in their name. When U.S. forces go off to fight in some foreign land, taxes should increase accordingly, ending the disgraceful practice of saddling future generations of ordinary Americans with debts piled up by present-day members of the national security elite. Should the next president decide that determining the outcome of the Syrian civil war or preserving the territorial integrity of Ukraine requires large-scale U.S. military action, then Americans collectively should pony up to cover the costs.
A second step follows from the first: confer on Americans as a whole the responsibility for fighting wars that exceed the capacity of regular forces. How to do this? While still filling the ranks of active-duty forces with self-selected volunteers, back up those regulars with reserves that mirror American society in terms of race, gender, ethnicity, region, and, above all, class.
Of course, the only way to create a military reserve that looks like the United States is to empower the state to require involuntary service. The trick is to make that empowerment politically palatable. In that regard, narrowly defining the state’s authority will be essential, as will ensuring that, as implemented, conscription is equitable and inclusive: no exemptions for the well-to-do.
A new national security doctrine should codify and expand on President Barack Obama’s admirable, if cryptic, dictum “Don’t do stupid stuff.”
This two-tiered formula—a standing army of volunteer professionals backed by conscription-based reserves—would require reallocating responsibilities. Small policing actions or brief punitive campaigns would remain the exclusive purview of regulars. For anything larger or more protracted, mobilizing the more numerous citizen reserves would give the population as a whole an immediate stake in an ongoing conflict, Washington’s war thereby becoming the people’s war. Of course, history offers few assurances that small wars stay small or that campaigns designed to be brief keep to schedule. In war, all slopes are slippery. An appreciation of that fact might incentivize Americans who are subject to being called up (and their families) to pay attention to how Washington employs its regulars in the first place.
To be sure, funding wars on a pay-as-you-go basis and creating conscription-based reserves would require enabling legislation. It is doubtful that today’s Congress possesses the requisite political courage to enact it. Still, there is value in articulating essential principles. This the next administration should do, initiating a long-overdue reassessment of a broken military system.
The final piece of a new U.S. military doctrine should be to put an end to free-riding. American responsibility for defending others should extend only to friends and allies unable to defend themselves. The core issue here is not one of affordability, although one may wonder why U.S. taxpayers and soldiers should shoulder burdens that others are capable of shouldering. Rather, it is one of ultimate strategic purpose.
Exercising global leadership is not an end in itself but a means to an end. Its purpose is not to accumulate clients and dependencies or to justify the existence of a massive national security apparatus. It is (or should be) to nurture a community of like-minded nations willing and able to stand on their own. Sooner or later, every parent learns that there comes a time to let go. That lesson is no less applicable to statecraft.
Europe offers a case in point. Nowhere is free-riding more pronounced and less justified. In the immediate aftermath of World War II, the battered democracies of Western Europe did need U.S. protection. Today, no more. For Europeans, the dangers that made the twentieth century such a trial have all but vanished. Those that remain are eminently manageable.
With the good news come fresh complications, of course. Chief among them is the challenge of securing a vastly expanded perimeter now encompassing over two dozen nominally united, but still largely sovereign, nation-states. In practice, threats to that perimeter are coming from two directions. From the south, waves of desperate refugees are arriving on European shores. To the east lies Russia, nursing grudges. The United States has rightly refrained from assuming responsibility for Europe’s refugee crisis. So, too, should it refrain from assuming responsibility for Europe’s Russia problem.
The final piece of a new U.S. military doctrine should be to put an end to free-riding.
Understandably, when it comes to Russia, Europeans are only too happy to resurrect a division of labor dating from the onset of the Cold War, when it fell to the United States to carry most of the load. Yet today’s Russia hardly compares to the Soviet Union of yesteryear. More thug than totalitarian, Vladimir Putin is not Joseph Stalin reborn. The Kremlin’s roster of client states begins and pretty much ends with Bashar al-Assad’s Syria, not exactly an asset. When Obama disparaged Russia as a mere “regional power” after it annexed Crimea, the appraisal stung because it hit the mark. Apart from having stockpiles of essentially useless nuclear weapons, Russia lags far behind Europe in most relevant measures of power. Its population is less than one-third that of the European Union. Its economy, heavily dependent on commodity exports, is one-ninth the size of Europe’s.
Should it choose to do so, Europe—even after the British vote to leave the EU—is fully capable of defending its eastern flank. The next administration should nudge Europeans toward making that choice—not by precipitously withdrawing U.S. security guarantees but through a phased and deliberate devolution of responsibility. The sequence might go as follows: Begin by ending the practice of always having an American serve as the supreme allied commander in Europe; NATO’s next military commander should be a European officer. Then, establish a schedule for shutting down the major U.S. military headquarters in such places as Frankfurt and Stuttgart. Next, specify a date certain for terminating U.S. membership in NATO and withdrawing the last U.S. troops from Europe.
When should Washington actually cut the transatlantic umbilical cord? Allowing ample time for European publics to adjust to their new responsibilities, for European parliaments to allocate the necessary resources, and for European armies to reorganize, 2025 sounds about right. That year will mark the 80th anniversary of the victory in World War II—an eminently suitable occasion for Washington to declare “mission accomplished.” But to get things rolling, the next administration’s message to Europe should be clear from day one: ready your defenses; we’re going home.
A drawdown in Europe should mark just the beginning of an effort to overhaul the Pentagon’s global posture, which today finds the U.S. military maintaining an active presence in some 150 countries. In that regard, the new administration should revisit prevailing assumptions regarding the supposed benefits of scattering U.S. troops across the planet. Costs and benefits, rather than habit or dogma or (worst of all) domestic politics, should determine where the U.S. military goes and what it does when it gets there. Where the forward deployment of U.S. forces contributes to stability—as is arguably the case in East Asia—the next administration should affirm that presence. Yet where the evidence suggests that U.S. troops have become redundant or where U.S. military efforts show little or no signs of succeeding, it should reduce, reconfigure, or terminate that presence altogether.
Call it the corollary to Obama’s “stupid stuff” rule. When what you are doing isn’t needed (for example, U.S. Southern Command standing ready “to conduct joint and combined full-spectrum military operations” across the length and breadth of South America), ring down the curtain. When ongoing efforts, such as the never-ending war on terrorism, show few signs of progress, consider alternatives. That’s not isolationism. It’s common sense.
What are the programmatic implications of maintaining a more modest overseas presence and curbing Washington’s penchant for interventionism? The Asia-Pacific would absorb greater U.S. military attention, a trend that would find ground forces as currently configured particularly hard-pressed to justify their existence. The active-duty U.S. Army is already shrinking; it would grow smaller still. As U.S. forces pulled out of Europe and as the failure of U.S. military efforts to pacify the Middle East became ever more evident, opportunities to trim the Pentagon’s overall spending would present themselves. Here, too, prudence dictates an incremental approach. Currently, the United States lavishes more on its armed forces than do the countries with the next seven most generously endowed militaries combined. Pegging the Pentagon budget to merely the size of the next six offers a good place to start and would free up some $40 billion per year. The prospect of reallocating that tidy sum should excite the interest of liberals and conservatives alike.
Yet such a cut, obliging the Pentagon to get by with a mere half-trillion dollars per year, would still leave the United States with easily the strongest military on the planet. The competition to ensure that it remains the strongest would pit the world’s best navy against the world’s best air force, a race that should spur innovation. Goodbye, carrier battle groups and piloted aircraft. Hello to a new generation of weapons that are more precise, more lethal, and more survivable—and better suited to a strategy of defense and deterrence.
Come November, “America First” may reemerge as a central theme of U.S. policy. Once thought to have been permanently discredited by the events of World War II, the phrase is today making a comeback, with Donald Trump, the Republican presidential candidate, employing it to signal his own predisposition when it comes to foreign affairs. Depending on how officials interpret that sentiment, the American people and the world at large may welcome or deplore its revival.
Yet whoever wins the election and whatever proclivities he or she brings to office, it will be incumbent on the next administration to undertake a critical appraisal of the country’s recent military disappointments. Formulating a new national security doctrine offers an essential step toward fulfilling that solemn duty, but only a preliminary one: the implications of such a doctrine will take years to play out.
In the meantime, proponents of the status quo will mount a fierce counterattack. Die-hard interventionists will insist that adversaries are likely to misread self-restraint as weakness. Reflexively opposing anything that might jeopardize the Pentagon’s spending, beneficiaries of the military-industrial complex will argue for redoubling efforts to achieve permanent military dominance. Leaders of the armed services, for their part, will remain preoccupied with protecting their turf and their share of the budget.
All will argue that safety lies in doing more and trying harder, leaving intact inclinations that have warped U.S. policy since the end of the Cold War. In all likelihood, however, more of the same will only make matters worse, at considerable cost to Americans and to others.