How a Great Power Falls Apart
Decline Is Invisible From the Inside
As the casualties and financial costs of the United States’ Middle Eastern wars have mounted, Americans’ appetite for new interventions—and their commitment to existing ones—has understandably diminished. The conventional wisdom now holds that the next phase in the United States’ global life should be marked by military restraint, allowing Washington to focus on other pressing issues. This position seems to be one of the few principles uniting actors as diverse as foreign policy realists, progressives, nearly all of the presidential candidates in the 2020 Democratic primary, and President Donald Trump.
It’s not hard to see why Americans would look at U.S. military involvement in Afghanistan, Iraq, and Libya and conclude that such interventions should never be repeated. The costs of these wars have been extraordinary: at a rally in Ohio in April 2018, Trump estimated them at $7 trillion over 17 years and concluded that the country has nothing to show for the effort “except death and destruction.” Although the precise financial cost depends on how one counts, what is certain is that more than 4,500 U.S. military personnel have been killed in Iraq and nearly 2,500 in Afghanistan, plus tens of thousands injured in both wars—to say nothing of the casualties among allied forces, military contractors, and local civilians. Critics of these resource-intensive operations blame them for bogging down the United States in a region of second-tier importance and distracting Washington from the greater threats of China and Russia, as well as from pressing domestic issues.
With the costs so high, and the benefits seen as low, the imperative is obvious to political leaders in both parties: get out of the existing conflicts in Afghanistan, Iraq, and Syria and avoid starting new ones. In his State of the Union address this year, Trump declared that “great nations do not fight endless wars.” Scores of House Democrats have signed a pledge to “end the forever war,” referring to the global war on terrorism and U.S. military involvement in Afghanistan, Iraq, Jordan, Niger, Somalia, Syria, Thailand, and Yemen, as have many of the Democrats running for president. Joe Biden, the former vice president and current presidential candidate, has also promised to “end the forever wars.” He has described the Obama administration’s withdrawal of U.S. troops from Iraq as “one of the proudest moments of [his] life” and has called for pulling U.S. forces out of Afghanistan.
Conventional wisdom now holds that the United States’ foreign policy should be marked by military restraint.
Many experts are of a similar mind. Discussions of “offshore balancing,” a strategy in which the United States would dramatically scale back its global military presence and reduce the frequency of its interventions, were once mostly confined to the halls of academia, but today the idea is garnering new attention.
Faced with such a sweeping political consensus, one might conclude that Washington should simply get on with it and embrace restraint. The problem is that such a strategy overlooks the interests and values that have prompted U.S. action in the first place and that may for good reasons give rise to it in the future. The consensus also neglects the fact that, despite the well-known failures of recent large-scale interventions, there is also a record of more successful ones—including the effort underway today in Syria.
To assume that nonintervention will become a central tenet of future U.S. foreign policy will, if anything, induce Americans to think less seriously about the country’s military operations abroad and thus generate not only less successful intervention but possibly even more of it. Instead of settling into wishful thinking, policymakers should accept that the use of military force will remain an essential tool of U.S. strategy. That, in turn, requires applying the right lessons from recent decades.
The first sign that the sweeping consensus around “ending endless war” is more problematic than it first appears is the telling set of caveats that emerges even among its most ardent advocates. Consider the many qualifications that Democratic presidential candidates are applying to a withdrawal from Afghanistan. Biden has said that he would bring U.S. combat troops home during his first term but that he remains open to a “residual presence” to conduct counterterrorism operations—roughly the same approach as Trump’s. Senator Cory Booker of New Jersey has promised that as president he would immediately begin a “process” to withdraw troops from Afghanistan, while somehow ensuring that the country does not again become a safe haven for terrorists. Pete Buttigieg, the mayor of South Bend, Indiana, who served as a naval officer in Afghanistan, has agreed that “it’s time to end this endless war,” and yet he envisions a peace agreement that keeps U.S. special operations forces and intelligence operatives there. Such concessions, responsible policy though they are, stop well short of terminating the United States’ longest war.
Even the most committed anti-interventionists continue to come up with exceptions. The foreign policy manifesto of Senator Bernie Sanders of Vermont, published in Foreign Affairs in June, is titled “Ending America’s Endless War,” and yet he has acknowledged that “military force is sometimes necessary, but always—always—as the last resort.” His foreign policy adviser has emphasized Sanders’ commitment to collective defense among NATO allies and has said that genocide and mass atrocities would “weigh heavily” on Sanders when contemplating military action. Advocates of offshore balancing, such as the scholar John Mearsheimer, favor using force if a regional balance of power is breaking down, and Mearsheimer has written that his approach would not preclude operations to halt genocides like the one that befell Rwanda in 1994.
Even at a rhetorical and intellectual level, then, the end of intervention is not nearly as clear-cut as today’s politicians suggest. The reality of being commander in chief complicates things further: on the campaign trail, Bill Clinton, George W. Bush, Barack Obama, and Trump each pledged to engage in fewer foreign military adventures and redirect resources toward needs at home. In office, each reluctantly proceeded to not only continue existing wars but also launch new offensives.
The result is that, according to a Congressional Research Service estimate, the United States has employed military force over 200 times since the end of the Cold War. Many of these operations have taken place in or around the Middle East, including in Afghanistan, Iraq, Libya, Somalia, Syria, and Yemen. But other, less frequently recalled interventions have occurred elsewhere, as in Bosnia, Colombia, Haiti, Kosovo, Liberia, and the Philippines. What’s more, the tendency to intervene is not simply the product of the United States’ emergence as an unbridled superpower after the Cold War. Between 1948 and 1991, during a time of supposedly stabilizing bipolar competition, the United States sent its military to fight abroad more than 50 times. American military action is not, as many believe, a feature of post–Cold War overstretch; it has been a central element of the United States’ approach to the world for decades.
Just because the United States has intervened so frequently over its history does not mean that it will continue to do so or that it should. The case against intervention generally takes five forms. And although there are elements of truth to each, they also threaten to obscure other, more complicated realities.
The first argument holds that the United States need not employ military means in response to terrorism, civil wars, mass atrocities, and other problems that are not its business. Washington has used force against terrorists in countries ranging from Niger to Pakistan, with massive human and financial expenditures. And yet if more Americans die in their bathtubs each year than in terrorist attacks, why no war on porcelain? The post-9/11 overreach, this camp contends, endures some 18 years later, having stretched well beyond eradicating the original al Qaeda perpetrators and their Afghan base. In this view, as the threats have diminished, so should American attention. The civil wars in Libya, Syria, and Yemen may be tragic, but they do not demand a U.S. military response any more than did the atrocities in Rwanda, eastern Congo, or Darfur.
Check in to a military intervention, and it often seems like you can never leave.
Adopting such a cramped view of American interests, however, carries its own costs. Terrorism remains a threat, and the effect of successful attacks on Americans goes beyond their immediate casualties to include increased pressure to restrict civil liberties at home and wage impromptu operations abroad—operations that end up being costlier and less effective than longer-term, better-planned ones would be. After the Islamic State (or ISIS) took hold in Iraq and Syria and footage of terrorists decapitating American hostages horrified the public, Obama undertook a far larger operation than would have likely been necessary had he left a residual force in Iraq after 2011. As for genocide and civil war, certain cases can pose such serious threats to U.S. interests, or be so offensive to American values, as to merit intervention. Successive presidents have used military might to prevent, halt, or punish mass atrocities—Clinton to cease the genocide against Bosnian Muslims in the Balkans, Obama to protect the Yezidi minority in Iraq, and Trump after Bashar al-Assad’s chemical attacks against his own people in Syria. There is every reason to believe that similar cases will arise in the future.
The second argument against intervention highlights its supposedly poor track record. For all of the United States’ good intentions—stopping terrorists, ending genocide, stabilizing countries, spreading democracy—Washington simply is not very successful in its attempts. Iraq and Libya look worse today than when the wars against Saddam Hussein and Muammar al-Qaddafi began, and the Taliban currently control more of Afghanistan than at any time since 2001. Long gone are U.S. aspirations to turn these countries into democracies that would radiate liberalism beyond their borders.
Yet this argument ignores the many other times in which the use of American force worked. It ejected Saddam from Kuwait, it ended a war in Bosnia, it stopped ethnic cleansing in Kosovo, it paved the way for a democratic transition in Liberia, and it helped defeat narcoterrorists and bring temporary peace to Colombia. Even in Afghanistan, it should not be forgotten that Washington denied al Qaeda a safe haven, and in Iraq and Syria, it eliminated ISIS’ physical presence, limited the flow of foreign fighters, and liberated cities from depravity. Then there are other, harder-to-measure effects of U.S. intervention, such as enforcing norms against ethnic cleansing and deterring countries from offering terrorists sanctuary or engaging in wars of aggression. To get an accurate picture of intervention’s mixed track record, one cannot cherry-pick the disastrous cases or the successful ones.
The third argument against intervention points to the slippery slope involved in such efforts: start a military campaign, and the United States will never get out. After the 1995 Dayton peace accords formally ended the ethnic conflict in Bosnia, U.S. troops stayed in the area for ten years, and NATO retains a presence in Kosovo to this day. The United States seems to be stuck in Afghanistan, too, because without a peace deal with the Taliban, the U.S.-backed government could fall. In Iraq, Obama removed all U.S. troops, only to send them back in when ISIS established a vast presence there. Check in to a military intervention, and it often seems like you can never leave.
Once deployed, American troops often do stay a long time. But staying is not the same as fighting, and it is wrong to think of troops who are largely advising local forces the same way as one thinks about those who are actively engaged in combat. There is a stark difference between what it meant to have U.S. forces in Iraq during the peak of the war and what it means to have U.S. troops there now to train Iraqi forces—just as there is a massive gulf between deploying troops to Afghanistan during the troop surge there and keeping a residual presence to strengthen the government and its security forces. Some American interests are worth the price of continued military deployments, and the aim should be to diminish those costs in blood and treasure as the conditions stabilize. Even once they do, there may remain a case for an enduring role, particularly when the U.S. troop presence is the only thing maintaining the domestic political equilibrium, as was the case in Iraq before the 2011 withdrawal and as is true in Afghanistan today.
The fourth argument can be boiled down to the plea, “Why us?” Why must the United States always run to the sound of the guns, especially when other countries are capable of taking on such burdens and may have more skin in the game? Europe is geographically closer to Libya and Syria, at far greater risk from terrorism and refugee flows, and possesses capable military forces of its own. Middle Eastern allies have their own resources, too. The American role might not be so indispensable after all.
For all the contributions of U.S. partners, however, more often than not, only the United States has the will and the capability to lead successful military operations. France led a successful operation in Côte d’Ivoire in 2004 and in Mali in 2013, and the United Kingdom led one in Sierra Leone in 2000, but those were exceptions. Iraq would not have left Kuwait in 1991 had the United States not led the effort; mass slaughter in the Balkans during the 1990s would not have ended without a dominant U.S. role, even though it took place on European soil. In Afghanistan and Syria, U.S. allies have made it clear that they will stay as long as the United States does but will head for the exit otherwise. U.S. friends in Europe have proved decidedly uninterested in taking matters into their own hands, and when Washington has declined to meaningfully intervene itself, they have often stood idly by. In Libya after Qaddafi’s fall, the Europeans failed to impose security even as growing numbers of refugees and migrants set sail across the Mediterranean. In Syria before U.S. bombing began, they undertook no military campaign against ISIS, even as the arrival of Syrian refugees destabilized European politics. When U.S. allies do take matters into their own hands, they can make a bad situation worse. Saudi Arabia and the United Arab Emirates decided to intervene in Yemen’s civil war, but their brutal and indiscriminate campaign led to a humanitarian disaster and strengthened the very Iranian role it sought to eliminate.
American intervention might not be so indispensable after all.
The final reason most frequently offered for getting out of the intervention business relates to its costs, both direct ones—the lives lost and damaged, the dollars borrowed and spent—and opportunity costs. It is increasingly clear that China and Russia represent the foremost challenge to the United States over the long term and that the competition with them has begun in earnest. If that’s the case, why tie up scarce resources in less important military interventions?
Here, too, a dose of subtlety is in order. The prospect of great-power competition should indeed structure the United States’ coming approach to national security, but a focus on counterterrorism is required, as well. After all, the George W. Bush administration entered office hoping to focus on China, only to see its best-laid plans upended by the 9/11 attacks. Withdrawing prematurely from terrorist safe havens such as Afghanistan, Iraq, and Syria would threaten the great-power emphasis necessary in the next phase of the United States’ global life. A major terrorist attack on U.S. soil, for instance, would likely cause Washington to once again embrace counterterrorism as its chief national security priority, leaving it more vulnerable to threats from China and Russia. Unless the United States chooses to give up its global role and instead focus only on Asia and Europe, it must engage in great-power competition while attending to other security challenges in other areas.
Every possible intervention, past and future, raises difficult what-ifs. If presented again with a situation like that in Rwanda in 1994—800,000 lives in peril and the possibility that a modest foreign military effort could make a difference—would the United States once again avoid acting? Should it have stayed out of the bloodbath in the Balkans or intervened earlier to prevent greater carnage? Should it have left Qaddafi to attack Benghazi? Pursued al Qaeda after the 1998 attacks on the U.S. embassies in Kenya and Tanzania, perhaps obviating the need to overthrow the Taliban three years later?
In such discussions, the gravitational pull of the Iraq war bends the light around it, and for obvious reasons. The war there has been so searing, so badly bungled, and so catastrophically costly that, according to former Secretary of Defense Robert Gates, anyone thinking of a similar engagement “should ‘have his head examined,’ as General MacArthur so delicately put it.” Almost everything that could go wrong in Iraq did. What started as a war to eliminate weapons of mass destruction found none. The impulse to liberate the Iraqi people from tyranny pushed them into a civil war. The desire to open another front in the war on terrorism created far more terrorists than it eliminated. A war that some U.S. officials promised would be a “cakewalk” exacted an unbearable toll on U.S. troops, their families, and the Iraqi people themselves.
Almost everything that could go wrong in Iraq did.
Ironically, many among Washington’s political and national security elite, especially on the Republican side, were for years unable to admit publicly that the invasion was the mistake it so clearly was. After the 2003 invasion, politics and a resistance to suggesting that American sacrifices were in vain kept such observations private. Republican political leaders’ failure to admit that the war’s costs exceeded its benefits undermined their credibility, which was already tarnished by their general support for the war in the first place. That, in turn, may have helped usher in the blunt anti-interventionism so prevalent today. Washington needs a subtler alternative to it.
U.S. military interventions take diverse forms—an isolated drone strike in a remote area of Pakistan is as different from a theoretical future war with China as is possible to contemplate. As a result, there are no precise rules about when leaders should and should not use force. Context matters, and human judgment always comes into play. Yet it is possible to sketch out several principles, informed by the experience of recent decades, that should guide the general conduct of U.S. decision-making.
The first guideline is to avoid overlearning the supposed lessons of past interventions. It’s often said that generals are always fighting the last war, and the same can be said of policymakers. Sometimes, they draw the right lessons, but sometimes, they do not. President Harry Truman sent troops north of the 38th parallel in Korea, drawing China into the Korean War, so in Vietnam, U.S. ground forces remained on their side of the demilitarized zone—which put enormous emphasis on extensive bombing campaigns against the North. Hoping to avoid a Vietnam-style quagmire, when the George H. W. Bush administration fought the Gulf War, it sought to limit its objective to the specific aim of restoring Kuwaiti sovereignty. But because Saddam was left in power, the Iraq problem festered. The second Iraq war was supposed to finish the job—but it showed how a purportedly short conflict can lead to an indefinite occupation. To prevent that from happening in Libya, Obama decided to use airpower to help oust Qaddafi but keep American boots off the ground; he was thus unable to contain the chaos that followed. And so in Syria, Obama and Trump would fight terrorists without attempting to remove Assad. Sticking to rigid lines based on prior errors can easily lead to new and different pitfalls.
Another guideline is to pick interventions that meet clear conditions and commit to those that are chosen. The United States should generally undertake interventions only when political leaders—namely, the president and a majority of Congress—believe that force is necessary to attain a clearly stated objective. They should have a reasonable expectation that allies, especially those in the region in question, will join the effort, and they should make serious efforts to enlist them. They should conclude that the benefits of a military intervention over the long run are reasonably expected to exceed the costs. And they should undertake military interventions in which they are prepared for the possibility that U.S. forces will have to stay for a long time, indefinitely if necessary.
Guidelines such as these cannot possibly supply all the answers policymakers might need, but they can point to the right questions. Requiring decision-makers to clearly define the objectives of the possible intervention, for example, will force them to distinguish between managing a problem (such as preventing Afghanistan from becoming a terrorist safe haven) and solving it (such as rendering that country a Taliban-free modern democracy). Enlisting allies in the effort should involve an honest assessment of their strengths and weaknesses, whether those allies are someone in the nature of Afghan President Hamid Karzai, or exiles in Iraq, or European troops in Libya, or the Syrian Democratic Forces. And the judgment about an operation’s likely costs and benefits should include an analysis of the success or failure of various approaches in the past, such as targeted counterterrorism operations or a full-fledged counterinsurgency campaign.
One traditional way of thinking about intervention is represented by the Powell Doctrine, developed by General Colin Powell during the Gulf War, which emphasizes the importance of using decisive force, having a clear exit strategy, and mobilizing U.S. public support. But the opposite has proved at least equally important in recent wars: there will be cases in which the employment of modest force over an open-ended timeline will be the better strategy. Policymakers’ general unwillingness to contemplate a long-term U.S. presence in a foreign country, along with their tendency to see conflicts as temporary problems that can be solved in a limited period of time, often makes them rush for the exits when the going gets tough. Had the United States not frantically sought an off-ramp in both Afghanistan and Iraq, for instance, its prospects for success in both conflicts would have been brighter—and, paradoxically, the wars might have ended sooner. Even many years after the initiation of those conflicts, sustainable, low-cost, and long-term American engagement is preferable to unconditional withdrawal.
There are no precise rules about when leaders should and should not use force.
A new set of guidelines would also take a more nuanced approach to determining whether an intervention is politically sustainable. The usual model holds that presidents should paint a picture of the threat for Americans and then elicit their support for war, hoping to wind down operations before the public grows weary of the conflict. Yet political support hinges less on a war’s duration than it does on its financial costs, casualties, and perceived progress. Reducing losses and making concrete steps toward a conflict’s stated objective are critical to maintaining popular support over the long run. Instead of suggesting that ultimate success is just around the corner, policymakers should articulate the case for an enduring engagement and then work to lower the human and financial costs associated with it.
Perhaps the most difficult guideline is to rigorously estimate the long-term costs and benefits. Although the need to run a cost-benefit analysis seems patently obvious, recent experience suggests that it is not. In the run-up to the Iraq war, for example, U.S. leaders minimized the estimated cost of troops and reconstruction aid and wildly overinflated their projections of success. During the deliberation over intervention in Libya, it appears that policymakers ignored the lesson that would-be nuclear proliferators might draw in watching the United States topple a leader who had previously turned over his weapons of mass destruction. Most important is an examination of the specific case itself, including the history of the people and the forces at play. Analogies to past wars and unrelated historical experiences, or aspirations to abstract principles—such as needing to be on the right side of history—add little value.
Applying these guidelines would rule some past and potential interventions in and others out. Intervention in the Balkans and Rwanda likely would have passed the test, particularly given the limited objectives (in the Balkans, an end to atrocities without toppling governments) and the military means required (in Rwanda, reinforcing UN peacekeepers already on the ground or jamming radio broadcasts). The 2001 decision to attack al Qaeda and the Taliban in Afghanistan would have met the mark, too, as would have the anti-ISIS campaign in Iraq and Syria, given that nonmilitary approaches were unable to shut down the safe havens. The 2003 Iraq war would not have met the test, given a realistic projection of the costs and benefits and the ever-changing objectives. In Libya, these principles would have led Washington to either mount a limited operation to stop a massacre in Benghazi and leave Qaddafi in power or stay out of the fight altogether. Instead, the Obama administration chose to topple the regime and then disengage.
For the ongoing interventions in Afghanistan, Iraq, and Syria, the guidelines would rule in favor of a residual, indefinite troop presence. Preventing these countries from regressing into terrorist hubs and, in the cases of Afghanistan and Iraq, supporting the governments that keep them from doing so are objectives that merit continued U.S. engagement. Additionally, the costs of redeploying to these countries after a descent into terror-ridden chaos—as happened in Iraq after 2011—would almost certainly be higher than the costs of remaining. Simply ignoring the emergence of terrorist sanctuaries could be even more catastrophic.
It’s often said that generals are always fighting the last war, and the same can be said of policymakers.
Several practical changes would help policymakers evaluate possible military interventions. To ensure that cost-benefit analyses are as accurate as possible, for example, they must be based on the entire range of possible costs down the line—not just the expected casualties and direct expenses associated with operations but also those of contractors and intelligence personnel, as well as longer-term costs, such as veterans’ care. They should also include the likely effect of military action on civilians living in the country in question and the likely effect of military inaction on the U.S. population.
Congress must also play a role far beyond its power of the purse and its ability to authorize force. For all the focus on the outdated 2001 Authorization for Use of Military Force, which permitted the use of U.S. military force against the perpetrators of 9/11, legislators would do better to concentrate on the conduct of the wars themselves. That means investigating on-the-ground conditions, measuring progress, interrogating policymakers and military leaders, and offering alternative strategies. To do that, Congress would have to use the full panoply of its informal powers to engage in oversight: conducting hearings and briefings, sending congressional delegations, initiating investigations, and so on.
Ironically, it is the counter-ISIS mission in Syria—the one that so frequently elicits calls for its end—that provides a reasonably successful example of how U.S. military intervention can work in practice. With the deployment of roughly 2,000 special operations forces, the United States armed, trained, and advised up to 70,000 local Arab and Kurdish fighters. The operation has banished Iran, Russia, and Syrian government forces from a third of the country, eliminated ISIS’ physical caliphate and forestalled its resurgence, deterred a Kurdish-Turkish clash, and kept refugee flows in check. U.S. casualties and financial expenditures have been relatively low, and international support relatively high: fewer than ten U.S. troops have lost their lives in Syria, and U.S. operations there compose only a fraction of the $15 billion budget for Operation Inherent Resolve, as the military campaign against ISIS in Iraq and Syria is known. Such financial costs are significant, and the human losses tragic, but there is reason to believe that they will be much lower in the future, given the elimination of ISIS’ physical caliphate.
Still, Washington could cut yet more costs by allowing more regular troops to relieve the burden placed on elite special operations forces. Over time, it could reallocate expensive military equipment—such as F-35 and F-22 aircraft—to arenas of great-power competition and instead invest in cheaper aircraft for anti-ISIS bombings in Iraq and Syria. Doing so would free up resources for missions in other regions and reduce the financial burden. If calls for disengagement from Syria prevail, however, it is likely that conditions on the ground will eventually deteriorate, and the United States may once again have to deploy ground forces to prevent the reemergence of a terrorist stronghold.
Ultimately, the unpredictability of world events puts a priority on human judgment and undermines rigid formulas. That is precisely why it is so unwise for 2020 presidential candidates to make categorical commitments to end the United States’ involvement in Afghanistan, Iraq, and Syria and why it is unwise for Trump to focus on an exit to those conflicts rather than the right conditions that would safely enable one. This uncertainty is also a reason why voters must place a priority on the judgment of their would-be leaders.
Amid all the justified frustration with the United States’ post–Cold War approach and pledges to dial back intervention and end forever wars, far more subtlety is needed when it comes to considering if, when, and how the United States should use force abroad. No grand strategy can be built on the presumption that military intervention is mostly an erroneous activity of yesteryear, rather than an enduring feature of U.S. foreign policy.
Now, as the world enters its post–post–Cold War phase, Americans need to do some hard thinking. Their country remains a global power, with strongly held interests and values that require defending. The United States need not look abroad for monsters to destroy. But it must not lull itself into believing that such monsters have disappeared.