The Bush administration's primary justification for going to war against Iraq last year was the threat posed by Saddam Hussein's weapons of mass destruction (WMD) programs. But almost as soon as U.S. forces took Baghdad, it became clear that this fear was based on bad intelligence and faulty assumptions. Since then, the failure to find WMD in Iraq has caused a furor.
Sympathetic analysts argue that Washington had no way of knowing how serious the threat of Iraqi WMD was, so intelligence agencies provided the administration with a wide-ranging set of estimates. In the post-September 11 security environment, the argument goes, the Bush administration had little choice but to assume the worst. Critics charge that the White House inflated and manipulated weak, ambiguous intelligence to paint Iraq as an urgent threat and thus make an optional war seem necessary. A recent report by the Carnegie Endowment for International Peace, for example, found not only that the intelligence community had overestimated Iraqi chemical and biological weapons capabilities but also that administration officials "systematically misrepresented" the threat posed by Iraqi weapons.
Public debate has focused on the question of what went wrong with U.S. intelligence. Given the deteriorated state of Iraq's unconventional weapons programs and conventional military capabilities, this is only appropriate. But missing from the discussion is an equally important question: What went right with U.S. policy toward Iraq between 1990 and 2003? On the way to their misjudgments, it now appears, intelligence agencies and policymakers disregarded considerable evidence
- Full website and iPad access
- Magazine issues
- New! Books from the Foreign Affairs Anthology Series