The United States of Sanctions
The Use and Abuse of Economic Coercion
It was an act of political bravery heard around Washington, if not around the world. By January 1954, Senator Joseph McCarthy’s Permanent Subcommittee on Investigations had upended lives and destroyed careers, all in an effort to expose a fantastic conspiracy inside American government and society. That month, the committee was up for reauthorization. When senators’ names were called to approve a motion to keep it going, only one nay came from the floor: that of the junior Democratic senator from Arkansas, J. William Fulbright. “I realized that there was just no limit to what he’d say and insinuate,” Fulbright later said of McCarthy. “As the hearings proceeded, it suddenly occurred to me that this fellow would do anything to deceive you to get his way.” Within a year, Fulbright had helped persuade 66 other senators to join him in censuring McCarthy and ending his demagogic run. By the spring of 1957, McCarthy was gone for good, dead of hepatitis exacerbated by drink.
President Harry Truman once called Fulbright “an overeducated Oxford S.O.B.,” and the senator might have felt that was about right. As a Rhodes scholar, promoter of the United Nations, enemy of McCarthyism, chair of the hearings that helped expose the horrors of the Vietnam War, and founder of the academic exchange program that bears his name—now in its 75th year—he had a good claim to being the most broadly influential American internationalist of the twentieth century. From his first run for federal office, in 1942, until his death, in 1995, he cast himself as a political tinker wandering in a divided America: a salvage man trying to pull what he could from a country that was, for much of his career, riven by race, class, and geography.
Fulbright’s ideas were shaped at a time of party polarization and chin-jutting demagoguery unmatched until the rise of Donald Trump. His life is therefore an object lesson about global-mindedness in an age of political rancor and distrust—but not exactly in the ways one might think.
In addition to being a foreign policy visionary, Fulbright was, as his biographer Randall Woods put it 26 years ago, “a racist.” He vocally opposed the racial integration of public schools mandated by the Supreme Court’s 1954 ruling in Brown v. Board of Education. In the 1960s, he filibustered or voted against the era’s monumental civil rights legislation. Later in life, he would claim his stance was tactical. Electoral viability in his home state of Arkansas depended on defending states’ rights and a gradualist approach to equality for Black Americans, he said. But to those who knew him, that argument was only partly true. “To his mind the blacks he knew were not equal to whites nor could they be made so by legislative decree,” Woods wrote.
Americans today are less than one lifetime removed from the system of apartheid that Fulbright defended. The United States has had only one president who came of age when full racial equality was the law of the land. Eighty-one of the 100 current U.S. senators were born in an era when people could be arrested for marrying across racial lines. Americans are more armed, more forgiving of extrajudicial killing, and more comfortable with state-sanctioned confinement and execution than the citizens of any other free country. A hardening segment of the population sees broader social empowerment as an existential threat, and the country’s institutions have proved weak when challenged by officials determined to subvert them. If one were analyzing another country similarly placed in history, the warning lights for the fate of democracy would be flashing red.
In a moment of crisis, Fulbright is a clarifying case. He was a figure who committed his life to global understanding yet found it impossible to apply the same ideals to his homeland. What seems like a contradiction in Fulbright’s outlook, however, is really a blind spot in Americans’ own. The combination of open-mindedness abroad and bigotry at home was not unique to him. His opinions aligned with a deeper conviction in U.S. statecraft that the interests of a great power are best pursued by placing a partition between domestic politics and foreign policy. Yet in an age of savvy authoritarianism, foreign competitors now have a more clear-eyed view of American society than at any point in recent history. Their grasp of American studies is often starkly discerning, with an understanding of the fissures of class, race, and locale—and an unprecedented ability to exploit them.
Remaking U.S. internationalism will require that Americans bridge the old divide between committed globalists and concerned localists. The task is more complicated than leading through “the power of our example,” as President Joe Biden has often repeated, especially when that example includes an organized effort to upend electoral democracy. To counter their own illiberal nationalists and braying chauvinists, Americans should start by practicing the sober self-awareness that Fulbright claimed was critical to living intelligently in the world. Both Fulbright’s vision and his myopia form the story of his country’s twentieth-century rise. And with new limitations on access to voting and more “America first” candidates preparing for electoral runs, the central question of his life remains deeply relevant today: What price does a racially ordered polity pay for its global role?
Fulbright was representative of a certain species of midcentury internationalist: white, male, patrician in style if not background, and schooled in both the superiority of Anglo-Saxon civilization and the obligations of noblesse. He grew up in northwestern Arkansas at the social apex of an otherwise provincial and largely white, southern upland. His mother, Roberta, was a local businesswoman with an extensive telephone list and a gift for persuasion. Her ambitions were realized through Bill, as he was known, whom she helped usher toward the Rhodes scholarship, a college lectureship, and the presidency of the University of Arkansas, all before his 35th birthday.
Fulbright’s political career began with a term in the U.S. House of Representatives and then a race for the Senate. His Senate tenure would extend from President Franklin Roosevelt to President Gerald Ford, and he still holds the record for the longest continuous service as chair of the Senate Foreign Relations Committee. Because of the scholarships that he established by an act of Congress just after World War II, Fulbright was close to a household name before people quite knew why. The original funding for the Fulbright Program came from an ingenious bit of budgeting and backdoor internationalism: selling wartime assets left behind by the United States in other countries, which were hard to repatriate and of little value if converted to dollars, in order to pay the local expenses of Americans studying and researching there. It would eventually grow into the world’s largest foreign scholarship program, supported bilaterally by Washington and partner governments. In the 1950s, the program put Fulbright himself squarely in McCarthy’s sights. Scholarship recipients were America-haters who promoted communism, McCarthy alleged. To Fulbright, this was nonsense. “You can put together a number of zeros and still not arrive at the figure one,” he told McCarthy during one hearing.
Over the next two decades, Fulbright would stage-manage some of the most deeply civic moments of the era. As the Vietnam War devolved into both a foreign policy quagmire and a national crisis, Fulbright convened a series of Senate hearings that interrogated the war’s origins, its cost in lives and prestige, and pathways to ending it. The televised hearings, which ran intermittently from 1966 to 1971, brought high-level debate about the conflict into American living rooms. Across the administrations of Lyndon Johnson and Richard Nixon, a who’s who of foreign and defense decision-makers was called to testify. The diplomat and strategist George Kennan confirmed that many professed communists, such as the North Vietnamese leader Ho Chi Minh, were in fact nationalists. Kennan recommended “a resolute and courageous liquidation of unsound positions”—in other words, stop the war. Long before he became a U.S. senator, a 27-year-old John Kerry, wearing his fatigues and service ribbons and representing Vietnam Veterans Against the War, spoke the most arresting question of the age: “How do you ask a man to be the last man to die for a mistake?” Secretary of State Dean Rusk defended the Johnson administration’s policies, only to be met by the incredulous drawl of Fulbright, sounding like a southern lawyer descending on a dodgy witness.
If there was a moment when the White House began to lose middle America, the Fulbright hearings marked it. From the outset, Johnson was so worried about their impact that he pressed one television network to air I Love Lucy reruns instead of live coverage. In the first month alone, the president’s approval rating on the war slid from 63 percent to 49 percent. Fulbright’s role was all the more powerful because he had earlier supported the 1964 Gulf of Tonkin Resolution, which facilitated the United States’ all-out attack on the North Vietnamese. By the time Nixon was inaugurated in 1969, however, Fulbright had transformed into something he could never have predicted—an antiwar activist. The counterculture had the streets, but Fulbright had the Constitution’s requirement that the Senate hold the presidency to account, even when both institutions were controlled by the same party. It was an enactment of the founders’ vision that has never since been equaled.
Fulbright’s political philosophy was on full display in those moments before the cameras. As a student at Oxford in the 1920s, he had settled into a loose belief in progress and an expectation of cooperation among nations, tempered by a certain pessimism about humans’ ability to get it all right. As a legislator, he often seemed to channel the conservative British statesman Edmund Burke. Legislatures worked best, Burke believed, when they were composed of the best people: educated, curious about the world, expert in their craft. Their role was not only to make laws but also to inform their constituents—“to teach the nation what it does not know,” as the nineteenth-century English constitutionalist Walter Bagehot put it.
The world was a plurality, which demanded tolerance for differences of opinion and culture, as well as properly functioning international institutions that would promote mutual dependence. Fulbright pressed for engagement with the Soviet Union during the Cold War, and when the communist system began to falter, late in his life, he still counseled restraint and outreach rather than a victory dance. Change had to come about in evolutionary ways, he believed. For both a nation’s adversaries and a legislator’s own voters, it was no good pushing people onto ground they were not ready to inhabit. Government, at home and abroad, worked best when it practiced pragmatism and followed the law.
Although some of these ideas get framed as Wilsonian today, many of them—pluralism, tolerance, the primacy of the rule of law—had avatars among white opponents of racial equality. It was here that Fulbright’s outlook connected with those of other segregationists, such as President Woodrow Wilson himself. In 1956, Fulbright signed the Declaration of Constitutional Principles, also known as the Southern Manifesto, along with 100 other members of Congress. The document codified southern resistance to racial integration as a matter of states’ rights. It denounced outside “agitators and troublemakers” and pledged the use of “all lawful means” to resist federal law.
Fulbright’s life is an object lesson about global-mindedness in an age of political rancor.
The document might have been even more extreme had Fulbright not worked behind the scenes to soften it. The word “lawful” may have been one of his insertions. Still, other southern Democrats, such as Al Gore, Sr., and Johnson, then the Senate majority leader, decided not to sign the manifesto. Throughout the rest of the 1950s and into the 1960s, when civil rights legislation came to the Senate floor, Fulbright again held the line. “The Negroes of my State vote freely and without coercion,” he proclaimed during one filibuster. A defense of southern prerogatives was a stand for constitutional restraint, he maintained. Change via federal mandate did violence to the unique conditions the South had inherited from slavery, including the mere fact that white majorities lived alongside large African American minorities.
When Fulbright looked back on those moments, even in his 80s, he cited the constraints imposed on him by the wishes of his constituents. It would take time for them to come around to the idea of equality, he believed. The constituents he could most readily see, however, were the white ones. The African American communities of Arkansas’s Mississippi Delta, whom he also represented, were largely invisible. The problem was that they didn’t vote, Fulbright claimed. But to the degree that was true, he must have known why. The vast southern system of disenfranchisement, coercion, and terror was still firmly in place throughout his time on Capitol Hill.
Americans typically tell the story of the civil rights movement as a struggle between subjugators and emancipators, which of course it was. But Fulbright also occupied a zone inhabited by so many white leaders of the era, especially if they took an interest in global affairs. It was a position whose evil lay in its sheer banality. With the great questions of war and peace clamoring for attention, they felt, full citizenship for Black Americans just wasn’t that important.
The contradictions in Fulbright’s outlook are puzzling only from a specific perspective. U.S. foreign policy is often narrated from New England—the “city upon a hill” described by the Massachusetts Bay colonist John Winthrop, the Harvard and Yale men who designed global institutions and managed the Cold War, and so on—but it was born in the South.
The peculiarities of a slaveholding region were central to the emergence of U.S. foreign relations and, later, westward expansion, as Sven Beckert, Matthew Karp, Heather Cox Richardson, and other historians have shown. The wealth derived from cotton, tobacco, and other commodities—the fruit of the forced labor of nearly four million women and men on plantations stretching from the Chesapeake Bay to the Gulf of Mexico—spurred a commitment to free trade. National leaders from southern states defended slavery not just as a domestic institution but also as the basis for alliances and world order. A consistent strand in U.S. foreign policy thinking before the Civil War was the South’s other indigenous Jeffersonianism—not Thomas’s but that of Senator Jefferson Davis, the future Confederate president. “Among our neighbors of Central and Southern America, we see the Caucasian mingled with the Indian and the African,” Davis said in a speech in 1858. “They have the forms of free government, because they have copied them. To its benefits they have not attained, because that standard of civilization is above their race.” For Davis and other white southerners, the United States’ calling was not to spread universal freedom and republicanism. It was to model the superiority of a political economy founded on the supposedly natural ranking of races.
After the end of Reconstruction, the influence of southern voices and ideas grew both locally and nationally. The South didn’t so much lose the Civil War as outsource it, spreading new theories and techniques of segregation beyond the region itself. Domestically, the Jim Crow system cemented the legal, economic, and political power of whites, as did the brutal counterinsurgencies against Native Americans fought by the regular military on the western plains. Places that had no association with the old Confederacy, from Indiana to California, rushed to create their own versions of apartheid, including prohibitions on interracial marriage and restrictions on voting.
Internationally, U.S. interventions in Hawaii, the Philippines, Cuba, and Haiti were explained using the same tropes that many antebellum southerners had seen as theirs: manliness, white supremacy, and faith in one’s own noble intent, even when other people experienced it as terror. The map of the world as it appeared to white strategists was one of natural affinities—Europeans and their descendants, Africans and theirs—that rendered foreigners familiar and co-citizens foreign. Politics was the art of managing the unfortunate side effect of enslavement, immigration, and empire, namely, the fact of race mixing. The bedrock principle of politics was the same inside the borders of the United States as beyond: “a harsh and cruel struggle for existence . . . between superior races and the stubborn aborigines,” as the Wisconsin political scientist and diplomat Paul Reinsch put it in his textbook World Politics at the End of the Nineteenth Century in 1900.
U.S. foreign policy is often narrated from New England, but it was born in the South.
The same reasoning was still at work during World War II, enabling the internment of Japanese Americans in the United States and informing different visions of the conflicts in Europe and the Pacific. “In Europe we felt that our enemies, horrible and deadly as they were, were still people,” wrote the war correspondent Ernie Pyle from the Pacific theater. “But out here I soon gathered that the Japanese were looked upon as something subhuman and repulsive, the way some people feel about cockroaches or mice.” The mechanisms that helped sustain and spread these ideas, as the scholar and civil rights leader W. E. B. Du Bois wrote in the American Journal of Sociology in 1944, were part of the structure of U.S. politics: “The power of the southerners arises from the suppression of the Negro and poor-white vote, which gives the rotten borough of Mississippi four times the political power of Massachusetts and enables the South through the rule of seniority to pack the committees of Congress and to dominate it.”
The consonance between domestic order and foreign affairs proved difficult to sustain, however. In the 1950s, the growing opposition to race-based discrimination, pursued in the courts and through acts of bravery by Black Americans, slowly began to weaken the system that southern whites had effectively nationalized after the 1870s. A new global competitor, the Soviet Union, took pains to highlight the hypocrisy at the heart of American claims about freedom and democracy. It is tempting to look back on that Soviet approach as a minor element of Cold War jockeying. But at the time, it was of more than passing concern to American diplomats, intelligence analysts, and others who understood the vulnerabilities created by American racism. “Racial discrimination furnishes grist for the Communist propaganda mills,” the U.S. Justice Department told the Supreme Court in an amicus brief for the Brown v. Board of Education case, “and it raises doubts even among friendly nations as to the intensity of our devotion to the democratic faith.”
The communists had a point, of course, even if it was an inconvenient one. “Can’t you just tell the Africans not to drive on Route 40?” President John F. Kennedy once asked an aide after a Maryland diner sparked an international incident by refusing to serve Chad’s representative to the United Nations. For white politicians and intellectuals, the easier thing to accept was that the domestic and foreign worlds were essentially separate, demanding different ethical reasoning and specific analytic models. “Both domestic and international politics are a struggle for power,” the scholar Hans Morgenthau wrote in Politics Among Nations, first published in 1948, yet “different moral, political, and general social conditions prevail in each sphere.” States were atomic, amoral units in the international system, each chasing an object called a national interest. Grand strategy was the technique by which a state pursued its goals, given the available resources and the actions of allies and adversaries. A selective reading of Thucydides and Machiavelli might suggest that this had been the normal way of thinking about world affairs for millennia. As time went on, even when scholars began to open the black box of the state, the drivers of behavior that suggested themselves were personalist or antiseptically structural, such as institutional rivalry, the military-industrial complex, and interest-group politics. Scholars tended to ignore the thing that Du Bois and others had insisted on for a century: the connection between who wielded power at home and the aims a government pursued abroad.
The result was to place the most urgent domestic issues outside the purview of the globally minded. At a time when American politics and international affairs were entangled as never before—with a feedback loop running from Mahatma Gandhi to Martin Luther King, Jr., and then out to anticolonial and human rights struggles around the world—denying these connections was essential to forming a coherent concept of the national interest. After all, a collective will is stable only as long as one controls who counts as the collective. That is how it was possible for virtually every leading white policymaker and global affairs expert of the time to relegate racism, disenfranchisement, and colonialism to the sidelines, as the scholars Kelebogile Zvobgo and Meredith Loken have argued in a key critique. Between 1945 and 1993, they observed in Foreign Policy last year, the word “race” appeared only once in the titles of articles in the top five international relations journals. In a remarkable sleight of hand, scholars stopped recognizing the ties between domestic power and global ambition, something that had been obvious, in its white-supremacist version, to people such as Davis, precisely when that relationship was coming to matter most: at the moment of American ascendancy.
For all their differences, the establishment figures who shaped the United States’ postwar role shared the concept of the international arena as a safe space separate from the concerns of home and inhabited mainly by men like them (the gender, of course, mattered). In these ways, Fulbright was representative of his cadre of foreign policy minds—Kennan, Morgenthau, Dean Acheson, John Foster and Allen Dulles, Henry Kissinger—whose serial biographies once constituted the standard way of writing the history of U.S. foreign relations. Like them, he rejected the isolationism of the aviator and America First celebrity Charles Lindbergh, the haranguing anticommunism of McCarthy, and the miscegenation-phobia of the Alabama politician George Wallace. Each was, in his fashion, déclassé and, what is even worse to a self-invented patrician, zealous.
By contrast, what the great affairs of state really required was sober discernment. “A sound sense of values, the ability to discriminate between that which is of fundamental importance and that which is only superficial,” Fulbright wrote in these pages in 1979, “is an indispensable qualification of a good legislator.” His example of a fundamental matter was emboldening the United Nations. A superficial one, he said, was the poll tax, which he knew was explicitly used to keep Black citizens away from the voting booth. “Regardless of how persuasive my colleagues or the national press may be about the evils of the poll tax, I do not see its fundamental importance,” he wrote.
Most readers would find his dismissal of voting rights shocking today, but the distinction was both telling and commonplace at the time. It exemplified the habit of carving the national interest in ways that avoided the burls: For whom? For what purpose? In whose actual interest? Fulbright had once coined a phrase for what it meant to elide questions such as these. It was on the cover of one of his several books, even if it never occurred to him to turn the analysis back on himself. The title of that book was The Arrogance of Power.
American internationalism has to start with the reality of a country founded on enslavement and the Enlightenment.
Now, a new generation of historians and political scientists is taking the problems of American democracy seriously and placing them in the appropriate comparative light. They are redefining the place of racism and antiracism in U.S. history and resurrecting thinkers, from Du Bois to the civil rights pioneer Pauli Murray, who drew explicit connections between national politics and foreign policy. That process has accompanied a broad and necessary rethinking of racial hierarchies in college syllabuses, publishers’ lists, film scripts, art exhibitions, symphony repertoires, and other areas. That American college students can still study diplomacy without Ralph Bunche, anthropology without Zora Neale Hurston, and history without Carter G. Woodson is a sign of how far the desegregation of the imagination has yet to go. Rediscovering Black voices such as these isn’t a matter of “political correctness” or “wokeness”—what self-aware person uses such terms?—or even a question of justice, although it might lead in that direction. It is at base about being less dumb.
A new American internationalism can rise on this fresh foundation. It has to start with the braided reality of a country founded on enslavement and Enlightenment ideals—holding in one’s head at once both 1619, the year the first Africans were forcibly brought to the Colonies, and 1776, the year of the Declaration of Independence. It also entails putting away the residual exceptionalism that still divides scholars and journalists in the United States from their counterparts elsewhere and that, in turn, determines what students and the public think is important to know. Mainstream liberals, as well as conservatives, tend to diminish the ills caused by the United States abroad while recasting ones effected closer to home—the American prison system, health-care disparities, voter suppression—as unimportant to an understanding of global affairs. That habit can be undone.
The United States ought to be a laboratory for investigating issues that are too often consigned to the vast abroad. Global development also matters in the Mississippi Delta, in upland Appalachia, and on the Standing Rock Indian Reservation. American authoritarianism—from Jim Crow to Trump—bears a family resemblance to systems of violence and personalist dictatorships in other parts of the world. Corruption has the same sources everywhere and is fed by networks that are multinational. Populism, ethnic nationalism, radicalization, and the politics of nihilism and despair all have American versions, which are now more linked than ever, via the Internet and social media, to their global equivalents. The United States has a well-developed export industry of unfreedom, from ruthless campaign advisers to private security firms, whose paid expertise will continue to shape political outcomes and public safety in communities around the world. Reclaiming the domestic as international requires recognizing these realities, and the first step is easy to state, if not to achieve. It is summed up in a line that Fulbright once quoted from President Abraham Lincoln’s 1862 message to Congress. “We must disenthrall ourselves,” Lincoln wrote, “and then we shall save our country.”
Fulbright’s life, like most people’s, was mottled. He acquiesced to awfulness yet led in areas that required political and moral courage. His failings were his country’s, and especially his region’s. His achievements were his alone. He was brave and weak, persuasive and exasperating, prescient and shortsighted, a futurist in thrall to the past. If the United States had followed the domestic path he supported in the 1950s and 1960s, it would have committed a massive act of injustice and self-betrayal. If it had followed the foreign policies he advocated in the 1960s and 1970s, the era would likely have claimed fewer lives.
In 1982, Fulbright’s alma mater (and my own), the University of Arkansas, held a ceremony renaming its College of Arts and Sciences after him, with an oration by the economist John Kenneth Galbraith. The former senator himself beamed from the dais. Nearly four decades later, in August 2020, the university established a special committee to make recommendations about the future of the college’s name and a prominent statue of Fulbright on campus. By that time, Woodrow Wilson’s name had been dropped from Princeton’s School of Public and International Affairs. Monuments to old secessionists and segregationists had fallen across the country. Congress would soon pass legislation stripping the surnames of Confederate generals from U.S. military bases. This past April, the committee recommended that the Fulbright name and statue be removed.
Fulbright acquiesced to awfulness yet led in areas that required political and moral courage.
The reexamination of Fulbright is part of the broader transformation in how Americans talk about themselves in the past tense. Monuments, like nations, are situated in history. As societies change, so do the things they erect to instruct children in the preferred way of recounting it. The meaning of tributes to the dead is no more than what the living do with them. As any visitor to Washington, D.C., can confirm, the Victims of Communism Memorial—unveiled by President George W. Bush in 2007 and now a gathering place for clients from a nearby homeless shelter—has ironically become a monument to the victims of capitalism. The usefulness of statues resides in whether they enable human achievement or inhibit it in the here and now. If the latter is the case, it is best to let them go. Ghosts do not care either way.
There may come a time when societies no longer feel that buildings need human names or that people of note warrant bronzing. Until then, there are plenty of ways to remember the people whose worldviews exceeded their biographies. One of them is the transformative experience of being a Fulbrighter. Since 1946, over 400,000 people (myself included) from more than 160 countries have benefited from an array of Fulbright programs; at present, around 3,000 American students, teachers, and scholars do so annually. Among the awardees are 39 heads of state or government, 60 Nobelists, and 88 Pulitzer Prize winners. The Fulbright title remains a marker of brainy, worldly achievement. “Aren’t you the woman who was recently given a Fulbright?” the musician Paul Simon asked on his multi-Platinum album Graceland. It would be hard to imagine a more profitable investment in building a world both more peaceful and more inclined to think of the United States as, on balance, a force for good.
This legacy is a remarkable monument not to a man but to an idea, one lived out imperfectly in a single life and betrayed repeatedly by the country that professed it. Fulbright’s own biography is evidence that the best of what the United States produced in the last century was inseparable from the worst—a complicated, grownup fact that ought to inform how Americans approach everything from education in international affairs to foreign-policy making. And to generations of people in Africa, Asia, Europe, and the Americas, Fulbright’s most enduring contribution is something that the United States now has an opportunity to bring back home: the astonishing, liberating idea that governments have a duty to help people lose their fear of difference.