A Superpower, Like It or Not
Why Americans Must Accept Their Global Role
When the U.S. president used his power to target immigrants, the press, and his political opponents, the sheer overreach of his actions shocked many citizens. Tensions among the country’s political leaders had been escalating for years. Embroiled in one intense conflict after another, both sides had grown increasingly distrustful of each other. Every action by one camp provoked a greater counterreaction from the other, sometimes straining the limits of the Constitution. Fights and mob violence often followed.
Leaders of the dominant party grew convinced that their only hope for fixing the government was to do everything possible to weaken their opponents and silence dissent. The president signed into law provisions that made it more difficult for immigrants (who tended to support the opposition) to attain citizenship and that mandated the deportation of those who were deemed dangerous or who came from “hostile” states. Another law allowed for the prosecution of those who openly criticized his administration, such as newspaper publishers.
Much of this may sound familiar to anyone living through the present moment in the United States. But the year was 1798. The president was John Adams, and the legislation was known as the Alien and Sedition Acts. Adams’s allies in Congress, the Federalists, argued that in anticipation of a possible war with France, these measures were necessary to protect the country from internal spies, subversive elements, and dissent. The Federalists disapproved of immigrants, viewing them as a threat to the purity of the national character. They particularly disliked the Irish, the largest immigrant group, who sympathized with the French and tended to favor the opposition party, the Republicans. As one Federalist member of Congress put it, there was no need to “invite hordes of Wild Irishmen, nor the turbulent and disorderly of all the world, to come here with a basic view to distract our tranquility.”
Critics of the new laws raised their voices in protest. The Republicans charged that they amounted to barefaced efforts to weaken their faction, which happened to include most Americans not of English heritage. Two leading Republicans, Thomas Jefferson and James Madison, went so far as to advise state governments to refuse to abide by the Sedition Act, resolving that it was unconstitutional.
Political conflicts boiled over into everyday life. Federalists and Republicans often resided in different neighborhoods and attended different churches. The Federalists, centered particularly in New England, prized their Anglo-American identity, and even after the American Revolution, they retained their affinity with the mother country. Republicans saw themselves as cosmopolitan, cherishing the Enlightenment ideals of liberty and equality, and they championed the French Revolution and disdained Great Britain. As early as 1794, partisans in urban communities were holding separate Fourth of July ceremonies. Republicans read aloud the Declaration of Independence—penned by Jefferson, the founder of their party—as evidence that independence had been their own achievement, whereas Federalists offered toasts to their leader, President George Washington. The Republicans viewed themselves as the party of the people; one prominent politician among them chided the Federalists for celebrating not “we the people” but “we the noble, chosen, privileged few.”
On the streets, mock violence—the burning of effigies—was swiftly devolving into the real thing, as politically motivated beatings and open brawls proliferated. In one case, on July 27, 1798, Federalists in New York marched up Broadway singing “God Save the King” just to antagonize the Republicans; the latter responded by singing French revolutionary songs. Soon, the singing contest became a street fight.
Watching the growing chaos and division, Americans of all stripes worried that their experiment in self-government might not survive the decade. They feared that monarchy would reassert itself, aristocracy would replace representative government, or some states might secede from the union, causing its demise. The beginnings of American democracy were fragile—even at a time when some of the U.S. Constitution’s framers themselves, along with other luminaries of the era, held public office.
Of course, the early republic was by no means a fully realized democracy. The bold democratic ideals of equality and government by consent, which were enshrined in the nation’s founding documents, were paired with governing practices that repudiated them, most blatantly by sanctioning slavery. The U.S. Constitution established representative government, with public officials chosen directly or indirectly by a quickly expanded electorate of white men of all classes, who gained suffrage rights well before their peers in Europe. Yet nearly one in five Americans, all of them of African descent, were enslaved, lacking all civil and political rights. The Constitution not only implicitly condoned this practice but even granted extra political power to slaveholders and the states in which they resided.
After two centuries of struggle, the United States democratized. Not until the 1970s could the United States be called a truly robust and inclusive democracy. That long path included numerous periods when the country lurched toward greater authoritarianism rather than progressing toward a stronger democracy. Time and again, democratic reforms and the project of popular government were put at risk of reversal, and in some instances, real backsliding occurred. In the 1850s, divisions over slavery literally tore the country apart, leading to a destructive civil war in the next decade. In the 1890s, amid the convulsive changes of the industrial era and an upsurge in labor conflict and farmers’ political organizing, nearly four million African Americans were stripped of their voting rights. During the Great Depression of the 1930s, many Americans welcomed the presidency of Franklin Roosevelt, who was willing to use greater executive power than his predecessors—but others worried that Roosevelt was paving the way for the type of strongman rule on the rise in several European countries. During the Watergate scandal of the 1970s, in the wake of unrest over racism and the Vietnam War, President Richard Nixon tried to use the tools of executive power that were developed in the 1930s as political weapons to punish his enemies, creating a constitutional crisis and sapping citizens’ confidence in institutions of all kinds.
These crises of democracy did not occur randomly. Rather, they developed in the presence of one or more of four specific threats: political polarization, conflict over who belongs in the political community, high and growing economic inequality, and excessive executive power. When those conditions are absent, democracy tends to flourish. When one or more of them are present, democracy is prone to decay.
Today, for the first time in its history, the United States faces all four threats at the same time. It is this unprecedented confluence—more than the rise to power of any particular leader—that lies behind the contemporary crisis of American democracy. The threats have grown deeply entrenched, and they will likely persist and wreak havoc for some time to come.
Although the threats have been gathering steam for decades, they burst ever more vividly and dangerously into the open this year. The COVID-19 pandemic and the economic crisis it precipitated have dramatically exposed the United States’ partisan, economic, and racial fault lines. Americans of color have disproportionately been victims of the novel coronavirus. African Americans, for example, have been five times as likely as whites to be hospitalized for COVID-19 and have accounted for nearly one in four deaths related to the coronavirus that causes the disease—twice their proportion of the population. The pandemic-induced recession has exacerbated economic inequality, exposing the most economically vulnerable to job losses, food and housing insecurity, and the loss of health insurance. And partisan differences have shaped Americans’ responses to the pandemic: Democrats have been much more likely to alter their health behavior, and even the simple act of wearing a mask in public has become a partisan symbol. The Black Lives Matter protests that erupted after the police killing of George Floyd in Minneapolis in May have further highlighted the deep hold that systemic racism has long had on American politics and society.
President Donald Trump has ruthlessly exploited these widening divisions to deflect attention from his administration’s poor response to the pandemic and to attack those he perceives as his personal or political enemies. Chaotic elections that have occurred during the pandemic, in Wisconsin and Georgia, for example, have underscored the heightened risk to U.S. democracy that the threats pose today.
The situation is dire. To protect the republic, Americans must make strengthening democracy their top political priority, using it to guide the leaders they select, the agendas they support, and the activities they pursue.
Not long ago, lawmakers in Washington frequently cooperated across party lines, forging both policy alliances and personal friendships. Now, hostility more often prevails, and it has been accompanied by brinkmanship and dysfunction that imperil lawmaking on major issues. The public is no different. In the 1950s, when pollsters asked Americans whether they would prefer that their child “marry a Democrat or a Republican, all other things being equal,” the vast majority—72 percent—either didn’t answer or said they didn’t care. By contrast, in 2016, a majority of respondents—55 percent—expressed a partisan preference for their future son-in-law or daughter-in-law. For many Americans, partisanship has become a central part of their identity.
Vibrant political parties are essential to the functioning of democracy. Yet when parties divide lawmakers and society into two unalterably opposed camps that view each other as enemies, they can undermine social cohesion and political stability. The framers of the U.S. Constitution, attuned to such threats because of Great Britain’s previous century of experience with violent parties and factions, hoped their new country could avoid parties altogether. Yet no sooner was the new government up and running than political leaders—including some of the founders themselves—began to choose sides on the critical issues of the day, leading to the formation of the sharply antagonistic Federalist and Republican factions. That bout of polarization subsided only after the deadlocked presidential election of 1800, during which both sides prepared for violence and many feared civil war. The outcome was ultimately decided peacefully in the House of Representatives when, after multiple inconclusive votes, one member of Congress shifted his support from Aaron Burr to Jefferson.
Polarization grows when citizens sort themselves so that, instead of having multiple, crosscutting ties to others, their social and political memberships and identities increasingly overlap, reinforcing their affinity for some groups and setting them apart from others. In the mid-twentieth century, this process commenced once again as white southerners, beginning as early as the 1930s and accelerating by the 1960s, distanced themselves from the Democratic Party and its uneven but growing embrace of the cause of racial equality, shifting gradually toward the Republicans.
When parties divide society into unalterably opposed camps, they can undermine political stability.
Polarization intensifies as ambitious political entrepreneurs take advantage of emerging divisions to expand their power. They may do this by adopting opposing positions on issues, highlighting and promoting underlying social differences, and using inflammatory rhetoric in order to consolidate their supporters and weaken their opponents. Contemporary polarization in Congress advanced in this way starting in 1978. A young Republican congressman named Newt Gingrich, lamenting his party’s decades of minority status, launched a long-term attack on the institution of Congress itself in order to undermine public trust in the institution and convince voters that it was time for a change. He told Republicans, “Raise hell all the time. . . . This party does not need another generation of cautious, prudent, careful, bland, irrelevant, quasi-leaders. . . . What we really need are people who are willing to stand up in a slugfest and match it out with their opponent.” He rallied the base, found ways to embarrass the Democratic majority, and proved to be a master of attracting media attention.
As a political strategy, polarization delivered: congressional elections became more competitive than they had been for the previous half century. Every election from 1980 to the present has presented an opportunity for either party to take control of each chamber of Congress. In 1994, Republicans finally won a majority in the House of Representatives after being in the minority for 58 of the preceding 62 years, and they elected Gingrich as Speaker. Partisan control of Congress has seesawed ever since.
Party leaders from Gingrich onward encouraged their fellow partisans to act as loyal members of a team, prioritizing party unity. They shifted staff and resources away from policy committees and toward public relations, allowing them to communicate constantly to voters about the differences between their party and the opposition. Such messaging to the base helps parties be competitive in elections. But this approach hinders democratic governance by making it more difficult for Congress to work across party lines and address major issues. This occurs in part because polarization makes many of the attributes of a well-functioning polity—such as cooperation, negotiation, and compromise—more costly for public officials, who fear being punished at the polls if they engage in these ways with opponents. As division escalates, the normal functioning of democracy can break down if partisans cease to be able or willing to resolve political differences by finding a middle ground. Politics becomes a game in which winning is the singular imperative, and opponents transform into enemies to be vanquished.
Polarization is not a static state but a process that feeds on itself and creates a cascade of worsening outcomes. Over time, those who exploit it may find it difficult to control, as members of the party base become less and less trustful of elites and believe that none is sufficiently devoted to their core values. These dynamics give rise to even less principled actors, as epitomized by Trump’s rise. During the 2016 U.S. presidential campaign, numerous established Republican politicians, such as Senators Lindsey Graham of South Carolina and Marco Rubio of Florida, expressed their disdain for Trump, only to eat their words once he was nominated and to support him faithfully once he was in the White House.
The culmination of polarization can endanger democracy itself. If members of one political group come to view their opponents as an existential threat to their core values, they may seek to defeat them at all costs, even if it undermines normal democratic procedures. They may cease to view the opposition as legitimate and seek permanent ways to prevent it from gaining power, such as by stacking the deck in their own favor. They may become convinced that it is justifiable to circumvent the rule of law and defy checks and balances or to scale back voting rights, civil liberties, or civil rights for the sake of preserving or protecting the country as they see fit.
Democracy has been most successful in places where citizens share broad agreement about the boundaries of the national community: who should be included as a member and on what terms, meaning whether all should have equal status or if rights should be parceled out in different ways to different groups. Conversely, when a country features deep social divisions along lines of race, gender, religion, or ethnicity, some citizens may favor excluding certain groups or granting them subordinate status. When these divisions emanate from rifts that either predated the country’s founding or emerged from it, they can prove particularly pernicious and persist as formidable forces in politics.
Such formative rifts may come to a head as the result of some political change that prompts opposing political parties to take divergent stands on the status of certain groups. Politicians may deliberately seek to inflame divisions as a political strategy, to unite and mobilize groups that would not otherwise share a common goal. Or social movements might mobilize people on one side of a rift, leading to a countermobilization by those on the other side. In either case, when such divisions are triggered, those who favor a return to earlier boundaries of civic membership and status may be convinced that they must pursue their goals even if democracy is curtailed in the process.
The United States at its inception divided the political community by race, creating a formative rift that has organized the country’s politics ever since. A commitment to white supremacy has often prevailed, impelling many Americans to build coalitions around appeals to racism and segregation in order to further their political interests. The quest to preserve slavery drove U.S. politics for decades. Even after slavery ended, white supremacy often reigned through decades of voting restrictions, the denial of rights, discrimination, and segregation. Yet a countervailing commitment to equality and inclusion also emerged in American politics, fueled by the ideals of the Declaration of Independence and sustained by the persistent efforts of enslaved and oppressed Americans themselves. This tradition repeatedly and powerfully challenged slavery and white supremacy and brought about critical reforms that expanded rights and advanced American democracy.
Even after slavery ended, white supremacy reigned through voting restrictions and segregation.
The American gender divide, also codified in law, made men’s dominance in politics and society appear to be natural and rendered the gender hierarchy resistant to change. A countervailing commitment to equality emerged, however, in the nineteenth-century women’s movement, articulated in the 1848 Declaration of Sentiments at the Seneca Falls Convention: “We hold these truths to be self-evident: that all men and women are created equal.” Yet not until 1916 would the two major political parties embrace the cause of women’s suffrage at the national level, ushering in the 19th Amendment’s ratification in 1920.
Certainly, some tendencies of human nature can help explain why formative rifts can prove so potent. Many people trust communities that seem familiar to them and that they associate with virtue and safety, and they feel distrustful of other groups, whose customs strike them as strange and even dangerous. When political figures or events ignite voters’ anger, especially around matters pertaining to race or gender, political participation is often elevated, particularly among those who favor traditional hierarchies and are willing to put democracy itself at risk in order to restore them.
Yet views about who belongs in the political community do not always foster political conflict; it all depends on how they map onto the political party system. In some periods, for example, neither party strongly challenged white supremacy, in which case the status quo prevailed, its restrictions on democracy persisting unchallenged. In other periods, the conflict between racially inclusive and white supremacist visions of American society and democracy has overlapped with partisan divisions and fueled intense political conflict. At such moments, democracy stood on the brink—with the promise of its expansion existing alongside the threat of its demise.
The first half of the nineteenth century featured white man’s democracy on southern terms, as neither party challenged the South’s devotion to slavery. In the 1850s, however, the region’s dominance of national politics began to decline. As that happened, its ability to use the political system to protect slavery eroded, and subsequently southerners abandoned democratic means for resolving the conflict. The party system reorganized itself around the slavery question, and ruinous polarization ensued. In response to the election of President Abraham Lincoln, the South seceded, and the country plunged into a violent civil war, the ultimate democratic breakdown.
In the decades after the Civil War, the country made strides at building a multiracial democracy, as newly enfranchised African American men voted at high rates and over 2,000 of them won election to public office, serving as local officials, in state legislatures, and in the U.S. Congress. But in the 1890s, the forces of white supremacy rebounded, resulting in violent repression and the removal of voting rights from millions of African Americans. Sixty years of American apartheid followed, not only in the authoritarian enclaves of the South but in northern states as well and in national institutions such as the federal bureaucracy and the U.S. military.
In the contemporary period, the conflict between egalitarian and white supremacist visions of American society once again overlaps with the party system and coincides with intense polarization. Over the past several decades, as the U.S. population has become more racially and ethnically diverse, the composition of the Republican Party has grown to be far whiter than the population at large, and the Democratic Party has forged a more diverse coalition. Attitudes among party members have diverged, as well: since the 1980s, Republicans have become far more likely to express racist views, and Democrats, far less so, as revealed by the American National Election Studies. This political chasm has been further exacerbated by rising hostility to immigration and simmering disagreement about the status of immigrants in American society. The resulting divergence makes for extremely volatile politics.
Democratic fragility can also result from high rates of economic inequality, which can undermine the institutions and practices of existing democracies. Countries in which inequality is on the rise are more likely to see democracy distorted, limited, and potentially destabilized. By contrast, countries in which inequality is low or declining are less likely to suffer democratic deterioration.
People typically assume that inequality makes democracy vulnerable by increasing the chances that the less well-off will rise up against the wealthy, but that is rarely the case. Rather, as inequality grows, it is the affluent themselves who are more likely to mobilize effectively. They realize that working- and middle-class people, who greatly outnumber them, tend to favor redistributive policies—and the higher taxes necessary to fund them, which would fall disproportionately on the rich. Fearful of such policy changes, the rich take action to protect their interests and preserve their wealth and advantages. For a time, this may skew the democratic process by giving the rich an outsize voice, but it can eventually cause more fundamental problems, endangering democratic stability itself. This can occur when the wealthiest citizens seek to solidify their power even if it entails harm to democracy. They may be willing to abide a polarizing politics of “us versus them” and the adoption of repressive measures if that is what it takes for leaders to protect their interests.
Among wealthy democracies in the world today, the United States is the most economically unequal. After a period during the mid-twentieth century when low- and middle-income Americans experienced quickly rising incomes, since the late 1970s, they have seen slow or stagnant wage growth and shrinking opportunities. The affluent, meanwhile, have continued to experience soaring incomes and wealth, particularly among the richest one percent of the population. The compensation of chief executives skyrocketed from 30 times the annual pay of the average worker in 1978 to 312 times as much by 2017.
In the late eighteenth century and the nineteenth century up through the Civil War, the widespread existence of slavery made for extreme inequality in the American South. Other regions of the country during that same period, however, featured greater equality than did the countries of Europe, being unencumbered by feudalism and the inherited structure of rigid social classes. But as the nineteenth century proceeded, economic inequality grew throughout the country, and by the late nineteenth century—“the Gilded Age,” as Mark Twain called it—the United States had nearly caught up with the intensely class-stratified United Kingdom. These disparities would endure until the U.S. stock market crashed in 1929. The wealthy lost much during the Great Depression, and then, after World War II, a strong economy and government policies fostered upward mobility and the growth of a large middle class. By later in the twentieth century, however, economic inequality was growing once again, owing not only to deindustrialization and globalization but also to policy changes that favored the wealthy.
Greater political inequality generally accompanies rising economic inequality, and the United States has been no exception in this regard. In the age of the robber barons, in the late nineteenth and early twentieth centuries, the Industrial Revolution generated vastly unequal wealth paired with unequal political power. Decades of bloody repression of workers ensued as an ascendant class of capitalists enjoyed protection from the courts.
Many Americans had already been living on the edge of destitution when the Great Depression plunged the country into soaring rates of joblessness and poverty. Under Roosevelt’s leadership, the United States responded with the New Deal, a collection of policies to provide social protection, restructure the economy, and ensure labor rights. Along with World War II, the New Deal helped revive the American economy and reduce economic inequality, while largely preserving existing racial and gender hierarchies and inequalities. These changes helped sustain three decades of shared prosperity and relatively low polarization in American politics.
But beginning in the 1970s, economic inequality began to grow, and the affluent and big business in the United States became more politically organized than ever, in ways that presented major obstacles to democracy. Since the 1990s, the amount of money spent on politics—on both campaign contributions and lobbying—has escalated sharply, owing to the deep pockets and strong motivations of wealthy Americans and corporations. Even more striking is the degree to which the rich have organized themselves politically to pursue their policy agenda at the state and national levels. When government responds primarily to the rich, it transforms itself into an oligarchy, which better protects the interests of the wealthy few. Keeping watch over democracy is not their concern.
A final factor in democratic backsliding is the demise of checks on executive power, which typically results when powerful leaders take steps to expand their power and autonomy relative to more broadly representative legislatures and courts that are expected to protect rights. These executive actions might be perfectly legal, such as filling the courts and government agencies with political allies. But executives might also be tempted to stack the deck against their political opponents, making it hard to challenge their dominance; circumvent the rule of law; or roll back civil liberties and civil rights.
The American founders sought to thwart executive tyranny and to prevent a single group of leaders from seizing control of all the levers of government power at once. But separation-of-powers systems, such as that of the United States, are notoriously prone to intractable political conflicts between the executive and the legislative branches, each of which can claim democratic legitimacy because it is independently elected. Moreover, a president engaged in such a conflict might be tempted to assume a populist mantle—to equate his supporters with “the people” as a whole and present his preferred policies as reflective of a single popular will, as opposed to the multiplicity of voices and interests represented in the legislature.
Across most of the first 125 years of the country’s history, the very idea of a president achieving autocratic powers would have seemed inconceivable because the office was limited and Congress prevailed as the dominant branch. In the early twentieth century, however, presidential power began to grow, with the presidency eventually becoming a much more dominant office than the framers ever envisioned. Certainly, the president cannot single-handedly create or repeal laws, as those powers are vested in Congress. But in other respects, an aspiring autocrat who occupied the White House would find considerable authority awaiting him.
Presidents throughout the twentieth century and into the twenty-first have expanded the powers of the office through the use of executive orders and proclamations, the administrative state, an enlarged White House staff and the creation of the Executive Office of the President, and the president’s control over foreign policy and national security. Meanwhile, Congress has ceded considerable authority to the executive branch, often in moments of crisis, and has enabled presidents to act unilaterally and often without oversight. As a result, the ordinary checks and balances that the framers intended to ensure democratic accountability have grown weaker.
This “imperial presidency,” as some have dubbed it, has afforded presidents near-complete autonomy in foreign policy decisions and allowed them to commit the country to expensive and risky interventions abroad, with the executive seeking congressional approval only later. A vast national security apparatus has grown in tandem. It has secretly conducted domestic surveillance and engaged in political repression, often targeted at immigrants, minorities, and the politically vulnerable. In the hands of a leader who sees himself as above the law, these tools provide ample means to further the leader’s own agenda, at great cost to accountable democratic government.
U.S. presidents are afforded near-complete autonomy in foreign policy decisions.
Although presidential power had grown over the first third of the twentieth century, it was Roosevelt who truly launched the process of executive aggrandizement. He took office at a moment of deep crisis, and many Americans expected him to assume dictatorial powers like those on display in Europe—some even urged him to do so. Roosevelt managed to steer the country through the crisis in a manner that preserved democracy, but he did so through an unprecedented expansion of presidential power. As the fascist threat grew in the 1930s, Roosevelt secretly authorized extensive domestic wiretapping, ostensibly to counter the danger of Nazi subversion. And during World War II, he ordered the mass incarceration of more than 100,000 people of Japanese descent, some 70,000 of whom were U.S. citizens.
In the 1970s, Nixon built on those precedents in order to weaponize the presidency, turning the national security apparatus against his personal and political enemies. Nixon’s White House and campaign operatives engaged in a wide array of skullduggery and law breaking to harass, surveil, and discredit his antagonists, including, most famously, the botched Watergate burglary in 1972 that ultimately brought Nixon down.
The four threats to democracy have waxed and waned over the course of U.S. history, each according to its own pattern. When even one threat existed, the course of democracy was put at risk, as occurred with the escalation of polarization in the 1790s and executive aggrandizement in the 1930s and 1970s. In the absence of the other threats, however, little backsliding occurred during those periods. When several threats coalesced, however, democratic progress was endangered. In both the 1850s and the 1890s, the combination of polarization, economic inequality, and racial conflict produced calamities.
Today, for the first time ever, the country is facing all four threats at once. Polarization has become extreme, prompting members of Congress to act more like members of a team than as representatives or policymakers. Among ordinary citizens, polarization is prompting a sense of politics as “us versus them,” in which people’s political choices are driven by their hostility toward the opposition. Economic inequality has skyrocketed, and wealthy individuals and business leaders are highly motivated and organized to protect their interests and expand their riches, even if they must tolerate or embrace racist, nativistic politics to achieve their goals. And in the face of political dysfunction and stalemate, the power of the executive branch has grown exponentially.
Trump’s nomination and election were one result of these trends; his presidency has become a driving force behind them. He is polarization personified, utterly dismissive of and vicious toward all opponents. He has repeatedly stoked racial antagonism and nativism. Despite the populist atmospherics of his rallies and rhetoric, his approach to governing has been plutocratic, not redistributive, delivering robust benefits to the wealthy and business interests and relatively little to everyone else. And more than any president since Nixon, Trump views the presidency as his personal domain and has wielded its power to promote his personal interests—political and financial—at the expense of democratic accountability.
Throughout his time in the White House, Trump has launched a frontal attack on elections and the public’s confidence in them. This began with his unsubstantiated 2016 claims that the electoral system was “rigged” and his warnings that he would not accept the results if they went against him; even after he won, he made spurious allegations of voter fraud in order to wave away the fact that he had lost the popular vote. He has also tolerated and even encouraged foreign interference in U.S. elections, failing to condemn Russian meddling in 2016 and later making a bald-faced effort to coerce Ukraine into launching a baseless investigation into former Vice President Joe Biden, Trump’s likely opponent in the 2020 election, in order to provide him with dirt to use against Biden.
Even more dangerous is Trump’s assault on the rule of law. Previous presidents have stretched the law and even violated it in pursuit of policy goals and political advantage. But few have so resolutely flouted the line between presidential power and personal gain. Trump has made no secret of his belief that the FBI and the Justice Department are not public entities responsible for carrying out the rule of law; rather, he regards them as a private investigative force and a law firm that can protect him and his allies and harass and prosecute his enemies. In William Barr, he has found an attorney general who is willing to provide this personal protection.
Trump has also chipped away at bedrock values of American democracy, such as the idea of a free press, going so far as to threaten to revoke the licenses of news outlets that have published critical reporting on him and his administration; luckily, he has not followed through. Yet his frequent attacks on the mainstream media as “fake news” and “enemies of the people” have further undermined confidence in the press, with invidious effects. And when it comes to civil rights, Trump’s frequent verbal assaults on immigrants and members of other minority groups have been accompanied by several policy and administrative changes that have scaled back the rights of vulnerable communities.
Americans may wish to assume that their democracy will survive this onslaught. After all, the country has weathered severe threats before. But history reveals that American democracy has always been vulnerable—and that the country has never faced a test quite like this.
Democratic decay is not inevitable, however. Politics does not adhere to mechanical principles, in which given circumstances foreordain a particular outcome. Rather, politics is driven by human beings who exercise agency and choice and who can set their sights on preserving and restoring democracy. Political leaders and citizens can rescue American democracy, but they must act before it is too late.
Some will say that focusing on the risk of backsliding misses the bigger point that American democracy has been far from perfect even in the past half century, never mind prior to the 1960s. And yet in recent decades, American democracy—despite its limitations—has nonetheless continued some of the best-established traditions of the United States and has allowed for a vast improvement over earlier periods with respect to free and fair elections and the integrity of rights.
Some political scientists and commentators believe that the only way to improve democracy in the United States would be through deep structural reforms. The equal representation of states in the Senate, for example, gives extra representation to residents of sparsely populated states and diminishes the power of people who live in more densely populated places. The Electoral College makes possible a perverse and undemocratic result in which the candidate for president who receives the most votes does not win—the result in two of the last five presidential elections.
But changes to such long-standing features of the U.S. political system seem unlikely. Amending the Constitution is difficult under the best of circumstances, and probably next to impossible in today’s polarized climate. Moreover, those in power are the beneficiaries of the current arrangements and have little incentive to change them.
Absent such changes, one key to protecting democracy is surprisingly simple: to allow that goal to explicitly guide political choices. In evaluating a policy or a proposal, Americans should lean away from their ideological tendencies, material interests, and partisan preferences and instead focus on whether the measure at hand will reinforce democracy or weaken it. The most important thing Americans can do is to insist on the rule of law, the legitimacy of competition, the integrity of rights, and strong protection for free and fair elections. These pillars are the rules of the game that permit all Americans to participate in politics, regardless of which party wins office.
Today’s Republican Party has abandoned its willingness to protect those pillars of democracy, despite its legacy of having done so in earlier periods. The party has tolerated increasingly repressive and antidemocratic behavior as it has sought to maintain and expand its power. Republican officials and leaders now sanction the unjust punishment of their political enemies, efforts to limit voting by those who favor Democrats, and even the dismissal of election results that do not favor their party. In other countries where support for illiberal or authoritarian rule has emerged, opposition parties have embraced the role of champion of democracy. In the United States, that obligation now falls to the Democratic Party.
But ordinary citizens must become engaged, as well. Early generations of Americans made immense personal sacrifices for the sake of democracy. During World War II, Americans defeated Nazism and fascism through military service overseas and substantial efforts on the home front. During the 1950s and 1960s, Americans marched for civil rights, took part in lunch counter sits-ins, and volunteered for Freedom Summer. The time has come once again for Americans to defend democracy, joining in a long legacy.
The first half of 2020 deepened the crisis of democracy in the United States. A global pandemic, a deep recession, and feckless leadership have exposed and further exacerbated all four threats to democracy. At the same time, the broad and widespread Black Lives Matter protests that have filled streets and public squares in cities and towns across the country since the spring are forcing unprecedented numbers of Americans to confront their country’s shameful history of racial inequality. If this reckoning bears electoral fruit in November and beyond, the United States might once again pull itself back from the brink. Crisis might lead to renewal.