"He who establishes conventional wisdom owns history,” a historian once told me. So it’s no surprise that ever since last year’s extraordinary U.S. presidential election, all sides have been bitterly fighting over what happened—and why. The explanations for Donald Trump’s surprise victory have varied widely. But one factor that clearly played an important role was the alienation and disaffection of less educated white voters in rural and exurban areas. Trump may have proved to be a uniquely popular tribune for this constituency. But the anger he tapped into has been building for half a century.

The roots of that anger lie all the way back in the 1960s, when President Lyndon Johnson launched his so-called War on Poverty. Only by properly understanding the mistakes made in that war—mistakes that have deprived generations of Americans of their fundamental sense of dignity—can the country’s current leaders and political parties hope to start fixing them. And only once they properly understand the problem will they be able to craft the kind of cultural and political agenda that can heal the country’s wounds.


On April 24, 1964, Johnson paid a highly publicized visit to Inez, the biggest town in eastern Kentucky’s Martin County. Inez was the heart of coal country, the most typical Appalachian town that Johnson’s advisers could find. In the 1960s, “typical Appalachian” meant a place suffering from crippling despair. The citizens of Inez were poor. Many of them were unemployed, and their children were malnourished. Johnson had chosen Inez to illustrate that dire poverty was not just a Third World phenomenon: it existed right here at home, and not just in cities but in rural America as well. But he also came to Inez to announce that this tragedy could be remedied.

In one famous photo op, Johnson stopped by the home of a man named Tom Fletcher, an unemployed 38-year-old father of eight. The president climbed up onto Fletcher’s porch, squatted down next to him, and listened to the man’s story. According to a 2013 article in the Lexington Herald-Leader by John Cheves, “Fletcher never finished elementary school and could not really read. The places where he had labored—coal mines, sawmills—were closed. He struggled to support his wife and eight children.” The president used Fletcher’s struggles as a springboard for his own announcement. “I have called for a national war on poverty,” he declared. “Our objective: total victory.” Years later, Cheves reports, Johnson still remembered the encounter. “My determination,” he wrote in his memoirs, “was reinforced that day to use the powers of the presidency to the fullest extent that I could, to persuade America to help all its Tom Fletchers.” Over the next five decades, the federal government would spend more than $20 trillion trying to achieve Johnson’s dream with social welfare programs such as Medicaid, food stamps, and Aid to Families with Dependent Children. 

Tom Fletcher personally received some of this largess: he got welfare benefits and found employment through government make-work initiatives, laboring on crews that cleared brush and picked up trash from roadsides. But he never held down a steady job, Cheves recounts, and although his standard of living rose along with the national average, he never made it out of poverty. By 1969, he no longer worked at all and relied instead on disability checks and other public assistance. After his first wife died, he married a woman four decades his junior, with whom he had two more children. In a cruel final twist, Fletcher’s second wife murdered one of those children (and tried to kill the other) as part of a scam to collect on their burial insurance. In 2004, with his wife still in prison, Fletcher died, never having gotten much closer to the American dream than he was when Johnson climbed onto his porch.

Between 1966 and 2014, the United States spent trillions of dollars but saw no reduction in the poverty rate.

Visit the area today, and despite Johnson’s promises, you’ll see that idleness and depression still hang heavy in the air. In Inez, as across the country, the welfare state and modern technology have made joblessness and poverty less materially painful. Homes have electricity and running water. Refrigerators, personal computers, and cars are ubiquitous. Economic growth and innovation have delivered material abundance, and some of the War on Poverty’s programs have proved effective at bolstering struggling families. 

But even though poverty has become less materially miserable, it is no less common. In Martin County, just 27 percent of adults are in the labor force. Welfare is more common than work. Caloric deficits have been replaced by rampant obesity. Meanwhile, things aren’t much better on the national level. In 1966, when the War on Poverty programs were finally up and running, the national poverty rate stood at 14.7 percent. By 2014, it stood at 14.8 percent. In other words, the United States had spent trillions of dollars but seen no reduction in the poverty rate.

Of course, the poverty rate doesn’t take into account rising consumption standards or a variety of government transfers, from food stamps to public housing to cash assistance. But the calculations that determine it do include most of the income that Americans earn for themselves. So although the rate is a poor tool for gauging material conditions, it does capture trends in Americans’ ability to earn success. And what it shows is that progress on that front has been scant.

The War on Poverty has offered plenty of economic analgesics but few cures. This is a failure not just in the eyes of conservative critics but also according to the standard set by the man who launched the campaign. On signing the Appalachian Regional Development Act in March 1965, Johnson argued that the United States should aspire to more than simply sustaining people in poverty. “This nation,” he declared, “is committed not only to human freedom but also to human dignity and decency.” R. Sargent Shriver, a key Johnson adviser on the War on Poverty, put it even more explicitly: “We’re investing in human dignity, not doles.”

A man waits to apply for a job outside the offices of Local Union 46 in Queens, New York, April 2012.
A man waits to apply for a job outside the offices of Local Union 46 in Queens, New York, April 2012.
Keith Bedford / Reuters


At its core, to be treated with dignity means being considered worthy of respect. Certain situations bring out a clear, conscious sense of our own dignity: when we receive praise or promotions at work, when we see our children succeed, when we see a volunteer effort pay off and change our neighborhood for the better. We feel a sense of dignity when our own lives produce value for ourselves and others. Put simply, to feel dignified, one must be needed by others. 

The War on Poverty did not fail because it did not raise the daily caloric consumption of Tom Fletcher (it did). It failed because it did nothing significant to make him and Americans like him needed and thus help them gain a sense of dignity. It also got the U.S. government into the business of treating people left behind by economic change as liabilities to manage rather than as human assets to develop.

The dignity deficit that has resulted is particularly acute among working-class men, most of whom are white and live in rural and exurban parts of the United States. In his recent book Men Without Work, the political economist (and American Enterprise Institute scholar) Nicholas Eberstadt shows that the percentage of working-age men outside the labor force—that is, neither working nor seeking work—has more than tripled since 1965, rising from 3.3 percent to 11.6 percent. And men without a high school degree are more than twice as likely to be part of this “un-working” class.

These men are withdrawing not only from the labor force but from other social institutions as well. Two-thirds of them are unmarried. And Eberstadt found that despite their lack of work obligations, these men are no more likely to spend time volunteering, participating in religious activities, or caring for family members than men with full-time employment.

That sort of isolation and idleness correlates with severe pathologies in rural areas where drug abuse and suicide have become far more common in recent years. In 2015, the Proceedings of the National Academy of Sciences published an extraordinary paper by the economists Anne Case and Angus Deaton. They found that, in contrast to the favorable long-term trends in life expectancy across the rest of the developed world, the mortality rate among middle-aged white Americans without any college education has actually risen since 1999. The main reasons? Since that year, among that population, fatalities due to chronic liver disease and cirrhosis have increased by 46 percent, fatalities from suicide have risen by 78 percent, and fatalities due to drug and alcohol poisoning are up by a shocking 323 percent.

Involuntary unemployment saps one's sense of dignity

Unsurprisingly, those left behind hold a distinctly gloomy view of the future. According to a survey conducted last year by the Kaiser Family Foundation and CNN, fewer than one-quarter of white Americans without a college degree expect their children to enjoy a better standard of living in the future than they themselves have today, and half of them believe things will be even worse. (In contrast, according to the same survey, other historically marginalized communities have retained a more old-school American sense of optimism: 36 percent of working-class blacks and 48 percent of working-class Hispanics anticipate a better life for their children.)

To be sure, rural and exurban whites who possess few in-demand skills and little education are hardly the only vulnerable group in the United States today. But the evidence is undeniable that this community is suffering an acute dignity crisis. Left behind every bit as much as the urban poor, millions of working-class whites have languished while elites have largely ignored them or treated them with contempt. 

Americans from all walks of life voted for Trump. But exit polls unambiguously showed that a crucial central pillar of his support came from modern-day Tom Fletchers: Trump beat Hillary Clinton among white men without a college degree by nearly 50 percentage points. Tellingly, among counties where Trump outperformed the 2012 GOP candidate Mitt Romney, the margins were greatest in those places with the highest rates of drug use, alcohol abuse, and suicide.

Many analysts and policy experts saw Trump’s campaign as a series of sideshows and unserious proposals that, even if implemented, would not actually improve things for his working-class supporters. For example, academic research clearly shows that trade protectionism—a major theme of Trump’s campaign—is more likely to destroy jobs than create them. Yet Trump won regardless, because he was the first major-party nominee in decades who even appeared to care about the dignity of these working-class voters whose lives are falling apart.


If its goal is to instill dignity, the U.S. government does not need to find more innovative ways to “help” people; rather, it must find better ways to make them more necessary. The question for leaders, no matter where they sit on the political spectrum, must be, Does this policy make people more or less needed—in their families, their communities, and the broader economy? 

Some may ask whether making people necessary is an appropriate role for government. The answer is yes: indeed, it represents a catastrophic failure of government that millions of Americans depend on the state instead of creating value for themselves and others. However, it’s not enough to merely make people feel that they are needed; they must become more authentically, objectively necessary.

The single most important part of a “neededness agenda” is putting more people to work. The unemployment rate is relatively low today, at around 4.7 percent, after peaking at around ten percent in 2010, in the wake of the financial crisis. But the unemployment rate can be a misleading metric, since it does not take into account people who are no longer even looking for work. A more accurate measure of how many Americans are working is the labor-force participation rate: the percentage of all working-age adults who are currently employed. That figure hit a peak of just over 67 percent in 2000 and has since fallen to around 63 percent today. The decline has been particularly pronounced among men. In 1954, 98 percent of prime-age American men (those between the ages of 25 and 54) participated in the labor force; today, that figure has fallen to 88 percent. 

Involuntary unemployment saps one’s sense of dignity. According to the American Enterprise Institute economist Kevin Hassett, recent data suggest that a ten percent increase in the jobless rate may raise the suicide rate among men by almost 1.5 percent. And a study published by the sociologist Cristobal Young in 2012 found that receiving unemployment insurance barely puts a dent in the unhappiness that follows the loss of a job. Feeling superfluous triggers a deep malaise that welfare benefits do not even come close to mitigating.

Corn wilts during a drought in Glenham, South Dakota, August 2006.
Jonathan Ernst / Reuters

Increasing the labor-force participation rate will require significant tax and regulatory reforms to encourage more firms to locate and expand their operations in the United States. A logical first step would be to reform the draconian American approach to taxing corporations. On average, between federal and state policies, U.S. businesses pay a tax rate of around 39 percent. That is far above the worldwide average of 22.5 percent and even more out of alignment with the average rates paid by companies in Asia (20.1 percent) and Europe (18.9 percent). One promising, revenue-neutral plan, put forward by the economists Eric Toder and Alan Viard (the latter of the American Enterprise Institute), would cut the U.S. rate to 15 percent (in conjunction with other important structural reforms).

Elites have an ethical duty to reveal how they have achieved and sustained success.

Putting more people to work must also become an explicit aim of the social safety net. Arguably, the greatest innovation in social policy in recent history was the Personal Responsibility and Work Opportunity Reconciliation Act of 1996. The PRWORA, which became synonymous with the phrase “welfare reform,” made several major changes to federal policy. It devolved greater flexibility to the states but established new constraints, such as a limit on how long someone could receive federal welfare benefits and a work requirement for most able-bodied adults. The PRWORA was denounced at the time as a callous right-wing scheme. Critics insisted that people were only jobless because there were no opportunities to work and that the new requirements would force single mothers and vulnerable children into poverty. The opposite has happened. According to the poverty expert Scott Winship, child poverty in single-parent homes has fallen by more than ten percent since 1996. Overall child poverty now sits at an all-time low.

This demonstrates that commonsense limits on welfare can increase people’s incentives to seek employment without crushing them or their families. Congress should apply that lesson to other programs. Housing vouchers and food stamps have weak work requirements that are rarely enforced. Simply bringing those requirements closer to the ones created by the PRWORA could help many Americans reenter the labor force.

Federal disability insurance, or SSDI, is in even more urgent need of reform. Many workers and employers have come to view SSDI as just another form of unemployment insurance. Its enrollment numbers have swelled by almost 40 percent since 2005, even as research offers no evidence of an accompanying uptick in actually disabling conditions. Economists have proposed several interesting ideas for curtailing this surge, which would keep more people in the work force. One plan would adjust employers’ payroll tax burdens depending on how frequently their workers enroll in SSDI; another would require employers to obtain private disability insurance policies, which have a better track record than SSDI when it comes to keeping employees in jobs where they are needed.

These policies represent fairly traditional conservative thinking, and as most conservatives would likely point out, putting them in place years ago might have mitigated much of the suffering that now afflicts so many Americans. But conservatives have failed to get their proposals enacted, in no small part because they have made the wrong arguments for them. Why reform taxes? “To boost earnings and GDP.” Why require work for welfare? “To make those lazy welfare queens work!” Such rhetoric has made good policies sound out of touch and inhumane. The most compelling reason for tax reform and further welfare reform is to create more opportunities for people at the periphery of society.

The truth is that not all good economic policy aligns perfectly with conservative orthodoxy. Take, for example, the challenge of helping low-wage workers earn enough to support their families. For years, conservatives have railed against increases in the minimum wage, citing evidence that such increases do not decrease poverty rates and may well destroy jobs at the bottom of the pay scale. Although well intentioned, minimum-wage policies are more likely to restrict poor Americans’ opportunities to earn a stable living than to enhance them. So governments at every level should forget about increasing minimum wages—which is where the usual conservative argument ends. But they should also experiment with reducing minimum wages to help people trapped in long-term unemployment, making these vulnerable people more attractive to hire. Governments would then supply those workers with direct wage subsidies to increase their take-home income. For example, Michael Strain of the American Enterprise Institute has proposed that the federal government let employers hire long-term unemployed people at $4 per hour and then itself transfer an additional $4 per hour to each of these workers. Another promising idea is the expansion of an existing subsidy, the Earned Income Tax Credit, a refundable tax credit for low-income people who work. The EITC prioritizes families but is less generous to individuals without children; Washington should consider increasing the credit for the latter. Such pro-work policies would help achieve the noble goal of ensuring that hard work results in sufficient rewards, without the negative consequences that accompany minimum-wage hikes.

Creating more opportunities for Americans to work would also require addressing the broken U.S. immigration system, which has a significant effect on the labor market. Economists disagree vigorously about the precise nature of that effect, but it’s reasonable to conclude that illegal immigration tends to moderately reduce wages in low-skill industries, whereas the legal immigration of high-skilled individuals has a positive effect on the overall economy and job creation. Congress and the Trump administration should therefore prioritize the enforcement of existing immigration laws, not through mass deportations but by targeting low-wage employers who hire and exploit illegal immigrants. But they should also significantly loosen the current quotas that limit the number of high-skilled immigrants who can enter the United States.

Volunteers help demolish a damaged house in Moore, Oklahoma, May 2013.
Volunteers help demolish a damaged house in Moore, Oklahoma, May 2013.
Lucas Jackson / Reuters


Making people more necessary will also require improving human capital through better education. At present, U.S. public schools leave millions of young people behind, especially the poor. This is not for lack of funding. According to the National Center for Education Statistics, U.S. government spending per pupil (adjusted for inflation) has more than doubled since 1970. Yet math and reading scores for 17-year-olds haven’t budged in four decades, and the achievement gap between poor and rich students has widened by about a third. 

Policies designed to increase competitive pressures on public schools—vouchers to allow low-income families to send their children to private schools, the devolution of more latitude to state and local authorities, and the expansion of charter schools—are the right place to begin. But these ubiquitous proposals are only the start.

For several generations, American education has moved away from teaching skills that help people specialize and gain greater job security. According to one trade association estimate, nearly 3.5 million manufacturing positions will be created over the next decade, but as many as two million may go unfilled. Another estimate suggests that the U.S. welding industry alone may face an imminent shortage of nearly 300,000 skilled workers. Much of the blame for such gaps goes to a widespread “college or bust” mentality that pervades American society and has resulted in a disconnect between supply and demand in the blue-collar labor market. Employers in several sectors are begging for more workers, but many young adults don’t have the necessary skills because they were never encouraged to learn them. There’s a fairly easy policy fix for this problem. Career- and technical-training programs take, on average, only two years to complete, and students can attend them while still enrolled in high school. To get more students to pursue such options, governments should reallocate financial assistance toward trade schools and apprenticeship programs. 

For that change to work, however, politicians and other influential figures will need to use moral suasion to attack the cultural fixation on gaining a four-year degree at any cost. More than 90 percent of high school seniors aspire to postsecondary education, and about 80 percent try it out within two years of graduating from high school, but only about 40 percent successfully earn a degree. That leaves too many young Americans with unfulfilled dreams, college debt, and no credentials or marketable skills—an outcome that could be avoided if they pursued a more practical direction.

Skills-based training isn’t only for the young. The crisis of dignity is most acutely felt among middle-aged populations that have been badly served by decades of lackluster federally funded job-training programs. Instead of relying on top-down directives from Washington, training programs should be embedded in the private sector and gently overseen by authorities at the state and local level, where officials could entice companies through tax incentives to train and hire workers who have been out of the labor force for long periods of time.


A public policy agenda focused on building dignity and neededness would mark a departure from the status quo, but not an unthinkable or radical one. But on their own, these policies would not produce the dramatic change that is necessary. Only a profound cultural shift can achieve that. 

Today, the top and the bottom of American society live in separate worlds. They do not attend school together, socialize together, or work together. They hardly know each other. As a result, few people in either of these two Americas even recognize the social trends that are widening the cultural gulf between them. Some differences are trivial, such as regional accents or entertainment preferences. Other differences, however, are more consequential: for example, the birthrate among unmarried mothers. Whereas less than ten percent of births to college-educated women occur out of wedlock, the comparable figure for women with only a high school degree or less is more than 50 percent. Children born out of wedlock are more likely to grow up without a father, and those brought up in such circumstances are less likely to graduate from high school, more likely to suffer from mental health problems, and less likely to work later in life. In other words, class-based cultural differences are more than a matter of curiosity. They are a major factor in producing the misery that so many Americans experience.

Of course, the United States does not need a cabinet-level secretary of middle-class morals. But legislators and officials should try to ensure that any social policy passes a simple test: Does it weaken family integrity or social cohesion—for example, by encouraging single parenthood, fragmenting communities, erecting barriers to religious expression, or rewarding idleness?

Moral suasion can be even more powerful than policy. Before elites on the left and the right do battle over policy fixes, they need to ask themselves, “What am I personally doing to share the secrets of my success with those outside my social class?” According to the best social science available, those secrets are not refundable tax credits or auto-shop classes, as important as those things might be. Rather, the keys to fulfillment are building a stable family life, belonging to a strong community, and working hard. Elites have an ethical duty to reveal how they have achieved and sustained success. Readers can decide for themselves whether this suggestion reflects hopeless paternalism, Good Samaritanism, or perhaps both.


A few months after the launch of the War on Poverty in 1964, voters in Kentucky’s Martin County headed to the polls to choose the next president of the United States. They rewarded the candidate who had traveled there, listened to them, and pledged to fight for their dignity. The deeply conservative community, where Richard Nixon had easily won in the 1960 presidential contest, made a brief exception: Johnson, a liberal Democrat, won Martin County with just over 51 percent of the vote. The outcome of the 2016 election was similar in one important respect: the man who swept Martin County with a staggering 89 percent of the vote was the candidate who had promised to return dignity to its people.

But merely backing the winning candidate will not guarantee dignity for today’s Tom Fletchers. The War on Poverty proved that beyond all doubt, having led to five decades of debt and welfare dependence, which, when blended with the Great Recession, helped produce the anger and disillusionment that drove the current populist surge.

Many elites and officials have reacted to Trump’s victory with a combination of shock, alarm, and depression. But they should see it as an opportunity for learning and reform, and they should respond with a positive policy agenda that is radically pro-work and serious about developing human capital. And they should learn to treat people at the periphery of society—from Inez to Detroit to the Rio Grande Valley—with enough respect to share with them the cultural and moral norms that can bring happiness and success in life. Doing so would be politically prudent. But much more important, it would help fulfill the moral obligation that leadership brings: to maximize the inherent dignity that all Americans are born with, remembering that we all possess a deep need to be needed.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now