The United States’ employment rate—the share of working-age persons in a paying job—rose steadily throughout the second half of the twentieth century. In the 1980s and 1990s, this increase stood out among affluent democracies, a number of which were then experiencing low job growth and persistently high levels of unemployment. Commentators took to referring to the U.S. economy as the “great American jobs machine.”
This article is the second in a two-part series. Part One looked at the decline in men’s employment since 1970.
But that trend ended around 2000, as Figure 1 shows. The employment rate was about four percentage points lower in 2016 than it had been in 2000, and nearly nine percentage points lower than what it would have been based on a projection using 1950–2000 trends.
The United States’ recent employment record isn’t just disappointing when compared to past performance. It is also mediocre relative to job growth in other affluent nations. Among 21 rich, long-standing democratic countries, the U.S. employment rate was the fifth highest in 2000. It is now fifteenth. What has gone wrong, and how can it be fixed?
DEFINING THE PROBLEM
The most obvious culprit is the Great Recession. Its onset in 2008 cut short the employment-growth phase of the 2000s business cycle, limiting it to just four years, compared to seven years in the 1980s and eight years in the 1990s. And the fall in the employment rate caused by the Great Recession—four percentage points between 2007 and 2010—was more severe than in any other downturn since the 1930s.
A second factor contributing to the recent jobs slowdown is a fall in government employment. As Figure 2 shows, the share of the working-age population employed by federal, state, and local governments increased until 1980 and then flattened out in the 1980s and 1990s. Since 2000 it has dropped by about one percentage point.