Can Putin Survive?
The Lessons of the Soviet Collapse
It has been more than eight years since many of the United States’ cashiers, dishwashers, janitors, lifeguards, baggage handlers, baristas, manicurists, retail employees, housekeepers, construction laborers, home health aides, security guards, and other minimum-wage workers last got a raise. The federal minimum wage now stands at just $7.25. In real terms, these workers’ earnings have declined by nearly 13 percent since the last hike, in 2009—and have fallen by over one-third since 1968, when the real federal minimum wage was at its peak of $11.38 in today’s money (although only $1.60 then). Although most Americans think the minimum wage should go up—one 2017 poll found that 75 percent supported raising it to $9.00 per hour—today’s Republican-controlled Congress is unlikely to act.
But the lack of progress on Capitol Hill should not give one the impression that little is happening with regard to the minimum wage. In fact, never has there been so much action—it’s just that it is happening at the state and, increasingly, city levels. The “Fight for 15” has become a rallying call on the left and has resulted in some notable successes. Twenty-nine U.S. states plus the District of Columbia now have minimum wages that exceed the federal minimum, as do about 40 municipalities.
Proponents of the minimum wage claim that a high minimum wage is the best way to ensure an acceptable standard of living for all Americans, whereas opponents counter that it is likely to destroy jobs. In the debate between these two camps, feelings often run high. But behind the emotion, economics, both theoretical and empirical, can help one make sense of the issues at stake. The bottom line is that there is not much evidence that the minimum wage is currently a job killer in the United States, and so there is room for it to go up. Raising the minimum wage, however, is not a particularly effective tool to combat poverty and share the benefits of growth.
The recent experiments with minimum-wage increases in the United States have not gone uncontested. In 2013, voters in SeaTac, Washington, made their city the first in the country to raise its minimum wage to $15 per hour. During the campaign, groups supporting and opposing the measure spent a combined $264 per vote, one of the highest figures on record. And 27 states have passed laws requiring cities to abide by the state minimum. Those behind such pushback make an appealing argument: that employers forced to pay above-market wages will choose to cut their payroll.
Opponents of the minimum wage have long invoked economic theory to claim that the measure destroys jobs. “Just as no physicist would claim that ‘water runs uphill,’ no self-respecting economist would claim that increases in the minimum wage increase employment,” the Nobel Prize–winning economist James Buchanan wrote in The Wall Street Journal in 1996. “Such a claim, if seriously advanced, becomes equivalent to a denial that there is even minimal scientific content in economics, and that, in consequence, economists can do nothing but write as advocates for ideological interests.”
Strong stuff, but wrong stuff. That view is rooted in either ignorance or dishonesty. What Buchanan really meant is that the textbook model of perfect competition taught as a benchmark in many introductory economics courses predicts that raising wages above the level that the market would dictate will move the economy from a point where the demand and supply of labor are in balance to one where demand has fallen below supply—in other words, a place in which there are fewer jobs. Some claim that the law of demand tells us that the minimum wage must destroy jobs just as surely as the law of gravity tells us that an apple will fall to the ground.
But it is important not to mistake the labor market on planet Econ 101 for the labor market on planet Earth. The predictions of this model are not akin to the laws of physics, and alarm bells should go off anytime an economist, even a Nobel laureate, claims that they are.
The predictions of any economic model are only as good as the assumptions behind it. And the assumption of perfect competition is a bad approximation for real-world labor markets. Perfect competition assumes that labor markets are frictionless, meaning that hiring workers costs employers nothing in time or money (so there are no vacancies) and that workers can always find an alternative to their current job (so there is no unemployment). In that world, finding or losing a job is no big deal, and if an employer tried to cut wages by just one cent, all its workers would leave right away.
In the real world, of course, workers celebrate getting jobs and grow depressed when they lose them, since it takes time to find a new one. Employers, for their part, know that it takes time and money to hire workers. And they know that if they lower wages, they may find it harder to recruit and retain workers, but not all of the existing ones will quit immediately. There are economic models that capture these features, perhaps taught more often in Econ 301 than in Econ 101. And these models imply that the relationship between minimum wages and employment is more complicated than the model of perfect competition predicts.
In a perfectly competitive market, a minimum wage above the natural level moves the labor market to a place where employment is determined by the demand for labor alone. Some people are unemployed, but no company has any difficulty in filling vacancies for minimum-wage workers. In a more accurate model, by contrast, employment is influenced by both demand and supply factors. Vacancies and unemployment coexist, because one cannot instantaneously fill open positions with the unemployed who want jobs. An increase in the minimum wage may depress the demand for labor, but by making work more attractive, it also raises the supply of labor. The overall employment level, therefore, may go up or may go down.
Economic theory does not imply that a minimum wage always destroys jobs, nor does it imply that it will never do so. All it tells us is that the impact of minimum wages on jobs is not an issue that can be settled with pen and paper; it is an empirical question.
... AND IN PRACTICE
Until 25 years ago, the conventional wisdom among economists was that the empirical evidence proved that the minimum wage destroys jobs and that the only interesting question was, How many? But beginning in the 1990s, new research cast doubt on that conclusion. The key work was the 1995 book by David Card and Alan Krueger, Myth and Measurement: The New Economics of the Minimum Wage, which concluded that much of the existing literature was flawed. Krueger’s position as chief economist at the U.S. Department of Labor at a time when the Clinton administration was advocating a raise in the minimum wage only increased the controversy surrounding the book.
The debate on the impact of the minimum wage has raged back and forth ever since. Each new study claiming that the minimum wage costs jobs is met by another study critiquing it, and vice versa. A lot of this debate concerns issues of little strategic importance, generating more heat than light. Much of the research, for example, focuses on teenagers, the group most affected by the minimum wage, with 25 percent of them reporting hourly wages at or below the minimum. But teens account for a small and shrinking share of total employment, now representing less than two percent of all hours worked in the United States. And less than ten percent of all minimum-wage hours are worked by teens, down from 25 percent in 1979. So findings about teenagers’ incomes have little to say about the overall impact of the minimum wage. Nonetheless, a Congressional Budget Office report in 2014 used evidence (and selective evidence at that) from the teen labor market to estimate the impact of proposed increases in the federal minimum wage on total employment.
There is good evidence that the minimum wage boosts the incomes of the workers earning it.
Setting these problems aside, it is important to note that although some studies claim to have found that minimum wages in the United States reduce employment, none of those is very robust, statistically speaking. If a different (but equally reasonable) model of the relationship between minimum wages and employment is used, the job-killing results often disappear. This conclusion is borne out by a number of meta-studies that have combined the estimates from many different studies to get some idea of the average estimate. There is no reason to think that, up to the levels of a minimum wage observed in these data, an increase will cause workers to lose their jobs.
Although the evidence for the minimum wage as a job killer is thin, there is some evidence that it has other negative effects. It does appear to push up prices to some degree, although the effect is small, because minimum-wage labor often accounts for a small share of the cost of a good or service. A study of a 25 percent increase in the minimum wage in San Jose, California, in 2013 found that restaurant prices rose by just 1.45 percent.
There is good evidence, however, that the minimum wage boosts the incomes of the workers earning it. Whereas the studies of the impact of minimum wages on employment are conflicting, the studies of the impact on earnings point in one direction: not only do workers paid the minimum wage see their incomes go up accordingly, but so do those paid slightly above the old minimum.
Studies of other countries support the conclusion that the minimum wage can increase pay without reducing employment. In 1999, for example, the United Kingdom introduced a national minimum wage; an independent commission that has researched the impact extensively has yet to find any strong evidence of job losses. That has remained the case after 2016, when the government introduced a much higher minimum wage for workers over the age of 25, which now stands at about $9.80 per hour. In 2015, Germany introduced a national minimum wage (currently about $10.30 per hour), and early studies of its impact also suggest little effect on jobs. Opponents of the minimum wage often look to France for evidence for their side, since the country has both a high minimum wage (about $11.30 per hour) and high unemployment (nearly ten percent). But the main brake on employment there is probably not the minimum wage; it is restrictive labor laws.
Policymakers around the world have begun to take notice of all this accumulating evidence. The International Monetary Fund and the Organization for Economic Cooperation and Development—institutions that once criticized minimum wages—now recommend that when set at a reasonable level, a minimum wage should be part of a well-designed labor policy.
But what is a reasonable level? The experiences of the United Kingdom and Germany may be the most relevant for the U.S. debate, because the minimum wage is higher in those countries, both in absolute terms and as a share of the median worker’s earnings. (There are some U.S. states, such as Arkansas and West Virginia, where the minimum wage represents a similar fraction of the median worker’s earnings, but not many.) Thus, the United States appears to have considerable room for an increase. Moving it to around $10.50 would be very unlikely to raise unemployment. Indeed, a minimum wage at that level would end up constituting a similar proportion of the average worker’s earnings as do the minimum wages on the books in the United Kingdom and Germany. And it would still be less, in real terms, than the one that was in place in 1968.
HOW HIGH IS TOO HIGH?
At some point, a high enough minimum wage would start to destroy jobs. The problem is that no one can say where exactly that point lies. I know of only one study for any country that seems to demonstrate a clear negative impact of a national minimum wage, a study of Denmark’s experience by Claus Thustrup Kreiner, Daniel Reck, and Peer Ebbesen Skov. When Danes turn 18, the minimum wage rises by 40 percent, to something close to $15 per hour, and the study found that their employment rate dropped by a third. Employers switch to employing 17-year-olds, who are similarly skilled but can be paid much less. That study, however, does not necessarily tell us about the impact of a minimum-wage rise that affects all workers, as most employers cannot switch to similar but cheaper labor. And studies of similar age-related rises in the United Kingdom have found no effect on employment.
The minimum wage is a particularly blunt instrument for reducing poverty.
But the United States may soon find out what happens when the minimum wage rises dramatically. If implemented, the proposed measures in many U.S. cities would increase the minimum wage to new highs and affect a larger-than-ever share of workers. After Seattle increased its hourly minimum wage to $13, part of a projected plan to increase it to $15, the city commissioned a study by economists at the University of Washington to look into the effects. The study, released in June 2017, concluded that the measure had harmed the low-wage workers it was intended to help, since it caused employers to reduce hours and delay new hiring. But those findings have been contested and are unlikely to be the final word; there will be many other studies to come, of Seattle and elsewhere.
What is likely, however, is that some credible study finding that the minimum wage has cost jobs will eventually turn up. This is almost inevitable, because those campaigning for a higher minimum wage are not motivated by a finely calibrated assessment of the costs and benefits to low-wage workers. Rather, they have rallied around the cause of a “living wage,” meaning an income that would allow workers to have what could be regarded as a decent standard of living—something that is very different from economists’ idea of the highest wage that a labor market can bear. Market economies do not on their own guarantee that all people will be able to find an employer prepared to pay them a wage that gives them the opportunity to earn a decent standard of living for them and their family. If activists determine that a single mother of two must make, say, $20 an hour to get by, and they succeed in convincing the legislature to raise the minimum wage to that level, then it is entirely plausible that unemployment will rise.
When a study showing such an effect does arrive, it will be important to draw the right inference. Some will incorrectly conclude that the minimum wage costs jobs—full stop. Rather, one should conclude that such a study tells us something about the appropriate level for a minimum wage, not whether a minimum wage should be used at all. Economists would do well to shift their research focus, then. The question should be not, What is the effect of the minimum wage on employment? but, What is the appropriate level for the minimum wage?
WAGES AND WANT
Even though most sweeping arguments against minimum-wage increases have proved misguided, it is important to be realistic about what the policy can achieve. Supporters often argue that the minimum wage reduces poverty. That is true, but it is a particularly blunt instrument for doing so. As an hourly rate, the minimum wage on its own reveals little about the household income of those who earn it. That depends on how many hours are worked, how many other adults there are in the household and how much they earn, and how many dependents there are. A minimum wage of $7.25 means something very different for a teenager working a summer job as a lifeguard than it does for a woman who has to support her family of four through shifts at McDonald’s. A federal minimum wage of $15 would mean that the vast majority of households would have earnings above the poverty line if they had at least one full-time worker. But there would still be some households with only part-time workers. Moreover, given how high $15 is compared to the historical high-water mark and compared to the minimum wage in similar countries, such a level would most likely be reached not on a nationwide basis but only in certain cities. In other words, the minimum wage is unlikely to eliminate poverty on its own.
Policies such as the earned-income tax credit, a federal benefit linked to a taxpayer’s income and number of dependents, are also needed. But the earned-income tax credit shouldn’t be used alone, either. The risk with a tax credit is that employers will use the federal subsidy as a reason to cut wages, effectively pocketing some of the income that was intended for low-income workers. The minimum wage stops employers from doing this.
There is another important limit to the minimum wage. The United States doesn’t just have a problem with low wages at the bottom of the income distribution; it also has a problem generating growing real wages for the average worker. The gains from what economic growth there has been have flowed disproportionately to those at the top of the income distribution, who already have the most. The United States is not the only Western economy grappling with stagnant or declining living standards for the average citizen, and in recent years, populists on both the left and the right have fed off the resulting widespread dissatisfaction with self-serving political elites. On the left, one symptom of this discontent has been the push for much higher minimum wages.
But the minimum wage is very unlikely to ever affect the earnings of the average worker, and so it will do little to reverse the economic forces that have upended politics around the globe. There is evidence that minimum wages can boost earnings for people making more than the minimum, but these spillover effects peter out before one gets anywhere close to the average worker. If the United States is to share the benefits of growth across the population, it will take much more than a higher minimum wage to do so.