We are living in a time of crisis. From the immediate challenge of the COVID-19 pandemic to the looming existential threat of climate change, the world is grappling with massive global dangers—to say nothing of countless problems within countries, such as inequality, cyberattacks, unemployment, systemic racism, and obesity. In any given crisis, the right response is often clear. Wear a mask and keep away from other people. Burn less fossil fuel. Redistribute income. Protect digital infrastructure. The answers are out there. What’s lacking are governments that can translate them into actual policy. As a result, the crises continue. The death toll from the pandemic skyrockets, and the world makes dangerously slow progress on climate change, and so on.

It’s no secret how governments should react in times of crisis. First, they need to be nimble. Nimble means moving quickly, because problems often grow at exponential rates: a contagious virus, for example, or greenhouse gas emissions. That makes early action crucial and procrastination disastrous. Nimble also means adaptive. Policymakers need to continuously adjust their responses to crises as they learn from their own experience and from the work of scientists. Second, governments need to act wisely. That means incorporating the full range of scientific knowledge available about the problem at hand. It means embracing uncertainty, rather than willfully ignoring it. And it means thinking in terms of a long time horizon, rather than merely until the next election. But so often, policymakers are anything but nimble and wise. They are slow, inflexible, uninformed, overconfident, and myopic.

Why is everyone doing so badly? Part of the explanation lies in the inherent qualities of crises. Crises typically require navigating between risks. In the COVID-19 pandemic, policymakers want to save lives and jobs. With climate change, they seek a balance between avoiding extreme weather and allowing economic growth. Such tradeoffs are hard as it is, and they are further complicated by the fact that costs and benefits are not evenly distributed among stakeholders, making conflict a seemingly unavoidable part of any policy choice. Vested interests attempt to forestall needed action, using their money to influence decision-makers and the media. To make matters worse, policymakers must pay sustained attention to multiple issues and multiple constituencies over time. They must accept large amounts of uncertainty. Often, then, the easiest response is to stick with the status quo. But that can be a singularly dangerous response to many new hazards. After all, with the pandemic, business as usual would mean no social distancing. With climate change, it would mean continuing to burn fossil fuels. 

But the explanation for humanity’s woeful response to crises goes beyond politics and incentives. To truly understand the failure to act, one must turn to human psychology. It is there that one can grasp the full impediments to proper decision-making—the cognitive biases, emotional reactions, and suboptimal shortcuts that hold policymakers back—and the tools to overcome them. 

Avoiding the Uncomfortable

People are singularly bad at predicting and preparing for catastrophes. Many of these events are “black swans,” rare and unpredictable occurrences that most people find difficult to imagine, seemingly falling into the realm of science fiction. Others are “gray rhinos,” large and not uncommon threats that are still neglected until they stare you in the face (such as a coronavirus outbreak). Then there are “invisible gorillas,” threats in full view that should be noticed but aren’t—so named for a psychological experiment in which subjects watching a clip of a basketball game were so fixated on the players that they missed a person in a gorilla costume walking through the frame. Even professional forecasters, including security analysts, have a poor track record when it comes to accurately anticipating events. The COVID-19 crisis, in which a dystopic science-fiction narrative came to life and took everyone by surprise, serves as a cautionary tale about humans’ inability to foresee important events. 

Not only do humans fail to anticipate crises; they also fail to respond rationally to them. At best, people display “bounded rationality,” the idea that instead of carefully considering their options and making perfectly rational decisions that optimize their preferences, humans in the real world act quickly and imperfectly, limited as they are by time and cognitive capacity. Add in the stress generated by crises, and their performance gets even worse.

Because humans don’t have enough time, information, or processing power to deliberate rationally, they have evolved easier ways of making decisions. They rely on their emotions, which serve as an early warning system of sorts: alerting people that they are in a positive context that can be explored and exploited or in a negative context where fight or flight is the appropriate response. They also rely on rules. To simplify decision-making, they might follow standard operating procedures or abide by some sort of moral code. They might decide to imitate the action taken by other people whom they trust or admire. They might follow what they perceive to be widespread norms. Out of habit, they might continue to do what they have been doing unless there is overwhelming evidence against it. 

Not only do humans fail to anticipate crises; they also fail to respond rationally to them.

Humans evolved these shortcuts because they require little effort and work well in a broad range of situations. Without access to a real-time map of prey in different hunting grounds, for example, a prehistoric hunter might have resorted to a simple rule of thumb: look for animals where his fellow tribesmen found them yesterday. But in times of crisis, emotions and rules are not always helpful drivers of decision-making. High stakes, uncertainty, tradeoffs, and conflict—all elicit negative emotions, which can impede wise responses. Uncertainty is scary, as it signals an inability to predict what will happen, and what cannot be predicted might be deadly. The vast majority of people are already risk averse under normal circumstances. Under stress, they become even more so, and they retreat to the familiar comfort of the status quo. From gun laws to fossil fuel subsidies, once a piece of legislation is in place, it is hard to dislodge it, even when cost-benefit analysis argues for change.

Another psychological impediment to effective decision-making is people’s natural aversion to tradeoffs. They serve as a reminder that we cannot have it all, that concessions need to be made in some areas to gain in others. For that reason, people often employ decision rules that are far from optimal but minimize their awareness of the need for tradeoffs. They might successively eliminate options that do not meet certain criteria—for example, a user of a dating app might screen people based on height and then miss someone who would have been the love of his or her life but was half an inch too short. Tradeoffs between parties make for conflict, and people dislike conflict, too. They see it not as an opportunity to negotiate joint gains but as a stressful confrontation. Years of teaching negotiation have shown me that although everybody understands that negotiations are about distributing a finite pie (with unavoidable conflict), it is much harder to get across the concept that they are also often about creating solutions that make all sides better off.

Believing Is Seeing

A further hindrance to crisis response is the lack of an easily identified culprit. Some crises, such as military standoffs during the Cold War or, more recently, terrorist attacks, have clear causes that can be blamed and villains who can be fought. But many others—the pandemic and climate change being prime examples—do not. They are more ambiguous, as they are caused by a range of factors, some proximate, others not. They become catastrophes not because of any particular trigger or evildoer but because of the action or inaction of policymakers and the public. When it isn’t clear who is friend and who is foe, it’s difficult to see a clear and simple path of action. 

Psychologists speak of the “single-action bias,” the human tendency to consider a problem solved with a single action, at which point the sense that something is awry diminishes. For example, one study found that radiologists will stop scrutinizing an x-ray for evidence of pathology after they have identified one problem, even though multiple problems may exist. This bias suggests that humans’ preferred way of dealing with risks evolved during simpler times. To avoid being killed by lions at the watering hole, there was an easy, one-step solution: stay away from the lions. But today, many crises have no culprit. The enemy is human behavior itself, whether that be the burning of fossil fuels, the consumption of virus-infected animals, or the failure to wear masks or abide by social-distancing rules.

The solutions to these problems are often inconvenient, unpopular, and initially expensive. They involve making uncomfortable changes. When that is the case, people tend to exploit any ambiguity in the cause of the problem to support alternative explanations. When the COVID-19 pandemic began, for instance, some embraced a conspiracy theory that falsely claimed that the virus was the intentional product of a Chinese lab. For many, that idea was easier to swallow than the scientific consensus that the virus emerged from bats. Indeed, in a survey of Americans that my colleagues and I conducted in April, a mind-boggling 29 percent of respondents held this view. 

In times of crisis, emotions and rules are not always helpful drivers of decision-making.

Another psychological barrier to effective governance in times of crisis relates to how people learn and revise their beliefs. If people followed the Bayesian method of inference, they would update their beliefs in the face of new information. Over time, as more and more information became available, a consensus would emerge—for example, that climate change is caused by human activity. But not everyone sees and acknowledges the same new information and integrates it in the same rational way. In practice, they give more weight to concrete personal experience than abstract statistical information. The death of a single close friend from COVID-19 is much more of a wake-up call than a news report about high infection rates. Someone who loses a house in a wildfire will grasp the risk of climate change more than someone who looks at a graph of rising temperatures. Personal experience is a powerful teacher, far more convincing than pallid statistics provided by scientific experts, even if the latter carry far greater evidentiary value.

People vastly underestimate the likelihood of low-probability events, until they personally experience one. At that point, they react, and perhaps even overreact, for a short while, until the perceived threat recedes again. After an official is the victim of an email hack, for example, he or she may take greater cybersecurity precautions for a while but will likely become less vigilant as the months go on.

The value of personal experience is reflected in the phrase “seeing is believing.” But the opposite can also be the case: sometimes, believing is seeing. In other words, people who are committed to their beliefs, especially when those beliefs are shared by ideological allies, will pay selective attention to information that confirms their preexisting notions and fail to see evidence that contradicts them. That’s why it is often the case that people are increasingly divided, rather than united, over time about the causes of and solutions to crises. Beliefs about COVID-19 and climate change have gotten more polarized over time, with Democrats more likely to subscribe to science-based explanations of both crises and express greater concern and Republicans more likely to agree with conspiracy theories that downplay the risks. 

The Self-Aware Policymaker

One response to all these psychological biases is for officials to change their ways and embrace more rational decision-making processes, which would lead to better policies. They would need to acknowledge the true extent of their ignorance about future events and creatively guard against probable and unpredictable high-impact surprises. (With the COVID-19 crisis, for example, they would plan for the possibility that a vaccine cannot be identified or proves to be short lived.) Policymakers would seek to guide and educate the public rather than follow it. Some might view this approach as paternalistic, but it need not be, provided that it is implemented with input from groups across society. Indeed, people regularly delegate decision-making to those with greater expertise—going to a doctor for a diagnosis, for instance, or letting a lawyer handle legal issues. In principle, at least, elected officials are supposed to take care of the big-picture strategic planning that individuals don’t have the time, attention, or foresight to do themselves.

It might seem as if the politician who deviates from public opinion to think about more long-term problems is the politician who fails to get reelected. But public opinion is malleable, and initially unpopular changes can gain support over time. In 2003, for example, New York City banned smoking in restaurants and bars. After an initial outcry and a drop in Mayor Michael Bloomberg’s popularity, the city came to see that the new policy was not as detrimental as originally thought, support for the ban rose, and Bloomberg won reelection twice. In 2008, the Canadian province of British Columbia also instituted an unpopular policy: a carbon tax on fossil fuels. Again, disapproval was followed by acceptance, and the province’s premier, Gordon Campbell, won an election the next year. Some reforms don’t poll well at first, but it would be a mistake to see failure as a foregone conclusion. Passing initially unpopular reforms may require creative policies and charismatic politicians, but eventually, the public can come around. 

A man wearing a face mask decorated in New York City, May 2020
In New York City, May 2020
Mike Segar / Reuters

Another approach to improving crisis decision-making would be to work with, rather than against, psychological barriers. In 2017, the Behavioral Science and Policy Association published a report that identified four categories of policy problems with which the insights of psychology could help: “getting people’s attention; engaging people’s desire to contribute to the social good; making complex information more accessible; and facilitating accurate assessment of risks, costs, and benefits.” The experts behind the report came up with a variety of tools to meet these objectives. One recommendation was that policymakers should set the proper default—say, automatically enrolling households in energy-reduction programs or requiring that new appliances be shipped with the energy-saving settings turned on. Another was that they should communicate risks using a more intuitive time frame, such as speaking about the probability of a flood over the course of a 30-year mortgage rather than within 100 years. 

In the same spirit, the cognitive scientist Steven Sloman and I put together a special issue of the journal Cognition in 2019 to examine the thought processes that shape the beliefs behind political behavior. The authors identified problems, such as people’s tendency to consume news that confirms their existing beliefs and to let their partisan identities overpower their ability to evaluate probabilities rationally. But they also identified solutions, such as training people to better understand the uncertainty of their own forecasts. Policymakers need not take public opinion as an immutable barrier to progress. The more one understands how people think, feel, and react, the more one can use that information to formulate and implement better policy. 

The field of psychology has identified countless human biases, but it has also come up with ways of countering their effects. Psychologists have developed the concept of choice architecture, whereby decisions are structured in such a way to nudge people toward good choices and away from bad choices. When companies automatically enroll their employees in retirement plans (while allowing them to opt out), the employees are more likely to save. When governments do the same with organ donation, people are more likely to donate. Psychologists also know that although playing on negative emotions, such as fear or guilt, can have undesirable consequences, eliciting positive emotions is a good way to motivate behavior. Pride, in particular, is a powerful motivator, and campaigns that appeal to it have proved effective at convincing households to recycle and coastal communities to practice sustainable fishing. All these techniques are a form of psychological jujitsu that turns vulnerabilities into strengths. 

Effective public leaders understand and use the richness of human behavior. German Chancellor Angela Merkel comes to mind. Combining the rationality of the scientist she was with the human touch of the politician she is, she has proved adept at managing emergencies, from Europe’s currency crisis to its migration crisis to the current pandemic. Such leaders are evidence-based, analytic problem solvers, but they also acknowledge public fears, empathize with loss and pain, and reassure people in the face of uncertainty. They are not prisoners of psychology but masters of it.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now
  • ELKE U. WEBER is Gerhard R. Andlinger Professor in Energy and the Environment and Professor of Psychology and Public Affairs at Princeton University.
  • More By Elke U. Weber