Putin the Great
Russia’s Imperial Impostor
The movement toward unmanned weapons systems seems inevitable. In a recent discussion with Defense News, Bradford Tousley, the director of the Tactical Technology Office at the Defense Advanced Research Projects Agency (DARPA), explained that unmanned technology is the “natural evolutionary path” in the future of warfare. That view is widely shared among defense policy experts. Unmanned and autonomous weapons systems have been central to the Department of Defense’s Third Offset Strategy, which calls for greater investments in these technologies to compete with rivals such as China. Today, each branch of the U.S. military is investing heavily in the research, development, and use of autonomous and unmanned systems.
Perhaps the best known of these new weapons systems are unmanned aerial vehicles—known as UAVs in the military but colloquially referred to as drones. Since 2002, the United States has carried out almost 800 drone strikes in Pakistan, Somalia, and Yemen; since 2015, it has conducted 3,911 in Afghanistan alone. And the United States’ reliance on unmanned weapons is growing: since U.S. President Donald Trump took office in January 2017, the rate of unmanned strikes has increased markedly.
Drones appear to provide a useful example of how militaries can bring unmanned systems onto the battlefield. According to their advocates, unmanned systems of the future will not only increase combat effectiveness by improving decision-making and targeting accuracy but also give the United States the ability to pursue national security goals without risking the lives of its military personnel. Despite the strong push toward unmanned systems from technocrats in the Pentagon, however, little attention has been paid to how they are viewed from the battlefield. What, for instance, do those on the ground think about the utility of drones relative to that of manned aircraft?
Our research suggests that operators on the ground see drones as riskier and less trustworthy than manned aircraft. These are the personnel who integrate new technologies into training, develop tactics for their use, and ultimately put these tactics into practice. Their perspective on the technologies—and the disconnect between their views and those of the experts—is therefore key to understanding the future of autonomous weapons on the battlefield.
SKIN IN THE GAME
In order to understand ground forces’ perspective on drones, we performed over 450 surveys and 150 interviews with two types of military personnel: Joint Terminal Attack Controllers and Joint Fires Observers. JTACs and JFOs are both embedded in the frontlines, where they call in air strikes and coordinate air support from manned and unmanned aircraft. This means they are knowledgeable about the capabilities of both types of aircraft and highly invested in their ability to accurately and responsively drop weapons on the battlefield. In addition, they do not carry with them the potential biases we might see from pilots of manned or unmanned aircraft, since the future of their job is not tied to investments in either.
To conduct our study, we visited marine, air force, and army installations in the United States. We talked to JTAC and JFO leaders at the Pentagon and in think tanks and disseminated the survey link through closed Facebook groups and JTAC communities of interest. The survey presented people with a series of hypothetical scenarios and then asked them if they preferred to call in an air strike from a manned aircraft or an unmanned aircraft or if they had no preference.
The surveys showed that JTACs and JFOs strongly preferred manned over unmanned aircraft across all demographic categories, including age, branch, education, experience, and rank. This preference was strongest in hypothetical scenarios in which the enemy was nearby and there was a high risk of friendly fire: almost 90 percent of the respondents preferred manned aircraft in such circumstances. Their main concern was that drones, remotely controlled by pilots hundreds of miles from the battlefield, were unable to maintain situational awareness in combat environments and were therefore more likely to make mistakes that could risk friendly lives. For example, one JTAC wrote that “a manned aircraft would be less likely to lose sight of my position and make any mistakes that may result in fratricide.”
This preference for manned aircraft persisted even in scenarios in which controllers were told that the enemy had air defense systems that could pose a threat to pilots and crew. In total, 63 percent of our survey respondents preferred a manned aircraft in scenarios that featured a high risk to both aircrew and ground troops. That is partly because most drones cannot survive against modern air defense systems. They can’t evade radar; they don’t have the most sophisticated chaff, flare, or jamming systems; and they aren’t maneuverable enough to defeat a missile. As one respondent put it, “While the threat is understood to the pilot, it’s either a pilot operating in a dangerous situation in addition to the guys on the ground, or an unmanned asset that will get blown out of the sky due to a lack of capability, leaving the boots on the ground in a worse situation.”
Respondents believed that pilots who were at risk themselves would make better decisions than those controlling drones from the safety of the rear.
In addition to the threat of losing air support, however, respondents believed that pilots who were at risk themselves would make better decisions than those controlling drones from the safety of the rear. As one respondent explained, “I’ve called in air before from the ground. If the [aircraft] are afraid, they won’t come. If they think they can do it, they will do everything they can to help.” Troops preferred manned aircraft because their pilots “feel the sense of urgency.” The pilots’ “on-site judgment,” as one respondent explained, “makes me feel safer than someone controlling a computer screen.”
WE HAVE THE TECHNOLOGY
Respondents’ strong preference for manned aircraft derives, in part, from an engineering problem with the UAVs. Today’s drones are simply not as good as manned aircraft at providing most kinds of air support. Drones carry smaller payloads than manned aircraft. They suffer from communications lags and reliability issues. They are slow and unwieldy, and they lack 360-degree situational awareness. As one of our respondents put it, “Manned aircraft are all-around better-performing aircraft. [They] are faster, more nimble, and can employ ordnance in a changing environment.”
Our survey results reflected these engineering limitations. We presented respondents with a second set of scenarios that did not explicitly include enemies shooting at ground troops or aircraft and instead focused solely on mission characteristics such as time sensitivity and overall probability of success. In these scenarios, between 70 and 80 percent of respondents still preferred manned aircraft—a slight but significant decline from the share that preferred manned aircraft in scenarios in which lives were in serious danger. In those contexts, respondents were more likely to give technical rather than behavioral explanations for their answers. When lives were at stake, respondents were concerned about “human” elements such as judgment. But when that threat was absent, they were more interested in technical specifications, such as an aircraft’s speed and maneuverability or the number of weapons it could carry.
Clearly, engineering concerns shaped these preferences to some degree, but respondents’ greater preference for manned aircraft in life-threatening scenarios suggests that the human element dominated their concerns. Perhaps most telling, when we asked JTACs in interviews if they would prefer support from ten remotely piloted A-10s or one manned A-10, they chose the latter. This shows that they were worried not about the capabilities of the aircraft itself but about something more fundamental regarding the human-machine relationship. Building better drones will not solve this problem.
THE WARM FUZZY
The respondents’ preference for piloted aircraft in more dangerous situations raises an important question about the military’s push toward unmanned technology. In the same DARPA discussion cited above, Tousley mentioned that one of the highest barriers to the progression of unmanned weapons systems was trust: operators, he explained, need to better understand whether a machine will be able to perform its mission. The implication is that with better engineering and more knowledge, operators will be more likely to trust the technology and use unmanned systems. We heard a similar theme when conducting meetings at the Pentagon. Staff officers in Washington suggested that if controllers were better trained and had more experience with unmanned systems, they would be more likely to support them.
Our findings, however, suggest this is not the case. In fact, more experienced JTACs and JFOs were slightly less likely to support unmanned systems. We also found no correlation between support for unmanned aircraft and combat experience in the later years of the U.S. wars in Iraq or Afghanistan, when the use of drones became more common. Our interviews with JTACs, moreover, made clear that they knew a lot about UAV capabilities. Their lack of trust in them was not due to a lack of knowledge.
Instead, the trust issue was a human issue. Not once did any of our respondents refer to a drone pilot as a human. Instead, drones were discussed in abstract terms that explicitly avoided any reference to a human controlling the machine. UAVs were “robots” or “machines” whose “operators,” as one respondent put it, were playing a video game “a world away.” This is despite the fact that most of them knew there was a human controlling the drone, and some even knew these pilots personally. Yet across the JTAC community, we heard a familiar narrative: drone pilots were coffee-drinking gamers whose distance from the battlefield severed their emotional connection to friendly ground troops. For instance, one of the JTACs explained that he preferred manned aircraft because their pilots “are in the fight, not just sipping a latte playing a video game.”
What the controllers lacked with the unmanned systems was a “warm fuzzy”—a term that came up over and over in our interviews and refers to the belief that a remotely operated machine can make the same gut decisions that a human would make. As one respondent explained, “Having real-time human eyes in the sky above when dropping ordnance or during a TIC [troops in contact] is crucial. It is as real for that pilot as it is for the guy on the ground, and that cannot be said for the individual flying a UAV from a safe building in the United States.” In short, they were looking for pilots with skin in the game. Without that, how could they trust the machine or the machine’s operator?
Despite the huge preferences we saw among JTACs for manned aircraft, even frontline troops shared the sense that the rise of unmanned systems was inevitable. As one marine noted in frustration to a room of controllers discussing the future integration of UAVs to combat air support: “It doesn’t matter if we think manned or unmanned is better, unmanned is the future and we better get on board.”
Policymakers should reexamine their apparent commitment to an unmanned future. The belief that unmanned systems are inevitable—and that they can achieve U.S. national security objectives cheaply and easily—risks blinding leaders to some of these systems’ limitations. In particular, our interviews suggest that there are important, unresolved trust issues between humans and machines that make battlefield personnel wary of employing unmanned aircraft in situations in which their lives are at risk. The implications of this finding go beyond UAVs and speak to the future development of unmanned systems within other domains. In domains where there is less direct risk to life—such as underwater, space, and cyber—we might expect to see a more rapid acquisition of unmanned technologies. In domains where humans are in direct physical contact with the enemy, however, troops will be reluctant to delegate decisions to machines. Instead, they will want to work with humans they can trust.