Umit Bektas / Reuters Afghan residents look at a robot that is searching for IEDs (improvised explosive devices) during a road clearance patrol by the U.S. army in Logar province, eastern Afghanistan November 23, 2011.

Battle Bots

How the World Should Prepare Itself for Robotic Warfare

People living today find their lives bookended by two eras of weaponry: nuclear arms and the emerging threat of lethal autonomous weapons. On the one hand, nuclear Armageddon is still a real risk as states fail to live up to their legal obligations to disarm and dispose of their nuclear stockpiles. Lethal autonomous robotic weaponry, on the other hand, offers a glimpse of war’s future—and few if any rules govern the use of drones, a precursor technology for future autonomous weapons. Globally agreed norms and rules must be created and strengthened throughout all areas of weapons proliferation so that future generations are safeguarded from large-scale violence. To set realistic and reliable international norms for dangerous and new weapons, states must establish the necessary conditions for global cooperation. If world leaders want a less violent world, they must set the guidelines and improve the current international order.

Two recent UN meetings dealt with the pressing concerns of nuclear arsenals and future weapons: the 2015 Review Conference of the Parties to the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) that ended in May, and the Meeting of Experts on Lethal Autonomous Weapons Systems in Geneva this past April. Each event focused on a distinct threat to global security: the former on the risk of nuclear war and the latter on the emerging risk of lethal autonomous military robotic technology.

Despite the importance of these issues, both meetings ended inconclusively: 70 years after nuclear bombs were dropped on Hiroshima and Nagasaki, nuclear proliferation still threatens global security. The 1968 Nuclear Non-Proliferation treaty turns 47 this year, but states continue to rely on its outdated legal architecture toward this aging technology with no new real disarmament in sight. At the same time, the United States and other nations are pursuing lethal robotic technologies that can provide autonomous support in warfare—creations that might be seen on the battlefield in the not-so-distant future given the capabilities of current prototypes. Given the proliferation of drones and their importance within U.S. foreign policy, it appears that evolving military doctrines see autonomous systems as their best option for defense, allowing Washington to avoid casualties and putting troops on the ground regardless of unintended targets and public perceptions of drone technology. In other words, the loss of effective human control in combat looms as a real possibility in the future. 

Robotic Warfare 2

A Royal Marine poses for photographers with the Unmanned Vehicle Robot, Testudo, at the launch of the Defence Technology Plan in London February 26, 2009.

Maintaining nuclear weapons while developing lethal military robots is a costly and dangerous strategy for keeping states secure. The potential proliferation of these technologies to rogue states and actors will only exacerbate conflict, fuel violence and crime, and intensify the risk of terrorist attacks. To contain these threats, the international community should look to its existing norms for guidance. The powerful global norms that forbade the use of chemical weapons, for instance, represent a set of principles that could be emulated as new technologies are established. The 2014 Arms Trade Treaty, for example, regulates the transfer of conventional arms based upon long-standing human rights practices and international law. This treaty created high international norms that prohibited the transfer of arms, created state responsibilities to uphold embargoes, and enforced human rights law. This model works well for existing weapons, but for future lethal robots and other autonomous warfare technology, preventive prohibition could help avert costly, dangerous technologies from spreading in the first place. The Campaign to Stop Killer Robots, for example, calls for prohibitions instead of retroactive bans—saving lives and avoiding protracted international debate down the road. 

Robotic Warfare 3

An U.S. Army pyrotechnic specialist sends an EOD (Explosive Ordnance Disposal) robot towards a suspected IED (Improvised Explosive Device) in Mainjui village in Arghandab valley, some 10 km (6 miles) northwest of Kandahar, in southern Afghanistan May 6, 2010.

There are noticeable gaps between the development of lethal warfare technologies and the legal regimes that regulate them. International Humanitarian Law is forced to play catch-up with new weapons technologies, and must wait on a handful of countries to test new weapons in order to deem them permissible under international law. Despite its testing protocol, International Humanitarian Law has yet to regulate drones—a technology that is now several years old—providing little confidence to the international community that it will be responsive to potential autonomous lethal weapons in the future. 

The secrecy of emerging weaponry makes the establishment of international norms particularly challenging. As political scientist Martha Finnemore writes in her book, National Interests in International Society, “It is precisely the control over the use of arms that states guard most jealously.” It is assumed that security is a different beast than other international concerns, since it requires secrecy and autonomy in order to be successful. Meanwhile, conventional wisdom holds that weapons technology can only occur within ungoverned, or under-governed, areas of national security. 

Nuclear weapons and future lethal autonomous technologies will imperil humanity if governed poorly. They will doom civilization if they’re not governed at all.

These assumptions, however, are not always true: States can cooperate on security and defense, and have done so in the United Nations’ operations and elsewhere. States have indeed changed their military doctrines by no longer deploying certain weapons systems, and have prohibited the use of certain arms in response to the establishment of international norms. Landmines, cluster munitions, blinding lasers, and chemical weapons were either reduced or prohibited due to changes in international prohibitions, and future weapons could undergo the same treatment. Cooperation has occurred in many instances, motivated not only by national security concerns but also by powerful humanitarian and legitimacy reasons. And when states do cooperate, they improve world security as a result. When cooperation occurs within the high-stakes politics of the security domain, it often comes with great difficulty and reluctance on the part of major powers. Despite the recalcitrance of world powers to reach new agreements, their efforts often set important precedents for smaller nations in the process, establishing a more ordered and secure world as a result. Even if unruly actors within ungoverned spaces do not abide by international norms, the world will benefit from official condemnation of tactics, which in turn will curb proliferation for all but a few agents of war.

It would be easy to assume that states only relinquish certain weapons once they no longer deem them useful. Or that nations only agree to disarm weapons systems that they either do not need, do not have, or under conditions where the stakes of losing them are low. These reasons, however, are not enough. Cooperation has occurred in many instances due to a combination of national security concerns and humanitarian factors. The emergence of new weapons technologies will give rise to different cooperation processes, some with more traditional relationships and others with new forms of participation that involves broader input from civil society. New motivations for states to find comity on arms regulation only bolsters the resolve of the international community, reinforcing the idea that the United Nations can establish global norms that work.

Nuclear weapons and future lethal autonomous technologies will imperil humanity if governed poorly. They will doom civilization if they’re not governed at all. It therefore becomes essential to examine how national security doctrines of high politics of the security domain can be made more transparent, commonly articulated through international norms, and monitored on the global stage under international law.

Related Articles

This site uses cookies to improve your user experience. Click here to learn more.

Continue

Close We are offering free and open access for a short period of time. Read more about why we are doing this.

Days
Hrs
Min
Sec