10 Things You Should Know About Killer Robots

March 27, 2019

Ploughshares
Download PDF

1. WHAT ARE KILLER ROBOTS?

Actually, pretty much what they sound like. These are autonomous weapons systems that could kill human beings without any human involvement in the critical functions of target selection and the employment of lethal force.

Lethal autonomous weapons systems could take many physical forms, and could also operate in large numbers and distributed architectures.

The emergence of killer robots would constitute an unprecedented development in human history, which some have referred to as the third major revolution in warfare—after the emergence of gun powder and nuclear weapons.

2. AREN’T DRONES THE SAME THING?

Well, there are some similarities. With armed drones, however, there is still clear and meaningful human involvement.

To this day, crucial legal and ethical issues around the use of lethal force by armed drones remain essentially unresolved. In the case of killer robots, questions concerning compliance with international humanitarian law, as well as legal and ethical accountability, are dramatically more complex.

If a killer robot autonomously engages a target, who is ultimately responsible? The coder who worked on its algorithms? The military commander who deployed it? The developer of facial recognition software?

3. DO KILLER ROBOTS ACTUALLY EXIST?

They definitely could—very soon. Several national militaries have or will soon have precursor autonomous weapons. More than 130 military systems can now autonomously track targets.

The international community is at a critical juncture when it must act to curb their emergence. But while the prospect of a world in which these systems become normalized is real, it is not inevitable.

4. WHAT CAN BE DONE TO PREVENT THE DEVELOPMENT OF KILLER ROBOTS?

A robust legal and regulatory framework must be put in place to limit such advances in military technologies. Civil society, progressive states, and tech experts from around the world have all called for a BAN—a legal prohibition under international law—on weapons that would function without meaningful human control.

A ban would bolster international law and establish norms for contemporary warfare. It would inform military doctrine and rules of engagement, ensure rules regarding predictability, and lead to common international norms around what is and is not an acceptable use of artificial intelligence in military systems.

Until a ban is negotiated, states should be encouraged to give serious consideration to the implications of the emergence and proliferation of killer robots and state their support for international legislation to prohibit their development. Individual states can independently enact legislation consistent with this objective.

5. SHOULD ONLY STATES BE CONCERNED?

Advances in autonomy and artificial intelligence are not necessarily taking place in “killer robot” labs or military hubs—though some of this is surely happening. Lots of research on AI and autonomous technologies is going on in academic institutions and private companies. Think Google’s self-driving car. Without regulation, the results of such benign efforts could end up as enabling technologies for killer robots.

Groups and individuals in industry and academia can pledge not to direct their research and innovation to the development of autonomous weapons systems. They can also proactively put in place commercial and legal restrictions that limit the end use or end users of certain systems, as some companies have already done.

6. IS A DIPLOMATIC FORUM ADDRESSING THIS ISSUE?

The United Nations Convention on Certain Conventional Weapons (CCW) in Geneva is the primary venue for such multilateral discussions. Since 2014, the 125-member body, whose mandate is to consider weapons “Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects,” has engaged in discussions on autonomous weapons systems.

In 2016, a UN Group of Governmental Experts was established to examine emerging lethal autonomous weapons systems. The aim is to identify rules and principles that would apply to these weapons.

7. IS THERE SUPPORT FOR A BAN?

A growing number of countries have supported the call for a ban on autonomous weapons systems. Austria, Brazil, and Chile have called for a new CCW mandate to ensure the negotiation of a legally binding instrument that prohibits weapons that do not have meaningful human control.

The Campaign to Stop Killer Robots, a coalition of nongovernmental organizations from more than 50 countries, was established to advocate for a ban. The Campaign has the support of 4,500 AI experts, the United Nations, and the European Parliament, among others. According to a December 2018 Ipsos survey commissioned by the Campaign and conducted in 26 countries, more than three in five respondents oppose the development of weapons systems that would select and attack targets without human intervention.

8. WHY CAMPAIGN WHEN THERE IS ALREADY THE CCW?

Yes, the CCW exists—but there are questions about its effectiveness. After more than five years of discussions on autonomous weapons systems, it is still caught up in matters of procedure and definitions.

Research into autonomy and artificial intelligence (AI) is advancing rapidly, leaving policymakers in the dust. Civil society involvement in arms control and disarmament has been instrumental in similar diplomatic processes in the past to energize states, provide expertise, and raise much-needed awareness.

9. WHAT ABOUT THE BENEFITS OF ARTIFICIAL INTELLIGENCE?

We can keep those! No one is denying the many social and economic benefits of advances in AI—both realized and potential. But the dual-use nature of these systems does not preclude the development of international regulations to control or prohibit harmful uses.

In other areas of arms control, the international community has been able to develop regulatory frameworks that clearly differentiate between benign and harmful uses of certain technologies. This needs to happen with AI and killer robots.

10. ISN’T THE CALL FOR A BAN PREMATURE?

We can grasp the concept of fully autonomous weapons systems today, whatever the pace of technological advances or the ups-and-downs of multilateral policymaking. Now is the time to put in place an unambiguous, multilateral legal prohibition on their development, possession, and use.

Arms control efforts can be hugely challenging to address types of weapons that have already been used in conflict or whose use has become normalized. With killer robots, the international community has the rare opportunity to proactively develop legislation around their development and use before it is too late.