By Branka Marijan
Published in The Ploughshares Monitor Volume 41 Issue 1 Spring 2020
Disarmament and arms control have not featured prominently, if at all, in mandate letters to Canada’s foreign ministers in many years. But at the end of 2019, Canadian Foreign Minister François-Philippe Champagne was given a new mandate to “advance international efforts to ban the development and use of fully autonomous weapons systems.”
Before this, Canada had not been convinced of the need for new international regulations on autonomous weapons, arguing that existing international humanitarian law was sufficient to address the challenge. In other words, weapons that could make decisions on their own were already illegal and a ban was not needed. But countries that have taken this position have begun to vacillate as it has become increasingly clear that advances in artificial intelligence (AI), which further remove the human decision-maker from the actions taken by the weapons systems, result in novel challenges.
One wonders, then, if political positions were still in flux when Champagne outlined Canada’s foreign-policy priorities at a February 21 event in Montreal and neglected to mention autonomous weapons or, indeed, any issues related to disarmament or arms control.
Such an omission does not necessarily indicate a lack of intent to fulfill the mandate. However, it does seem to suggest that Canada needs to dedicate more resources and effort to these particular concerns.
The next round of talks on autonomous weapons at the Convention on Certain Conventional Weapons (CCW) is scheduled for June 22-26 and August 10-14. Now is the time for Canada to get up to speed on the ban.
Advancing the ban
Currently, 30 countries, including Brazil and Austria, support a ban on autonomous weapons systems. Others are seriously exploring the issue.
At a seminar in Rio de Janeiro on February 20, Brazil reiterated its call for new regulation of these weapons. Because of the pandemic, Germany had to postpone a March consultation in Berlin. Austria and Japan are planning events for early 2021.
Progressive states will need to address the lack of momentum at the CCW, caused by the opposition of key states such as Russia and the arguments of others that say a new legal document is premature. For instance, a negotiating mandate, which outlines the instructions that govern the negotiators, should be adopted quickly. Without such a mandate, no new legally binding instrument can be developed.
A new legal instrument would provide greater clarity on permissible types of weapons and uses. Under existing regulations, it is not clear who would be held accountable for any decisions made by a weapons system. This critical gap in accountability must be addressed.
Counterviews of allies
The vast majority of countries agree that decisions over human lives should remain firmly in human hands. However, some of Canada’s traditional allies, including the United Kingdom, France, and Australia, are more optimistic about autonomous weapons technologies and see less need to emphasize this point. Australia, in particular, supports more autonomy in weapons systems.
According to a paper by the Australian government issued ahead of the March 2019 CCW meetings, Australia does not support use of the phrase “human control.” Instead, it presents a model of a “System of Control” that covers “all aspects of a weapon system from design through to engagement.” According to Ray Acheson, the Director of the civil-society organization Reaching Critical Will, this phrase seems to imply that “if the weapon will operate within specific rules of engagement and targeting directives, then these ‘controls’ are sufficient.”
The need for human control
Ban supporters disagree, as they have during years of discussion at the CCW. They are adamant that the principle of meaningful human control over the selection and engagement of targets is essential to whatever legal instrument is developed.
In a recent article in Foreign Policy, Arthur Holland Michel reported an incident in which the U.S. Navy tested a network of AI systems. He noted that “the one human involved in this kill chain was a commanding officer on the chosen destroyer, whose only job was to give the order to fire.” Here the choice of target was made by machines. Ban supporters do not accept this inconsequential level of control.
How Canada should prepare to support the ban
If Canada is to fulfill its new defence mandate, it must develop a strong position in favour of a ban, which will need to include a clear explication of the appropriate level of human control over weapons systems. And the Department of National Defence and the Canadian Armed Forces will need to be fully supportive of that stand.
In Strong, Secure, Engaged, a statement of Canada’s defence policy, there is this statement: “The Canadian Armed Forces is committed to maintaining appropriate human involvement in the use of military capabilities that can exert lethal force.” Clarity is needed on what constitutes appropriate human involvement. As well, cases in which human control would not be appropriate or necessary should be clearly defined. For this, Canada will need to engage legal specialists.
Canada, home to leading AI researchers and specialists, is well positioned to hold successful, insightful discussions on autonomous weapons. The Canadian government is already a leader in the ethical uses of AI for government services; standards focus on fairness, “explainability” of decisions, and eliminating bias.
So far, such standards and policies have not been developed for the military. Clearly establishing such standards is now a priority.
There is much that Canada can do to prepare for the CCW meetings this year and beyond. Because CCW is frequently deadlocked, Canada and other ban supporters might need to go outside the CCW framework. Canadian leadership in such an endeavour would be widely welcomed.
Photo: Stop Killer Robots campaigners pose at the Broken Chair statue in Geneva to commemorate victims of landmines and cluster bombs. Credit: Clare Conboy