For some time, Canada’s silence has been a standard feature of international discussions on autonomous weapons. True to form, Canada remained quiet at the April 26-27 informal, virtual sessions on lethal autonomous weapons systems hosted by Brazil, the current chair of the United Nations Convention on Certain Conventional Weapons (CCW).
Military research and development in recent years have focused on artificial intelligence (AI) tools that gather and analyze data quickly. Combined with improved sensors, they make possible faster and seemingly more accurate targeting of enemy positions. Now this R&D is being operationalized. Last September, according to Secretary of the Air Force Frank Kendall, the United States Air Force, for the first time, used AI to help to identify a target or targets in “a live operational kill chain.”
Two titans from the Cold War era seem set to go another round, this time over the prospect of Ukraine’s membership in the North Atlantic Treaty Organization (NATO), which the United States calls a sovereign Ukrainian decision and Russia opposes vehemently. Whatever the outcome of the current standoff, another confrontation between the United States and Russia that merits closer attention is brewing — one that may fundamentally reshape the US-Russia security relationship in the not-so-distant future.
At first glance, it might appear that seven years of international discussions on autonomous weapons have had few concrete results. At the time of writing, the third session of the 2021 United Nations (UN) Group of Governmental Experts on emerging technologies in the area of lethal autonomous weapons systems (LAWS) was scheduled to take place in early December in Geneva, Switzerland. The most that is expected from these meetings is a proposal to continue talking.
Canada is in dire need of a solid diplomatic strategy that responds to the growing nexus between emerging technologies and national security. Newly-appointed foreign minister Mélanie Joly would do well to prioritize the development of robust and forward-looking policies to tackle tech-related security concerns, as is increasingly the case in the foreign ministries of a number of countries—including key Canadian allies as well as would-be adversaries.
Over the past few months, experts have been surprised by the media attention given to the Turkish-made Kargu-2 kamikaze drone or loitering munition. Everyone, it seems, wants to know if the use of the Kargu-2 in Libya in March 2020 was the first instance of an autonomous weapon being used in conflict.
Responsible uses of artificial intelligence (AI) have been featured prominently in recent national discussions and multilateral forums. According to the Organisation for Economic Co-operation and Development (OECD), 60 countries have multiple initiatives and more than 30 have national AI strategies that consider responsible use. However, the use of AI for national defence has not generally been tackled yet.
To no one’s surprise, United Nations discussions on the regulation of autonomous weapons have stalled. Last year, the global pandemic caused delays, with only one week of discussions—partly in Geneva, Switzerland and partly virtual—taking place from September 21-25. November’s annual meeting of the Convention on Certain Conventional Weapons (CCW), at which the 2021 schedule for discussions on autonomous weapons would have been set, was cancelled.
The United States is at the forefront of advancements in autonomous swarming technologies. A U.S. government-appointed panel has even said that the country has a “moral imperative” to develop weapons …
According to a recent report by Canada’s privacy commissioner Daniel Therrien and three provincial counterparts, Clearview AI has broken Canada’s privacy laws. Therrien told reporters that the company’s technology and …