I think I finally REALLY get it. I’ve been reading analysis of autonomous weapons and AI-powered tech by Ploughshares Senior Researcher Branka Marijan for years, but I’ve never completely understood why so many individuals and organizations and even countries are totally against weapons that can target and kill humans without humans as part of the decision-making process.
Responsible uses of artificial intelligence (AI) have been featured prominently in recent national discussions and multilateral forums. According to the Organisation for Economic Co-operation and Development (OECD), 60 countries have multiple initiatives and more than 30 have national AI strategies that consider responsible use. However, the use of AI for national defence has not generally been tackled yet.
To no one’s surprise, United Nations discussions on the regulation of autonomous weapons have stalled. Last year, the global pandemic caused delays, with only one week of discussions—partly in Geneva, Switzerland and partly virtual—taking place from September 21-25. November’s annual meeting of the Convention on Certain Conventional Weapons (CCW), at which the 2021 schedule for discussions on autonomous weapons would have been set, was cancelled.
During several years of discussions on autonomous weapons at the United Nations Convention on Certain Conventional Weapons (CCW), several arguments against their regulation have surfaced. Some seem intentionally misleading, while others are out of touch with the rapid development of emerging technologies and the current trends in academic research and analysis.
Disarmament and arms control have not featured prominently, if at all, in mandate letters to Canada’s foreign ministers in many years. But at the end of 2019, Canadian Foreign Minister François-Philippe Champagne was given a new mandate to “advance international efforts to ban the development and use of fully autonomous weapons systems.”
It sounds like the stuff of science fiction. Enabled by significant advances in artificial intelligence (AI) and robotics, fully autonomous weapons systems with the ability to select targets and employ lethal force with no human involvement—also known as killer robots—may soon emerge.
The question now is what happens next and how will the mandate be implemented when UN discussions on this issue resume in June. While fully autonomous weapons systems do not yet exist, experts agree that they soon will.
In a recent New York Times opinion piece, Glenn S. Gerstell, the general counsel for the United States National Security Agency, explains why the United States cannot afford to lose …
If you are in Vancouver, please join us for a conversation about the issue of autonomous weapons and the work of the Campaign to Stop Killer Robots. We will provide the information and refreshments. It’s FREE!
1. What are Killer Robots? Actually, pretty much what they sound like. These are autonomous weapons systems that could kill human beings without any human involvement in the critical functions …
- Page 1 of 2