Ukraine’s battle-tested tech

June 12, 2024

By Branka Marijan

Published in The Ploughshares Monitor Summer 2024

The Russian invasion of Ukraine has played out as a game of technological cat-and-mouse, as Ukraine and Russia develop systems and countermeasures that in turn lead to further innovation. Ukraine has become a testing ground for military technology, particularly artificial intelligence (AI). While both countries are engaged in this game, the activities of Ukraine are better documented.

In attempts to defend and regain its territory, Ukraine has experimented with everything from cost-effective drones to advanced applications of machine learning and computer vision. Its rapid escalation of warfare technology has drawn the keen interest of allied countries that are motivated by a desire to support Ukraine and international corporations eager to take advantage of a lucrative opportunity to market their technologies as ‘battle-tested’.

Ukraine, too, sees possible economic benefits. As reported in “How tech giants turned Ukraine into an AI war lab” in TIME magazine, Mykhailo Fedorov, Ukraine’s Minister of Digital Transformation, has stated that “our big mission is to make Ukraine the world’s tech R&D lab.”

The allure of battle-tested technology cannot be overstated. In the defence sector, proven effectiveness in actual combat conditions significantly enhances a technology’s marketability. This is especially true for AI systems that may have been developed for civilian applications or military software that has not been tested in an actual war.  However, the deployment of these advanced technologies on battlefields in Ukraine also raises profound questions about the wider uses of AI in warfare and the role some private companies are playing in shaping global norms on the use of emerging technologies.

Field experiments

Ukraine is actively participating in experimentation with new technologies such as drones and is keen to expand its domestic defence sector. However, Ukraine cannot afford to support this sector on its own. For example, according to The Kyiv Independent, only 58 of an estimated 200 domestic drone companies have contracts with the Ukrainian government.

Instead, Ukraine has turned to allies, including Canada, for financial support. So far, Denmark has contributed $28.5 million and Canada $2.1 million to support Ukraine’s drone manufacturers.

The allure of battle-tested technology cannot be overstated. In the defence sector, proven effectiveness in actual combat conditions significantly enhances a technology’s marketability. This is especially true for AI systems that may have been developed for civilian applications or military software that has not been tested in an actual war.

Some Ukrainian drone companies are also considering a move outside Ukraine. Such movement could expand the global impact of Ukraine’s military innovation.

Know-how and practical battle experience are significant advantages when deploying new technologies. In “Techcraft on display in Ukraine,” published on the War on the Rocks website, the authors argue that a tech-savvy local population offers clients “techcraft” or “the field-expedient use of technology in war.”  This feature has played a pivotal role in deploying and testing new technologies. The article describes, for example, how Ukrainian soldiers and volunteers have adapted readily available off-the-shelf quadcopters or even printed components on 3D printers.

Palantir’s role

The real-time battlefield testing of new technologies is being carried out with the support of foreign tech companies. Among the most notable is Palantir, the American-based software company that is a major supplier to the U.S. military and allied countries. Palantir has provided the Ukrainian military with software to track Russian troops and, according to The Washington Post, has a team of engineers in Ukraine that is constantly experimenting with new tools.

By all accounts, Palantir’s software has given Ukraine a distinct advantage. As TIME reports, Palantir enjoys a special status with major Western militaries. Thus, its software can integrate data from commercial satellites with classified data from allied states.

But Palantir is also widely criticized by human rights organizations such as Amnesty International for creating an intricate web of surveillance tools that governments are using to track asylum seekers and even to arrest individuals.

And it’s hard to gauge with any accuracy the effectiveness and impact of some of this new tech with so much fanfare generated by eager customers and avaricious manufacturers. Examining the use of Palantir tools in predictive policing by the Los Angeles Police Department, sociologist Sarah Brayne noted in an article in The Intercept that police officers found a gap between promise and practice. Nevertheless, Palantir went ahead with an expansion into healthcare.

In Ukraine, Palantir could also operationalize CEO Alex Karp’s vision of building AI weapons for the United States and its allies. As Karp admitted in the TIME article, “There are things that we can do on the battlefield that we could not do in a domestic context.” Palantir and other players view the war in Ukraine as an opportunity to display the utility of their tools to prospective Western customers, particularly against a larger state like Russia. Western European states are particular clients that Palantir and other U.S. companies would like to acquire. Supporting Ukraine has also done wonders for their sometimes-tarnished images.

Clearview AI has perhaps benefitted the most. Ukrainian officials have found its facial recognition technology useful to identify war dead and Russian soldiers. Before the current conflict in Ukraine, Clearview AI had come under scrutiny in several countries. Then Canadian privacy commissioner Daniel Therrien found the company had broken Canada’s privacy laws, noting that, by scraping images from social media to create a database, the company had essentially carried out illegal mass surveillance. But in Ukraine, Clearview’s CEO Hoan Ton-That found a more permissive and welcoming environment and a chance to recast the company’s image and ultimate usefulness to security and defence agencies.

Need for regulation

It is understandable that a Ukraine at war welcomes support from tech companies. Looking to the future, it also recognizes the economic and strategic value of becoming a key player in military technology. Still, precedents being set now could have far-reaching consequences. Ukrainian military commander Yaroslav Honcha noted in Nature article “Lethal AI weapons are here: How can we control them?” that Ukraine “already conducts fully robotic operations, without human intervention.” Such uses are outpacing international efforts to require human control over weapon systems.

The fog of war has blurred our view of the actual effectiveness of new AI tools in combat. Companies are keen to claim that their tools are exceeding expectations, but the reality might be different. Ukrainian commanders could be motivated by the need to rally their troops, offer hope to the population and gain more support from allies.

The applicability of tools in different contexts is also being questioned. A state fighting an irregular force in densely populated urban areas might find that some tools that are useful in Ukraine produce significant civilian casualties. Even in Ukraine, the deployment of new AI-enabled weapon systems could have unpredictable results. Outsiders don’t yet know how such risk is being addressed or whether Ukraine might become desperate enough to deploy tools that take the lives of some of its own citizens. Or if foreign companies are being held accountable for developing various tools or collecting data on Ukrainian citizens.

There is also no guarantee that individuals and companies now supporting Ukraine will not at some future date provide technology to states that oppress their populations. Or that the tools will not be diverted to nonstate armed groups. States embroiled in conflicts with minority communities within their borders could see value in some of the systems currently being tested in Ukraine.  

Before these technologies become uncontrollable, the international community must seize the opportunity to engage in a serious and constructive dialogue about the future of warfare.

The experiences of Ukraine offer invaluable lessons, not only about the potential of military technology but about the ethical and legal challenges it presents. All the world’s nations need to learn these lessons.

From Blog

Related Post

Get great news and insight from our expert team.

December 19, 2024
Analysis and Commentary

An affront to humanitarian norms: Statement on U.S. decision to supply landmines to Ukraine

December 10, 2024
Analysis and Commentary

Amid Gaza carnage, Canada must step up to vigorously defend IHL

Let's make some magic together

Subscribe to our spam-free newsletter.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.