Burn-In: What our future with tech could look like

Why a blog about a recently released novel? Because, as many readers know, fiction is often a compelling way to capture our concerns about the future—and the present.  In techno-thriller Burn-In, A Novel of the Real Robotic Revolution, authors P.W. Singer and August Cole examine the many roles that technology can assume in 21st–century national and global security and in future conflict.

This book, admittedly “a work of fiction,” is worthy of consideration by security analysts because “it also wrestles with real issues that will have to be faced in the coming years.” Hence the many superscripts in the body of the novel that match the pages and pages of notes at the end. As a blurb on the back cover says, Burn-In “takes tech seriously.”

The title refers to the process of conducting a reliability test on a device to detect any problems. In this case, the testing is of a Tactical Autonomous Mobility System (TAMS) or robot. Charged with training this TAMS to navigate different national-security scenarios is FBI Special Agent Lara Keegan, a former Marine.

Set in and around Washington, D.C., in what feels like the not-too-distant future, the story expands to include other political, societal, and cultural events that deserve their own analysis. But it is Keegan’s training of the robot and the robot’s constant observation of Keegan that fascinate the reader. Because TAMS’s learning is directly dependent on what Keegan teaches and on what it “experiences.”

Initially skeptical of her charge, as the story unfolds Keegan must remind herself that TAMS is just a machine. But is a growing faith in this robot a good thing? Keegan feels real-world concern about ‘automation bias,’ or the human tendency to rely too much on automated systems. If she follows the recommendations of TAMS and the decision is wrong, is she responsible? Is TAMS?

TAMS does have great value as a source of information. For example, TAMS is able to scan social networks for data. As one of the scientists involved in TAMS’s creation notes, “There’s no way any of us can keep up with everything in the feed, the cloud, as well as whatever surrounds you. In turn, you need that data operationalized.”

But TAMS also has weaknesses: it is incapable of understanding human emotions, for example. It also gets stuck in an obstacle course, showing that something that is simple for humans can be difficult for a robot.

In describing the robot’s development, the authors expose readers to ongoing debates about whether militaries and law-enforcement agencies should develop and use such technology. All the points about TAMS’s strengths and weaknesses have been made about  autonomous systems by various countries at the United Nations Convention on Conventional Weapons (CCW) since 2014.

Keegan also contemplates issues of trust and ethics that have been raised at the CCW. She reflects, “Machines don’t have ethics. They can be programmed to lie and not even know it. And I can lie to a machine and not break a sweat over it.” She recognizes that no matter how much “training” an autonomous system has in human intuition and understanding, it will never function or think as a human does.

The world of the novel becomes so alive for the reader that it creates a sense of unease. So many of the scenarios depicted in the book seem to be unfolding in real time. The authors and fans of the book continue to keep track of these “coincidences” on Twitter. Look for #BurnInBook.

With the focus on an FBI Special Agent and the U.S. security and defence institutions, it is not surprising that Burn-In is popular with U.S. military and intelligence communities However, civil-society researchers and advocates will also relate to some of the scenarios that depict future uses of tech by militaries and domestic- security authorities.

Burn-In is incredibly relevant and should be read as a cautionary tale about the pressing need for societies to pay close attention to the technology now being developed and used. The book offers a convincing picture of the impact that current technology, including data collection, facial recognition software, and cyber weapons, could have on society, if they are not regulated and kept under control. As the story reveals, the price for not doing so is simply too high.