Autonomous Weapons Would Take Warfare To A New Domain, Without Humans

Apr 23, 2018
Originally published on April 23, 2018 7:28 pm

Killer robots have been a staple of TV and movies for decades, from Westworld to The Terminator series. But in the real world, killer robots are officially known as "autonomous weapons."

At the Pentagon, Paul Scharre helped create the U.S. policy for such weapons. In his new book, Army of None: Autonomous Weapons and the Future of War, Scharre discusses the state of these weapons today.

"Killer robots" might be a bit sensational, he says, but what he's talking about is a weapon that could "go out on its own and make its own decisions about who to kill on the battlefield."

At least 30 countries have autonomous weapons that are supervised by humans for defensive purposes, Scharre says.

"[These are] things that would target incoming missiles and shoot them down entirely on their own," he says. "Humans are sitting there at the console that could turn it off if they need to. But in a simple way, those are autonomous weapons."

Scharre says while the current weapons are not like those seen in the movies, the technology is advancing, whether people like it or not.

"Things like more advanced hobby drones, the same technology that will go into self-driving cars, all of those sensors and intelligence will make autonomous weapons also possible," he says.

In his book, Scharre looks at the question: "How hard would it be for someone to build a simple, autonomous weapon in their garage?"

And while that's a scary scenario, he says that it's already happening on some levels as students today are learning programming skills, with free and readily available online resources.

"These tools are available for free download. You can download them online," he says. "[It] took me about three minutes online to find all of the free tools you would need to download this technology and make it happen."

And while high school students aren't creating these autonomous weapons, the ability to do so is a real possibility. Because of that, Scharre says the debate isn't so much about if this type of technology should be created, but more so what should be done about it.

"What do we do with this? Do we build weaponized versions of them? Do you build them en masse? Do militaries invest in this?" are all questions being asked as this technology would drastically change warfare.

"[It would create] a domain of warfare where humans have less control over what happens on the battlefield — where humans are no longer deciding who lives and who dies, and machines are making those decisions," Scharre says.

Debates like this are happening in countries all around the world, including those that have repeatedly violated international rules.

In Russia, the military is working to create a fleet of armed ground robots.

"They're building large, ground combat vehicles that have anti-tank missiles on them," Scharre says. "Russian generals have talked about a vision in the future of fully robotized units that are independently conducting operations, so other countries are leaning hard into this technology."

Scharre says that one of the fears of this technology advancing is that "flash wars" could occur. Much like a "flash crash" in the stock market, a "flash war" would occur at such as fast pace that humans would not be involved.

"The worry is that you get an equivalent — a flash war, where algorithms interact in some way and the robots start shooting each other and running amuck, and then humans are scrambling to put a lid back on it," Scharre says.

But, though some scenarios are terrifying, other people argue that autonomous weapons could save lives, by making fewer mistakes than might result from human error.

"Just like self-driving cars could someday make the roads much safer, some people have argued, 'Well, maybe autonomous weapons could be more precise and more humane. By avoiding civilian casualties in war and only killing the enemy,' " Scharre says.

From his own experience in the military serving as a special operations agent, Scharre says he has been in a situation in which an autonomous weapon would have killed a girl that the Taliban was using as a scout, but that soldiers did not target.

He says it's situations like that which highlight differences between what is legal in the laws of war and what is morally right — something that autonomous weapons might not distinguish.

"That is one of the concerns that people raise about autonomous weapons is a lack of an ability to feel empathy and to engage in mercy in war," Scharre says. "And that if we built these weapons, they would take away a powerful restraint in warfare that humans have."

There's still a lot to consider and discuss when it comes to autonomous weapons and the increasing technology, Scharre says. But as for whether humans are doomed, he says there is not a clear answer.

"We do have the opportunity to shape how we use technology. We're not at the mercy of it," Schare says. "The problem at the end of the day isn't the technology. It's getting humans to cooperate together on how we use the technology and make sure that we're using it for good and not for harm."

Noah Caldwell and Emily Kopp produced and edited the audio for this story. Wynne Davis adapted it for Web.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

ARI SHAPIRO, HOST:

Killer robots have been a staple of TV and movies for decades from "Westworld" to "The Terminator."

(SOUNDBITE OF FILM, "THE TERMINATOR")

MICHAEL BIEHN: (As Kyle Reese) It doesn't feel pity or remorse or fear. And it absolutely will not stop ever until you are dead.

SHAPIRO: In the real world, killer robots are officially known as autonomous weapons. At the Pentagon, Paul Scharre helped create the U.S. policy for autonomous weapons. And now he has a new book out called "Army Of None: Autonomous Weapons And The Future Of War." And Paul Scharre is our guest on this week's All Tech Considered.

(SOUNDBITE OF ULRICH SCHNAUSS' "NOTHING HAPPENS IN JUNE")

SHAPIRO: Welcome to the program.

PAUL SCHARRE: Thanks. Thanks for having me.

SHAPIRO: I defined an autonomous weapon as a killer robot. Can you give us a better definition?

SCHARRE: Yeah. I probably wouldn't use language quite that sensational...

SHAPIRO: OK.

SCHARRE: ...But it captures - you know, it captures the essence of the idea. We're talking about a weapon that could go out on its own and make its own decisions about who to kill on the battlefield.

SHAPIRO: Do they exist today?

SCHARRE: You know, in some crude forms a little bit. There are at least 30 countries that have autonomous weapons that are supervised by humans for defensive purposes, things that would target incoming missiles and shoot them down entirely on their own. Now, humans are sitting there at the console. They could turn it off if they need to. But in a simple way, those are autonomous weapons.

SHAPIRO: So if people decided they wanted to race towards autonomous weapons as fast as they could, they wouldn't have far to run.

SCHARRE: Well, the technology is taking them there really whether they like it or not. Things like more advanced hobby drones, the same technology that will go into self-driving cars, all of those sensors and intelligence will make autonomous weapons also possible.

SHAPIRO: So this is not a debate over whether we should create these technologies. The technologies are already created.

SCHARRE: Right. The debate really is, what do we do with this? Do we build these? Do we build weaponized versions of them? Do you build them en masse? Do militaries invest in this and take warfare to a whole new domain, a domain of warfare where humans have less control over what happens on the battlefield?

SHAPIRO: And these debates are not only happening in the United States and Western democracies. These debates are happening in autocratic countries, in highly isolated countries, in countries that have violated international norms repeatedly.

SCHARRE: Right. I mean, Russia is building a fleet of armed ground robots for war on the plains of Europe. And Russian generals have talked about a vision in the future of fully roboticized units that are independently conducting operations. So other countries are leaning hard into this technology.

SHAPIRO: So if people listening are starting to get worried, let me just assure them it gets worse.

SCHARRE: (Laughter).

SHAPIRO: You describe a lot of terrifying scenarios. One of them is what you call a flash war, which is sort of like the flash crash that happened in the stock market partially as a result of automated trading. What is a flash war?

SCHARRE: Well, just as we've seen in arms race and speed and stock trading where stock trading now has moved to time speeds in milliseconds where humans cannot possibly be engaged and compete, the fear is that we'd see something similar in warfare where countries automate decisions on the battlefield, taking humans out of the loop because there's an advantage in speed.

But just like we've seen accidents in stock trading where algorithms are interacting in surprising ways and you get things like flash crashes, the worry is that you get an equivalent - a flash war where algorithms interact in some way and the robots start shooting each other and running amok, and then humans are scrambling to put a lid back on it.

SHAPIRO: You also raise the possibility that autonomous weapons could save lives because machines wouldn't make the same mistakes that people make. Explain that.

SCHARRE: Well, that's certainly one of the arguments against a ban or people even arguing in favor of building these weapons. And I'd compare them to looking at cars. Just like self-driving cars could someday make the roads much safer, some people have argued, well, maybe autonomous weapons could be more precise and more humane by avoiding civilian casualties in war and only killing the enemy.

SHAPIRO: You also served in the U.S. military. You have fought in wars. And you describe instances where you could legally have used lethal force and killed a person, but you understood that that would not have been the right choice in that scenario. Tell us about one of those instances. And I wonder what an autonomous weapon would have done had it been in your shoes.

SCHARRE: There was an incident early in the wars in Afghanistan where we were up on a mountaintop in eastern Afghanistan near the Pakistan border. I was part of a ranger sniper team. And a little girl came along that was scouting out our position. And we watched the girl. She watched us. After a while, she left. And soon after, some Taliban fighters came, and we took care of them. And later we talked about, you know, what would we do if we were in a similar situation? Something that never came up was shooting this girl. No one discussed it. No one - it would have been wrong.

SHAPIRO: Even though the Taliban was using her as a scout and it would have been legal.

SCHARRE: Well, and here's the thing. The laws of war do not set an age for combatants. It's based on your actions. And if you're scouting for the enemy, you're participating in hostilities. So an autonomous weapon that was designed to obey the laws of war would have shot this little girl. So there is an important difference in what is legal and what is right. And that is one of the concerns that people raise about autonomous weapons, is a lack of ability to feel empathy and to engage in mercy in war. And that if we build these weapons, they would take away a powerful restraint in warfare that humans have.

SHAPIRO: So how do we make sure this doesn't happen?

SCHARRE: Well, there are a number of people who've called for an international treaty that would ban autonomous weapons. There have been conversations underway at the United Nations for five years now. But progress is moving very slowly diplomatically. Meanwhile, the technology keeps racing forward.

SHAPIRO: And we've seen Syria violate international treaties. We've seen North Korea violate international treaties. Even if there were an international treaty like this, what guarantee would there be that some country wouldn't see that as an opportunity to get ahead of the pack?

SCHARRE: Well, that is exactly one of the objections against a treaty. These treaties only really constrain countries who care about the laws of war in the first place. And so a treaty that took away powerful weapons from the most law-abiding nations and then only gave them effectively to rogue states would hardly be in anyone's interests.

SHAPIRO: So I ask this only half in jest - are we doomed?

SCHARRE: I mean, I think that's one of the things that the book really wrestles with, is is this inevitable? Do we control our technology, or does our technology control us?

SHAPIRO: Well, that does kind of dodge the question. Are we doomed?

(LAUGHTER)

SCHARRE: You know, one of the things I walk through in the end of the book is, what are some options going forward? I think there are ways to think about narrower regulations that might be more feasible to avert some of the most harmful consequences. Maybe a more narrow ban on weapons that target people. And there has been some discussions underway internationally in trying to frame - reframe the issue and think about, what is the role of humans in warfare?

So if we had all the technology in the world, what role would we want people to play in war and why? I think that's a valuable conversation to have. And, you know, we do have the opportunity to shape how we use technology. We're not at the mercy of it. The problem at the end of the day isn't the technology. It's getting humans to cooperate together on how we use the technology and make sure that we're using it for good and not for harm.

SHAPIRO: Paul Scharre's new book is called "Army Of None: Autonomous Weapons And The Future Of War." Thank you for joining us.

SCHARRE: Thank you. Thanks for having me. Transcript provided by NPR, Copyright NPR.