ARMY OF
NONE
Autonomous Weapons and
the Future of War
PAUL SCHARRE
For Davey, William, and Ella,
that the world might be a better place.
And for Heather.
Thanks for everything.
Contents
ARMY OF
NONE
THE MAN WHO SAVED THE WORLD
O n the night of September 26, 1983, the world almost ended.
It was the height of the Cold War, and each side bristled with nuclear weapons. Earlier that spring, President Reagan had announced the Strategic Defense Initiative, nicknamed Star Wars, a planned missile defense shield that threatened to upend the Cold Wars delicate balance. Just three weeks earlier on September 1, the Soviet military had shot down a commercial airliner flying from Alaska to Seoul that had strayed into Soviet air space. Two hundred and sixty-nine people had been killed, including an American congressman. Fearing retaliation, the Soviet Union was on alert.
The Soviet Union deployed a satellite early warning system called Oko to watch for U.S. missile launches. Just after midnight on September 26, the system issued a grave report: the United States had launched a nuclear missile at the Soviet Union.
Lieutenant Colonel Stanislav Petrov was on duty that night in bunker Serpukhov-15 outside Moscow, and it was his responsibility to report the missile launch up the chain of command to his superiors. In the bunker, sirens blared and a giant red backlit screen flashed launch, warning him of the detected missile, but still Petrov was uncertain. Oko was new, and he worried that the launch might be an error, a bug in the system. He waited.
Another launch. Two missiles were inbound. Then another. And another. And anotherfive altogether. The screen flashing launch switched to missile strike. The system reported the highest confidence level. There was no ambiguity: a nuclear strike was on its way. Soviet military command would have only minutes to decide what to do before the missiles would explode over Moscow.
Petrov had a funny feeling. Why would the United States launch only five missiles? It didnt make sense. A real surprise attack would be massive, an overwhelming strike to wipe out Soviet missiles on the ground. Petrov wasnt convinced the attack was real. But he wasnt certain it was a false alarm, either.
With one eye on the computer readouts, Petrov called the ground-based radar operators for confirmation. If the missiles were real, they would show up on Soviet ground-based radars as they arced over the horizon. Puzzlingly, the ground radars detected nothing.
Petrov put the odds of the strike being real at 50/50, no easier to predict than a coin flip. He needed more information. He needed more time. All he had to do was pick up the phone, but the possible consequences were enormous. If he told Soviet command to fire nuclear missiles, millions would die. It could be the start of World War III.
Petrov went with his gut and called his superiors to inform them the system was malfunctioning. He was right: there was no attack. Sunlight reflecting off cloud tops had triggered a false alarm in Soviet satellites. The system was wrong. Humanity was saved from potential Armageddon by a human in the loop.
What would a machine have done in Petrovs place? The answer is clear: the machine would have done whatever it was programmed to do, without ever understanding the consequences of its actions.
THE SNIPERS CHOICE
In the spring of 2004two decades later, in a different country, in a different warI stared down the scope of my sniper rifle atop a mountain in Afghanistan. My sniper team had been sent to the Afghanistan-Pakistan border to scout infiltration routes where Taliban fighters were suspected of crossing back into Afghanistan. We hiked up the mountain all night, our 120-pound packs weighing heavily on the jagged and broken terrain. As the sky in the east began to lighten, we tucked ourselves in behind a rock outcroppingthe best cover we could find. We hoped our position would conceal us at daybreak.
It didnt. A farmer spied our heads bobbing above the shallow rock outcropping as the village beneath us woke to start their day. Wed been spotted.
Of course, that didnt change the mission. We kept watch, tallying the movement we could see up and down the road in the valley below. And we waited.
It wasnt long before we had company.
A young girl of maybe five or six headed out of the village and up our way, two goats in trail. Ostensibly she was just herding goats, but she walked a long slow loop around us, frequently glancing in our direction. It wasnt a very convincing ruse. She was spotting for Taliban fighters. We later realized that the chirping sound wed heard as she circled us, which we took to be her whistling to her goats, was the chirp of a radio she was carrying. She slowly circled us, all the while reporting on our position. We watched her. She watched us.
She left, and the Taliban fighters came soon after.
We got the drop on themwe spotted them moving up a draw in the mountainside that they thought hid them from our position. The crackle of gunfire from the ensuing firefight brought the entire village out of their homes. It echoed across the valley floor and back, alerting everyone within a dozen miles to our presence. The Taliban whod tried to sneak up on us had either run or were dead, but they would return in larger numbers. The crowd of villagers swelled below our position, and they didnt look friendly. If they decided to mob us, we wouldnt have been able to hold them all off.
Scharre, my squad leader said. Call for exfil.
I hopped on the radio. This is Mike-One-Two-Romeo, I alerted our quick reaction force, the village is massing on our position. Were going to need an exfil. Todays mission was over. We would regroup and move to a new, better position under cover of darkness that night.
Back in the shelter of the safe house, we discussed what we would do differently if faced with that situation again. Heres the thing: the laws of war dont set an age for combatants. Behavior determines whether or not a person is a combatant. If a person is participating in hostilities, as the young girl was doing by spotting for the enemy, then they are a lawful target for engagement. Killing a civilian who had stumbled across our position would have been a war crime, but it would have been legal to kill the girl.
Of course, it would have been wrong. Morally, if not legally.
In our discussion, no one needed to recite the laws of war or refer to abstract ethical principles. No one needed to appeal to empathy. The horrifying notion of shooting a child in that situation didnt even come up. We all knew it would have been wrong without needing to say it. War does force awful and difficult choices on soldiers, but this wasnt one of them.
Context is everything. What would a machine have done in our place? If it had been programmed to kill lawful enemy combatants, it would have attacked the little girl. Would a robot know when it is lawful to kill, but wrong?
THE DECISION
Life-and-death choices in war are not to be taken lightly, whether the stakes are millions of lives or the fate of a single child. Laws of war and rules of engagement frame the decisions soldiers face amid the confusion of combat, but sound judgment is often required to discern the right choice in any given situation.
Technology has brought us to a crucial threshold in humanitys relationship with war. In future wars, machines may make life-and-death engagement decisions all on their own. Militaries around the globe are racing to deploy robots at sea, on the ground, and in the airmore than ninety countries have drones patrolling the skies. These robots are increasingly autonomous and many are armed. They operate under human control for now, but what happens when a Predator drone has as much autonomy as a Google car? What authority should we give machines over the ultimate decisionlife or death?
Next page