STRATEGY, EVOLUTION, AND WAR
STRATEGY
FROM APES TO
EVOLUTION
ARTIFICIAL INTELLIGENCE
AND WAR
KENNETH PAYNE
2018 Georgetown University Press. All rights reserved. No part of this book may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying and recording, or by any information storage and retrieval system, without permission in writing from the publisher.
The publisher is not responsible for third-party websites or their content.
URL links were active at time of publication.
Library of Congress Cataloging-in-Publication Data
Names: Payne, Kenneth, 1974 author.
Title: Strategy, evolution, and war : from apes to artificial intelligence / Kenneth Payne.
Description: Washington, DC : Georgetown University Press, 2018. | Includes bibliographical references and index.
Identifiers: LCCN 2017057435 (print) | LCCN 2018012277 (ebook) | ISBN 9781626165816 (ebook) | ISBN 9781626165793 (hardcover : alk. paper) | ISBN 9781626165809 (pbk. : alk. paper)
Subjects: LCSH: Strategy. | StrategyPsychological aspects. | Artificial intelligenceMilitary applications.
Classification: LCC U162 (ebook) | LCC U162 .P39 2018 (print) | DDC 355.02dc23 LC record available at https://lccn.loc.gov/2017057435
This book is printed on acid-free paper meeting the requirements of the American National Standard for Permanence in Paper for Printed Library Materials.
191898765432First printing
Printed in the United States of America
Cover design by John Barnett/4eyesdesign.com.
For Valerie Payne-Morris
CONTENTS
INTRODUCTION
THIS BOOK EXPLORES THE EVOLUTIONARY BASIS OF HUMAN STRATEGY IN war and considers the prospects of a radically distinct approach to strategy using artificial intelligence (AI). Strategy is defined here as the purposeful use of violence for political ends. This definition sharpens the focus onto war and away from other human endeavors that may require a similar aptitude for planning imaginatively to realize goals.
My conclusion is that strategy is soon to undergo something of a dramatic transformation because machines will make important decisions about war and will do so without input from human minds. The results may be dramaticincluding a shift in the utility of deterrence, a profound reordering of existing power balances, and challenges to the notion of the warrior. AI will likely enhance the power of the offensive side in conflict and will make decisions about escalation in nonhuman ways at a pace that is inhumanly fast.
I take an evolutionary approach to understanding strategyseeing it largely as a product of our intense and complex social relationships. Living in a group entails cooperation and shared planning about future goalsespecially when it comes to conflict with other groups. This process of negotiating goals within a group and working together to achieve them is what we call strategy, particularly when it involves goals that conflict with those of rival groups. Until now, all strategy has been inherently biologicalconnected intimately to our evolved origins as social primates. Our decision-making, conscious and otherwise, is informed by our human biology and the motivations that derive from it. I describe this evolutionary process in the first third of the book; in the second part I explore how the arrival of culture has modified this evolutionary process. The last third is an exploration of how artificial intelligence might change this biological essence and radically alter the essence of strategy.
Human strategy reflects the human conditionstarting with the basic reality that we are social and embodied agents. We are motivated to reproduce and survive and have evolved a series of subordinate goals and procedures to enhance our prospect of survival. Warfare between groups of humans is the consequence of this biological imperative, even if the links sometimes seem obscure or tangential. Moreover, the prospect for violent conflict has shaped the evolution of our minds, including the evolution of human consciousness. It has given us an acute sensitivity for what other humans might think or want. We are emotional, and these emotions act as an important heuristic, streamlining our decision-making process in ways that have proved adaptive.
We will see that much falls out of the difference between these basic human realities and the realities of machines. What machines might seek to achieve and how they go about achieving it are not necessarily akin to the situation we face. Indeed, I suggest that there will be radical differences and large strategic consequences. One caveat: rather than creating a danger from AI acting strategically on its own account, the main effects of AI are likely to be felt from AI acting in our interests, which may produce rapid power shifts and unintended consequences. Even so, AIs will challenge some of the fundamental tenets of strategyincluding the importance of mass, the dominance of the defense, the utility of surprise, the efficacy of preventive warfare, and the link between a societys values and the way in which it approaches strategy. All of these are possible with technology not far distant from what is available today. This is not to say that the onset of more advanced AI than that, capable of dealing flexibly with real-world situations, will not entail serious dangers, nor that it might not generate some internal motivations of its own. But even then it might not look much like the malign, anthropomorphized versions of AI that populate alarmist science fiction and journalistic accounts.
To give some idea of the argument that will follow: I suggest that there have been two macro-level revolutions in human strategic affairsthe first, when our species underwent some sort of cognitive transformation, which seems to have gathered pace somewhere around one hundred thousand years ago; the second, now under way, in which artificial intelligence will profoundly alter the way in which cognition shapes strategy. AI has the potential to affect humans more broadly than just in strategic affairs, of coursereshaping our societies, our bodies, and perhaps even our minds as the distinction between it and human and intelligence breaks down. But these possibilities are of interest here primarily insofar as they have an impact on strategic affairs.
My argument stresses continuity in strategic affairs, from the evolution of human intelligence to the advent of AI. Not everyone is convinced. Take the basic question of violence and human nature. Despite increasingly persuasive evidence about violent human prehistory, there remains a school of thought that war is a modern phenomenon distinct from the more peaceful way humans lived in our evolutionary past (Fry 2007). This thinking suggests that modern civilization, rather than our evolved human nature, has unleashed the scourge of war, waged with powerful modern weapons, for material gain. Others, like Harry Turney-High, allow that primitive man was violent but argue that the wars of modern states are qualitatively different and altogether more serious than the wars of states of the past (Turney-High 1971). Not only were primitive warriors ill-disciplined and ill-equipped, he suggests, but they were motivated by existential passionsincluding the rather Freudian motive to release pent-up frustrations by fighting: violence as stress relief. For Martin van Creveld, too, primitive war differs from those of modern states in its instrumentalitywith modern states applying reason and logic when it comes to the use of violence (van Creveld 1991). I disagree. There is great continuity in human affairs to set against the compelling differences between cultures, and these often reflect the influence of our evolved psychology. Primitives, as much as moderns, are driven to fight for reasons that can be instrumental, existential, or some mixture of both.