Penguin supports copyright. Copyright fuels creativity, encourages diverse voices, promotes free speech, and creates a vibrant culture. Thank you for buying an authorized edition of this book and for complying with copyright laws by not reproducing, scanning, or distributing any part of it in any form without permission. You are supporting writers and allowing Penguin to continue to publish books for every reader.
While the author has made every effort to provide accurate telephone numbers, Internet addresses, and other contact information at the time of publication, neither the publisher nor the author assumes any responsibility for errors or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for author or third-party Web sites or their content.
Never stop being excited by it.
INTRODUCTION
O n July 8, 2015, as I was in the midst of working on this book, United Airlines suffered a computer problem and grounded its planes. That same day, the New York Stock Exchange halted trading when its system stopped working properly. TheWall Street Journals website went down. People went out of their minds. No one knew what was going on. Twitter was bedlam as people speculated about cyberattacks from such sources as China and Anonymous.
But these events do not seem to have been the result of a coordinated cyberattack. The culprit appears more likely to have been a lot of buggy software that no one fully grasped. As one security expert stated in response to that days events, These are incredibly complicated systems. There are lots and lots of failure modes that are not thoroughly understood. This is an understated way of saying that we simply have no idea of the huge number of ways that these incredibly complex technologies can go wrong.
Our technologiesfrom websites and trading systems to urban infrastructure, scientific models, and even the supply chains and logistics that power large businesseshave become hopelessly interconnected and overcomplicated, such that in many cases even those who build and maintain them on a daily basis cant fully understand them any longer.
In his book The Ingenuity Gap, professor Thomas Homer-Dixon describes a visit he made in 1977 to the particle accelerator in Strasbourg, France. When he asked one of the scientists affiliated with the facility if there was someone who understood the complexity of the entire machine, he was told that no one understands this machine completely. Homer-Dixon recalls feeling discomfort at this answer, and so should we. Since then, particle accelerators, as well as pretty much everything else we build, have only increased in sophistication.
Technological complexity has been growing for a long time. Take the advent of the railroads, which required a network of tracks and a switching system to properly route trains across them. The railroads spurred the development of standardized time zones in the United States in order to coordinate the many new trains that were crisscrossing the continent. Before this technology and the complexity it entailed, time zones were less necessary.
But todays technological complexity has reached a tipping point. The arrival of the computer has introduced a certain amount of radical novelty to our situation, to use the term of the computer scientist Edsger Dijkstra. Computer hardware and software is much more complex than anything that came before it, with millions of lines of computer code in a single program and microchips that are engineered down to a microscopic scale. As computing has become embedded in everything from our automobiles and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it.
In recent years, scientists have even begun to recognize the inextricable way that technology and nature have become intertwined. Geologists who study the Earths rock layers are asking whether there is enough evidence to formally name our current time period the Anthropocene, the Epoch of Humanity. Formal title or not, the relationship between our human-made systems and the natural world means that each of our actions has even more unexpected ramifications than ever before, rippling not just to every corner of our infrastructure but to every corner of the planet, and sometimes even beyond. The totality of our technology and infrastructure is becoming the equivalent of an enormously complicated vascular system, both physical and digital, that pulls in the Earths raw materials and emits roads, skyscrapers, large populations, and chemical effluent. Our technological realm has accelerated the metabolism of the Earth and done so in an extraordinarily complicated dance of materials, even changing the glow of the planets surface.
We are of two minds about all this complexity. On the one hand, we built these incredibly complicated systems, and thats something to be proud of. They might not work as expected all the time, but they are phenomenally intricate edifices. On the other hand, almost everything we do in the technological realm seems to lead us away from elegance and understandability, and toward impenetrable complexity and unexpectedness.
We already see hints of the endpoint toward which we are hurtling: a world where nearly self-contained technological ecosystems operate outside of human knowledge and understanding. As a journal article in Scientific Reports in September 2013 put it, there is a complete new machine ecology beyond human response timeand this paper was talking only about the financial world. Stock market machines interact with one another in rich ways, essentially as algorithms trading among themselves, with humans on the sidelines.
This book argues that there are certain trends and forces that overcomplicate our technologies and make them incomprehensible, no matter what we do. These forces mean that we will have more and more days like July 8, 2015, when the systems we think of as reliable come crashing down in inexplicable glitches.
As a complexity scientist, I spend a lot of time being preoccupied with the rapidly increasing complexity of our world. Ive noticed that when faced with such massive complexity, we tend to respond at one of two extremes: either with fear in the face of the unknown, or with a reverential and unquestioning approach to technology.
Fear is a natural response, given how often we are confronted with articles on such topics as the threat of killer machines, the dawn of superintelligent computers with powers far beyond our ken, or the question of whether we can program self-driving cars to avoid hitting jaywalkers. These are technologies so complex that even the experts dont completely understand them, and they also happen to be quite formidable. This combination often leads us to approach them with alarm and worry.
Even if we arent afraid of our technological systems, many of us still maintain an attitude of wariness and distaste toward the algorithms and technologies that surround us, particularly when we are confronted with their phenomenal power. We see this in our responses to the inscrutable recommendations of an Amazon or a Netflix, or in our annoyance with autocorrects foibles. Many of us even rail at the choices an application makes when it tells us the best route from one location to another. This phenomenon of algorithm aversion hints at a sentiment many of us share, which appears to be a lower-intensity version of technological fear.