Introduction: The Ostrich Paradox
When dawn broke on the morning of September 8, 1900, the people of Galveston had no inkling of the disaster that was about to befall them. The thickening clouds and rising surf hinted that a storm was on the way, but few were worried. The local weather bureau office, for its part, gave no reason to think otherwise; no urgent warnings were issued, no calls were made to evacuate. But by late afternoon it became clear that this was no ordinary storm. Hurricane-force winds of more than 100 miles per hour were soon raking the city, driving a massive storm surge that devoured almost everything in its path. Many tried to flee, but it was too late. By the next day, more than 8,000 people were dead, the greatest loss of life from a natural disaster in US history.
Fast-forward to September 2008, when Hurricane Ike threatened the same part of the Texas coast, but this time being greeted by a well-informed populace. Ike had been under constant surveillance by satellites, aircraft reconnaissance, and land-based radar for more than a week, with the news media blasting a nonstop cacophony of reports and warnings, urging those in coastal areas to leave. The city of Galveston was also well prepared: A 17-foot-high seawall that had been constructed after the 1900 storm stood ready to protect the city, and government-flood insurance policies were available to residents who were at risk of property loss. Unlike in 1900, Texas residents really should have had little reason to fear. On their side was a century of advances in meteorology, engineering, and economics designed to ensure that Ike would, indeed, pass as a forgettable summer storm.
But for some reason it didnt quite work out that way. Warnings were issued, but many in low-lying coastal communities ignored themeven when told that failing to heed the warnings meant they faced certain death. In the end, Ike caused more than $14 billion in property damage and 100 deaths, almost all of it needless.
Why Are We Underprepared for Disasters?
The gap between protective technology and protective action illustrated by the losses in Hurricane Ike is, of course, hardly limited to Galveston or to hurricanes. While our ability to foresee and protect against natural catastrophes has increased dramatically over the course of the past century, it has done little to reduce material losses from such events.
Rather than seeing decreases in damage and fatalities due to the aid of science, weve instead seen the worldwide economic cost and impact on peoples lives from hazards increase exponentially through the early twenty-first century, with five of the 10 costliest natural disasters in history with respect to property damage occurring since 2005. While scientific and technological advances have allowed deaths to decrease on average, horrific calamities still occur, as in the case of the 230,000 people estimated to have lost their lives in the 2004 Indian Ocean earthquake and tsunami, the 87,000 who died in the 2008 Sichuan earthquake in China, the 160,000 who lost their lives in Haiti from an earthquake in 2010, and the 8,000 fatalities that occurred in the 2015 Nepalese earthquake. Even in the United States, Hurricane Katrina in 2005 caused more than 1,800 fatalities, making it the third most deadly such storm in US history.
The purpose of this book is to explain this disconnect and to propose a solution. In part 1, we explore six reasons that individuals, communities, and institutions often underinvest in protection against low-probability, high-consequence events. In each chapter, we explore a specific bias that foils our ability to make good decisions in these types of situations. To illustrate the tragic shortcomings of our mind-sets, we share tragic stories from global disasters. These are the stories that motivated us to write this book and to offer a new approach to preparedness planning that will help to prevent such tragedies.
In part 2, building on this foundation, we describe how knowledge of these biases can be used to anticipate the kinds of errors that occur when people are faced with potential disasters, and how we might avoid those errors. Our approach to preparedness planning that provides individuals, firms, and policy makers with the means to anticipate the cognitive biases that often impede risk preparedness, so as to guide the design of more effective tactics that save lives and protect resources.
This new approach, the behavioral risk audit, seeks to reverse the traditional mind-set used when policies for protection are designed. Rather than proposing economic or engineering solutions to preparedness, and hoping that people will adopt them, the behavioral risk audit starts with an understanding of the psychological biases that inhibit adoption, and then proposes policies that work with, rather than against, our natural psychologies. As such, the intellectual foundation of the approach lies in the social sciences, notably behavioral economics and psychology, not engineering or natural science.