The Ostrich Paradox: Why We Underprepare for Disasters

David Oliver Kasdan (Department of Public Administration, Sungkyunkwan University, Seoul, South Korea)

Disaster Prevention and Management

ISSN: 0965-3562

Article publication date: 4 June 2018

750

Citation

Kasdan, D.O. (2018), "The Ostrich Paradox: Why We Underprepare for Disasters", Disaster Prevention and Management, Vol. 27 No. 3, pp. 360-362. https://doi.org/10.1108/DPM-06-2018-304

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited


The Ostrich Paradox provoked mixed emotions for me in the academic sense. On the one hand, I was heartened to see a solid example of the convergence of behavioral economics and disaster management being realized, because it promises to yield significant gains for the safety and sustainability of society. On the other hand, I was frustrated that Professors Meyer and Kunreuther had beaten me to the punch in publishing on a subject that had been preoccupying me for the past few years. Thankfully—for my research agenda—their slim volume serves as a provoking start to imputing behavioral economics in disaster management. While Professor Kunreuther has been writing around the subjects for many years in his research on insurance risk (e.g. Camerer and Kunreuther, 1989; Kunreuther et al., 1978; Slovic et al., 1974) and Professor Meyer has approached it from the consumer psychology and risk background, there is still much to be examined, tested and implemented at the intersection of behavioral economics and disaster management before the matter is settled.

Before working through the contents of the book, a brief explanation of the intriguing title is in order. In fact, the authors give a very light treatment of what the “Ostrich Paradox” actually is. They contend that “while ostriches are often characterized as hapless birds who bury their heads in the sand whenever danger approaches, that are, in fact, highly astute escape artists, birds who use their great speed to overcome their inability to fly” (p. 4). The metaphor is applied to our behavioral biases in the face of disaster, insofar as we tend to make choices for emergency management that are akin to burying our heads in the sand. We should, Meyer and Kunreuther argue, recognize our cognitive proclivities that limit choices in disaster management and apply behavioral economics to overcome these shortfalls. Such an approach would capitalize on our astute escape artistry, hence their encouragement to, “be more, not less, like ostriches—hence the paradox—if we are to be better prepared for disaster” (p. 4, italics in original).

My take on the paradox is that we are well aware of the limitations of our thinking, yet we are unable to overcome them because of behavioral, political, economic and other constraints on the approach to disaster management that have been reified for decades. A better way is available, but we are stuck on a path that has been so well-trod that it has formed a rut of traditional disaster management that is nigh impossible to escape. The UN is now promoting a more enlightened tone and broad prescriptions for behaviorally informed disaster management (e.g. The Global Assessment Report on Disaster Risk Reduction series), but the take-up has been slow as policy change includes risks that—according to some calculus—may be greater than the risk of losses that are admittedly included in the calculus for the old way of doing things. In other words, we have yet to try the ostrich’s alternative tactics because burying our head in the sand is the instinctive default.

In just over 100 pages, Meyer and Kunreuther cover a lot of the behavioral excuses that have been used to rationalize the underpreparation of disasters, as well as offering a generalized means of countering those biases in disaster risk management. The first part of the book explains some foundational biases from behavioral economics and social psychology that are built from the System 1 to System 2 paradigm of decision making (Kahneman, 2011). The authors extend the shortfall of our human decision-making systems to the context of unfamiliar, risk-laden situations where both our instinctive heuristics (System 1) and controlled focus (System 2) fail to guide us to optimal outcomes. The failure to reason correctly is due to the “six core biases”: myopia, amnesia, optimism, inertia, simplification and herding. Each bias is dedicated a chapter, where the authors provide an example, the conceptual definition and a critical analysis as to why this bias persists in disaster mismanagement despite knowledge to the better. For those who are unfamiliar with behavioral economics, the explanations are clear and well-contextualized in real cases from recent history.

The second part of the book offers the practical contribution for emergency managers in the “Behavioral Risk Audit.” Starting from the dire premise that “we are not all that optimistic about the ability of people to make good protective decisions when faced with low-probability, high-consequence events” (p. 71). Meyer and Kunreuther then describe a four-step analysis of a hazard context to help mitigate risk. They argue that most disaster managers begin their work by considering the hazard itself, whereas the more enlightened approach should be to assess the biases that affect the actors and the potential for people to not behave rationally. This behavioral risk audit should enable disaster managers to employ choice architecture—what is now popularly known as “nudges”—to subtly encourage people to make the more optimal decisions about disaster mitigation and preparedness.

They apply the audit to the situation of the US National Flood Insurance Program (NFIP) which, coincidental to the publishing of this book, has been under increased scrutiny after the flooding of Houston in 2017 from Hurricane Harvey confirmed that the NFIP’s intentions have been twisted from protecting people from disaster risk to encouraging disastrously risky development behaviors. While the audit is perfectly sensible and makes a clear case for recognizing counterproductive biases in disaster risk management, its direct application to improved policy is still a bit of a reach.

The book concludes with a call for proactive mitigation strategies that integrate foils to our biases against adequately preparing for disaster, yet the authors acknowledge the usual political, economic and social blockades to making meaningful change in our behaviors. Meyer and Kunreuther “propose four guiding principles as an umbrella for how societies should approach the management of long-term risk” (p. 93) and apply them in the case of rising sea levels. In this instance, they have well-formulated policy solutions that could be effective in significantly mitigating disaster risk, given an environment that is favorable to our cognitive limitations. Yet, they acknowledge the difficulty in overcoming the biases and the predominantly reactive perspective to disaster management. Indeed, the problem with getting the ostrich to run is that she rarely pulls her head out of the sand to find a direction.

Commentary on David Kasdan’s Book Review of The Ostrich Paradox

David Kasdan has summarized the key points in The Ostrich Paradox extremely well in an interesting and concise manner. As he points out, the motivation for our documenting these six systematic biases was to recognize that we all have a tendency to use them when we making choices. They normally are not very costly for most of the decisions we make on a daily basis. However, for disasters, which fortunately are low-probability events, we generally do not prepare for them until it is too late, to a large extent because of these biases.

We are myopic and have overly short future time horizons when appraising immediate costs and the potential benefits of protective investments. We exhibit amnesia by tending to forget too quickly the lessons of past disasters. We are optimistic so we underestimate the likelihood that losses will occur from future hazards. Inertia leads us to maintain the status quo when there is uncertainty about the potential benefits of investing in alternative protective measures. We like to simplify the situation when making choices involving risk. We often assume that the risk that we should pay attention is beyond our threshold level of concern and hence do not prepare for a potential disaster. Finally, a herd mentality leads us to make choices by observing the actions of our friends and neighbors, who may also be just as ill-informed about the risk as we are. Taken together these six biases imply that people will tend to underprepare for disasters.

As Kasdan notes in his review, our main reason for documenting these biases is to develop a behavioral risk audit that recognizes that we should accept the fact that people will continue to make choices in the ways just highlighted. The behavioral risk audit is designed to develop strategies that work with rather than against people’s risk perceptions and natural decision biases by using the principles of choice architecture so one frames alternatives in ways that lead individuals to pay attention to the risk. When risk communication is combined with short-term economic incentives, individuals are likely to consider investing in protective measures that reduce the potential consequences of future.

To illustrate how one might apply these principles to address each of these biases consider the problem of getting homeowners to invest in protective measures against a natural disaster. To deal with myopia spread the high upfront costs of a loss reduction measure by providing a long-term loan, so that the annual costs of investing are relatively small. Amnesia could be addressed by insurers offering multi-year policies, so that a person has less incentive to cancel his insurance policy after not experiencing a loss. Optimism can be addressed by stretching the time horizon, so that a person who ignores a flood that has a 1 in 100 chance of occurring next year, now learns that the likelihood of at least one flood in the next 25 years is greater than 1 in 5. To deal with inertia homeowners could be informed that a flood insurance policy will automatically be part of their homeowner’s policy, but they could decide to cancel it if they felt they did not want insurance and they will want to maintain this default option. Given that individuals like to simplify their decision process, focus on a worst case scenario should they experience a disaster where they have not invested in protective measures. The herd mentality can be addressed by trying to establish social norms on why one wants to prepare by placing seals of approvals on homes that have invested in risk reducing measures.

David Kasdan appropriately highlights the importance of utilizing a behavioral audit to improve the decision-making process. The challenge we face in the area of low-probability, high-consequence events is to get the key stakeholders such as real estate agents, banks and financial institutions, insurers, developers and public officials to treat problems such as disasters as important enough to be high enough on their agenda. They are likely to exhibit the same systematic biases as those who are at risk and we need to get them to pay attention now.

The Wharton Risk Management and Decision Processes Center at the University of Pennsylvania is addressing this problem via a Policy Incubator so as to bring key decision makers together in a neutral forum to design strategies that they feel have a good chance of being implemented and to use the behavioral risk audit as a guide for getting humans to prepare for disasters in the smart ways that ostriches do.

Howard Kunreuther and Robert Meyer

References

Camerer, C.F. and Kunreuther, H. (1989), “Decision processes for low probability events: policy implications”, Journal of Policy Analysis and Management, Vol. 8 No. 4, pp. 565-592.

Kahneman, D. (2011), Thinking, Fast and Slow, Farrar, Straus and Giroux, New York, NY.

Kunreuther, H., Ginsberg, R., Miller, L., Sagi, P., Slovic, P., Borkan, B. and Katz, N. (1978), Disaster Insurance Protection: Public Policy Lessons, Wiley, New York, NY.

Slovic, P., Kunreuther, H. and White, G.F. (1974), Decision Processes, Rationality and Adjustment to Natural Hazards, Earthscan Publications.

Related articles