Gerstein’s Flirting with Disaster

Marc Gerstein Flirting with Disaster: Why accidents are rarely accidental Union Square Press, 2008

With Michael Ellsberg, and a foreword and afterword by Daniel.

Foreword

Learning from Past Disasters, Preventing Future Ones

I [Daniel Ellsberg] have participated in several major organizational catastrophes.

… The escalation in Vietnam was not the result of a universal failure of foresight among the president’s advisers, or to a lack of authoritative, precise, and urgently expressed warnings against his choice of policy.

Since 1961 … I have viewed the nuclear arms race as an ongoing catastrophe that has to be reveresed, and a situation that has to be understood. I assumed then, and still believe, that understanding the past and present realities is essential to changing them.

A major theme to be gained from this important book is that organizations do not routinely and systematically learn from past errors and disasters – in fact they rarely ever do.

One reaason for this folly is that many aspects of disasters in decision-making are known only within the organization, and not even by many insiders at that.

[The] deliberate decision within organizations not to try to learn internally what has gone wrong constitutes [an] anti-learning mechanism.

[There] is strong and succesful resistance within many organizations to studying or recording past actions leading to catastrophe – because doing so would reveal errors, lies, or even crimes.

Introduction

This book is about disasters. From Chernobyl to Katrina, Challenger to Columbia, BP to Vioxx, and the Iraq War.  Were these – and practically every major catastophe that has befallen us in the past twenty-five years – unforeseen, unavoidable misfortunes that no-one could possibly have imagined? No. All of them, it turns out, were accidents waiting to happen, and many influential people inside saw them as such. The events were not really “accidental” at all.

[They] had long buildups and numerous warning signs. What’s more, they display a startling number of common causes, and the same triumph of misguided intuition over analysis … .

1. The Bystanders among us

Organizational bystanders are individuals who fail to take necessary action even when important threats or opportunities arise.

[This often happens in situations with the following characteristics]:

  • Ambiguous precipitating events.
  • A large number of people observe the event.
  • Failure of others to act.
  • Uncertainty regarding one’s ability to help.
  • Presence of formal authorities or “experts”.

Groupthink and other psychological, social and political phenomena may also be a problem.

Factors promoting bystander behaviour work in concert to maintain the status quo … . [Not] “rocking the boat” often becomes a way of life – even while that boat is going right over Niagra Falls.

2. Human Biases and Distortions

This summarises relevant work, mainly from the mainstream psychology literature. On evolutionary psychology it notes that:

[One] might expect humans to value direct experience, prefer hard facts to soft knowledge, and to put their trust in their own abilities rather than in abstractions while, at the same time, reserving respect for people they might encounter from whom they might acquire knowledge and new skills.

Rather than being generally cautious or risk-seeking, human beings appear to have been programmed by evolution to live with false positives to avoid a disadvantageous false negative, and sometimes the reverse. [We] seek to minimize the more costly error.

While some of the self-protective biases developed over countless evolutionary generations are not always functional in today’s society,  the principle of avoiding the greater mistake still works.

Unfortunately … the greater mistake appears to be the one that affects them personally rather than the one that one that produces the greater good (sic).

3. Understanding Uncertainty

Why did so many people bet against Katrina?

If ever there was “an accident waiting to happen” …

If we are to avoid such colossal mistakes in the future, we must learn to face the probabilities and likelihoods squarely, and be less sanguine that everything will work out okay if we merely follow our instincts.

4. Space Shuttle Challenger

Cold, Warm, and Hot causes of disasters.

5. Chernobyl

Faulty design, and the interplay of humans and technology

6. The Vioxx Disaster and BP

The seduction of profits

7. When all the Backups failed

How American F-15s accidentally shot down two U.S. Army Black Hawks.

8. Butterfly Wings and Stone Heads

How complexity influences catastrophe in policy decisions.

9. The Collapse of Arthur Andersen

The role of organizational culture.

10. When Countries go Bankrupt

The Prisoner’s Dilemma writ large.

11. What Have We Learned? What Can We Do?

   Despite its apparent simplicity, the advice offered here is often difficult to implement. [People] are tempted by short-term gains or coerced by social pressure, and then their risky behaviour is strongly reinforced when they repeatedly get away without incident.

Rules to Live By

  • Understanding the risks you face.
  • Avoid being in denial.
  • Pay attention to weak signals and early warnings.
  • Not to subordinate the chance to avoid catastrophe to other considerations.
  • Don’t delay by waiting for absolute proof or permission to act.

Avoiding the Bigger Mistake

   The principle of avoiding the bigger mistake underlies dealing sensibly with all types of risk.

The Disaster Time Line

Disasters can be partitioned into before, during, and after … .

[There] is usually ample time before any low-probability hazard breaks loose. The problem, of course, is getting people to pay attention so that the lead time can be produced productively.

Don’t squander your early warnings with delays or half measures.

 Real-Time Responses

The most important thing is to think through what’s possible and, most of all, what’s most likely, as well as what we can do and should do in each instance, including the worst-case scenario.

Believe the Facts, not Your Intuition

Some common examples are given, some of which may be surprising.

The most important lesson is that constant vigilance is required in all high-hazard categories.

Take Low-Probability Catastrophic Risks seriously

Understand each threat, its severity, and timing, decide in advance … .

Pay attention to Weak Signals and Near Misses

Perhaps the biggest fallacy associated with the signalling of risk is that certainty of an accident is required in order to take action. On the contrary, taking action on the basis of rational concern before an adverse event occurs is obviously better than preventing harm from the second such accident.

Always pay attention as if the worst had actually occurred, but develop efficient ways of confirming, or disconfirming, the actual danger to minimize your time and effort.

Everyday Life’s Low-Probability Risks

   When the consequences of an adverse event exceed our tolerance … we must fully acknowledge them, along with the costs of full protection. That is often an unpleasant recognition, and is prone to self-deception – and politics.

For very rare events, such as the outbreak of a devastating flu … preparations are a balancing act. [It] is wise to be as self-sufficient as possible [even though] there are som many potential scenarios.

Examine the cumulative risk of all low-probability threats and make your plans according to the rule of avoiding the greater mistake.

Endeavour to be “Risk-Neutral”

Be “Safe at any Speed”

Moving from Bystander to Witness to Whistle-Blower

  Sometimes just “active watching,” visibly taking notes, or writing a concerned e-mail is enough to change the course of a situation.

[Cautious] legal advisers suggest making one’s protests within the chain of command or other legitimate avenues, but then departing the organization on good terms if one’s complaint comes to naught.

12. Advice for Leaders

Virtually all the accidents described in this book occurred in organizational settings.

This book [is] not about … fraudulent enterprises deeply based on deceit, if not deliberate harm.

[Most] people are not harmed by malice but by risk blindness, the failure to see potential harm and imminent danger until it is too late.

We must all take a greater share in preventing unintended consequences.

[Conflicts] between espoused values and actual practice inevitably draw people into loyalty tests and cover-ups when an apparently sensible short-cut invites a catastrophic outcome.

Suggestions for Professionals and Managers

  1. We shouldn’t be bystanders and shouldn’t encourage bystander behaviour in those around us.
  2. We should all do what we can to ensure that dissent is encouraged, not repressed, and that the channels of complaint are open.
  3. We should do what we can to build viable information and reporting systems that widely disseminate risk-relate performance information.
  4. We should not collude in cover-ups, even minor ones.
  5. When there is likely and documentable unacknowledged risk, each of us should assemble our allies and pursue a complaint with the appropriate institutional body.

If all else fails, we should consider blowing the whistle (with documents).

Avoiding Catastrophe is Free

Or at least, cheaper than the alternative.

Suggestions for Leaders

The key to figuring out what to do is realizing that practicalities and shortcuts have costs that inevitably even out in time, and that one’s choice is to either pay now or pay later. (sic)

Unfortunately, unless a crisis erupts, in many organizations traditional organizational performance policies largely ignore safety and ethical risks by focusing on short-term backward-looking financial indicators. Such measures encourage imprudent risk-taking, although it takes exceptional candour to admit it.

… “What gets measured gets managed” is more true today than ever.

[Imposing] nonnegotiable performance objectives combined with severe sanctions for failure encourages the violation of safety rules, reporting distortions, and dangerous shortcuts.

  1. [Be] wary of excessive optimism. [Admitting] mistakes and accepting the need for radical solutions are essential.
  2. In organizational settings, accidents are never “accidental”: They are inevitably the result of faulty management, particularly the management of safety.
  3. Systematize paying attention to near-misses, weak signals and assessments of engineers and safety officials. … Create monitoring systems, systematic review procedures, and independent channels that do not report through the operational chain of command.
  4. [Recognize] that while every organization tolerates some dissent, on certain subjects it does not. Only leaders can eliminate these “undiscussables.” Without adequate protection for naysayers and internal whistle-blowers, widespread bystander behavior is inevitable.
  5. Create effective contingency plans for serious but low-probability risks. [The] combination of situational unfamiliarity, time pressure, and poor information is lethal – it increases the chances of an accident [hugely].
  6. Every organization requires robust, independent watchdogs. Actions that make watchdogs more “client-centre” and “efficient” run serious risks of reducing their effectiveness.
  7. Leadership must subject itself to relentless review and self-criticism.

Conclusion: An Opportunity, Not a Problem

  1. [Accept] the inevitability of accidents and catastrophes without giving in to them.
  2. [Appreciate] the difference between new ideas and unpracticed old ones. .. The later … is often where the bulk of the value actually lies.

Afterword: When the Leaders are The Problem

[Reasonable] people, who are not malicious, and whose intent is not to kill or injure other people, will nonetheless risk killing vast numbers of people. And they will do it predictably, with awareness.

What are the circumstances … ? [When] the potentially disasterous gamble offers the possibility of avoiding loss altogether, coming out even or a little ahead; and when the alternative to taking the gamble assures the certainty of loss in the short run – a loss that impacts the leader personally.

Dave Marsay

%d bloggers like this: