The Logic of Failure

Today we’re revisiting that issue of the Philosophical Transactions of The Royal Society from 1990 that we’ve talked about a few times here.

As Dr. Richard Cook pointed out previously, this whole issue was dedicated to Human Factors in Hazardous Situations.


The Logic of Failure
Today we’ll be looking over “The logic of failure,” a paper by Dietrich Dörner that he eventually turned into a book with the same title.

Here in, this paper, Dörner describes an experiment where two groups participated in a computer simulation. The simulation described an area of Africa, the Moro, inside the state of Burkina Faso.

The process was that over the course of 20 years, at the beginning of each year, you could “ask questions” and get information about the state of the area across a number of variables including wells, cattle, grass, rainfall, population, crops, and birth rate. Once you’d asked what you wanted to know, you could choose a number of interventions, for example, sinking more wells. At the end of 20 years, the variables would help reveal the health of the region. Was there water? Cattle? Arable land?

Dörner then looked at the outcome at the end of the period, asking questions like who was successful in caring for the area and what approach did those who were successful take?. What about those who weren’t successful? What approach did they take? What can we learn from this.

The two groups were German and Swiss executives and an unspecified type and number of students. While this is good to know, I think the focus on which group is which, isn’t really very important. Sure, it turns out the executives were the predominantly more successful group, but the lesson here isn’t “be more like an executive”. The point here is to learn from what that group may have done differently.

The overall lesson here, the most important lesson is the idea of matching your mode of thinking to the type of situation at hand. Dörner believes this is teachable as evidenced by a single group having that ability, though he doesn’t really leave us with any advice here.

Just being aware of this notion, of “fit” between your planning and thinking and the situation at hand, can be valuable. One could apply this by increasingly sort of asking myself what mode I’m in; a sort of mindfulness of mode practice. First trying to more consciously observe the mode, whereas later make interventions to your thinking.

This is the overall recommendation of Dörner: to develop what he calls “strategic flexibility of thinking,” where one first examines the situation and then employs the correct type of planning. Dörner does not leave us with any specific recommendations on how to develop this thinking, though he does say that he believes it is learnable based on the evidence of the successful group and does say that “the executives do not know any more than the students they simply know how to adapt themselves to this kind of situation.”

Another point Dörner makes here is that often times, the less successful group would make interventions, but then rarely or never follow up on them. This gap between the amount of intervention and the later checkup became more drastic as the conditions in the Monro got worse. So the worse things got, the less people would follow up on their previous actions.

Dörner also describes the sequence of events from asking questions to acting. Unsuccessful participants typically would ask a question and perform an action as a result, ask a question and perform an action as a result, and so on. Dörner calls this “ballistic action”.

The most effective people in the simulation did not act immediately after gaining information instead they asked several questions gained information acted and then followed up to see the results and adjusted as necessary.

Dörner goes on to describe a few different “errors” and “failures” in thinking that the unsuccessful group may have performed in their thinking.

Dörner describes 4 general situations and the planning and decision making process that he feels is best suited for each:

  1. You have no control or perception of the variables, outcomes appear or are random. Recommendation: don’t plan at all. Planning would be pointless.
  2. Low time pressure. Recommendation: spend more time planning, specifically trying to consider the long term effects in the system.
  3. Extreme time pressure or potentially complete loss. Recommendation: perform “risky” planning, bet it all on one solution.
  4. If all potential mistakes are reversible or otherwise easily fixable, recommendation: plan little to not all and simply act.

Dörner also describes a number of what he calls “modes of faulty Behavior in coping with complex systems”

  • Insufficient goal elaboration
  • Only working on the most immediate or obvious of problems
  • Insufficient formulation of hypotheses about the structure of the system
  • Not considering side effects or long term results
  • Insufficient ideas about the behavior of the system in time
  • Not considering that the system can change on its own, that it is dynamic
  • Insufficient coordination of different measures
  • Making too many or conflicting adjustments
  • Ballistic action
  • Not noticing that a strategy or hypothesis may be wrong
  • No self reflection
  • Not adjusting hypotheses or strategies

I don’t find the list and focus he places on these especially helpful since there’s no real advice on how to detect these modes in the moment and seems to be primarily derived from hindsight. I’m reminded of what Sidney Dekker calls the viewpoint outside the tunnel, where everything is clear, but wasn’t to the people acting inside the tunnel.

Dörner goes on to attempt to explore the psychological reasons the mistakes he describes occur. This is unsurprising as Dörner is a psychologist himself. Similar to the local rationality principle, he points out that even though the behavior is not effective for coping with a complex system it serves other purposes.

Dörner focuses primarily on two purposes, limiting cognitive work: which manifests as not planning as much, not collecting as much information not generating as many hypotheses, or assuming linear behavior in a system. Secondly, the tendency of people to want to “guard one’s feeling of competence”.

He also touches on the idea that some of the behaviors may simply be a result of forgetting or assigning more importance or urgency to problems that are clearly visible as opposed to those that may be uncertain in the future. As he sums it up succinctly: “subjects deal with the problems they have, not with those they do not have.”

On a funny note: looking more into the book he wrote from this, I also have found my favorite Amazon review, a three star review that criticizes the book saying it “is hindered by becoming largely accepted wisdom”. I can only hope to someday have such a work “hindered”!

← The contribution of a latent human failures to the breakdown of complex systems
The high reliability organization perspective →

Subscribe to Resilience Roundup

Subscribe to the newsletter.