Steering the Reverberations of Technology Change on Fields of Practice: Laws that Govern Cognitive Work

We’re down to the final stretch, just 3 more issues until the new Resilience Roundup launches, you can get a discount as an early adopter here.


Steering the Reverberations of Technology Change on Fields of Practice: Laws that Govern Cognitive Work

This is a short paper by David Woods about how the introduction of new technologies influences cognitive work and how system designers (like us!) can help support cognitive work.

New technology predictions

New ideas or hypotheses about how technology will change some cognitive work (from adopters) are almost wrong. A consistent pattern emerges instead:

Oversimplifications take place, usually along the lines of some computer system will substitute for a human. The reasoning around this various across predictions or expectations, but they usually boil down to this.

This misunderstanding is so common it has it’s own name, "Substitution Myth"

Substitution Myth

The myth is the idea that a computer and human can exchange (usually the computer taking some of the human’s) work and things will be the same or easier.

This isn’t true, and is harmful, not just because changing the role of the machine changes the work of the human into a new relationship, where the new relationship has it’s own new problems and complexities. It is more harmful still because it leads us the wrong way in our thinking. Instead of building and innovating in ways that acknowledge reality, we’re instead stuck in this trial and error cycle.

The cycle goes something like this:

  • New technologies create new capabilities.
  • This creates new complexities (especially when the new tech is implemented clumsily).
  • In response, the sharp end adapts or works around the new complexities. They are responsible for meeting performance goals, so they must find some way to do so.
  • Adaptations occur that were not considered in the original design.
  • As a result, failures sometimes "break through" the adaptations, since they may brittle or incomplete.
  • Finally, those adaptations and workarounds have hidden the complexity from those who then look back and deem the failures to be "human error."

New technology leads to different work not less work

New technologies and new ways of working (like organizational changes) create cycles of transformation then adaptation.

This transformation is rarely what was intended though. Transformations are often predicted to make operators’ lives easier or simplify their work, but this is rarely the case.

Instead, the new technology creates a new way of working at an increased pace. This leads us to the Law of Stretched Systems:

Law of Stretched Systems

every system is stretched to operate at its capacity; as soon as there is some improvement, for example in the form of new technology, it will be exploited to achieve a new intensity and tempo of activity.

Laws of cognitive work

One way of breaking out of the trial and error cycle caused by the Substitution Myth is to develop guidelines or "laws" of cognitive work that would guide system development.

Woods doesn’t provide exact laws here, but instead four families of laws.

  1. Laws of Adaptation This family looks at "how cognitive systems adapt to the potential for surprise"

  2. Laws of Models This family is concerned with how we understand the world and "the mystery of how expertise is tuned to the future, while paradoxically, the data available is about the past."

  3. Laws of Collaboration This family is about how work is distributed across multiple people or machines, etc… and is social and not solitary.

  4. Laws of Responsibility People do cognitive wok for human purposes which changes human things. We have a responsibility to keep this in mind, that we are affecting things for humans, not just machines. Machines are simply a part of how we create change.

Additionally Woods includes Norbert’s Contrast as a guideline:

Artificial agents are literal minded and disconnected from the world, while human agents are context sensitive and have a stake in outcomes.

Though these are "laws," they can, to a degree be ignored. Nothing makes system designers pay attention to them, but that doesn’t change the consequences of the laws.

Takeaways

  • Predictions of how technology will make operators’ work easier or more simple rarely come to pass.
    • Instead, the improvement goes towards more work or a faster pace, as the Law of Stretched Systems tell us.
  • The Substitution Myth, the idea that some form of automation can take some part of a human’s work, is a common myth that helps to create a trial and error cycle of system development instead of one that reflects reality.
    • When some work is reallocated between the cooperation of human and machine, the relationship changes and a new thing, together is formed that has it’s own challenges, it’s own complexities.
  • One way to help avoid these traps would be to develop some guidelines or "laws" of how cognitive work actually occurs:
    • Laws of Adaptation that consider "how cognitive systems adapt to the potential for surprise."
    • Laws of Models that address how we understand the world and expertise.
    • Laws of Collaboration that remind us that cognitive work is distributed across multiple people and/or machines and is social.
    • Laws of Responsibility to remind us that though systems may be composed in part of machines and automation, they are ultimately for human purpose, affecting humans.

Get resilience engineering analysis in your inbox: