A Rose by Any Other Name...Would Probably Be Given an Acronym

Ever felt like there were just too many different terms or acronyms to learn? This week we’re taking a look at a paper that helps organize them and find some commonalities

A Rose by Any Other Name…Would Probably Be Given an Acronym

This is a paper by Robert R. Hoffman, Paul J. Feltovich, and Kenneth M. Ford, David D. Woods, Gary Klein, and Anne Feltovich.

They use the naming scheme for plants as a way to frame the discussion of designing complex sociotechnical systems.

They begin at the “root” and examine a number of different phrases, many of which have developed their own acronyms, that refer to various ways of exploring, designing, and building systems to help humans with cognitive work.

Since everything comes from a common root, it all starts with “Rosaceae Cogitationis Multiflorae,” which can be seen as the same genus.

We thus have differently hued variants of the same variety of rose. They are all rooted in the same soil. All drink the same water. All reach toward the same light. To turn a phrase, ex uno plura. From one comes many.

The authors emphasize that the point of the discussion is not to answer “which term is right,” but to understand a bit about the commonalities between them and why they were created.

Some of them were created because it was felt that previous terms did not cover what the creators were trying to get at or because they felt that there was some new concept.

Typically though, most of the terms discussed don’t come out of something actually new.

They define 4 other types, I’ll cover a few here.

The authors help us wade through the “acronym soup” that has been generated to describe fairly similar engineering pursuits that have been created from different communities or specialties.

Traditionum Contrarium

This is the bucket for terms that are in reaction to another (less preferable to its creator) term. This includes thing like “human-centered systems,” (HCS) which has been used to describe everything from a program at Cornell to funding programs for system development.

One reason for this is as a reaction to the idea that computer systems have typically been designed around the technical requirements of the machine and not the user, something you could call “technology centered design” (TCS). Despite this, the authors point out that HCS is really still technology driven, where the concern for the human user is an add-on, not “the primary engine of change”.

“Participatory design” is also an attempt to create an alternative to TCD. Lessons here come from a variety of places including the study of large scale accidents like Three Mile Island

Human-Centered Systems must complement humans and are not intended to imitate or replace them, as the Turing model for AI would have us believe

To understand the theme of this “branch” of the family, the authors contrast the Fits List with David Woods' analysis. If you’ve seen John Allspaw’s Monitorama talk from this year, you’ve seen this contrast.


Are constrained in that Need people to
Sensitivity to context is low and is ontology-limited Keep them aligned to the context
Sensitivity to change is low and recognition of anomaly is ontology-limited Keep them stable given the variability and change inherent in the world
Adaptability to change is low and is ontology-limited Repair their ontologies
They are not “aware” of the fact that the model of the world is itself in the world Keep the model aligned with the world


Are not limited in that Yet they create machines to
Sensitivity to context is high and is knowledge- and attention-driven Help them stay informed of ongoing events
Help them stay informed of ongoing events Help them align and repair their perceptions because they rely on
mediated stimuli
Adaptability to change is high and is goal-driven Affect positive change following situation change
They are aware of the fact that the model of the world is itself in the world Computationally instantiate their models of the world

Urgentis Paniculae

This “urgent panic” section is where terms the idea that the pace of technological change is overwhelming or moving too fast lives.

A great deal of concern has been expressed over an imminent potential disaster when people (more or less poorly trained) are confronted with new and highly complex technologies (more or less human-centered) that themselves run new and highly complex systems (for example, ships to be manned by only 90 people).

Foci Explicationis

Terms that are for a certain focus for empirical analysis live in this family.

This includes thing like investigating how systems can support various psychological faculties, like “Decision-Centered Design” (DCD) and also looking at difference between individuals as in “User-Centered Design”.


Ultimately, the authors conclude that most of these approaches are quite similar. They all have several commonalities:

  • The goal
  • The systems stance
  • The cognitive processes that are the focus of analysis
  • Method (analytical and evaluative)

This makes it possible for the authors to build a concept map that places all the terms and acronyms inside a framework that makes sense.


  • Many of the approaches and terms have a lot in common
  • The idea that automation should do what machines are good at and leave the rest to humans doesn’t really hold up and is not a new idea
  • Its important to consider not the human vs machine or where the machine is “better,” but how they can work together best as a single unit
  • Its more useful to understand these commonalities than it is to worry about one which specific term (or acronym) is “best”
← The strengths and limitations of teams for detecting problems
Can We Trust Best Practices? →

Subscribe to Resilience Roundup

Subscribe to the newsletter.