This week we’re taking a look at an article that purposes a different way of looking at how practitioners create and influence safety.
I’ve been experimenting with sending these out at different times, how are you liking the weekend editions? I’d love to hear about it, hit reply and let me know.
Gaps in the continuity of care and progress on patient safety
This is an article in the BMJ from March 2000 by Richard Cook, Marta Render, and David Woods.
It provides an interesting way of thinking about technical work and problems that can arise. It also helps by providing a frame of reference from which we can look at events or outcomes that are typically explained away as “human error”.
The idea is that people at the sharp end are very often, perceiving some sort of gap or disconnect in process or procedure. They are using their hard-won expertise and knowledge to bridge those gaps. They’re developing their own mechanisms or habits to be able to cope with the gaps that are often introduced by complexity.
As a result, these gaps (that may continually exist for long periods of time), rarely cause accidents or incidents. That’s because of the bridging these experts are doing. This can help us see that these experts (as we talked a bit about last week) are creating safety at least locally if not beyond.
It’s important though, that we realize that when a gap is bridged by some sort of adaptation or workaround, that doesn’t mean that it goes away. Some of those workarounds are going to be very good; they’re going to last a long time and be very reliable. But, as we know from software, a lot of our workarounds or implementations may be intended to be temporary and are fragile.
Gaps can also come about by some sort of change, especially change in an organization or the technology it uses. When this change occurs, gaps can be created anew or the old gaps that were bridged previously can be altered. If old gaps are changed, then those work arounds that perhaps fit very well prior to the change may no longer work.
Since this article appeared in a medical journal, the authors of course, focus on healthcare. They give an example of the sort of change and the effect it can have on previous bridges as when the work of nurses was divided into two distinct groups.
Nurses would continue to work on some very high-level things that required certification special qualifications for. But the other work they were doing, some monitoring work for example, was then given to “patient care technicians.” The idea here is that patient care technicians, since they don’t need the same level of certification or education, creates a pool of labor that is cheaper to use to accomplish those tasks.
This gives a very big economic benefit to hospitals and other healthcare facilities. But, it creates a situation where nurses may not be able to anticipate gaps in the care of their patients as well. Because they have more patients to care for, in order to anticipate things coming up they’re having to make more complicated inferences about where to spend their attention or how intensively to watch a patient because they’re further distanced from those patients. Also, this limits how the organization or teams are able to restructure work in response to changing demands for attention.
Again, these gaps rarely cause some sort of incident or accident. When the accidents do occur we can say that they result from breakdown in what ever mechanism these practitioners are using to bridge the gaps, or anticipate them, or detect that they even exist.
The authors point out that because these practitioners are so often bridging these gaps, it creates sort of an irony “that stakeholders can attribute failure to human error only because practitioners in these roles usually bridge the gaps and prevent any escalation toward bad consequences for patients.”
The authors give this alternative to a human error explanation:
“that accidents occur because conditions overwhelm or nullify the mechanisms practitioners normally use to detect and bridge gaps. Safety is increased primarily by understand-ing and reinforcing practitioners’ ability to detect and bridge gaps”
As we touched on, these practitioners are creating safety locally. Because of that some more typical responses to try to prevent error is by putting up rails around the system keeping humans out, etc. But because of how they’re creating that safety those sorts of typical responses are not going to work, and perhaps even make things worse, because they are going to limit a practitioner’s ability to see and bridge those gaps. Then new failures could occur or even more frequent forms of failure could occur.
In order to improve that, a good starting point is by understanding the work as those practitioners are experiencing it in their context.
The authors have a quote in this article that I really like and I think it reflects a lot of our experiences in software. Whether it’s when you see an expert who is able to just glance a graph or some scrolling logs or perhaps you’re the expert.
“Work in the real world involves detecting when things have gone awry; discriminating between data and artefact; discarding red herrings; knowing when to abandon approaches that will ultimately become unsuccessful; and reacting smoothly to escalating consequences.”
“It involves recognizing that hazards are approaching; detecting and managing incipient failure; and, when failure cannot be avoided, working to recover from failure.”
That sounds to me like the work of a lot of teams that I’ve seen and gotten to work with. A lot like our instant response or monitoring teams or ops teams.
This article doesn’t detail exact how-to’s on how we can better detect gaps, somewhat because it’s going to vary by field and organization. Although the authors do suggest that this is a good target for research, but also a good place for us as the technical practitioners to be aware of in our own organizations and processes, continually looking for these gaps.
Takeaways:
- Gaps exist in systems and processes and sometimes even people
- These gaps very rarely lead to accidents because practitioners tend to be adept at bridging.
- Bridging a gap does not make it disappear.
- Some bridges or workarounds are reliable others can be fragile.
- Since practitioners are creating safety locally, isolating them from parts of the system can make things worse.
- Looking for gaps can be a beneficial process for organizations and teams seeking improvement.
Subscribe to Resilience Roundup
Subscribe to the newsletter.