Learning from
“Near Miss” Incidents and the “loss of awareness argument”
Or
The “crying
wolf” Story:
Maybe the
Villagers’ should have asked more questions.
Arash
Azadegan, PhD and Andryi Petronchak, MBA
This
week’s train derailment in New York City caused us to refocus on the topic of near
misses. Reports suggest that the driver of the Metro-North Railroad commuter
train “experienced loss of awareness”. Labelled as an episode of “highway hypnosis”, this
particular accident killed four passengers, injured many others and created a
havoc in the area’s rail system for days[i].
We have
all had a similar situation happen to us. While driving a car, the cell phone starts
to ring. The number on the cell phone screen is too important to not answer the
call. We know that it’s wrong to answer the call while driving in high speeds, it
may be unlawful, and it clearly is dangerous!
The car
may have drifted into the other lane - slightly. We may have gotten scared for
a few milliseconds, put the phone away, slowed down, and felt a moment of guilt.
This was our version of a “momentary loss of awareness”. But it ended up as a
faultless near-miss. Nothing happened, right? Even if it did, it would be causing
damage to our own lives. We forgive ourselves, and go back to the norms of our
lives. We may soon forget about the whole thing.
Now
consider this: We are driving a vehicle at three hundred miles per hour. There
are a couple of hundred people in our vehicle. Here, a “momentary loss of
awareness” may have more serious consequences. It may mean derailment. If it does
end up there, it would cause damage to other’s lives and livelihood. In that
case, things are different. We can’t forgive ourselves, and we can’t got back
to the norm. We may never forget about it – ever.
The
hypothetical “distance” between guilt and no guilt, lost lives and kept lives
are all decided during those small few seconds. These types of critical seconds
don’t just happen on Sunday mornings on New York’s rail cars. They are what commercial
pilots, cruise ship captains, hi-speed train drivers and other professions with
people’s lives in their hands operate in every hour of every work day. Added up, the social structure of our
transportation system, service delivery system, and the chain of events supporting
them deals with many many hundreds of these critical seconds every week. And
why is it that we do not learn from these many hundreds and thousands of hours
of experience? Why is it that near-misses do not provide enough of a basis to
avoid the real catastrophe?
A near
miss is an incident that does not cause a physical harm, sickness, or
property damage, but has a high probability of serious impact on either people
or material assets. In other words, “near miss” is a variation in a normal
process that, if continued, could have a negative effect on people or valuables
of a different kind. Some call it a “false alarm”. Often a lucky interruption
in the sequence of mishaps during a near miss leads to the damage not taking hold.
Near misses can be an effective source of input for organizational learning. The
downside costs of the damage are not present. We have a near-miss each time we
reach over to re-program the GPS device and the car swerves over to the next
lane.
In
“The psychology of the Near Miss” R.L. Reid [ii] gives an interesting
definition of the term from the perspective of the game and gambling industry.
“A near miss is
a special kind of failure to reach a goal, one that comes close to being
successful. A shot at a target is said to hit the mark, or to be a near miss,
or to go wide. In a game of skill, like shooting, a near miss gives useful
feedback and encourages the player by indicating that success may be within
reach. By contrast, in games of pure chance, such as lotteries and slot machine
games, it gives no information that could be used by a player to increase the likelihood
of future success”.
Reed
suggests that how useful a “near miss” is can depend on how the information is
collected, processes and concluded. Too many “near misses” go undetected because
the systems are not in place to look for them. Often we consider them as a
(un)lucky turn of events, that as Reed suggests, “give no information to
increase the likelihood of success”. But running an operation is not about pure
chance. Operating a car, a train or a company is more like a game of skill than
a game of chance. So learning from near-misses should be part of the process of
getting better.
So the
question again comes up? Why do we have an issue learning from near-misses or
even from false alarms? Why do companies (just like operators, pilots and even
auto drivers) fail to incorporate these learnings into their processes? There
are several possible reasons:
First,
reporting any problem (personal or system related) can be difficult. Thinking,
picturing, talking and exchanging hypothetical scenarios about near misses
tends to raise the emotional flare-ups of past accidents and the uncomfortable
memories that accompanied them. Of course, there is also the blame-game. Often
the person raising an issue is among those who is blamed for the cause or
volunteered to fix it. For many over-worked members of an organization, it’s
best to not report anything!
Second,
there is more to learning from near-misses than reporting them. By emphasizing
on the importance of “near misses”, we don’t just encourage to report all the
“near misses” that take place in the organization. That information also has to
make sense from operational and occupational safety perspective.
Third,
how to learn from and how to allocate resource to the reporting of near-misses
can be debatable. How to properly interpret the incidents, and incorporate the
conclusions into a risk management practices can be complicated. “Digesting” the information the ultimate purpose
of almost every business process, but each business has to find its own way of
processing the information.
Fourth,
by reporting too many minor near-misses we risk to devalue such information.
Processing their real message becomes harder as we “overload” the system. With too much noise, the real possible message
gets lost. Ironically, we create another type of “loss of awareness” of what
causes accidents. But this time instead of loss of awareness by the operator,
in the entire system is lost in the details.
Remember
the boy from the folktale, screaming “Wolf!” to fool the villagers and amuse
himself? When the real wolf attacked the boy, nobody took his scream for help
seriously because of the people’s immunity to his annoying numerous jokes. In
his blog titled “Public warnings--why crying wolf is downright bad” [iii], Gerald Baron writes
about the potential harm from over-reporting.
He notes how the public were weary of the “crying wolves” in Italy over
predicting possibilities of earthquake. A spokesperson is quoted to say: “If
the risk is between zero and 40%, today they will tell us it’s 40, even if they
think it is closer to zero. They’re protecting themselves, which is perfectly
understandable.” In other words, “crying wolf” is merely a way for authorities
to protect their jobs rather than people’s welfare. The risk here is when there is a real 40%
possibility earthquake. If the public translates the message as one with a
close to zero likelihood, based on the previous false alarms, Large numbers of
people will be caught off-guard and surprised by the actual event while in
their state of denial.
So if it is so difficult to learn from
near-misses may be we are better off to forget about them. Let’s look at this
option:
What would ignoring near-misses do to the organization’s culture,
behavior and learning? Could it be that by ignoring near misses we create a breeding
ground for becoming desensitized to recognizing causes of accidents? Maybe not enough attention to near misses
leads to “loss of awareness” by the entire organization would be the result?
Back
to the “crying wolf” story, the villager’s strategy was to “not” learn from
possible false alarms, because they were not providing any useful information. Interestingly,
what the crying wolf story doesn’t include [the versions we could get our hands
on] is the villagers asking any questions about details that may have suggested
as to whether the boy was telling the truth or not – in any of the events. There
was no investigation to the near-miss, the boy or the wolf!
So
was the fault with the boy or with the villagers? While the wolf was clearly
harmful, the boy was actually helpful – admittedly some of the time! So, maybe we need to give the “boy who cried
wolf” a break, and consider the downsides of not listening to him, even if the
message was sometimes incorrect.
These days, the
problem with handling near-misses might not be in dealing with “too much
reporting”, but in having the static, permanent, prone-to-bias “processing
centers” responsible for delivering solutions and making decisions. These
“processing centers” (safety committee, activist group, flight crew, members of
a household, etc.) may have the leadership and authority built on rotational basis
to avoid biased decisions. Every member of the organization should feel
responsible for the safety of others.
Having
a healthy combination of an active safety-related data collection and
intelligent filtering of this information inflow may create an adaptive yet
sensitive mechanism of responding to early warning signs in each organization
that concerned about safety matters.
Let’s
go back to our “crying wolf” example one more time. This time we can try to
retell the story from the “near miss” perspective. The villagers verified the boy’s first call for help, found out where wolf came from, and
reinforced the young sheppard with a guard dog with all the canine’s keen
senses (!). In a modern version of the tale, there would even be wolf-sensoring
electronic devices placed around the area!
What
we are trying to say that each concerned organization must establish the system
that is sensitive to warnings and is able to distinguish the false alert from
legitimate alert. Organizational structure must foster responsible reporting of
near misses by showing the benefits of such process in a form of displaying
days without accident and/or monetary savings to the company budget that
materialized because of accident prevention “near miss” reporting. We think
that communicating of lessons that were learned “the hard way” (the powerful
“Remember Charlie?” safety video, for example) would deliver a clear message
about unparalleled value of accident prevention and “near miss” reporting.
This
blog suggests for a reorientation: Events like the one in New York Metro-North
Railroad have almost always put the focus on what DID happen. Case in point: A few dozen NTSB
professionals are looking into the “cause” of the accident as we speak. But maybe
the event should make us think about the times when it did “NOT” happen – or
when we had the “near-miss”.
Often times is a fine line between a near miss and
the “real-miss”, what DID and did NOT happen. Often times, that fine line also
tells us about passengers saved and passengers lost. More thorough collection,
analysis and reporting of the near misses” is necessary. Afterall, it is too
late to start looking for clues after the derailment if we really want to save
lives.
Notes: