Professional Documents
Culture Documents
02 Human Error
02 Human Error
02 Human Error
To Err Is Human
on
Missi
Goals Event
es
Polici es
ss
Proce s
am
Pr ogr
Initiating
Latent Action
Organizational
Weaknesses
Error
Precursors
Myths of Human Error
Dont Add a
C
O lubricate little extra
o
m the bearing grease m
i m
s i
s s
i Forget to Add the s
o lubricate the wrong i
n bearing grease
o
n
Unintentional
Mechanisms of Thought
There are three basic mechanisms of
thought (Rasmussen):
1. Skill- Based,
2. Rule-Based,
3. Knowledge-Based
These mechanisms span the range from
unconscious to conscious thought
processes.
Mechanisms of Thought
Skill-Based
Skill is the ability to carry out a task;
Skill-based cognitive processing and
performance refers to actions that are automatic
and easy due to an acquired skill.
They usually happen quickly and without
express effort on the part of the actor.
These are unconscious actions that we don't
need to explicitly "think about" in order to
accomplish. Any decisions are usually automatic
as well
Skill-Based
Most training is concerned with skill
development, the end goal being the
development of an automatic process.
Typically the actor needs to understand how to
execute a set of instructions, but not understand
the reasons behind them. Through training, the
actor will become proficient enough--skilled
enough--to perform the actions without the need
of instructions.
Rules-based
Rules-based processing involves
matching the context and problem
currently facing the actor. These rules are
typically of the "if X then Y" form, and can
be based on past experience, explicit
instructions, and so forth.
Rules-based
Rules-based processing comes to play
when an automatic skill fails and the
actor needs to fall back upon a set of
explicit instructions or rules at his
disposal. The actor examines and
interprets the current situation, and
chooses a rule that can best solve the
problem.
Knowledge-based
If rules-based processing doesn't solve the
problem, we fall back on knowledge-based
processing (we tend to prefer rules-based
solutions since they require less cognitive effort
on our part).
This is what happens when we are truly faced
with novel or unfamiliar situations, or where low-
level rules aren't appropriate (e.g. making
strategic decisions, or establishing a medical
diagnosis).
In general, this kind of processing involves the
processing of symbolic information.
Knowledge-based
As with rule-based processing, knowledge-
based processing is a conscious process. It
refers to what we typically think of as "analytic
thought," the process and analysis of
personal subjective knowledge.
Where skill is the ability to carry out a task,
knowledge is the possession of "information,
facts, and understanding" about a task.
(you may know a lot about a task but still not
be able to carry it out.)
Factors affecting thoughts and
actions:
the actual data you get
person's subjective values, from external sources
ethics, attitude, the social climate
Action
Error of Planning
Error of Execution
(Mistake)
To human error
Different approaches
The problem of human error can be
viewed in 2 way:
1. The person approach
2. The system approach
Each has its model of error causation,
and each model gives rise to different
philosophies of error management
Person Approach versus
System Approach
Person approach System approach
Focus on individuals Focus on the conditions
under which individuals
Blaming individuals work (rules,
Methods: poster expectations, work
campaigns, writing metrics, communication,
etc.)
another procedure,
Building defenses to
disciplinary measures, avert errors/poor
threat of litigation, productivity or mitigate
retraining, blaming their effects
and shaming Methods: creating better
systems
Person approach, basis
The long-standing and widespread
tradition of person approach focuses on
the unsafe acts -errors and procedural
violations- of people on the front line.
Person approach, philosophy
This approach views these unsafe acts
as arising primarily from aberrant
mental processes such as
forgetfulness, inattention, poor
motivation, carelessness, negligence,
and recklessness.
People are viewed as free agents
capable of choosing between safe and
unsafe mode of behavior.
If something goes wrong, a person or
group must be responsible.
Person approach: countermeasures to
errors
The associated countermeasures are
directed mainly at reducing unwanted
variability in human behavior.
Posters that appeal to peoples fear,
disciplinary measures, threat of litigation,
retraining, naming, blaming, and
shaming.
Followers of these approaches tend to
treat errors as moral issues, assuming
that bad things happen to bad people-
what have been called the just- world
hypothesis
Person approach, why?
Blaming individuals is emotionally more
satisfying than targeting institutions.
Uncoupling of persons unsafe acts
from any institutional responsibility is in
the interests of managers
Person approach is also legally more
convenience.
Person approach:
shortcomings
Although some unsafe acts in any
sphere are egregious, most are not. in
aviation maintenance about 90% of
quality lapses were judged blameless.
Person approach:
shortcomings
Effective risk management depends crucially
on establishing a reporting culture. Without a
detailed analysis of mishaps, incidents, near
misses and free lessons, we have no way of
uncovering recurrent error traps.
The complete absence of such a reporting
culture contributed crucially into the
Chernobyl disaster.
Trust is a key element of a reporting culture,
and this in turn, requires the existence of a
just culture-where the line should be drawn
between blameless and blameworthy actions.
Person approach:
shortcomings
Focusing on the individual origins of error, isolate
unsafe acts from their system context.
2 important feature of human error tend to be
overlooked:
It is often the best people who make the worst
mistakes- error is not the monopoly of an unfortunate
few
Far from being random, mishaps tend to fall into
recurrent patterns. The same set of circumstances can
provoke similar errors, regardless of the people
involved.
The pursuit of greater safety is seriously
impeded by an approach that does not seek out
and remove the error-provoking properties within
the system
Blame and punishment
Anticipation of blame promotes cover up
Fear of criticism in close calls and near
misses precludes rational analysis of
possible injury precursor mechanisms,
and thus the opportunity for constructive
accident prevention
Quit Complaining
Your Job Could Be Worse
System approach
Humans are fallible and errors are to be
expected, even in the best organizations
Errors are seen as consequences rather
than causes, having their origins not so
much in the perversity of human nature as
in upstream systemic factors.
Work Systems Theory
Tasks
Technology Outcomes
Errors
Performance
Person Misfit? Quality of Care
Satisfaction
Environment injury/illness
Organization
Individual
Characteristics
System approach:countermeasures to
errors
Although we can not change the human
conditions, we can change the
conditions under which the human work.
A central idea is that of system
defenses. All hazardous technologies
posses barriers and safeguards. When
an adverse event occurs, the important
issue is not who blundered, but how and
why the defenses failed.
The Swiss cheese model of how defenses,
barriers, and safeguards may be penetrated by
an accident trajectory
Holes as weaknesses
In defensive layers
The Swiss cheese model of system
accident
Defenses, barriers, and safeguards
occupy a key position in the system
approach.
High technology systems have many
defensive layers: some are engineered,
others rely on people and others
depend on procedures and
administrative controls.
The Swiss cheese model of system
accident
In an ideal word, each defensive layer would be
intact. In reality, they are more like slices of
Swiss cheese, having many holes- although
unlike in the cheese, these holes are
continually opening, shutting, and shifting
their location.
The presence of holes in any one slice does
not normally cause a bad outcome. Usually
this can happen only when the holes in many
layers momentarily line up to permit a
trajectory of accident opportunity- bringing
hazards into damaging contact with victims
The Swiss cheese model of system accident
The holes in the defenses arise
for 2 reasons:
1. Active failures
2. Latent conditions
Two Kinds of Error
Active Error
Latent Error
(leading to latent conditions)
Active Error
An error that occurs at the level of the
frontline operator and whose effects are
felt almost immediately
Latent Error
Errors in the design organization, training,
or maintenance that lead to operator
errors and whose effects typically lie
dormant in the system for lengthy periods
of time
Latent Errors
Adverse consequences which lay dormant within
the system for a long time, only becoming evident
when they combine with other factors to break
through the systems defences
on
Missi
Goals Event
es
Polici es
ss
Proce s
am
Pr ogr
Initiating
Latent Action
Organizational
Weaknesses
Error
Precursors
Active failures in Swiss cheese model
(Leape, 1997)