20th European Annual Conference on Human Decision Making and

20th European Annual Conference on Human Decision Making and Manual Control
DECISION-MAKING M O D E L S AND CLASSIFICATIONS O F
HUMANS DURING CRITICAL EVENTS
Machteid VAN DER VLUGT
Peter A. WIERINGA
Delft University of Technology, Department Design & Engineering,
Man-Machine Systems
Mekelweg 2, 2628 CD Delft, The Netherlands
m. vandervlugt @ wbmt.tudelft.nl
ABSTRACT
It is impossible to create hundred per cent safe organisations, in the end critical events are unavoidable.
Since one cannot to avoid critical events, it seems useful to support humans in case they face complex
decisions during critical events. It is assumed the existing approaches don't cover decision-making during
critical events, because it appears unpredictable. Therefore, complementary knowledge about decisionmaking in critical circumstances is necessary. A review is given of some of the most used models on
behaviour during critical events. The impression is that discussed approaches aim at extending expectable
circumstances based on examined failures, near failures and accidents. The information processing and
adaptive behaviour approaches point at developing better user-interfaces, expert-, decision support-, and
alarm systems, whereas the organisational approach aims at all layers of the organisation, including
management. Implicitly, the discussed approaches search for causes that provoked inaccurate human
decisions. However, looking for causes seems not the solution for modelling human decision-making
during critical events because causes do not unambiguously explain why someone chooses for a specific
sequence of actions. Our proposal is to look for the leading motive which decides decision-making during
critical events.
KEYWORDS : Behaviour models, critical situations, decision-making
can be assured. They include better préparation
of professionals by training and learning
procedures, and developing better instruments
such as alarms, décision support Systems and
ergonomie user-interfaces based on behaviour
models.
CRITICAL EVENTS A R E A F A C T
OF LIFE
Professionals working in complex dynamical
environments, such as nuclear power plants,
aeroplane cockpits and operating theatres,
regularly face complex decision-making.
Critical events can lead to decision-making that
doesn't agree to the organisational expectations,
established as a norm by procedures and
protocols. Under these circumstances, decisions
and actions seem poorly directed on realising
the, from the norm derived, professional goals.
For example, a driver in a skid reacts
impulsively by braking, whereas the reaction
should be to remove the foot from the
accelerator and declutch the transmission in the
effort to correct skidding.
However, history teaches that préparation can
help to limit the incidence of adverse outcomes,
but in the end it cannot avert critical events. It is
unavoidable that humans sometimes, from the
organisation's view, behave unlike the
organisational expectancy.
Existing behaviour models don't cover decisionmaking during critical events, which are seen as
unpredictable. Complementary knowledge is
therefore necessary to develop a model that
covers behaviour during critical events. The
underlying assumption is that during critical
circumstances a severe change of behaviour
occurs. The main question is how come that
Present strategies that cope with modelling
complex decision-making aim at expanding the
expectable circumstances in which the safety
161
20th European Annual Conference on Human Décision Making and Manual Control
professionals act seemingly unlike the
organisational norm?
INFORMATION PROCESSING
APPROACH
Classifying actions as failures identifies an
appearance, it doesn't explain the décision taken
by someone. Examples of such appearances are
for instance fixation errors (Keyser and Woods,
1990; Xiao and Mackenzie, 1995; Cook and
Woods, 1994), cognitive lockup, errors of
omission and errors of commission. However,
failures can only be defined related to déviations
from the norm. The last decade unexpected
appearances resulted in a further investigation
how man-machine Systems works, including
unexpected appearances.
The starting point for most human behaviour
models is the classical Stimulus-OrganismResponse (SOR) model either by elaborating the
model or by criticising the model. The SOR
model formed the basis that underlies many
information processing models elaborating the
organism part. The information processing
models played and still play an important role in
reducing the occurrence and impact of critical
events. Many models are developed, the two
most used models will be discussed.
Skill-Rule-Knowledge Based
Behaviour Model
The assumption for this research is that human
decision-making cannot be wrong. Each
décision is taken with the best intent trying to
fulfil the organisational expectations, whether it
déviâtes seemingly from the organisational norm
or not. The décisions taken at critical moments
can resuit in adverse outcomes but this is an
essential other discussion. The main objective is
to search for a possibility to support
professionals when they run into such
circumstances. The first step is to find a
functional relation between unexpected
behaviour and critical events.
Rasmussen developed a basic model of human
information-processing capacities involved in
complex control to provide a basis for the
design of human-machine interfaces under
which décision support Systems. The model has
been useful over the last decade to explain the
behaviour of a human operator carrying out
complex dynamical tasks.
The SRK-model mainly directs at more serious
errors made by those in supervisory control of
industrial installations, in particular during
emergencies in dangerous plants. One of the
aspects of the classifications in SRK-based
behaviour is the role of the information watched
from the environment, which differs in different
levels. Rasmussen identified three behaviour
levels of information processing. The three
levels skill-based, rule-based and knowledgebased (SB, RB and KB) match to increasing
uncertainness about the environment or task.
Figure 1 pictures the information processing
stages starting with input signais and ending
with actions.
A review of the existing behaviour studies with
accompanying models is given to look for links
regarding behaviour during crises. However, this
is not a complete survey. According to Ashby
(1956) models formally are mappings of
properties of a set to another set and typically,
the mapping is many-to-one. This implies there
are as many models as purposes. This paper the
globally distinguishes three approaches:
•
•
•
The in-homeostatic information processing
approach
The adaptive behaviour approach
The organisational approach
For skill-based tasks, there is a clear and
unambiguous relation between system states
and needed responses, and no doubt about the
mapping from stimulus to action. Skill-based
behaviour typifies actions that take place
without conscious control as smooth, automated
and integrated pattern of behaviour.
The information processing and adaptive
behaviour approach focus mainly on the
individual behaviour. The organisational
approach concentrâtes on improving the
conditions under which humans work, that
indirectly influence individual behaviour.
This brief review focuses on the différences
between existing approaches and the possibility
to expand the accompanying models with a
description that covers behaviour during critical
events.
Rule-based tasks are characterised by a set of
suitable actions governed by clear procedures.
Once an operator has correctly recognised the
situation, the choice of actions (behaviour) is
deterministic, following a set of if-then rules.
162
20th European Annual Conference on Human Decision Making and Manual Control
Knowledge-based tasks are characterised by
uncertainty, needs to develop novel solutions,
and delayed or limited feedback. There is no
know-how or a set of rules to control behaviour.
This demands operators to perform complex
interprétations and decision-making (Wickens
and Huey, 1993).
search for and find a prepackaged solution at
RB level before resorting to the far more
effortful KB level.
SKILL-BASED
LEVEL
(Slips and lapses)
Routine actions in a familiar
environment
Goals
OK?
KBB
RBB
Identification
Récognition
Décision of
Task
Association
State/Task
Planning
)YES,
OK?
Checks on
progress of action
YES
NO
RULE-BASED
LEVEL
Stored rules
for tasks
YES
(RB mistakes)
NO
Problem
SBB
Feature
formation
Is the problem
solved?
Consider local
state information
Automated
sensori-motor
patterns
TT
Sensory input
.Goal
State
isthepattern }YES_
familiar?
Signais Actions
Apply stored rule
IF (situation)
THEN (action)
NO
Figure 1: Skill-, rule-, and knowledge based
behaviour diagram (adapted from
(Rasmussen, 1986))
The SRK model couples increasing
unfamiliarity and the accompanying necessary
processes. The more unfamiliar the task the
more intensive the processing stages. The SRKmodel doesn't explain what makes the operator
choosing for decision A or B. Poorly directed
behaviour during critical events could occur
because of unfamiliarity but it isn't necessarily
knowledge based. The stepladder model of
Rasmussen (1986) is a further sophistication of
the SRK-model.
KNOWLEDGEBASED LEVEL
(KB Mistakes)
FOUND
Find higher level
analogy
NONE FOUND
Revert to mental model
of the problem space.
Analyse more abstract
relations between
structure and function
Infer diagnosis and
formulate corrective
actions. Apply actions.
Observe results
Subséquent attempts
Figure 2: outline of the Genetic Error
Modelling System (adapted from (Reason,
1990))
Generic Error Modelling System
Reason's Generic Error-Modelling System
(GEMS) is a layout to locate human error types
(Reason, 1996). The error types are
distinguished according to the SB-, RB-, and
KB performance. Error type refer to the
presumed origin of an error within the stages
involved forming and carrying out an action
sequence.
Besides error types Reason distinguishes error
forms. According to Reason, error forms
originate from mainly two retrieval mechanisms:
similarity-matching and frequency-gambling.
Both come into increasing prominence when
cognitive procedures are insufficiently stated.
The main difference between the GEMS and
the SRK-model lies in its effort to present error
mechanisms acting at all three levels of
performance. GEMS focuses on improving
safety devices against single failures, both the
human and the mechanical. However, Reason
continues there are no guaranteed technological
Reason assigns the slips and lapses to the SB
behaviour and mistakes to the RB and KB
behaviour (Figure 2). The key feature of GEMS
is the assertion that, when confronted with a
problem, human beings are strongly biased to
163
20th European Annual Conference on Human Décision Making and Manual Control
defences against the insidious build-up of latent
failures within the organisation and
management. He argües that cognitive
psychology can tell something about an
individual' s potential for errors, but it has little
to say about how these individual tendencies
interact within complex groupings of people
working in high-risk systems (Reason, 1990). In
later work Reason concéntrales on the
conditions under which individuáis work, by
improving defences, barriers and safeguards
within the organisation (called the system
approach (Reason, 2000)).
future states, rather than correcting present
unacceptable conditions.
Contextual Control Model
Hollnagel developed the Contextual Control
Model (COCOM) to simúlate human-like
behaviour in choosing the next action.
According to Hollnagel (2001) the context, as
well as the inhérent traits and cognitive
mechanisms decide the choice of action. A
model of Cognition must therefore account for
how Cognition dépends on the context
(Hollnagel, 1997). The extent of control plays
an important role in predicting correctly (if
possible) the next action. The better the control
the more predictable is the next action. COCOM
describes how the extent of control dépends on
the context, where the context is considered to
be a combination of
•
the situation understanding (knowledge and
assumptions) and
•
The expectations how the situation will
develop (the means and plans that are and
will be available).
ADAPTIVE BEHAVIOUR
APPROACH
The followers of the cognitive engineering
viewpoint (among which: Hollnagel and Woods
(1987), and Bainbridge (1992)) and the action
theory Rauterberg (1999) criticise the lack of
human anticipation and adaptation through the
continuously present interaction with the
environment. The influence of the environment
was obvious from different studies but it was
difficult to include this environment in the
sequential models except as events or stimuli.
The information processing models are
homeostatic models without anticipation,
understanding or learning (Rauterberg, 1999).
As Bainbridge (1997) pointed out, expert
behaviour direets often at ensuring acceptable
Thus in short the extent of understanding and
the expectations decide the choice of the next
action. The performance (expressed in the extent
of control) is a mixture of feedback (error
controlled) and feedforward (cause controlled)
actions. To incorpórate COCOM in a simulation
program the FAME Operator component was
developed (Hoffmann, 1998).
Farne Operator Component
Action Sélection
Module
i
i
t
Compétence
Representation
Module:
Actions and
Plans
Control Mode
Module
Temporary
Knowledge Module:
Assumptions and
Knowledge
Events
Event Evaluation
Module
Input/output signal
Interrogation channel
Information channel
Figure 3: Operator Component based on the Contextual Control Model (from (Hoffmann, 1998))
164
20th European Annual Conference on Human Décision Making and Manual Control
subjectively available time) to décide the extent
of control.
Figure 3 shows the Operator model including:
" the choice of action (action sélection
module) based on the current understanding
(formed by the knowledge module and
compétence module),
•
the event feedback (évaluation module).
Bainbridge's Model of Cognitive
Processing
The control mode module represents the extent
of control and is for practical use expressed by
four control modes:
•
scrambled, where the event horizon is
confined to the présent and there is no
considération of preceding events or
prédiction of future events
•
opportunisme, when an action is chosen to
match the current context with minimal
considération given to long-term effects
•
tactical, actions are govemed by longer
term considérations and extensive feedback
évaluation
•
stratégie, where the person is fully aware of
what is happening and is deliberately
making plans
Bainbridge argues that prédiction, goal setting
and planning must corne before information
processing which identifies the mismatch
between current and necessary states
(Bainbridge, 1997). Also choosing an action
may involve evaluating alternative actions. Extra
information about the circumstance or the action
may be necessary to evaluate alternatives and
this cannot be described in a sequential model
like the information processing models. The
model of Bainbridge (1997): focuses on
•
the overview, that is the temporary
composition of inference to describe the
current task condition,
•
how this overview provides the context for
later processing and for an effective
organisation of behaviour
The control mode is a theoretical parameter to
describe the extent of control: that is the manner
of choosing next action and evaluating the
outeome. In COCOM Hollnagel (1993, 1997)
uses two parameters (number of goals and
When performing a complex task, people do not
react direct to external stimuli. They build up a
temporary structure of inference (overview)
which represents their understanding of the
overview oi:
- what is happening
- why it is happening
- what info is necessary
- what to expect
- implications for task
- how to do it
Choice (not
necessarily
conscious) next
cognitive
activity and
working method
exécute
working method
(may be
mentally
simulated)
orient
information
needs
KNOWLEDGE BASE
working methods îor:
- infer/review présent/future
state/events
- review/predict
- goals/demands,
- actions/plans
knowledge about
environment, device, tasks
goals/criteria etc.
actions
(high salience, such
as alarms)
EXTERNAL ENVIRONMENT
0
•
Figure
4: cycle
of contextual
processing
(from
(Bainbridge,
165
1997))
data transfer
processing resuit
processing
20th European Annual Conference on Human Décision Making and Manual Control
spécifie the values and the norms of the
organisation) plays in forming this
understanding remains open.
présent and future situation, and what they
should do. This is done within the context of:
•
the task relevant knowledge and goals,
•
the results of previous thinking,
•
the information from the environment
processed for its task relevance.
ORGANISATIONAL APPROACH
Towards the individual approach exists the
system-, or organisational approach. The
starting point for the organisational approach is
that accidents never come singly. The individual
models are capable to minimise single failures
but they are not able to cope with multiple
failures which are hidden in ail layers of the
organisation. The fundamental différence with
the individual approach is that individuals are
seen as a given cog in the artificial machine. The
artificial machine can be optimised, so the
effects of the individual acts are avoided or
mitigated.
Building up thèse inferences is mainly done
using cognitive working methods related to
cognitive goals. Cognitive goals are concerned
with developing a person's understanding of the
situation and their plan what to do. In this way
the control of the action is determined by the
séquence of cognitive goals rather than by the
inhérent organisation of the actions. The
séquence of cognitive goals in turn is
determined by the context: the environment and
the previous development.
In short, understanding décides what to do next
which détermines new understanding. Figure 4
shows a représentation of this cycle, with the
knowledge base and the active and reactive
relations with the environment.
According to Cook and Woods (Cook et al.,
1998) a System consists of a sharp end and blunt
end (Figure 5). The sharp end is where the
practitioners interact directly with the process in
their rôles as pilots, surgeons or power plant
operators. The blunt end consists of for example
policy makers, technology suppliers and
regulators, they shape the environment in which
practitioners work.
According to Bainbridge (Bainbridge, 1997)
most identified errors and diffïculties concern
failure to form a gênerai understanding of the
task. So each subtask is dealt with by itself and
is not considered within the context of the task
as a whole. What role the environment (or more
Mortttored Process
dtmanis
errors and expertise
Operational System as Cognitive System
r e ' o u r T i 7-
Organisational Context
- — - .
Figure 5:The sharp and blunt ends of a System (from (Woods et al., 1994)
166
20th European Annual Conference on Human Décision Making and Manual Control
The central idea is the System defences (Figure
6). All risky organisations hold defences,
barriers and safeguards. An adverse event
occurs when all defences are penetrated
successively by a chain of failures: an accident
trajectory. The organisational approach
concentrâtes on improving the organisation or
System, in which individuals act.
CONCLUSIONS AND DISCUSSION
Looking back at the discussed approaches the
conclusion is that present strategies aim at
improving the conditions under which
individuals work. Rasmussen (1986) sketched
the influences in a man-machine-system (Figure
7). According to this diagram the direct
information, data and orders as well as the
criterions and preferences formed by the
organisational policy define the context for
decision-making which forms a part of the
mental information processing. All functions
presented in Figure 7 are normally used by
humans in various professional contexts.
Nevertheless, the idea is that human action is
based on a top-down prediction drawn from
opinions or intents, motives and preferences
(Rasmussen, 1986). Causal bottom-up
arguments play a deduced role. Important
information useful for modelling behaviour
during critical events is therefore knowledge of
the values and norms of the work environment.
Reason (Reason, 2000) and Cook and Woods
(Cook et al., 1998) mentioned this in similar
terms. Yet, completing this view in a human
behaviour model seems still open.
Holes in the defences arise for two reasons
Reason (Reason, 1990): active failures and
latent conditions. Active failures are unsafe acts
committed by people at the sharp end, they have
a direct impact. Latent failures or latent
conditions originate from décisions taken at the
blunt end. They are removed in both time and
space from the acts at the sharp end. The latent
conditions shape the environment a practitioner
acts in.
When an adverse event occurs the central
question in the organisational approach is not
who blundered, but how and why the System
defences failed. The aim is to identify and repair
latent conditions before an adverse event occurs,
because the specific forms of active failures are
often hard to foresee according to Reason
(Reason, 2000). The organisational approach
covers all layers in the Organisation including
managers, policy decision-makers, designers,
and suppliers whereas the individual behaviour
models concern mainly human-machine-systems
designers.
The information processing and adaptive
behaviour approach aim on the direct
environment of the individual operator by
improving human-machine systems, such as
user-interfaces, expert-, alarm-, and decision
support systems. The behaviour models, either
the in-homeostatic information processing
models or the adaptive contextual models, serve
as a starting. Detected deviations from this
modelled behaviour guides the search for causes
and improvements.
Defences in depth
~-
Gaps and weaknesses
- m the defences
Accident
Trajectory
Figure 6: The system defences and their
weaknesses (Modified from Reason, 1990)
The organisational approach isn't focused on
human behaviour in particular, but on the
constraints created by the organisation which
shape the possibilities for the practitioners to act
(Cook and Woods, 1994). These conditions
influence indirectly human behaviour.
167
The organisational approach aims at improving
the broader organisation in which individuals
act, to avoid and mitigate the effect of active
failures. The organisational approach distracts
the attention from the individuals at the sharp
end and tries to seek the improvements by
changing the organisation. The possible fhreats
in the organisation are identified and repaired,
changing indirect the attitude and the values of
the organisation. Looking at Figure 7 all the
areas on decision-making are covered by these
approaches except the relation between values
(left top) and criterions: the subjective value
formation (rectangle on the top).
20th European Annual Conference on Human Decision Making and Manual Control
Reported failures, near failures and critical
incidents form a useful source for investigations
of human performance. Masked, the discussed
approaches look for causes that provoked the
inaccurate human action to search for solutions
that limit the number of critical events.
However, it is often difficult, if not impossible,
to pinpoint the causes to human inaccurate
decisions.
Influence from
System and
Environment
Situation, Policy
Attitude and Values
is to discover how people décide when facing
complex circumstances which may occur during
critical events.
In the coming research of the next two years the
complementary knowledge for modelling this
complex behavioural appearance is searched by
looking for the constant parameter that directs
decision-making in stead of looking for the
variables. A theory will be developed based on
the notion of a norm which will be used as a setpoint and référence to the human operational
behaviour model.
Human Operator
Functions
The norm correlates with a coordinating social
order (a social perspective) that prescribes how
one should and could handle. The assumption is
that this order is the leading motive to which
professionals and to which anyone act under any
circumstance. The functional relation between
the values and norms of the organisation and the
influence on behaviour is necessary to get a grip
on how this leading motive manifests in a
change of behaviour when humans face critical
events.
Subjective Value
Formation
Criteria and
Preferences
Symbolic Information
Data, Orders, etc.
Mental Information
Processing
Output
Actions
Mental
Resources
Emotional, Affective
Situation Features
Psychological
Mechanisms
Cognitive and Affective
REFERENCES
Arousal
Fatigue
Physlological
Stressors
Ashby, W.R. (1956). An Introduction to
Cybernetics. (London: Chapman & Hall), pp. 1295.
Physiological
Functions
Bainbridge, L. Diffïculties and Errors in
Complex Dynamic Tasks. 1992.
Physical
Capabilities
Physical Workload
Injuries
Bainbridge, L. (1997). The Change in Concepts
Needed to Account for Human Behavior in
Complex Dynamic Tasks. IEEE Transactions on
Systems, Man, and Cybernetics - Part A:
Systems and Humans 27, 351-359.
Anatomical
Properties
Flgure 7: Diagram sketching the complex
influences in a man-machine system
(Adapted from (Rasmussen, 1986))
Cook, R.I. and Woods, D.D. (1994). Operating
at the Sharp End: the Complexity of Human
Error. In Human Error in Medicine, S.M.
Bogner, ed. Lawrence Erlbaum Associates), pp.
255-310.
The relations between inaccurate human actions
and there causes are simply too complex to
allow easy causal reasoning. Besides, the
starting point is that one cannot avert critical
events. Taking a possible cause away doesn't
imply this incident will never surface again.
Cook, R. I., Woods, D. D., and Miller, C. A
Tale of Two Stories: Contrasting Views of
Patient Safety. 1998. Chicago Illinois, National
Patient Safety Foundation.
Looking for causes for behaviour deviations
seems therefore not a solution to model
behaviour during critical events. The challenge
faced when modelling human decision-making
Hoffmann, M.I. (1998). FAME Operator
Component based on the Contextual Model.
Masters thesis, Department of Man-Machine
168
20th European Annual Conference on Human Decision Making and Manual Control
Systems, Delft University of Technology, The
Netherlands.
Xiao, Y. and C F . Mackenzie. Decision Making
in Dynamic Environments: Fixation Errors and
Their Causes. Proc. Human Factors and
Ergonomics 39th Annual Meeting, 469-473.212-1995.
Hollnagel, E. (1997). Context, Cognition, and
Control. In Co-operation in process management
- Cognition and information technology, Y.
Waern, ed. (London: Taylor & Francis).
Hollnagel, E (in press). Cognition as Control: A
Pragmatic Approach to the Modelling of Joint
Cognitive Systems..
Keyser, V. de and Woods, D.D. (1990). Fixation
Errors: Failures to Revise Situation Assessment
in Dynamic and Risky Systems. In Systems
Reliability Assessment, A.G. Colombo and A.S.
de Bustamante, eds. (Dordrecht: Kluwer
Academic Publishers), pp. 231-252.
Rasmussen, J. (1986). Information Processing
and Human Machine Interaction; an Approach
to Cognitive Engineering. (New York: Elsevier
Science Publishers B.V.).
Rauterberg, M. (1999). Activity and Perception:
an Action Theoretical Approach. Systematica
14, 1-6.
Reason, J. (1990). Human Error. (New York:
Cambridge University Press).
Reason, J. (1996). A systems approach to
organizational error. Ergonomics 39,17081721.
Reason, J. (2000). Human error: models and
management. British Medical Journal 320,768770.
Wickens, C D . and B.M. Huey. (1993).
Workload Factors. In Workload Transition:
Implications for Individual and Team
Performance, C D . Wickens and B.M. Huey,
eds. (Washington, D . C : National Academy
Press), pp. 1-284.
Woods, D.D. and E. Hollnagel (1987). Mapping
cognitive demands in complex problem solving
worlds. Int. J. Man-Machine Studies 26,257275.
Woods, D.D., L.J. Johannesen, R.I. Cook and
N.B. Sartert (1994). Behind Human Error:
Cognitive Systems, Computers, and Hindsight.
(Columbus: CSERIAC).
169