EFFECT OF CUE TYPE ON SITUATION AWARENESS
by
DEBRA GIPSON JONES, B.S., M.S.I.E.
A DISSERTATION
IN
INDUSTRIAL ENGINEERING
Submitted to the Graduate Faculty
of Texas Tech University in
Partial Fulfillment of
the Requirements for
the Degree of
DOCTOR OF PHILOSOPHY
Approved
Accepteyd
December, 1996
•
I
T3
(^f\ (^
ACKNOWLEDGMENTS
Completion of this degree would not have been possible v^thout the support of my
husband Eric, my daughter Regan, my parents Walter and Susan (jipson, and my in-laws
Fred and Janelle Jones. Further moral support and encouragement during this course of
study was provided by Dave Kaber, Beverly Wiley, and Terry Wilson.
Contributions to the procedures and analysis of this study were made by Dr. Mica
Endsley, Dr. M.M. Ayoub, Dr. Patricia DeLucia, Dr. Jim Smith, and Dr. JeflFWoldstad.
This project required the utilization of the FAA facilities in Oklahoma City, as well as
participation and assistancefrommembers of the FAA: Dr. Carol Manning who
authorized the use of the facilhies and assisted in the development of the cue types; Henry
Mogilka who spent numerous hours developing the scenarios, served as subject matter
expert, and assisted in the running of the study; and Fahh Amell who assisted in the
development and fine tuning of the scenarios. Further assistance in running the study was
given by Armida RosilesfromTexas Tech.
This research was supported by a fellowship from the Air Force Office of Scientific
Research.
11
TABLE OF CONTENTS
ACKNOWLEDGEMENTS
II
ABSTRACT
LIST OF TABLES
Vll
LIST OF FIGURES
viu
CHAPTER
n
ESJTRODUCTION
Naturalistic Decision Making
Recognition-Primed Decision Model
Situation Assessment
Decision Cycles
Situation Awareness
Theories
SA and Other Cognitive Constructs
SA and workload
SA and attention
SA and working memory
Summary
Situation Awareness and Mental Models
Theories of Mental Model Construction
Mental models and schema
Rule-based mental models
Similarities of interest between the two theories
The Role of Mental Models in SA
Schema
Conceptual coherency
Schema relatedness
Consistency
Relevancy
Summary
Conclusion
1
4
4
5
6
7
7
13
13
15
17
20
20
22
22
24
27
27
29
29
32
33
37
38
39
METHOD
Approach
Experimental Design
Purpose
43
43
44
44
ni
Variables
Hypothesis
Task
Scenario Development
Misidentification of akcraft type
Flight path error
Communication error
Scenario Design
Subjects
Subject Instructions
Operator Response
m.
IV
.TS AND DISCUSSION
Bizarre cues
Irrelevant cues
Unexpected Cues
Absence of expected cues
Analysis
Hypothesis 1
Hypothesis 2
Discussion of Hypothesis Analysis
Error Category: Misidentification of Aircraft type
Evaluation of Alternate Explanations
Analysis of the Effect of Individual Differences in Error Detection
Workload
Subject Characteristics
S A Error Analysis
CONCLUSION
44
45
45
46
46
47
49
51
51
54
55
56
56
57
58
59
59
59
60
60
60
60
62
62
62
63
67
REFERENCES
71
APPENDIX
A SUMMARY OF SUBJECTS'DATA
77
B CONTESIGENCY TABLE ANALYSIS
79
IV
ABSTRACT
Situation Awareness (SA) is a vital element of decision making in dynamic
environments. As such, SA errors can impede and degrade decision making performance.
One particularly troublesome SA error is the representational error which occurs when the
wrong mental model is used to interpret information (resulting in an inaccurate
understanding of the situation), when in fact it should provide a cue that the wrong mental
model is in effect.
This dissertation investigates what characteristics of information are likely to cause a
person to adjust a mental model rather than falling prey to a representational error. Since
the literature on this issue is sparse,findingsfromschema literature were used as a starting
point for investigating this question. From the schema Iherature, two hypotheses were
formulated involving the effect of cues on SA: (1) schema bizarre information will impact
SA more than schema irrelevant information, and (2) schema unexpected information will
impact SA more than the absence of schema expected information. A high fidelity
simulation of an air traffic control task was used to test these hypotheses. Certain
misinformation was provided to the controllers and then cues to this error were provided
in the form of schema bizarre cues or irrelevant cues, schema unexpected cues or absence
of expected cues. The controllers were expected to ascertainfromthese cues that the
current mental model was not adequate to account for the cue (thus, the cue signified that
an error had been committed) and that the error needed correcting. If the significance of
the information was comprehended, an overt action was required.
A contingency analysis of operator response showed two things. First, in accord with
the hypothesis, schema bizarre cues seemed to impact SA more than schema irrelevant
cues. Second, the hypothesis that a difference exists between response to schema
unexpected and the absence of schema expected cues was not supported. These results
provide an indication of the types of information that affect SA. Enhancing SA by
emphasizing the type of information to which an operator is naturally less inclined to
respond is one approach to improving system design and thereby performance.
VI
LIST OF TABLES
2.1 Misidentificationof aircraft type
48
2.2 Flight path error
50
2.3 Communication error
52
2.4 Scenario A, orders 1 and 2
53
2.5 Scenario B, orders 1 and 2
53
3.1 Summary of responses
56
3.2 Bizarre cue results
57
3.3 Irrelevant cue results
57
3.4 Unexpected cue resuhs
58
3.5 Absence of expected cue resuhs
59
3.6 Sunmiary of responses and workload rating
62
A. 1 Summary of subjects' data
78
Vll
LIST OF FIGURES
1.1 Neisser's (1976) view ofthe perceptual cycle
10
1.2 Cue category definitions
41
3.1 SA Error Taxonomy
63
Vlll
CHAPTER I
INTRODUCTION
Optimal operator performance is the ultimate goal of any system design. One
important step to creating an environment conducive to opthnal performance is
understanding the decision processes the operator is likely to use. Understanding decision
processes enables designers to present operators with the type of information that they
actually use and to present this information in the most efficient manner. Although many
real world decisions involve dynamic situations where the status of individual elements is
constantly changing, the majority ofthe decision literature concentrates on relatively static
decisions where the alternatives and consequences are, to some degree, known. Research
in this area has centered around the Expected UtiUty (EU) model and its variants (Pitz &
Sachs, 1984). The EU model is a prescriptive model that attempts to explain how people
should make decisions when dealing with choices among risky prospects. According to
this model, people will make choices that will maximize preference (Schoemaker, 1982) or
value (Raiffa, 1968). Although this idea sounds simple enough (i.e , that people will make
decisions that brings them the most gain), the fact that most empirical evidence does not
support the EU model mdicates that hs attempts to capture this idea have not been
successful (Schoemaker, 1982). Consequently, a multitude of variations ofthe model
have been developed in an attempt to account for the numerous situations where the
original model's predictions fail (e.g., a version ofthe multiattribute theory which attempts
to incorporate the effects of regret [Bell, 1982], the Information-Integration theory which
tries to account for how people combine or integrate diverse pieces of information
[Shanteau, 1975], and the use of satisficing to deal with constraints imposed on the
decision maker [March, 1977]).
Despite the numerous models developed under classical decision theory, these models
do not satisfactorily describe how people make "real world" decisions. A major problem
of classical models is that theu^ assumptions arefrequentlyviolated in real world contexts.
Some of thesefrequentlyviolated assumptions include (1) goals can be isolated, (2)
utihties can be assessed mdependent of context, (3) probabihties can be accurately
estimated, (4) choices, goals and evidence are clearly defined, and (5) the utihties of
outcomes are independent of other outcomes (Klein & Calderwood, 1991).
In response to the failings of tradhional decision theory to account for decision
processes m "real world" situations, four schools of thought have developed (Beach &
Lipshitz, 1993). The first school suggests that classical decision theory is correct and any
behavior that does not follow this theory is hrational. Obviously this philosophy is flawed,
if people were mdeed hrational decision makers, the species would have long before died
out as a resuh of numerous hrational decisions. Another group beheves that classical
decision theory should be modified to account for behavior. This group's philosophy m
many ways just extends the use of EU model variants to explam deviant behavior, and hs
application to real worid settings at this point is still questionable. Still another sect
recognizes that classical decision theory is not universally appUcable, but h is usefiil to the
development of decision aids. This attitude provides a realistic view ofthe contributions
of tradhional decision theory. In addhion to the formulation of decision aids, decision tree
analysis and probabilistic forecasting (both developed from tradhional decision theory)
have made realistic contributions to society, particularly in management domains (Ulvila &
Brown, 1982). Finally, another group believes that classical decision theory is not a good
standard by which to judge decision making. The belief that traditional decision analysis
fails to account for real world decision making has led to the development of a new field
within decision theory. This field, frequently referred to as Naturalistic Decision Making
(NDM), has a descriptive philosophy in that h attempts to understand and describe how
people actually make real world decisions.
This type of descriptive approach seems a promising way for providing designers with
pertinent information regarding the type of information an operator actually needs to make
decisions and the most usefijl format in which to present this information. However, since
NDM theory recognizes that different types of decisions are likely to evoke different
strategies, numerous models have been developed to explain decision processes in various
situations. Understanding the decision models relevant to a particular type of decision and
investigating similarities between these models is a useful starting point for gaining insight
into decision processes. One type of decision that is particularly relevant for system
design is that of an expert in a dynamic situation. Thus, this review examines several
NDM theories of expert decision making in dynamic shuations, compares the major
similarity between these models (i.e., situation awareness), and investigates the
components of situation awareness (e.g., mental models, schema) in order to gain insight
into factors that might affect this concept.
Naturahstic Decision Making
Although NDM is a relatively young field, numerous models have been developed that
fall within this category. These models address issues such as uncertain dynamic
environments, action/feedback loops, time stress, and multiple players (Orasanu &
Connolly, 1993), any or all of which can reaUstically effect the decision process. One
advancement of this type of decision research over classical decision theory is the
recognition that different decisions strategies are useful in different situations (Lipshitz,
1993). For example. Beach and Mitchell's (1987) Image Theory focuses on how people
make important personal decisions without the use of decision aids, whereas Pennington
and Hastie's (1993) Explanation-Based model addresses mstances where a person must
make a decision based on an intermediate summary representation of information (e.g., a
juror). Although NDM models cover a variety of decision types, the models considered
here are those that offer insight as to how experts involved in dynamic situations make
decisions. Among these types of models are Klein's Recognition-Primed Decision (RPD)
model of rapid decision making. Nobles' model of situation assessment, and Connolly's
model of decision cycles. Each is briefly described below.
Recognition-Primed Decision Model
This decision model, developed by Klein and his associates (Klem, 1993; Klem &
Calderwood, 1991; Klein & Crandall, 1992), has two stages. During thefirststage,
situation assessment, a plausible course of action is generated in response to the need for a
decision. Four components are important during this stage: (1) understanding the goals
that can be accomplished, (2) increasing the salience of important cues, (3) forming
expectations and using them to check the accuracy ofthe situation assessment, and (4)
identifying the typical actions to take (Klein, 1993).
The second stage ofthe RPD model is mental simulation. During this stage the person
consciously enacts a sequence of events and uses this "mental simulation" to evaluate the
course of action selected during thefirststage. If this course of action is found not to
meet the necessary or desired goals, another course of action must be generated and the
process repeated (Klein & Crandall, 1992). Thus, a major fiinction of this stage is to
"create a hypothetical model or system, and to 'observe' h in action" (Klein & Crandall,
1992, p. 26) in order to detect potential problems or obstacles. In real world decisions,
the first stage of this model, situation assessment, is of particular importance since the
second stage ofthe model is senshive to time pressure and may be shortened or even
skipped when time is critical.
Situation Assessment
Noble's Situation Assessment Model involves several processes. First, concrete
information on the situation is combined with background information and previous
knowledge in order to form an initial representation ofthe situation. Next, this
representation implies certain expectations which are tested by comparing h whh
additional information. If the expectations do not match the new information the
representation is refined or rejected. Finally, people may observe that the current situation
is similar to a previous shuation and what worked in that situation may work again in the
new situation (Lipshitz, 1993). This model emphasizes the reherative nature ofthe
decision process.
Decision Cycles
Connolly's decision cycles model addresses the issue that decisions made in a dynamic
environment cannot be analyzed as isolated instances (Connolly, 1982; Lipshitz, 1993).
This model has three domains (the actual world, the decision maker's subjective image of
this world, and the decision maker's values) and two cycles (the perceptual and the
decisional cycle). Whhin the decision cycle, Connolly identifies two types of decisions:
action-last or tree-felling and action-first or hedge clipping. The first type of decision,
tree-felling, exemplifies the type of decision that is made in one 'fell swoop' following a
period of deliberate planning. This type of decision making is possible when goals are well
defined, and a clear way of achieving the goals is available. Hedge clipping decisions, on
the other hand, are made incrementally because the goals are unclear and the future is
uncertain (i.e., the final shape ofthe 'hedge' is unknown). With hedge clipping, less effort
is required to react to feedback, and this type of decision making is more efficient than
forming an elaborate, exhaustive plan, especially since the plan can change. Hedge
clipping rather than tree-feUing decisions are more the norm in dynamic situations
(Connolly, 1982). This model also requires a reherative process in that the situation is reevaluated after each action.
Situation Awareness
Although the aforementioned theories have promise, the extent to which they have
been tested is still somewhat limited. Still, in addition to testing or studying the particulars
ofthe various theories, insight can be gained by examining common themes across the
theories. One such recurrent theme is the importance ofthe person's understanding ofthe
situation to the decision process. This understanding ofthe situation is generically
referred to as situation awareness (SA). Although some form of SA is a key ingredient in
several NDM models, SA and decision quality are not synonymous. For example, a
person who has temporarily lost SA can still make a good decision, if only by luck. On
the other hand, what seems like a good decision based on the situation may turn out to be
a poor decision if the situation was misinterpreted (Endsley, 1994). Thus, although SA is
an important component of many decision models, h is separatefromthe decision pomt
(i.e., action selection) (Endsley, 1988a).
Theories
Although the intense study of SA as an essential construct is a fairly recent
development (whhin the last 10 years), SA is not a new phenomenon; the military, for
example, has used the idea at least as far back as Worid War I (Press, 1986, as ched in
Endsley, 1988a). The definitions created by the milhary for SA have ranged from
extremely generic ("an assessment ofthe situation based on the best possible information
[Waddell, 1979, as ched in Stiflfler, 1987, p. 1]) to somewhat more descriptive definitions
such as " the abihty to envision the current and near-term disposition of both friendly and
enemy forces" (AMRAAM OUE Tactics Analysis Methodology Briefing, 1983, as ched in
Stiffler, 1987). The acknowledgment that SA is the "single most important factor in
mission success" (AMRAAM OUE Lessons Learned Briefing, 1983, as ched in Stiffler,
1987; also Endsley, 1988a) has prompted renewed interested in understanding SA and hs
relation to system design as well as to decision making.
Desphe a surge in research into SA, attempts to explain the phenomenon are still quite
varied and no single definition enjoys universal acceptance. One comprehensive theory
defines SA as "the perception ofthe elements in the environment within a volume of time
and space, the comprehension of their meaning, and the projection of their status in the
near future" (Endsley, 1988b, as cited in Endsley, 1988a). This definition breaks SA
down into measurable components that can be applied across a variety of domains. The
"elements" relevant to SA are task specific, and by simply redefining relevant elements and
the relevant time and space, the definition apphes equally well to a nuclear power plant
trainee and a highly trainedfighterpilot. An important feature of this theory is that h
makes a solid distinction between SA and situation assessment: situation assessment can
be thought of as a process in which the individual attempts to achieve a particular state of
knowledge (i.e., SA) (Endsley, 1995b). SA is a state of knowledge that must be buih up
over time. Although SA can be lost very quickly, h is rarely, if ever, acquired
instantaneously (Endsley, 1993b). This temporal aspect of SA is also important to a
person's ability to project the state ofthe environment in the near fiiture. Thus, this
theory of SA recognizes the criticality ofthe temporal dimension of SA. Additionally, this
theory recognizes that the relevance of elements may vary across time (though the person
8
must be aware of all relevant elements in order to determine which are of higher or lesser
importance) (Endsley, 1993c) and that SA is, in many contexts, highly spatial in nature
(Endsley, 1989, 1995b).
Another theory, purported by Sarter and Woods (1991), defines SA as "all the
knowledge that is accessible and can be integrated into a coherent picture, when required,
to assess and cope with a situation" (p. 55). As does Endsley's theory, this theory of SA
stresses the importance ofthe temporal nature of SA. It emphasizes that self-contained
recurrent situation assessments are not sufficient for achieving or maintaining SA since
problems can build up over time. In some cases, initial signs of minor deviations may not
be critical. However, over tune these deviations may evolve into a serious problem.
Sarter and Woods (1991) cite the following example.
A good example of a subtle problem that can build up over time to become a major
threat is the case of a vacuum-driven gyro failure resulting in false attitude and heading
indications. The pilot does not get any direct indication or notification ofthe problem
but rather has to infer itfromobserving attitude and heading changing over time. If
the pilot fails to realize the problem (e.g., whileflyingin the clouds), he or she will try
to compensate for the indicated apparent deviationsfromlevel flight. In the extreme
case, this can result in a loss of control as it becomes impossible for the pilot to
recover from the resuhing "true" attitude ofthe plane, (pp. 47-48)
Thus, temporal awareness is an essential component of SA that "requires the diagnosis of
problems that are caused or influenced by precursors in the past as well as the prognosis
and prevention of potentialfiitureproblems based on the analysis of currently available
data" (Sarter & Woods, 1991, p. 49). In order to achieve temporal awareness, accurate,
comprehensible system feedback is required. Whhout system feedback to give the
operator information regarding the current state ofthe system (and thereby allov^ng the
operator to either see directly or to infer changes), achieving and maintaining temporal
awareness (and therefore SA) is not possible. This fact points out the necessity of
providing the operator whh feedback in an efficient manner and in a usefiil format.
Another theory whh similarities to Endsley's theory is that suggested by Tenney,
Adams, Pew, Huggins, and Rogers (1992). Like Endsley's theory, this theory makes a
distinction between situation assessment (a process) and situation awareness (the product
ofthe process). Tenney et al. describe situation assessment as a process involving a series
of cognitive activities in order to reach a state of awareness (i.e., in order to achieve
situation awareness). They use Neisser's theory ofthe perceptual cycle as the basis for
explaining SA (see Figure 1.1). According to Tenney et al., using Neisser's theory as a
framework for understanding SA has several advantages. First, h addresses the issue of
Objects
(available
information)
Samples
Modifies
Exploration
Schema
Figure 1.1: Neisser's (1976) view ofthe perceptual cycle
(from Tenney et al. , 1992, p. 4)
10
knowledge versus perception:
. . . knowledge, in the form of schemas, or mental models, leads to anticipation of
certain kinds of information and directs attention and exploratory movements to
particular aspects ofthe available information. The information that the perceiver
picks upfromthe environment, or samplesfromthe available information, in turn,
modifies, or updates, what the perceiver knows about the immediate surroundings
and influences what is known about the worlds in general. (Tenney et al., 1992, p.
3)
Using this approach, SA can be described as the context in which the flow of events is
interpreted and which allows the person to attend to particular information at a particular
level of abstraction for a particular task. The latter part of this description of SA is
reminiscent of Fracker's (1988) earlier work in which he defines SA as "the knowledge
that resuhs when attention is allocated to a zone of interest at a level of abstraction" (p.
103).
Another advantage to utilizing Neisser's theory of perception as aframeworkfor
explaining SA (according to Tenney et al., 1992) is that h emphasizes the importance of
comprehension by making meaning an essential part ofthe perceptual cycle. In other
words, simply perceiving information is not enough for SA; the information must be
understood. This aspect of Tenney et al.'s theory reflects Endsley's earlier work stating
that understanding the perceived information is a necessary component of a theory of SA.
Finally, Tenney et al. state that using this approach to explain SA is advantageous
because h can handle the "anticipatory process" (i.e., the ability to "stay ahead ofthe
system") that is generally recognized as essential for adequate SA. (Again, this theory
reflects earlier work by Endsley [1988a, 1989, 1993b, 1994, 1995b].) According to this
11
theory, anticipatory processes direct exploratory behavior and can occur at different points
on the continuum between perception and cognition:
. . . at the basic perceptual level, anticipation means a readiness to perceive certain
information when the ^optical flow specifies an impending colUsion or the
occlusions of another aircraft by cloud cover. At the level of higher-order
perceptual strategies, anticipation means focusing attention efficiently, as when an
experienced driver (or taxiing pilot) focuses attention on a point at a distance from
the vehicle rather than directly in front of h. At the cognitive level, anticipation
means the consideration of possible outcomes. (Tenney et al., 1992, p. 4)
In a somewhat different approach to explaining SA, Smith and Hancock (1995) define
SA as "adaptive, externally directed consciousness" (p. 138). According to this theory, a
person "possesses" SA if (1) a statable external goal whh performance specifications
exists, and (2) that person's performance is reasonably successfiil in meeting those
expectations. In this theory, SA specifies what information is needed to solve a problem
in a particular environment, and in doing so directs behavior toward achieving the goal.
Like Tenney et al.. Smith and Hancock utilize Neisser's perceptual cycle, but the addition
ofthe invariant to the middle ofthe cycle distinguishes the two. The invariant "forms the
linkage among information, knowledge, and action that produces competent behavior" (p.
141). Smith and Hancock's belief that SA is not a cognitive structure separates h from
other SA theories. The major difference between Smith and Hancock's theory and the
others is that Smith and Hancock believe that SA helps build a person's representation of
the environment, whereas the other theories of SA indicate that SA is the person's
' The availability of information in the optical flow is a fundamental principle to Gibson's theory of
perception from which Neisser draws many of his ideas. Gibson's theory states that everything a person
needs to accurately perceive the environment is constantly available in the optic array (Gibson. 1979).
12
representation ofthe environment. The view that SA is a cognitive construct is more
prevalent in the literature and SA will be treated as a cognitive construct here.
SA and Other Cognitive Constructs
In order to be a useful construct, S A must maintain hs independence from other
frequently investigated cognitive constructs. Even so, understanding the relationship
between S A and these other constructs is important to the development of a
comprehensive theory of SA. Other cognitive constructsfrequentlyintermixed with
discussions of SA mclude workload, attention, and working memory.
SA and workload. To define workload, Wickens (1992a) states that workload refers
"to the mteraction between pilot [or operator] and his or her task, insofar as the pilot's [or
operator's] limited information processing capachies (resources) [are] concerned" (p. 1).
Similarly, Selcon, Taylor, and Koritsas (1991) note that "there is general agreement that
workload is a multidhnensional concept composed of behavioral, performance,
physiological, and subjective components (Hart, 1987) resultingfrominteraction between
a specific individual and the demands imposed by a particular task" (p. 62). Although the
degree of commonaUty between workload and SA is unclear (Selcon et al., 1991), the two
concepts do have similar features (Wickens, 1992b). Mainly, both are inferred mental
constructs in that they cannot be directly observed in behavior. Although the direct results
of workload and SA cannot be measured, their suspected resuhs on performance can be
measured: excessive workload leads to decreased performance, whereas lost or decreased
SA leads to decreased performance (Wickens, 1992b).
13
Selcon et al. (1991) found evidence to support the idea that workload and SA are
closely related with a study comparing NASA TLX scores with Situational Awareness
Rating Technique (SART) ratings. In order to provide these ratings. Royal Air Force
"pilots were asked to rate a videotape of an air combat flight simulation sequence" (p.62)
using each ofthe techniques. The NASA TLX is a "muhidimensional rating procedure
for the subjective assessment of workload" developed and evaluated at NASA Ames
Research Center. This workload scale provides an overall "workload score based on a
weighted average of ratings on six sub-scales: mental demand, physical demand, temporal
demand, performance, effort, andfiiistration"(p. 62). (For a more complete treatment of
this scale, see Hart, 1987.) The SART scale, on the other hand, is a 10 dimensional scale
with three major groupings (demand on attention resources, supply of attentional
resources, and understanding) within each ofthe ten constructs. (For more detailed
information, see Taylor, 1989.) This study found that the NASA TLX and SART ratings
were significantly correlated, indicating a direct relationship between the two constructs.
Endsley (1993a), however, asserts that workload and SA are independent constructs.
She suggests that the significant correlation between NASA TLX and SART ratings found
by Selcon et al. (1991) was a resuh of elements traditionally considered part of workload
being measured whhin SART. These components, assessments of demand and supply of
attentional resources, make up two ofthe three major components ofthe SART scale.
The third component, understanding the situation, is geared toward the more prevalent
definitions of SA (Endsley, 1993). Thus, at least a portion ofthe NASA TLX and the
SART scales was measuring the same thing, so the significant correlation between the two
14
is not surprising. Endsley offers addhional support for the idea that workload and SA are
independent constructs. She suggests that a continuum exists among the four possible
extremes: (1) low SA and low workload ~ the person does not know what is going on
and is not actively trying to find out, (2) low SA and high workload ~ SA is lost as a
result of too high of a workload which forces the person to focus on only a subset ofthe
required information, (3) high SA and high workload ~ the person is working hard but is
also maintaining accurate SA, and (4) high SA and low workload ~ high SA is maintained
with relatively low workload because of ease at which the required information is
acquired; this situation is ideal and is the ultimate design goal. Other examples of ways in
which SA and workload diverge are provided in Endsley (1993 a).
SA and attention. The term "attention" has a variety of meanings. In the present
context, attention refers to the process of allocating one's cognitive resources to various
inputs. Many models of attention,frequentlycalled capacity models, suggest that
attention is a limited resource. Thus, since different tasks require different amounts of
attention, the number of tasks that can be performed at once depend on the attention
demand of each task (Ellis & Hunt, 1993).
Although the exact relationship between attention and SA is not clearly understood,
Fracker (1989a, 1989b) suggests that SA may begin whh a limited supply of attention to
be distributed across the relevant elements of a situation. Fracker operationally defines
SA as how aware a person is ofthe aspects ofthe task situation that should influence the
response decision. The task domain chosen by Fracker in the studies examined was
aviation. In this domain, Fracker indicates that two ofthe most important questions a
15
pilot needs answered are (1) which aircraft are friendlies, foes, or neutrals (FFN
awareness) and (2) where each aircraft is at any point in time (spatial awareness). Thus, in
these studies, Fracker measured SA in terms of FFN awareness and spatial awareness.
Fracker (1989a, 1989b) theorized that since attention is limited, more attention may be
allocated to some elements than to others depending on the priority a person assigns to
each element. The priorities should be formed depending on "the degree to which each
element threatens or contributes to successfiil task completion" (Fracker, 1989a, p. 1396).
He further suggests that "as the number of elements having a high priority increases, the
amount of attention that the person can allocate to each decreases" (Fracker, 1989a, p.
1396). In the task domain examined by Fracker, the highest priority would go to threat
aircraft, the next to friendly aircraft (which had the potential to assist), and the next to
neutral aircraft. Thus, he suggests that as the number of enemy aircraft are increased, the
attention allocated to friendly and neutral aircraft should be reallocated to threat aircraft,
up to some Umit. According to his theory, "FFN awareness must precede and be
independent of such prioritization. Therefore, if the number of aircraft are held constant
while the number of enemy aircraft is allowed to trade-off with the number that are
neutral, then spatial awareness should be affected by the number of enemy aircraft but
FFN awareness should not" (Fracker, 1989a, p. 1396).
Fracker (1989a, 1989b) performed a series of experiments to test his theory. He
devised a task wherein the subject participated in a simulated air battle. At various points
in the simulation, the simulation was stopped and the subjects were asked to indicate FFN
identity or spatial location of a particular aircraft. The resuhs of these studies showed
16
spatial awareness was best for the enemy aircraft threatening task success, spatial
awareness for friendlies was m the middle, and spatial awareness for neutral aircraft was
the poorest. Fracker's interpretation of these resuhs indicate that when the number of
enemy aircraft increased, increased attentional demand was met by re-allocating attention
away from low priority neutrals rather than higher priority friendUes. Thus, his resuhs
seem to support the model in which a limited supply of attention allocated to elements
depends on their ability to contribute to or threaten task success. Addhionally, Fracker's
resuhs suggest that if attentional demands exceed a Hmit, SA will suffer. This beUef is
echoed by Endsley (1988a, 1989, 1993b, 1995b) in her theory of SA.
SA and working memory. An important distinction in SA theory is that SA is not
merely another term for information in working memory (Endsley, 1989, 1993b, 1995b;
Sarter & Woods, 1991). Sarter and Woods (1991) state two reasons SA goes beyond
working memory. First, SA involves the temporal dimension discussed earher. The
comprehensive active model required for temporal awareness goes beyond the limits of
working memory since working memory can hold only "a restricted amount of mformation
for a short period of time" (p. 52). Second, "preattentively processed information which
is unconscious until brought into working memory by means of attention allocation can
contribute to situation awareness" (p. 52).
In order to understand the relationship between working memory and SA, one must
first understand theories regarding working memory. One theory defines working memory
as "a brain system that provides temporary storage and manipulation ofthe information
necessary for such complex cognitive tasks as language comprehension, learning, and
17
reasonmg" (Baddeley, 1992). This theory, developed primarily by Baddeley, purports that
working memory is comprised of three subcomponents: (1) the central executive, (2) the
visuospatial sketch pad, and (3) the verbal/phonetic loop. The "central executive" is
responsible for allocating resources to the other two components of working memory,
which act as slave systems (Wickens, 1992a). According to Baddeley (1992), the
verbal/phonetic loop "stores and rehearses speech-based information and is necessary for
the acquisition of both native and second-language vocabulary," while the visuospatial
sketch pad "manipulates visual images" (p. 556). This theory impUes that a hmited
amount of resources available for allocation by the central executive limits working
memory.
Other, more recent, theories of working memory involve the idea of activation (Just &
Carpenter 1992; Cantor & Engle, 1993). These theories suggest that information in long
term memory must somehow be activated or sthnulated above some threshold in order to
reside in working memory and to be operated on by various cognitive processes.
Additionally, activation models share the behef that the amount of activation available in
long term memory (LTM) is limited and this limit constrains working memory (Just &
Carpenter, 1992; Cantor & Engle, 1993).
The theories just described differ with respect to the underlying mechanisms involved
in memory, but they share the idea that working memory can act as a constraint on certain
cognitive fiinctions. Since some people seem to have a "better" working memory than
others the question arises as to what accounts for these differences. Although the amount
ofthe limited activation (or resources) available may vary between individuals, the major
18
differences between people with "good" working memory and people with 'not as good'"
working memory appears to relate to strategies used to overcome the hmitations of
working memory (Just & Carpenter, 1992; Cantor & Engle, 1993). For example, one
strategy for overcoming these limits involves the way information is represented. Cantor
and Engle (1993) suggest that people with an apparently larger WM capacity are better
able to mtegrate information mto a single representation. Similarly, Just and Carpenter
(1992) suggest that individual differences in WM resultfromdifferences in the efficiency
of mental processes. Thus, when mformation is integrated, or represented as one model,
the amount of activation used up is that of one unit, whereas if each bit of information is
represented individually, more units are requh-ed (an idea similar to "chunking").
Additionally, Ericsson and Chamess (1994) note that differences m WM vary widely
between experts and novices. They propose that "experts form an hnmediate
representation ofthe problem that systematically cues then- knowledge, whereas novices
do not have this kmd of orderly and efficient access to then- knowledge" (p. 734). Thus,
one critical aspect of experts' working memory appears to be the way information is
indexed in LTM as opposed to the amount of information that is stored. In all ofthe
above cases, an hnplication is made that people who have more effective strategies for
dealing v^dth the constraints of WM perform better; however, the exact strategies used are
domain specific (Ericsson & Chamess, 1994).
Since WM in a large part provides the means of performing many cognitive fijnctions,
the task of achieving and maintaming SA must be done, at least in part, by working
memory. Therefore, the fact that WM is a limited capacity suggests that working memory
19
can constrain SA, even to the point of being a major bottleneck to achieving/maintaining
SA (Endsley, 1988, 1989, 1995b). This constramt seems to be particularly problematic
for novices who do not have effective working memory strategies. In the absence of other
mechanisms, the entire job of processing informationfromthe environment as well as the
task of integrating this information into existing knowledge must be done in working
memory. Additional demand is placed on working memory when another part of SA,
projecting the future status ofthe system, is an important part ofthe task (Endsley, 1994).
As suggested above, people more experienced at a particular task will have developed
effective working memory strategies that reduce the demand on working memory, thereby
increasing the person's abihty to maintain SA.
Summary. In short, SA can be affected by the various cognitive constructs
considered. First, workload and SA are independent constructs that influence each other.
Second, excessive demands on attention can resuh in decreased SA. Finally, adequate
WM capacity is essential to achieving and maintainmg SA.
Situation Awareness and Mental Models
In addhion to the relationship between SA and the above constructs, other less
measurable components of SA are common across theories. One such component is the
importance of mental models to SA (Endsley, 1988a, 1989, 1993b, 1995b; Sarter &
Woods; 1991; Tenney et al., 1992). However, desphe the fact that the term "mental
model" isfrequentlyfound in the literature, the concept conveys different meaning to
different people. One early use ofthe mental model concept is Craiks' intuitive idea that
20
an inner mental representation has the same 'relation-structure' as the phenomena it is
intended to represent (Johnson-Lah-d, 1983). Since then, attempts to define the term vary
from providing a general description such as "human beings understand the world by
constructing working models of h m their minds" (Johnson-Laird, 1983), to attempts at
operational definitions such as "mental models are the mechanisms whereby humans are
able to generate descriptions of system purpose and form, explanations of system
functioning and observed system states, and predictions of future system states" (Rouse &
Morris, 1986). An addhional problem when using the term mental model is that the term
can refer to numerous scenarios. For example, the term might be referring to an
operator's knowledge ofthe environment or a designer's assumptions and knowledge
about the operator (Hollnagel, 1988). In the current context, the term mental model refers
to the former.
Desphe different definitions ofthe notion, characteristics related to mental models
have similarities across domains. For example, Johnson-Laird (1994) suggests that
"among the key properties of models is that their structure corresponds to the structure of
what they represent (like a visual image)" (p. 190). Another characteristic generally
accepted across theories is that mental models must be able to predict future states ofthe
environment (Holland et al., 1986; Johnson-Laird, 1983). Other characteristics ofthe
model are more dependent on the context in which the mental model is employed (e.g., to
explain deductive reasoning, system understanding or inductive reasoning). In order to
understand the concept of a mental model and hs role in situation awareness (and thereby
21
in decision making), an understanding of some ofthe theories of how mental models are
constructed (schema related and rule-based) is essential.
Theories of Mental Model Construction
Regardless ofthe definition or the particular characteristics of a mental model, an
important aspect ofthe development of mental model theory is the construction ofthe
model. Two theories of mental model construction are considered here. The first
examines the relationship between mental models and schema, while the second examines
rule-based mental models.
Mental models and schema. Before comparing mental models to schema, a definition
of schema is necessary. Schema are organized representations of a body of knowledge
(constructed from past experience vdth objects, scenes or events) which consist of a set of
expectations (usually unconscious) about what things look like and/or the order in which
these things occur (Mandler, 1979). In the context of a particular situation, the schema
instantiated is the one that best fits the current information (Wilson & Rutherford, 1989).
Subsequent data is interpreted in accordance with this schema, and iffiitureinformation is
found to conflict whh the schema selected, another schema must be instantiated. The
decision as to which new schema is instantiated depends on the cues provided by the
particular way in which the new information conflicted with the old schema. This aspect
of schema theory is similar to the rule-based mental model theory which will be discussed
later.
22
As far as the relationship between schema and mental models is concerned. Rumelhart
describes a mental model "as the total set of schemata instantiated at the time" (Wilson &
Rutherford, 1989, p. 624), whereas Johnson-Lah-d suggests that schema provide the
procedures from which mental models are constructed (Wilson & Rutherford, 1989)
Accordmg to the theory mvolving the use of schema to construct mental models, schema
represent the typical, or defauh characteristics, ofthe mstance in question and are
represented mentally by their most characteristic members (i.e., by a prototype) (JohnsonLah-d, 1983). Schema are also beUeved to underUe a person's abihty to form hnages
which are considered to be views of a mental model. Images, which can be developed
through perception or imagination, "represent the perceptible features ofthe
correspondmg real-world objects" (Johnson-Lah-d, 1983, p. 157). Thus, according to this
theory, mental models and images are functionally different. As just alluded to, hnages
differ from mental models m that they "give rise to an obvious subjective experience,
whereas this characteristic is wholly irrelevant to mental models, which need not possess
any immediately 'pictorial' attributes" (Johnson-Lah-d, 1981, p. 167).
Understandmg the precise difference between mental models and schema is difficult
because the terms arefrequentlyused differently in the Iherature. In some cases, the terms
are treated as distinct concepts while m other cases the terms are used interchangeably
This lack of consistency adds to the difficulty of communicating ideas regardmg the role of
mental models and schema in situation awareness. However, the previously mentioned
definitions of schema suggest that mental models and schema are not distinct concepts,
instead, they fall on a continuum (Wilson & Rutherford, 1989). Differences do exist,
23
however, between the two (thereby distmguishing then- points on the continuum). One
useful distinction between the two concepts, according to this viewpomt, is that mental
models are believed to be creations ofthe moment, whereas schemata are believed to be
stored and activated (Wilson & Rutherford, 1989). This difference can be stated in the
following manner - "the major difference between mental models and schemata is that the
latter are taken to be data structures in memory, which can be activated, whereas the
former are regarded as the utilization of such information in a computationally dynamic
manner" (Wilson & Rutherford, 1989, p. 624).
Rule-based mental models. According to Holland, Holyoak, Nisbett, and Thagard
(1986), in order to be a useful construct mental models must have two main
characteristics: (1) they must be able to generate predictions in the environment even
though the information is incomplete (i.e., they must have defauh values) and (2) they
must be easy to refine, which requires that old information is maintained and new
information is integrated mto the model. According to this theory, mental models are
constructed through the simuhaneous activation of a relevant set of rules. For any given
situation, a range of models can be constructed which must then compete for therightto
represent the environment.
The construction of a mental model involves two major steps. First, the worid is
divided into simple categories of elements (e.g., fast moving objects). This step in the
theory is described as the categorizationfijnction,P. This model is homomorphic in that
elements ofthe world are mapped to elements in the model in a many-to-one fashion
(Holland et al., 1986). This many-to-one mapping allows the user to develop a simplified
24
model suitable for a particular purpose; thus, the user ignores details that are not relevant
to the purposes ofthe model.
The next step in this theory involves the introduction of a model transferfiinction,T'
The purpose of this function is to describe how categories in the environment map to
categories within the user's representation, or 'state". In order to be vahd, this model
must be commutative m that mappmgfromthe state to the envh-onment is equivalent to
mapping from the environment to the state. In complex environments, this model may not
be completely valid. Thus, the theory mcorporates the idea of q-morphisms which allows
erroneous predictions to be corrected. A q-morphism is simply a model with layered
transition functions. The higher layers in the model have broad categories and associated
default expectations. These default expectations will be used to make predictions unless
an exception is found. When an exception is found, a lower layer ofthe model is activated
and the transferfianctionassociated with this layer is enacted in order to account for the
exception. Hence, a model is constructed through progressive refinements of this qmorphism. Holland et al. (1986) describe the process m this manner:
The initial layer ofthe model will divide the world into broad categories that allow
approximate predictions with many exceptions. Each addhional layer in the
hierarchy wiU accommodate additional exceptions while preserving the more
global regularities as defauh expectations. The mduction process will be guided by
failures ofthe current model. ... Failed expectations will serve as triggering
condhions for the generation of new, more specialized rules. In a complex
environment the process of model refinement by the cognitive system is unlikely
ever to be completed (p. 36)
25
Thus, model refinement is guided by two opposing processes: (1) the need for accurate
predictions, which favors more specialized rules, and (2) the need for efficient predictions,
which favors more general rules.
This model contains the previously stated essential characteristics. First, h allows the
system to make predictions based on incomplete knowledge ofthe environment. Second
during the refinement process the rules that represent useful probabilistic regularities are
retained as defaults. In this account, all events in the environment are treated as
equivalent unless features (i.e., failed expectations) suggest thatfiirtherdifferentiation is
required (e.g., one treats a male employee in a restaurant as a waiter unless there are signs
suggesting that he is the host or the busboy). The idea of a q-morphism is limited relative
to available ahematives. If another model, based on radically different categories, is found
to be less complex and to make a broader range of predictions, the original model will be
abandoned in favor of this more effective model. However, if no suchrivalmodel exists, a
weak q-morphism is preferred to no model at all (Holland et al., 1986).
Finally, this theory suggests three possible types of failures ofthe mental model:
1. The current model is not vaUd whh respect to some aspect ofthe environment it is
intended to predict. This error is most likely during the early stages of learning,
and the person will have little confidence in the model. In this case the most
appropriate response is to recategorize the environment and then build a new
transfer fianction.
2. Not all ofthe current model's predictions are accurate. In this case, the
appropriate corrective action would be to refine the model.
26
3. Refining the model is difficult because constructing more specific categories that
successfiilly predict the transition function for exceptions to the model becomes
increasingly more difficuh. "In this case, the most appropriate inductive response
is simply to estimate the degree of uncertainty associated with the model transition
fiinction T'" (p. 37).
Similarities of interest between the two theories. These two theories (schema-related
and rule-based) of how mental models are constructed contain numerous similarities. One
such mterestmg shnilarity is that both models capture the idea that a mental model must
contain default values in order to make predictions in the face of incomplete information.
This notion of default values becomes particularly interesting when dealing with mental
models of dynamic situations. Another similarity, and an important characteristic of
mental models in general, is the ability to predict the future state ofthe system. Fmally, an
hnportant aspect of both mental model theories is that the mental model can be wrong,
even after it has been refined (Wickens, 1992b).
The Role of Mental Models in SA
The concept of mental models plays an important role in SA theory. One important
aspect of SA theory is that h be able to predictfiiturestates ofthe system. The question
as to how mental models assist in this endeavor as well as how they affect performance
has already received attention. For example, Mogford and Tansley (1991) assessed the
importance of mental models for air traffic controllers. In this context, the authors defined
27
a mental model as "a hypothetical construct which refers to an operator's ideas about a
system and what h is controlling" (p. 1). These investigators state that "the existence of
the mental model is demonstrated by the controller's ability to project the fiiture positions,
altitudes, and speeds of aircraft in order to anticipate possible conflicts" (p. 1). The
findings of this study support the notion that less relevant information is ehher not
considered at all in the model or h falls out ofthe mental model when workload increases.
In this experiment, controllers participated in a simulated air traffic control scenario.
Twenty minutes into the scenario, the simulation was stopped and the controller was
asked to recall various aspects ofthe scene (using a map). These aspects included
location, call numbers, altitudes, and airspeeds. The resuhs showed that airspeed was
recalled poorly. In talking to the air traffic controllers, the investigators found that the
controllers had been taught that separating aircraft based on speed is an unwise strategy.
This fact makes airspeed a less relevant hem, and hs poor recall supports the prediction
that less relevant items will not be included in the simpHfied mental model.
The holes in this experiment highlight some ofthe problems that exist in the mental
model research. Although a definition of mental models was offered, a hypothesis
formulated, and results obtained to support the hypothesis, this experiment revealed very
little about how mental models are constructed or how the person uses them. The
problem of understanding mental models is a large one and even Rouse and Morris (1986)
admit that some aspects of mental models may never be fijlly understood.
In another experiment investigating the role of mental models, Matsuo, Matsui, and
Tokunaga (1991) used a directory dialing task to understand how past knowledge affects
28
mental model formation. They believe that when people are faced with a novel service or
product, they "tend to fall back on [a] mental model based on an analogical relationship
whh a familiar product or service" (p. 548). Addhionally, Matsuo et al. (1991)
investigated whether using a metaphor (i.e., a similar mental model) was an effective
teaching technique. In this second case, the experimenters drew a metaphor between
conventional mail and electronic mail to teach subjects how to use the electronic mail
system in order to test the effectiveness using of metaphors in training. From these two
experiments, Matsuo et al. (1991) conclude that mental models are constructedfromthe
user's operating knowledge acquiredfrompast experience and that using metaphors to
instruct is an effective technique.
The need to make predictions regardingfiiturestates ofthe system even in the absence
of complete information emphasizes the importance of default values within the mental
model. As discussed earlier, one theory states that these default values are contained in
the schema being activated. Some SA models generally use this terminology (i.e., schema)
when discussing the use of long term memory stores to relieve working memory demands
and to direct attention (Endsley, 1988a, 1989, 1993b, 1995b; Fracker 1988).
Consequently, the nature of schema and how the use of schema affects mental models and
S A becomes an issue.
Schema
Conceptual coherency. As mentioned earlier, schema are organized representations in
memory. This type of organized representation is important to conceptual coherence.
29
Conceptual coherence is the degree to which ideas, hems, or concepts are "glued"
together or related (Murphy & Medin, 1985). Concepts whose features are connected by
structure-function relationships or by causal schemata of one sort or another will be more
coherent than those that are not. Addhionally, concepts that have no interaction with the
rest ofthe knowledge base (i.e., that cannot be related to a schema or other organizing
process) will be unstable and probably forgotten sooner (Murphy & Medin, 1985).
The importance of conceptual coherence isfiirtheremphasized in a study by
Biederman, Glass, and Stacy (1973). This study examined two hypothesis. The first
suggested that the speed at which a single object can be detected in a real worid scene
would be reduced when the scene was jumbled compared to when h was coherent. The
second hypothesis stated that the jumbling would be most disruptive when the target
object was not in the scene but had a high probability of occurring in the scene (e.g., a
refrigerator in the kitchen). The experimental stimuH were createdfrompictures of
naturally occurring scenes (e.g., a kitchen, a desk top, etc.). The picture was cut into six
equal sized pieces using one horizontal and two vertical cuts. For the jumbled scenes,
these picture sections were rearranged. The results ofthe study supported both ofthe
hypotheses, thereby, among other things, showing the importance of conceptual coherency
in interpreting information. Since maintaining SA involves continually interpreting the
effect new information will have on systemfiinctioning,understanding the manner in
which operators maintain conceptual coherency whh respect to a system must be
considered if the incoming information is to be presented in the most efficient manner
possible.
30
Schema arefrequentlythe mechanism whereby conceptual coherence is obtained.
Thus schema canfiinctionin one of five operations during the memory process (Brewer &
Nakamura, 1984):
1. Attention — schema can influence the amount of attention allocated to a particular
type of information.
2. Framework ~ schema provide aframethat has slots that can accept a range of
values.
3. Integration ~ during schema instantiation, old schema based information becomes
integrated with new episodic information.
4. Retrieval ~ schema may guide the memory search for schema related episodic
mformation.
5. Edhing ~ people apparently follow a conversation maxim that one should not tell
someone information that is completely obvious.
Of these five possiblefiinctionsof schema, research has shown that framework,
integration, and retrieval play a major role in schema, but attention and editing output do
not (Brewer & Nakamura, 1984). Thus, according to these resuhs, the speculations that
(1) schema directs attention to a particular type of information (e.g., schema
relevant/irrelevant/etc.) causing that information to be remembered better and (2) highschema-related information is not reported due to a conversational maxim not to report
the obvious are not supported by the evidence. (These resuhs obtained by Brewer and
Nakamura are based on their extensive review of related Iherature.)
31
As far as SA theory is concerned, the impHcation of thesefindingsraises several
questions. First, since these resuhs were obtained using relatively static stimuli, would the
result be different if dynamic stimuli were used (i.e., would schema play a role in directing
attention if the situation was dynamic)? Second, if schema do not direct attention toward
certain types of information, what accounts for the fact that some types of information are
remembered better than others'^ Some possible mfluencing factors include the saliency of
the information with respect to hs independent characteristics (e.g., physical
characteristics), the importance of particular information to current goals (determined by
the current mental model), and a retrieval advantage for certain types of information
Finally, are some types of information remembered better regardless ofthe saliency of hs
independent characteristics or hs importance to goals ~ that is, is one type of schemarelated mformation remembered better across all conditions than others? If so, what type
and why? Understandmg the mechanism by which certain types of information are
remembered better is essential to understanding how different types of mformation impact
S A. Questions regarding the effect of schema relatedness on memory is addressed in the
following section.
Schema relatedness. When studying schema, researchers arefrequentlyinterested in
how well hems are remembered based on their relationship whh the schema. In general,
an hem may be schema consistent (i.e., expected as part ofthe schema), schema
inconsistent (i.e., not expected as part ofthe schema), schema relevant (i.e., important to
thefijnctioningofthe schema), schema irrelevant (not important to thefiinctioningofthe
schema), or schema bizarre (does not conform to the schema and is extremely unexpected
32
as part of that schema). To illustrate this relationship, consider the schema invoked when
dining at a restaurant. According to this schema, the wahstaff would be schema
relevant/consistent, plants in the periphery ofthe dining room would be schema irrelevant,
the absence of eating utensils would be schema inconsistent, and a tiger jumping from
table to table would be schema bizarre. The terminology involved in schema research can
be confusing and the results of various studiesfrequentlyseem to contradict each another.
However, by distinguishing between schema consistent/inconsistent and schema
relevant/irrelevant stimuli, the confusion lessens. Unfortunately, many studies either use
the terminology almost interchangeably or do not distinguish between schema irrelevant
and schema inconsistent stimuli, thereby making h difficuh to interpret results and
compare them across studies. Nonetheless, categorizing the resuhs of studies based on a
description ofthe stimuli they report using (i.e., consistent/inconsistent or
relevant/irrelevant) allows for more usefiil comparisons of resuhs.
Consistency. An example of a typical schema study is demonstrated by an experiment
performed by Brewer and Treyens (1981). This study sought to examine place memory in
a naturalistic incidental learning situation; consequently, a room designed to resemble a
graduate student's office provided the experimental stimuli. Since the study took place at
a university, the use of a graduate student's office was a reasonable choice. At the start of
the study, 61 objects in the room were rated on two dimensions: saliency and schema
expectancy. The saliency measure was based on two factors: (1) saliency intrinsic to the
object (e.g., a skull and cross bones), and (2) saliency derivedfromthe schema context
(e.g., a tire in the graduate student's office). A negative correlation between saliency and
33
schema expectancy was found. For the memory experiment, subjects were asked to wait
in the office for about 35 seconds while the experimenter "checked" to make sure the
previous subject was finished. The subject was then taken to another room and asked to
perform one of three tasks: (1) draw the room, (2) provide a written recall then perform a
verbal recognition, or (3) perform a verbal recognition only. (In verbal recognition,
subjects are given books containing lists of object names and are asked to rate how certain
they were that they had seen the objects in the experimental room.) The subjects were
asked to describe the room in such a manner that a person unfamihar with the room could
go in and locate an object.
The resuhs ofthe experiment give strong support for the integration of old and new
information mto the schema because subjectsfrequentlyinferred that typical objects were
present when in reahty they were not (e.g., a window). The inferred objects were
invariably highly schema consistent objects. Additionally, a strong poshive correlation
was found between schema expectancy and recall and between saliency and recall. Thus,
the schema consistent items were remembered well, as were schema bizarre hems.
Other studies however, contradict these findings. For example, in a review, Rojahn
and Pettigrew (1992) suggest a shght memory advantage for schema inconsistent
information. Similarly, a study by Pezdek, Whetstone, Reynolds, Askari, and Dougherty
(1989) found that inconsistent items were remembered better than consistent items. This
study used a daycare and a graduate student's office to study whether schema consistent
or inconsistent hems were recalled/recognized better. In all conditions tested (including
instruction manipulations, item number manipulation, and type of recognition test
34
manipulation [appearance information versus inventory information]) hems that were
inconsistent with expectations were significantly better recalled and recognized than items
consistent whh expectations. This resuh is called the "consistency effect." As with
Brewer and Treyens, however, an increase in false alarm rates in the delayed test
conditions as well as for consistent hems in the immediate test condhion suggests that
schema integration occurred.
Bower, Black, and Turner (1979) also studied how schema affected recall. This study
was comprised of a series of experiments involving the use of scripts, a specific kind of
schema. Scripts are event schemata that have a consistent order of occurrence of events
(e.g., eating at a restaurant or attending a lecture) (Mandler, 1979). Among the findings
of this study was that the organization of people's knowledge about stereotypical activhies
was fairly consistent across people, people tended to recall text in the stereotypical
(canonical) order even when it was presented to them in another order, subjects
comprehended a statement more quickly when it wasfirstprimed whh an action that
occurred earher in the script, and when recalling scripts that included interruptions (i.e.,
schema inconsistent information), the interruptions were recalled more than script actions,
which was more than irrelevancies.
Pezdek et al. (1989) discuss three possible explanations ofthe consistency effect. The
first possible explanation involves the von Restorflf effect which indicates that an hem that
stands out from hs background will be remembered better than an hem that is similar to hs
background. This effect is believed to occur ehher because the isolated hem's uniqueness
creates a stronger memory trace or because the isolated hem receives less interference
35
during the making ofthe memory trace. This explanation does not adequately handle the
current results, however, since it suggests that the larger the ratio of isolated hems to
similar hems the smaller the memory advantage for the isolated hems. This ratio was
manipulated by Pezdek et al. (1989), but h did not influence the consistency effect.
Another possible explanation for the consistency effect involves a retrieval advantage
for inconsistent items. According to this explanation, inconsistent hems are kept in
working memory longer than consistent hems because they are difficult to understand
within the given context. Due to this additional time in working memory, inconsistent
items are processed more and therefore become associated with more hems. These
additional associations can subsequently help cue retrieval. As a result, the inconsistent
items become associated with more items than do consistent items, and these addhional
associations subsequently can serve as retrieval cues. However, the retrieval explanation
only accounts for the consistency effect with recall since it suggests that the effect is
localized at retrieval; recognition would not be affected by this type of localization. Thus,
this explanation can not account for the consistency effect whh recognition.
The third possible explanation for the consistency effect discussed by Pezdek et al.
(1989) involves encoding differences for inconsistent and consistent hems. Pezdek et al.
(1989) che two studies (Friedman, 1979 and Lofhis & Mackworth, 1978) that found that
when presented whh a scene, subjects tend to fixate eariier, more often and for longer on
inconsistent/unexpected hems. Furthermore, Friedman (1979) found qualitative
differences between the way subjects process expected versus unexpected hems: "subjects
rarely noticed missing, new, or physical changes in consistent hems, whereas they almost
36
always noticed these changes in inconsistent hems" (ched in Pezdek et al., 1989, p. 593).
This difference may be a resuh of using a feature detection versus feature analysis
processing model on different types of hems. Feature detection, which usually operates
on schema consistent items, involves activating an appropriate sceneframefor processing
the picture. Thus, schema consistent hems are processed only enough to activate the
appropriate scene frame. Feature analysis, on the other hand, involves distinguishing the
current scene from other scenes and therefore usually operates on schema inconsistent
items. Thus, the schema inconsistent hem is encoded in more detail than the schema
consistent item. Thus, this encoding explanation accounts for the consistency effect on
both recall and recognition tests so is a more promising explanation than the previous two
explanations.
Relevancy. In a study involving recall of scripted activhies, Graesser, Woll,
Kolwalski, and Smith (1980) used a series of "Jack stories" to determine if subjects could
distinguish between actions actually presented in the stimuli script and actions presented
during the memory test phase. They predicted that memory discrimination would be
better for actions that were atypical and that no memory discrimination would occur for
actions that are very typical of generic scripts. This study found that although both
recognition and recall memory were initially better for atypical activities (as suggested in
the hypotheses), the rate of forgetting was greater for the atypical activities than for the
typical activhies. However, this study did not distinguish between schema
relevant/irrelevant and schema consistent/inconsistent actions, thereby decreasing the
comparability ofthe resuhs. Still, one particularly usefijl resuh was that the rate of
37
forgetting was greater for atypical activhies. From this resuh, Ciraesser et al. (1980)
concluded that generic scripts play an increasingly critical role in directing retrieval
processes as the retention interval increases.
The resuhs of a study by Goodman (1980) and Goodman's interpretation of these
resuhs provides support for Graesser et al.'s conclusion. Goodman's study looked at
memory for typical versus atypical events using an action schema (a schema organized
around a single act) represented in a picture. Thefindingssupported the notion that high
relevant information was expected, was represented in prototypical form, and was closely
connected to the action theme. Consequently, the high relevant information could be
easily reconstructed and retrieved. The low relevant information, on the other hand, was
less expected and was represented separatefromthe schema in memory. Thus, this type
of information was retained well, but the person's ability to link hs presence to the action
represented was lost because h was not retrieved whh the schema. This second
conclusion concurs whh Graesser et al.'s suggestion that atypical information is
remembered less after a time delay because the generic script directs the retrieval process;
consequently, the atypical information is not likely to be retrieved since h is not stored
with the schema.
Summary. In general, research seems to indicate two facts. First, both recall and
recognition are greater for schema inconsistent items than for schema consistent items.
Second, information that is schema relevant is remembered well, as is schema bizarre
information, while schema irrelevant information is not remembered as well. The
applicability of this research to SA is limited however, since most ofthe research on
38
schema was performed in a static manner (i.e., observing a room or viewing a picture),
and h involved recall/recognition of information rather than utilization of or response to
information. However, since schema are used in the development of mental models,
which in turn are vital to SA, exploring the effect ofthe schema relatedness (i.e., relevancy
or consistency) of information on SA provides a promising avenue for gaining more
insight into the development and maintenance of S A.
Conclusion
In general, S A theory recognizes that in a dynamic situation when a mental model is
found to be wrong it will be replaced with one that more accurately explains the cues
present in the situation (Endsley, 1988a, 1989, 1993b, 1995b; Fracker, 1988). Mental
models, which can be defined as the utilization of schema (defauh characteristics of a
particular instance stored in memory) in a computationally dynamic manner (Wilson and
Rutherford, 1989), are particularly important in developing the higher levels of SA: Level
2 (comprehending the situation) and Level 3 (predicting future system states). A person's
mental model affects the way the person interprets the significance of incoming
information as well as how the person utilizes this information to predict future states of
the environment. When a person uses the wrong mental model for a particular
circumstance, the significance of information can be misjudged and the relationship
between the person's understanding ofthe situation and the reality ofthe situation can be
distorted. One such case involves representational errors. Representational errors occur
when the wrong mental model is used to interpret information (resuhing in an inaccurate
39
understanding ofthe situation) rather than the information being used to adjust the mental
model to reflect the reality ofthe situation. For example, in an incident involving the
U.S.S. Vincennes a representational error occurred which resuhed in the shootdown of an
Iranian airliner. At the time ofthe incident, Iran was exhibhing hostile behavior and
several U.S. ships had already been attacked by gunboats. The Vincennes was on patrol in
the Persian Gulf when it detected an aircraft leaving Iran. The aircraft's flight profile was
such that h indicated a possible threat: (1) hs altitude was low, (2) hs speed was fast, (3)
itsflightplan did not match any known civilianflightplan, (4) it was squawking a
transponder code that had been previously associated whh Iranianfighters,and (5) h was
heading toward the Vincennes. At the same time, personnel on the Vincennes were
tracking Iranian gunboats headed for other alhed ships. After considering all these factors,
the Vincennes crew decided the aircraft was hostile andfiredon h. The decision to fire on
the aircraft was correct considering the manner in which the cues were interpreted.
Although, an ahemate explanation existed for at least some of these cues (e.g., the ahitude
was low but possible for an airliner and the speed was fast but possible) the context in
which the crew was working (i.e., known hostile activity) contributed to the
representational error.
Understanding why, in cases like the Vincennes incident, people misinterpret
information instead of reahzing that the information indicates that their perception of
reality is not correct (i.e., that their SA is faulty and that their mental model needs
adjusting) is an important issue. Thus, this study investigates the type of factors that
might influence a person to adjust a mental model; that is, what information characteristics
40
spur a person to revamp or abandon completely the current mental model in light ofthe
new information rather than interpreting the information in the context ofthe current
mental model. Rather than trial and error, a possible approach to this question was
gleaned from the Iherature regarding schema. Although these studies were all done under
static condhions, they provide a reasonable starting point to investigate the types of
information that will cause a person to realize that the current mental model needs
adjusting. Thus, based on the schema Iherature, the following hypothesis were developed
to examine the effect ofthe schema relatedness of a cue on SA. (See Figure 1.2 for cue
type definitions).
Hypothesis: Information that is seemingly schema bizarre is more likely to signal to the
operator that a problem exists whh the mental model than is information that is seemingly
schema irrelevant.
Cue Category Definitions
Schema bizarre cues: Cues that indicate that something is ehher not possible or only
extremely remotely possible according to the operator's mental model.
Schema irrelevant cues: Cues that are actually relevant, but seem unimportant since the
wrong mental model is operating.
Schema unexpected cues: Cues that are whhin the realm of possibility, but not likely
according to the current mental model.
Absence of schema expected cues: Cues that are not present, but are so prototypical
according the mental model that their absence in this particular case is not readily
apparent.
Figure 1.2: Cue category definitions
41
Rationale. Research suggests that schema bizarre information is remembered better than
schema irrelevant information (Brewer & Nakamura, 1984; Brewer & Treyens, 1981).
This issue is important because frequently discrepant information initially does not posses
characteristics to classify h as bizarre; it is simply seen as irrelevant. Thus, the mental
model is not adjusted as early as possible.
Hypothesis: The presence of schema unexpected cues are more likely to indicate a
problem with the mental model than the absence of schema expected cues.
Rationale. Research has shown that current information is frequently integrated with the
schema prototype (Brewer & Treyens, 1981; Pezdek et al. , 1989). This integration might
lead to the false behef that the schema consistent information is present when in reahty h is
not (e.g., as a result of inadequate scan patterns, expectancy, etc.). Additionally, some
studies show that people are less likely to notice changes in schema expected hems than
schema unexpected hems (Friedman, 1979, ched in Pezdek et al. , 1989).
42
CHAPTER II
METHOD
Approach
Addressing issues relating to schema and mental models require that the subjects have
a strong background of information and experiencefromwhich to draw. Thus,
appropriately investigating these hypothesis necesshated the use of experts. However,
waiting for an expert to make an error (and then examining that error) is not plausible and
trying to force the expert to make an error is unlikely to be successful. Consequently, the
task was designed so that the errors were induced by sources other than the experts'
actions. Instead, the experts were expected to ascertainfromcues provided that the
current mental model was not adequate to account for the cue, that the cue signified that
an error had been committed, and that the error needed correcting.
In order to test the schema related hypotheses, certain types of information were
presented to the subject and the effect that information had on the subject's SA was
measured. Although numerous methods for measuring SA have been proposed, an
imbedded task technique was selected as the best way to measure SA in relation to a
person's ability to avoid representational errors. Imbedded task measures provide
objective measures of operator performance on a specific task. A task is designed so that
if a person understands the significance of a particular piece of information and adjusts
his/her SA, an overt, measurable action is required. This measure therefore provides
information on whether a certain type of information wiU impact SA by assuming a direct
43
link between SA and behavior. If the significance ofthe information is not understood and
SA is not adjusted, no behavior will occur. A major disadvantage associated with this
method is that h can not adequately assess the muhidimensional concept of S A. This
limitation was not an issue since the aim ofthe study was to measure the effect of specific
cues on SA related to that cue only (i.e., no global assessment of SA was required). A
second limitation of this method is that h assumes that subjects will choose to respond in a
predictable way to the cues. For this reason, the cues were selected carefially in
conjunction with a subject matter expert to insure that an overt response would clearly be
required for a given cue.
Experimental Design
Purpose
The purpose of this experiment was to determine if the schema relatedness of cue type
affects the likelihood that the cue will indicate to an operator that his/her current mental
model ofthe shuation is incorrect. That is, will the schema relatedness of a cue affect the
probability that the operator will respond to the information and take corrective action.
Variables
Independent variable. Cue type (schema bizarre, schema irrelevant, schema
unexpected, and absence of schema expected).
Dependent variable. Operator response - did the operator take action to correct the
error based on the presented cue.
44
Hypotheses
Hypothesis 1:
• Ho: No difference exists between operator response to schema bizarre and schema
irrelevant cues.
•
Hi; A difference exists between operator response to schema bizarre and schema
irrelevant cues.
Hypothesis 2:
• Ho: No difference exists between operator response to schema unexpected cues and
the absence of schema expected cues.
•
Hi: A difference exists between operator response to schema unexpected cues and the
absence of schema expected cues.
Task
An air traffic control (ATC) simulation was used to test the above hypotheses. The
simulations utilized the ATC training simulators at the Federal Aviation Administration
(FAA) Academy in Oklahoma City. These simulators provided a replicate ofthe radar
displays, flight plan strips, and communications with aircraft as used in the field. The
simulation involved not only the controller (i.e., the subject), but h also involved a "ghost
station" whose operator ran the simulation and served as the controller of small airports
vsdthin the sector, two confederates who acted as the pilots ofthe aircraft in the sector
(each confederate could control up to 12 different aircraft at one time), and the SME who
acted as the controller of adjacent sectors and handled any unforeseen needs or requests.
45
Scenario Development
An experienced FAA controller serving as an instructor at the FAA Academy
fimctioned as the subject matter expert (SME). The SME ensured that the cues were
operationally relevant and incorporated these cues into reahstic scenarios. When
interpreted correctly, the cues indicated that an error had occurred and that corrective
action was required to rectify the situation. The occurrence of these cues whhin the
scenario was not any more or less salient than any other events in the scenario. In order to
minimize any confounding effect caused by differences in the severity ofthe consequences
of different types of errors, a within-error design was used in which all four cue types
were developed for each of three types of errors. The three error categories were selected
based on actual errors that can and do occur in the air traffic control environment:
misidentification of aircraft type,flightplan errors, and communication errors. These
categories were chosen because they are typical and because for each the controller is
expected to make a specific overt corrective action. Furthermore, desphe potential
differences in consequence severity, any of these errors could resuh in a loss of legal
separation. Thus, the cues, iffijllyunderstood, commanded a response. In each case, the
potential for a representation error was created through the provision of a piece of
misinformation. Cues of each type were then presented in each of these categories and
operator response was measured to determine if the misinformation was discovered.
Misidentification of aircraft type. This error category was based on the assumption
that somewhere prior to entering the controller's airspace, an error had occurred causing
the aircraft's type to be designated incorrectly. That is, the aircraft type shown on the
46
flight strip was not in fact the correct aircraft type. For the bizarre, irrelevant, and absence
of expected cues this fact was not obvious from simply looking at the flight strip since ah
the information on the flight strip was in accord with the aircraft type shown on the flight
strip. The discrepancy for the bizarre and irrelevant cues came from the fact that the
information presented on the radar screen was at odds with what was shown on the flight
strip. The bizarre cue was that the aircraft's speed on radar was faster than the aircraft
type shown on the strip could fly, and the irrelevant cue was that the aircraft's speed on
radar was on the upper boundary of a speed that the aircraft type shown on the strip could
fly. The discrepancy for the absence of expected cue occurred when the aircraft did not
identify itself as a "heavy" when communicating with the controller ~ an event which
should have occurred for an aircraft of that type. For the unexpected cue in this category,
the error was evident on the flight strip as well as on the screen since the altitude shown,
and subsequently flown, was not an expected altitude for that aircraft type (see Table 2.1).
Flight Path Error. The assumption in this category is that the flight path was filed
incorrectly and the aircraft's actual destination was different than that shown on the flight
strip. In the irrelevant, unexpected, and absence of expected cases the aircraft did not
adhere to the flight plan shown on the flight strip - at some point in the scenario the
aircraft deviated from the filed flight plan in order to proceed to hs actual destination,
providing a cue that an error had been made. The unexpected cue was that the aircraft
departed and flew a different route than towards the destination on the flight strip. The
absence of expected cue was the lack of a turn toward the destination on the flight plan.
47
3
O
O
g^
cd
I
I
c
o
r^
Ja
3> PQ
cd cd
-<-»
I
O
I
:z:
(U
;^
O
J3
t-i
o
in
J«
•«->
CO
^I
,o
-•4-»
cd
C
O
C
_
CA
S ^
o
o
U
o
c
o
o
»n
•c o 5
.•->
CM
Cd
^
vo
I-
^
c
O
«J
m
ed
O ^
O
td -C
CJ
^
O
o *- x: § ««
«
00
C/2
ed
C/3
O
^1
cd
'5 o I
w
o
o
0)
-^ I
I
c
c2
cd
JS
-u.
I/)
O
cA
O
CX
••J
o
td
0^
S 13 .2
•- x:
"S "^
I
-^
I
"^ • 0)
«
c^ 9
cd
I "*
C
I ed
c/5
Q,
.2 o c
2 £-1
^
ed
p
^-5
'^
S (u
Jo
3
ex
O
-^
cd
'^ 5
I «» 2
I« - ^
I 2 Jd 2
' £i 1;? o
^ • ^
ed •5
T3
cd
cd
cd
Ui
.
.«->
I
I *"
(U
>^
H
c
Ui
N
5 I'm
ed
>
o ^
o
o (U
c cx
••-»
IsIs
CO
Xi
^ cx *CO
.w
cd
S c 3 <=o
O
M
x : <u
O
C
I
ex
^
o op g x:
0^
a> -^
^ r2
ed
3
J3
« -a
"co
H
Ui
cj
X
48
^
O S
The irrelevant cue was the pilot questioning weather condhions along the route to the
actual destination and not the destination on theflightstrip. In this case, aflightpath
deviation also occurred later in the sequence when the aircraft turned in the direction of its
actual destination. (Catching the cue at this point in the scenario was not considered a
correct response since the physical deviation ofthe aircraft from hs filed route provided a
second clue.) In the case ofthe bizarre cue, theflightplan on theflightstrip was
incomplete and the aircraft's route ended in a prohibited area. The error was obscured by
the fact that the flight strip indicated "Springfield radial" which is similar to the legitimate
destination of "Springfield." Theflightstrip was obscured in this manner to decrease the
possibility that the cue would be discovered before the scenario started if the controller
scanned the strips (see Table 2.2).
Communication Error. This category was based on the assumption that the pilot
misunderstands a clearance and flies the clearance he/she "thinks" was given rather than
what the controller actually issued. In these cases, the pilots were instructed to simply
answer the clearance whh "roger" instead of repeating the clearance. The cues in this
category all involved ahitude: (1) bizarre cue ~ the pilot "thinks" he/she was given a
higher ahitude and cUmbs when given a descent (the bizarre cue is that the aircraft chmbs
instead of descends), (2) irrelevant cue ~ the pilot "thinks" he/she was given a higher
ahitude than actually was (i.e., 12000 feet instead of 11000 feet) and descends slowly
because the pilot thinks there is less altitude to lose (the slow descent is the irrelevant
cue), (3) absence of expected - the pilot does not "think" he/she was given a descent and
49
3
•«—>
O
T3
0)
T3
<
o
/ \
73'
<U
o
t
W
td
• * ^
c
cd
00
c
ed
C
• ^H
Vi
^
op
03 ca
00
3
H
o
i
led
one
•*->
tc N
« 5 TD
" ^ •4->
<U
<«
Cd ^
0)
3
o •ja
u
o
Ur
••^
0)
CO
-o
CM
Aircr aft ake off an fli
Oppo
dire tion
indici
ed
cd
c
o
a
O
ji
"H,
00
c«
^
Cd
c„
o
O
Q
CA
o
o
••-»
i> 1 3
CA
^
i_ ^cd
a
Sc
1^
c
ed
>
cx
H
cd
N
o
cx
X
c:
D
50
J^
00
c
o
_g T3
c
Ui
3
(U
-o
c
c
CA
c
U o
o
"O ed
•«-»
O
• wm
<c
cd
u
C/i
4>
O •0
i-i
'^
(4-1
•4J
CX
•4-J
icat
c
o
•4->
rect
ed
V
JS
•4->
C4-1
0
u
o -5
J^ cx
J3
maintains current ahitude (the lack of a descent is the absence of expected event), and (4)
unexpected cue ~ the pilot thinks he/she was given a lower altitude than he/she actually
was and descends past the given altitude (the descent past the given clearance is the
unexpected event). Since the radar screen provides a planar view, the ascent or descent is
not obvious by the aircraft's movement on the screen. Rather, the air traffic controller
must notice the errant altitude in the data block (see Table 2.3).
Scenario Design
Two 30 minute scenarios were developed, each containing one pair of cues (i.e.,
bizarre/irrelevant or unexpected/absence of expected) from each ofthe three error
categories (misidentification of aircraft, flight plan error, conmiunication error). Both
scenarios had two versions; the second version switched the order of presentation for each
pair of cues. The order in which the aircraft involved in each cue type appeared in the
scenario is shown in Tables 2.4 and 2.5. (The occurrence ofthe cues were not necessarily
in this order since controller action influenced the timing ofthe communication errors.)
Subjects
The subjects were experienced FAA controllers serving as instructors at the FAA
Academy. Twelve subjects participated in the study. Six subjects were presented with
scenario A (3 saw version 1 and 3 saw version 2) and six were presented with scenario B
(3 saw version 1 and 3 saw version 2). The average age ofthe subjects was 43.1 years
51
CA
o
CA
TD
ed
<
••-•
>
0) Ta
3
3
3
C
CA
CA
it3
i C
•
4>
V !
O
:
CA
4-<
c2i
••o o
o
o
o
o
o
(N
O
1
:
.
o
O
i<s o
tJ
; u
1
-
•^ 2 ••5 2
'^
* • < - •
cd
c
V
rcral
curr
B
aft
c
o
CA
T3
C
(U
•^
"cd
c
(U
CJ
o
c
c « c u
§ t« ed ,<U
ed
ed
Ji
o
CA
c
o
'•«-»
ed
c
cd
00
i
^~'
cd
«
o
O
S o
T3
<u
o o
I-I
>
c —'
ed
JJ
u ^ "ob 2 CJ
• ^
00 c
a
o
I-I
CA
00
CA
O
o o
k.
•^ 2
<
<e
^
CA
T3
cd
Urn
•3
k«
o
CA
«
c
3
s
s
o
(U
>
*oo
c
0)
c
u
k«
""^
(U
Cd
c o
x> 15 c
O
^
CA
3
«
Cd
o
CA
c
"53)
0)
x>
•«-•
3
•-^ X )
u.
o
CA
00
12
B
3 •-^
end
00
and
aft
ed
giv
Ji
Cd
i-i
ed
ed
k«
•<-»
C
o
C
CA
a>
CA
-o
ed
CA
CX
c»
-o
•o
c
a>
o
o
c
4 ^
CA
CA
a>
o
V
T3
-o
<t;
(U <t;
cd TD 1 cd
31
kM
C/3
• • ^
k«
ed
•
^
T3
c
CX
H
3
cd
>
cd
N
4—>
o
u
cx
X
c
(4-1
U
O
tj
U
(U
J:^
CX
V
OJ
CA
<
52
Table 2.4: Scenario A, orders 1 and 2
Scenario Al
Scenario A2
Error
Error
Cue Type
Cue Type
Bizarre
Misidentification
Misidentification
Irrelevant
of Aircraft Type
of Aircraft Type
Bizarre
Flight Plan Error
Communication
Irrelevant
Error
Bizarre
Conmiunication
Irrelevant
Flight Plan Error
Error
Irrelevant
Conmiunication
Misidentification
Bizarre
Error
of Aircraft Type
Flight Plan Error
Irrelevant
Bizarre
Flight Plan Error
Misidentification
Communication
Irrelevant
Bizarre
Error
of Aircraft Type
Table 2.5: Scenario B, orders I and 2
Scenario Bl
Scenario B2
Error
Error
Cue Type
Cue Type
Misidentification
Absence of expected Misidentification
Unexpected
of Aircraft Type
of Aircraft Type
Absence of expected Communication
Absence of expected Misidentification
Error
Communication
Absence of expected Flight Plan Error
Unexpected
Error
Unexpected
Misidentification
Flight Plan Error
Unexpected
of Aircraft Type
Unexpected
Flight Plan Error
Absence of expected Communication
Error
Unexpected
Communication
Absence of expected Flight Plan Error
Error
(min = 32, max = 62, SD = 12.3), the average number of years in air traffic control was
17.9 years (min = 7, max = 36, SD = 10.6). Their average number of years at fiill
performance level was 13.6 years (min = 4.1, max = 29, SD = 10.5). The average number
53
of years the subjects had been out ofthe field was 3.1 years (min = 3 3 , max = 7, SD =
2.3). Three ofthe subjects were female, and nine were male.
Subject Instructions
The subjects were not given any specific instructions as to the manner in which they
were to carry out the task. The controllers were simply asked to work a scenario
controlling traffic as they have been tramed by the FAA. Prior to participating in this
simulation, the subjects participated in four simulation exercises as part of another study,
all of which involved normal traffic scenarios in the same sector at varying levels of
workload. Thus, the subjects, in effect, all received the same amount of refresher training.
Since these prior shnulations were straightforward, the subjects were not alerted to the
fact that errors would be purposely introduced during this exercise. Additionally,fromthe
procedures of these previous exercises, the subjects were aware that their performance
was bemg evaluated by the subject matter expert.
The subjects were not given any instructions with respect to the cues. The errors
indicated by the cues were errors that could happen in both a traming simulation and in a
real world setting, so their occurrence should not seem particularly unusual. FAA
procedures would indicate that each ofthe errors should be corrected by the controller if
detected.
54
Operator Response
If the operator acknowledged the cue and took steps to correct the error (regardless of
outcome ofthe steps), the cue was counted as having been discovered. If no response to
the cue occurred, the error was counted as not having been discovered.
55
CHAPTER III
RESULTS AND DISCUSSION
A summary ofthe resuhs is shown in Table 3.1. (Appendix A shows the detailed
results by subject.) The totals for each of cue type were different because in some cases,
the cue did not occur as designed (see below). Although the omission of designed cues
resulted in a loss of information, these omissions were distributed across subjects and cue
types, thereby minimizing the impact ofthe missing data points. An analysis was
conducted to determine the hkehhood of discoveringfromthe given cue that an error had
occurred.
Cue type
Bizarre
Irrelevant
Unexpected
Absence of expected
Table 3.1: Summary of responses
Discovered
Not Discovered
6
11
0
11
5
9
8
10
Total
17
11
14
18
Bizarre Cues
Seventeen ofthe bizarre cues were presented as designed. In one case,
misidentification of aircraft cue did not occur as designed — the pilot ofthe aircraft
inadvertently used the word "Lear" in communication whh the controller (in "reality" the
aircraft was a Piper Malibu, not a Lear). Response to this cue was therefore omitted from
the analysis. Ofthe cues in this category that ehched an appropriate responsefromthe
56
subjects, 4 (out of 6) involved flight plan errors, and 2 (out of 6) involved communication
errors. None ofthe misidentification of aircraft type cues were discovered (see Table 3.2).
Table 3.2: Bizarre Cue results
Error Category
Discovered
Not Discovered
0
Misidentification of Aircraft
5
4
Flight Plan Error
2
Communication Error
2
4
Irrelevant Cues
Eleven ofthe irrelevant cues were presented as designed. The six irrelevant
communication cues were not presented as planned in the scenario (i.e., the slow rate of
descent did not occur due to a computer system inadequacy) and were therefore omitted
from the analysis. The other cue that was not presented as designed was an irrelevant
flight plan cue. This cue involved the pilot asking the controller for weather in a route
other than the one listed on theflightplan. In this case the controller routed the aircraft in
such a manner that the pilot was unable to ask the question that presented the cue. None
ofthe cues in this category induced the controller to realize that an error had occurred
(see Table 3.3).
Table 3.3: Irrelevant Cue results
Not Discovered
Discovered
Error Category
6
0
Misidentification of Aircraft
5
0
Flight Plan Error
~
~
Communication Error
57
Unexpected Cues
Fourteen ofthe unexpected cues were presented as designed. Ofthe cues that were
not presented as designed, two involvedflightplan cues. In one case a computer error
affected the aircraft's designed route offlight;that is, the aircraft's flight plan indicated a
southeast route of flight, and the cue was suppose to be that the aircraft flew a southwest
route of flight instead. However, a computer error caused the aircraft to fly in a northeriy
direction. In the other case an overly dramatic route change occurred because the route
change was input into the computer after the aircraft had passed the NAVAID; this late
change caused the aircraft to appear to jumpfroma southeast route offlightto the
southwest route of flight. Two communication cues also did not occur because in each
case the aircraft was switched to the next sector before the aircraft descended below the
given clearance involved in the cue. Ofthe cues that led to corrective action, 3 (out of 4)
involved flight plan cues and 2 (out of 4) involved communication cues. None ofthe
errors involving aircraft misidentification were discovered (see Table 3.4).
Table 3.4: Unexpected Cue resuhs
Error Category
Discovered
Not Discovered
0
Misidentification of Aircraft
6
3
1
Flight Plan Error
2
Communication Error
2
58
Absence of Expected Cues
All ofthe absence of expected cues were presented as designed. Four (out of 6) ofthe
flight plan cues ehched a response in the form of corrective action and 6 (out of 6) ofthe
communication cues induced the subject to take corrective action (see Table 3.5).
Table 3.5: Absence of Expected Cue results
Error Category
Discovered
Not Discovered
0
Misidentification of Aircraft
6
4
Flight Plan Error
2
Communication Error
6
0
Analysis
A 2x2 contingency table analysis was used to analyze the data. A complete
description ofthe calculations is given in Appendix B.
Hypothesis 1
For hypothesis one (comparing schema bizarre versus schema irrelevant cues), the
contingency table analysis was significant (T = 4.94, df = 1, p < .05), indicating that a
difference exists between response to schema bizarre and schema irrelevant cues. This
data is in accordance with that suggested by the static schema Iherature: bizarre cues
were more readily responded to than were cues that were seemingly schema irrelevant.
59
Hypothesis 2
For hypothesis two (comparing the absence of expected and unexpected cues), the
contingency table was not significant (T = 1.24, df = 1, p > .75). Thus, the null hypothesis
that no difference exists between response to the absence of schema expected and
unexpected cues could not be rejected.
Discussion of Hypothesis Analysis
Error Category: Misidentification of Aircraft type. The fact that none ofthe errors in
the category "misidentification of aircraft type" were discovered might suggest that this
category's cues were problematic. Subjects may have viewed the cues as computer quirks
or may not have paid as close attention to subtle differences as they would have in the
field. These errors went undiscovered desphe the fact that missing these errors in an
operational setting could resuh in a loss of legal separation between aircraft since the
aircraft's performance characteristics would be different from what the controller
expected. If this data is excludedfromthe analysis the outcome is not greatly affected,
however. That is, when the misidentification of aircraft type cues are omittedfromthe
analysis, the contingency analysis for hypothesis one (comparing schema bizarre versus
schema irrelevant cues) is still significant (T = 3.86, df = 1, p < .05), whereas is the
contingency table analysis for hypothesis two (comparing the absence of expected to
unexpected) is stiH not significant (T = 1.11, df = 1, p > .75).
Evaluation of Alternate Explanations. The assumption of this study was that if a piece
of information (i.e., a cue) had the appropriate impact on SA, an overt action would take
60
place as the subject tried to rectify the situation. If no response occurred, the assumption
was that the significance ofthe information was not comprehended. In reality, two
possibilhies exist as to why a subject may not have responded to the cue. First, as was the
assumption in this study, an SA error (most likely a representational error) occurred.
Second, the subject may have detected the cue, understood hs significance, but chose to
ignore it. Besides the fact that the subjects were all professional controllers who
volunteered to participate, the latter case is not likely for several reasons. First, if the
subject chose to ignore the cue, then in all likelihood the subject would ignore other
information during the scenario which would undoubtedly degrade performance. Since
the subjects knew that their performance was being evaluated, this type of behavior is
unlikely. Furthermore, many of these errors would have resuhed in a loss of legal
separation between aircraft had they occurred in the real world. For example, controllers
are responsible for ensuring that the aircraft in their sectors meet various crossing
restrictions; if the restriction is not met the controller is held responsible (barring obvious
pilot comphance problems). Any operational deviation is of enormous consequence to the
controller involved since h can resuh in being pulledfromthe job and sent for retraining.
Thus, the types of errors that occurred in this study were likely to receive attention from
the controller in order to avoid the embarrassment, even in a simulation, of a "deal" (i.e.,
an operational deviation).
61
Analysis ofthe Effect of Individual Differences in Error Detection
Due to the small sample size, an analysis was done to determine if individual
differences contributed to error detection. This analysis utilized available information
from the NASA. TLX index of perceived workload questionnaire (filled out by each
subject at the end ofthe scenario) and from the subject information sheet.
Workload. A Linear Regression of portion of errors discovered and perceived
workload rating showed no significant relationship between the two variables (F = .231, df
= 11, p = .641) (see Table 3.6). Thus, perceived workload does not appear to have
affected the number of correct responses given by an operator.
Subject
(Scenario)
2(A1)
7(A1)
9(A1)
1(A2)
4(A2)
10(A2)
Table 3.6: iSummary of res ponses and workload rating
Discovered / NASA. TLX
Subject
Discovered /
Total
Score
(Scenario)
Total
70.75026
4/6
8(B1)
3/5
68.14818
2/6
3/6
ll(Bl)
53.84617
13(B1)
3/6
3/5
70.31342
5(B2)
3/6
3/6
69.23079
6(B2)
3/6
2/5
31.79484
12(B2)
4/6
4/6
NASA. TLX
Score
37.49284
56.25835
55.27067
52.13677
65.07123
91.71891
Subject Characteristics. Linear regressions of portion of errors discovered and years
out ofthe field (F ratio = .482, df = 10, p = .505), portion of errors discovered and years
in air traffic control (F = 3.129, df = 10 p = . 111), portion of errors discovered and years
at full performance level (F = 3.215, df = 10, p = .107), and portion of errors discovered
and age (F = .294, df = 10, p = .601) showed no significance. Thus, the differences in the
number of errors discovered were not significantly influenced by the subjects'
characteristics.
62
Neither the perceived workload nor the subject characteristics indicated any evidence
that individual differences played a role in error detection. Thus, these individual
differences do not seem to be a confounding factor in the study.
S A Error Analysis
Finally, ahhough not every missed response could be accounted for (due to differences
in subjects' verbalization and willingness to discuss the scenario after the fact),
information was available on eight ofthe undiscovered errors. From this information,
eight associated SA errors could be identified. These errors were classified according to
the SA error taxonomy developed by Endsley (1995a) (see Figure 3.1).
Level 1: Failure to correctly perceive the situation
• Data not available
• Hard to discriminate or detect data
• Failure to monitor or observe data
• Misperception of data
• Memory Loss
Level 2: Improper integration or comprehension ofthe situation
• Lack of or poor mental model
• Use of incorrect mental model
• Over-reliance on defauh values
• Other
Level 3: Incorrect projection offiitureactions ofthe system
• Lack of or poor mental model
• Overprojection of current trends
• Other
Figure 3.1: S A Error Taxonomy
63
One ofthe identifiable SA errors was a Level 1 SA error. Level 1 SA errors involve a
failure to perceive information or the misperception of information. This error feU into the
subcategory of "failure to monitor" and involved the bizarre communication cue: the
subject never responded to the fact that the aircraft had climbedfrom22,000 to 31,000
after being given a descent clearance even though the changes in the aircraft's altitude was
visible in the data block the entire time.
The other seven errors were Level 2 SA errors in which the subjects failed to correctly
comprehend the situation. These errors indicate that representational errors did occur in
many ofthe cases in which no overt response to the cue occurred. Two of these errors
involved a failure to comprehend the significance of information for unknown reasons
(i.e., the "other" category):
•
Unexpectedflight plan. One subject noticed a discrepancy in theflightpath and the
flight strip but did not comprehend the significance.
•
Irrelevant flight plan. One subject knew that the aircraft was filed to Wichita and
asked for weather to Kansas City (the "real" destination) but did not discern the
significance ofthe question.
The remaining five errors involved the use of an incorrect mental model. Previously, two
subcategories to this classification have been identified: (1) mismatching information to
one's mental model based on expectancy, and (2) using the wrong mental model for the
circumstance. Although similar in nature to each other, these five errors did not fall into
ehher of these two established subcategories of "incorrect mental model," thereby
64
indicating that the inclusion of an additional subcategory to this classification is warranted.
In these cases, the failure to correctly comprehend the situation involved a snap
assessment of an event (i.e., a cue) that was seemingly of little consequence. The subject
made an assumption as to the cause ofthe cue and then, in essence, waited for
contradictory evidence. Whhout this type of evidence, the inaccurate assumption went
unnoticed and the true error m the situation was not resolved:
•
Unexpected communication"^. In one case, after switching an aircraft's frequency
early, the subject noticed that the aircraft had descended below the crossmg restriction
altitude; the subject assumed approach control "dumped that one early" (i.e., gave a
lower altitude before the crossing point) rather than considering that the aircraft had
not received the subject's clearance correctly (as designed in the study).
•
Unexpected misidentification of aircraft type. A subject assumed that the NASA
aircraft at 13,000 feet was carrying the space shuttle.
•
Unexpected misidentification of aircraft type. A subject assumed that the NASA
aircraft at 13,000 was an artifact ofthe simulation.
•
Irrelevant flight plan. A subject assumed an aircraft was on the wrong course because
a previous controller gave an incorrect clearance and the pilot did not catch h.
•
Bizarre aircraft. A subject did a route offlightgraphic and saw that the aircraft was
behind where h should be. This subject then amended the aircraft's route and changed
" This cue was not counted in the hypothesis analysis because some uncertainty existed as to whether
or not the controller was responsible for the aircraft after thefrequencychange was given. Howe\er, the
controller's statement regarding the cue indicated that the cue was noticed, but its tme meaning was not
discovered. Thus, this incident is counted in the SA error analysis since it pro\ides information about the
controller's explanation ofthe deviant action.
65
the route in the computer. The subject never checked to make sure that this
adjustment was adequate, indicating that he/she assumed that the discrepancy was
resolved. This case was the only one in which a "misidentification of aircraft type" cue
was acknowledged.
Thus, in at least seven ofthe missed detections, the cue was perceived correctly but
the true meaning ofthe cue was not realized. Instead, a representational error occurred in
which the subject absorbed the information into the functioning, but inaccurate, mental
model rather than discovering the error indicated by the cue. These representational
errors occurred even though the subject had to go to lengths to explain the deviant
information. For example, in one case in which a representational error caused a delay in
error discovery, the subject reasoned that aflightpath deviation occurred because the
flight was a training flight, even though nothing in the scenario suggested or supported
this rationale. The fact that the explanations generated by the subjectsfrequentlyhad no
basis in the current scenario did not deter the subjectsfromaccepting the errant
interpretations.
Four ofthe seven Level 2 SA errors involved an unexpected cue, two involved an
irrelevant cue, and one involved a bizarre cue. The Level 1 SA error involved the bizarre
communication cue. Although differences in subjects' verbalization and willingness to
discuss the scenario after the fact prevent drawing any conclusions regarding the
representativeness of these numbers whh respect to total number of undiscovered errors,
this information does provide interesting information about the types of cues that
influenced the different categories of SA errors.
66
CHAPTER IV
CONCLUSION
The resuhs of this study suggest several things. First, the resuhs indicate that schema
bizarre cues were more likely to cause an error to be discovered and corrected by subjects
than were cues that were seemingly schema irrelevant. Thus, in line with the theory
behind the use of schema, S A was impacted more by schema bizarre cues than by schema
irrelevant cues. This result highlights the need for system design to (1) draw attention to
items that are seemingly irrelevant and (2) help the operator differentiate between what is
truly irrelevant and what might be indicative of a problem. Second, this study found no
evidence to support the hypothesis that a difference exists between response to schema
unexpected and the absence of schema expected cues. However, many models of NDM
indicate that expectations such as those generated by schema play a role in the situation
assessment phase ofthe decision process. For example, the Recognition Primed Decision
model developed by Klein and his associates (Klein, 1993; Klein & Calderwood, 1991;
Klein & Crandall, 1992) suggests that expectations generatedfroma mental model or
schema are compared to actual events to check the accuracy ofthe situation assessment.
Therefore, understanding the manner in which an operator reacts to unrealized
expectations is crucial. For example, if an expectation is not met, does the operator
abandon this avenue of situation assessment in order to find an interpretation that fits whh
the presented information, or does the operator follow some other course of action such
as rationalizing why this information is not present? Furthermore, this model raises the
67
question as to how an operator deals whh information that does not fit with generated
expectations; is this information assimilated into the situation assessment or does it cause
the situation assessment to be modified? Consequently, the impact ofthe absence of
expected information and the presence of unexpected information warrants fiirther
research.
The importance of understanding what types of cues positively impact SA (and
thereby decrease the likelihood that a representational error wiU occur) is underscored by
Kaempf et al. 's (1996)findingsconcerning the AEGIS combat system. Kaempf et al. 's
study investigated how experienced personnel made decisions in situations characterized
by time pressure, highrisk,and ambiguous or missing information. They found that rather
than spending a lot of time trying to decide what actions to take, the personnel were
primarily concerned with developing SA. Once SA was fiilly developed, the appropriate
actions were dictated by procedure (i.e., if this set of contingencies exist, take action A, if
that set of contingencies exist take action B). Thus, interpreting the meaning ofthe
available information correctly and thereby developing accurate SA is essential for
implementing the most appropriate course of action. The occurrence of a representational
error under these circumstances has enormous consequences. For example, if this type of
error occurred, unnecessary escalation of hostilhies might occur (e.g., the Vincennes
incident) or necessary defensive actions might not occur.
Thefindingsof this study provided insight into the type of information that is likely to
cause a person to realize the current mental model is inadequate to explain new
information and that an adjustment in the mental model is needed (thereby avoiding a
68
representational error). Knowing which types of information an operator is more or less
likely to utilize enables a designer to more effectively present information to the operator
The goal of any system design is to optimize operator performance. Enhancing SA by
emphasizing the type of information to which an operator is naturally less inclined to
respond (when they should) is one approach to improving system design and thereby
performance.
Generalization of this study's resuhs, however, should be tempered by the limitations
ofthe study. First, this study was based on the assumption that a predictable mental
model could be created and that the condhions presented in the study were sufficient to
cause the desired mental model to be in operation. Individualsfromother theoretical
viewpoints mightfindthis assumption unsatisfactory. Nonetheless, the nature ofthe
representational errors identified by the SA error analysis supports this assumption; that is,
controller action (i.e., generating ahemate explanations for deviant information) was in
accord whh the theory behind representational errors. Second, the study involved a
simulation. Although thefidelityand reahsm ofthe simulation were quite high, the fact
that it was a simulation might have affected some ofthe subjects' responses. Next, the
study involved a small number of subjects and a relatively small number of data points.
Furthermore, these subjects have been out ofthe field from 5 months to 7 years. The
results may have been different if the subjects were all current and working in sectors with
which they were more familiar, although in light ofthe kind of errors that occurred this
possibility is not likely. Finally, the study could not determine whether or not the cue was
noticed, it could only determine whether the controller took corrective action or made a
69
statement concerning the cue. Due to these limitations the study should be viewed as a
prehminary study and hsfindingsviewed conservatively until supported by other research.
Situation awareness is an important, although fairly recent, concept in dynamic
environments. As such, it has received quite varied treatment, leaving the literature rather
dissenting as to what defines and affects this concept. This study drewfromschema
literature to develop hypotheses regarding how information with different schema related
characteristics might affect SA. By utilizing a systematic approach to understand vague
components ofthe concept of SA, this study provides hard data regarding a little studied,
yet crucial, aspect of SA: the manner in which information with different characteristics
influences a person to adjust a mental model. Thus, the resuhs of this study provide an
initial assessment regarding the hnpact of information with different schema related
characteristics on SA. This type of research is critical to understanding the types of
information that will effect SA, which in turn is critical to improving decision making.
70
REFERENCES
AMRAAM OUE Tactics Analysis Methodology Briefing. November 1983. SECRET.
Classified by DD254, 12 Oct. 82, Contract no. F33615-81-C-1485. Declassify on
OADR. Unclassified information only usedfromthis source.
AMRAAM OUE Tactics Lessons Learned. Video tape of briefing given on 21 November
1983. SECRET-NOFORN. Classified by DD254, 29 Jan 82, Contract no. 9. F3361581-C-1485/P00003. Declassify on OADR. Unclassified information only used from
this source.
Baddeley, A. (1992). Workmg memory. Science. 556-559.
Beach L. R. & Lipshitz, R. (1993). Why classical decision theory is an inappropriate
standard for evaluating and aiding most human decision making. In G. A. Klein, J.
Orasanu, R. Calderwood, and C. E. Zsambok (Eds.), Decision making in action:
Models and methods (pp. 21-35). Norwood N. J.: AblexPubUshing Corporation.
Beach, L. R. & Mitchell, T. R. (1987). Image Theory: principles, goals and plans in
decision making. Acta Psychologica. 66. 201-220.
Bell, D. E. (1982). Regret in decision making under uncertainty. Operations Research.
30(5), 961-981.
Biederman, I., Glass, A. L., & Stacy, E. W., Jr. (1973). Searchmg for objects in realworld scenes. Journal of Experimental Psychology. 97. 22-27.
Bower, G. H., Black, J. B., Turner, T. J. (1979). Scripts in memory for text. Cognitive
Psychology. 11. 177-220.
Brewer, W. F, & Nakamura, G. V. (1984). The nature and functions of schemas. In R.
S. Wyer, Jr., & T. K. Srull (Eds.), Handbook of social cognition (Vol. 3, pp. 119160).
Brewer, W. F., & Treyens, J. C. (1981). Role of schemata in memory for places.
Cognitive Psychology. 13. 207-230.
Cantor, J. & Engle, R. W. (1993). Working-memory capacity as long-term memory
activation: An individual-differences approach. Journal of Experimental Psychology:
Learning. Memory, and Cognition. 19(5). 1101-1114.
71
Connolly, T. (1982). On taking action seriously: Cognitivefixationin behavioral
decision theory. In G. R. Ungson and D. N. Braunstein (Eds.), Decision Making: An
interdisciplinary inquiry, (pp. 42-47). Boston, MA: Kent Pubhshing Company.
Conover, W J. (1971). Practical nonparametric statistics. New York: Wiley.
EUis, H. C. & Hunt, R. R. (1993). Fundamentals of cognitive psychology. Madison,
Wisconsin: Brown & Benchmark.
Endsley, M. R. (1988a). Design and evaluation for situation awareness enhancement.
Proceedings ofthe Human Factors Society. 32nd annual meeting, (pp. 97-101). Santa
Monica, CA: The Human Factors Society.
Endsley, M. R. (1988b, May). Situation awareness global assessment technique
(SAGAT). Paper presented at the National Aerospace and Electronics Conference,
Dayton, OH.
Endsley, M. R. (1989). Pilot situation awareness: The challenge for the training
community. Proceedings ofthe Interservice/Industrv Traming Systems Conference
(I/ITSC). (pp. 111-117). Fort Worth, TX: American Defense Preparedness
Association.
Endsley, M. R. (1993 a). Situation awareness and workload: FUp sides of the same coin.
Proceedings ofthe 7th International Symposium on Aviation Psychology.
Endsley, M. R. (1993b). Situation awareness in dynamic human decision making:
Theory. Proceedings ofthe 1st International Conference on Situational Awareness in
Complex Systems. Orlando, FL.
Endsley, M. R. (1993 c). A survey of situation awareness requirements in air-to-air
combat fighters. The International Journal of Aviation Psychology. 3(2). 157-168.
Endsley, M. R. (1994). The role of situation awareness in naturalistic human decision
making. Second Conference on Naturalistic Decision Making. Dayton, Ohio.
Endsley M. R. (1995a). A taxonomy of situation awareness errors. In Fuller, R, Johnston,
N, McDonald, N, eds. Human Factors in Aviation Operations. England: Avebury
Aviation, 287-292
Endsley, M. R. (1995b). Towards a theory of situation awareness. Human Factors
37(1), 32-64.
Ericsson, K. A. & Chamess, N. (1994). Expert Performance: Its structure and
acquisition. American Psychologist. 49(8). 725-747.
72
Fracker, M. L. (1988). A theory of situation assessment: Imphcations for measuring
situation awareness. Proceedings ofthe Human Factors Society. 32nd annual meeting,
(pp. 102-106). Santa Monica, CA: The Human Factors Society.
Fracker, M. L. (1989a). Attention allocation in situation awareness. Proceedings ofthe
Human Factors Society. 33rd annual meeting, (pp. 1396-1400). Santa Monica, CA.
The Human Factors Society.
Fracker, M. L. (1989b). Attention gradients in situation awareness. In Proceeding ofthe
AGARD AMP Symposium on Situational Awareness in Aerospace Operations. CP
478. Seuilly-sur Seine: NATO AGARD.
Gibson, J. J. (1979). The ecological approach to visual perception. Boston: HoughtonMifflin.
Goodman, G. S. (1980). Picture memory: How the action schema affects retention.
Cognitive Psychology. 12. 473-495.
Graesser, A. C, Woll, S. B., Kolwalski, D. J., & Smith, D. A. (1980). Memory for
typical and atypical actions in scripted activhies. Journal of Experimental Psychology:
Human Learning and Memory. 6. 503-515.
Hart, S. G. (1987). Background, Description, and Application ofthe NASA Task Load
Index (TLX). Proceedings ofthe Department of Defense Human Factors Engineering
Technical Advisory Group. Dayton, Ohio.
Holland, J. H., Holyoak, K. F., Nisbett, R. E., & Thagard, P. R. (1986). Rule-based
mental models. In Induction: Processes of inference, learning, and discovery (pp. 2965). Cambridge, MA: MIT Press.
Hollnagel, E. (1988). Mental Models and Model Mentality. In L. P. Goodstein, H. B.
Andersen, and S. E. Olsen (Eds.), Tasks, errors and mental models, (pp. 261-268).
London: Taylor & Francis.
Johnson-Laird, P. N. (1981). Mental models in cognitive science. In D. A. Norman, N.
J., & N. J. Norwood (Eds.), Perspectives in cognitive science. Norwood, NJ: Ablex
Publishing Co.
Johnson-Laird, P. N. (1983). Mental models: Towards a cognitive science of Language,
inference, and consciousness. Cambridge, MA: Harvard University Press.
Johnson-Laird, P. N. (1994). Mental models and probabihstic thinking. Cognition. 50.
189-209.
73
Just, M. A. & Carpenter, PA. (1992). A capacity theory of comprehension: Individual
differences in working memory. Psychological Review. 99(1). 122-149.
Kaempf, G. L., Klein, G., Thordsen, M. L., & Wolf, S. (1996). Decision Making in
Complex Naval Command-and-Control Environments. Human Factors 38(2). 220231.
Klein, G. A. (1993). A recognition-primed decision (RPD) model of rapid decision
making. In G. A. Klein, J. Orasanu, R. Calderwood, and C. E. Zsambok (Eds),
Decision making in action: Models and methods (pp. 138-147). Norwood NJ: Ablex
PubUshing Corporation.
Klein, G. A. & Calderwood, R. (1991). Decision models: Some lessonsfromthe field.
IEEE Transactions on Systems. Man, and Cybernetics. 21(5). 1018-1026.
Klein, G. A. & Crandall B. W. (1992). The role of mental simulation in problem solving
and decision making. In J. M. Flach, P. A. Hancock, J. K. Caird, and K. J. Vicente
(Eds.), An ecological approach to human machine systems II: Local applications.
Hillsdale, NJ: Erlbaum.
Lipshitz, R. (1993). Converging themes in the study of decision making in realistic
settings. In G. A. Klein, J. Orasanu, R. Calderwood, and C. E. Zsambok (Eds),
Decision making in action: Models and methods (pp. 103-137). Norwood NJ: Ablex
Pubhshing Corporation.
Mandler, J. M. (1979). Categorical and schematic organization in memory. In C. R.
Puff, (Ed.), Memory organization and structure (pp. 259-299). New York: Academic
Press.
March, J. G. (1977). Bounded rationality, ambiguity, and the engineering of choice. The
Bell Journal of Economics. 587-608.
Matsuo, N., Matsui, H., & Tokunaga, Y. (1991). Forming mental models in learning
operating procedures for terminal equipment. IEEE Journal on Selected Areas in
Communications. 9(4). 548-554.
Mogford, R. H. & Tansley, B. W. (1991). The importance ofthe air traffic controller's
mental model. Presented at the Human Factors Society of Canada.
Murphy, G. L, & Medin, D. L. (1985). The role of theories in conceptual coherence.
Psychological Review. 92. 289-316.
74
Orasanu, J. & Connolly, T. (1993). The reinvention of decision making. In G. A. Klein,
J. Orasanu, R. Calderwood, and C. E. Zsambok (Eds.), Decision making in action:
Models and methods (pp. 3-20). Norwood NJ: Ablex PubUshing Corporation.
Pennington, N. & Hastie, R. (1993). A theory of Explanation-Based decision making. In
G. A. Klein, J. Orasanu, R. Calderwood, and C. E. Zsambok (Eds), Decision making
in action: Models and methods (pp. 188-201). Norwood NJ: Ablex Publishing
Corporation.
Pezdek, K., Whetstone, T., Reynolds, K., Askari, N., & Dougherty, T. (1989). Memory
for real-world scenes: The role of consistency whh schema expectation. Journal of
Experimental Psychology: Learning. Memory, and Cognition. 15(4). 587-595.
Pitz, G. F., & Sachs, N. J. (1984). Judgment and decision: Theory and application.
Annual Review of Psychology. 35. 139-163.
Press, M. (1986). Situation awareness: Let's get serious about the clue-bird.
Unpublished manuscript.
Raiffa, H. (1968). Decision analysis: Introductory lectures on choices under uncertainty.
Reading, MA: Addison-Wesley.
Rojahn, K. & Pettigrew, T. F. (1992). Memory for schema-relevant information: A
meta-analytic resolution. British Journal of Social Psychology. 31. 81-109.
Rouse, W B., & Morris, N. M. (1986). On looking into the black box: Prospects and
limits in the search for mental models. Psychological BuUetin. 100(3). 349-363.
Sarter, N. B , & Woods, D. D. (1991). Situation awareness: A critical but ill-defined
phenomenon. International Journal of Aviation Psychology. 1(1). 45-57.
Schoemaker, P. J. H. (1982). The expected utility model: Its variants, purposes,
evidence and limitations. Journal of Economic Lherature. 20. 529-563.
Selcon, S. J., & Taylor, R. M. (1989). Evaluation ofthe situational awareness rating
technique (SART) as a tool for aircrew systems design. In Situational Awareness in
Aerospace Operations (AGARD-CP-478) (pp. 5/1-5/8). Neuilly Sur Seine, France:
NATO-AGARD.
Selcon, S. J., Taylor, R. M., & Koritsas, E. (1991). Workload or situational awareness?:
TLX vs. SART for aerospace systems design evaluation. Proceedings ofthe Human
Factors Society. 35th annual meeting, (pp. 62-66). Santa Monica, CA: The Human
Factors Society.
75
Shanteau, J. (1975). An information-integration analysis ofriskydecision making. InM.
Kaplan and S. Schwartz (Eds), Human Judgment and Decision Process, (pp. 109137).
Smith, K., & Hancock, P. A. (1995). Situation awareness is adaptive, externally directed
consciousness. Human Factors. 37(1). 137-148.
Stiffler, D. R. (1987). Exploiting Situational Awareness Beyond Visual Range. (Report
no. 87-2370) Maxwell AFB, AL: Air Command and Staff CoHege/EDCC.
Taylor, R. M. (1989). Situational Awareness Rating Technique (SART): The
development of a tool for aircrew systems design. In Proceeding ofthe AGARD
AMP Symposium on Situational Awareness in Aerospace Operations. CP 478.
Seuilly-sur Seine: NATO AGARD.
Tenney, Y. J., Adams, M. J., Pew, R. W., Huggins, A. W. F., & Rogers, W. H. (1992).
A principled approach to the measurement of situation awareness in commercial
aviation. NASA Contractor Report 4451, Langley Research Center.
Ulvila, J. W. & Brown, R. V. (1982). Decision analysis comes of age. Harvard Business
Review. 60(5). 130-141.
WaddeU, D. (1979). Situational awareness. USAF Fighter Weapons Review. 27(4)
(Winter 1979), 3-5.
Wickens, C. D. (1992a). Engineering psychology and Human Performance (2nd ed).
New York: HarperCollins Publishers, Inc.
Wickens, C. D. (1992b). Workload and situation awareness: An analogy of history and
imphcations. Insight. 14(4). 1-3.
Wilson, J. R. & Rutherford, A. (1989). Mental models: Theory and application in human
factors. Human Factors. 31(6). 617-634.
76
APPENDIX A
SUMMARY OF SUBJECTS'
DATA
77
o o o o
o o o
-
o
-
-
o
-
o
•
1
o
-
o
-
-
o
-
o o
-
<
<
ON'
cs'
-
<
-
1
-
1
o
o o o o o o
o o o o o
ss
^
78
Not
discovered
O
Discovered
-
Not
discovered
-
Discovered
Discovered
Not
discovered
O
o — o -
1
o -
1
o
CQ
12(82)
1
Not
discovered
1
Discovered
1
Not
discovered
1
Discovered
1
Communication Error
Unexijected
Absence of Exp
Discovered
1
Flight Plan Error
Absence of Exp
Unexpected
Not
discovered
1
Not
discovered
Discovered
1
Discovered
Not
discovered
-
Not
discovered
1
Misidentification of Aircraft Type
Unexpected
Absence of Exp
Not
discovered
Discovered
O
Discovered
Communication Error
Irrelevant
Bizarre
1
8(81)
11(81)
13(81)
ed
1
Subiect
(Scenario)
H
-
-
10(A2)
3
!/3
Not
discovered
£
E
Discovered
ed
Not
discovered
o
Discovered
3
Flight Plan Error
Irrelevant
Bizarre
15"
Misidentification of Aircraft Type
Irrelevant
Bizarre
«
Q
o
a
Subject
(Scenario)
ed
1
o o o o o
1
1
1
' o o
o o o o
o — —
-
1
o o
1
-
o o o o o o
o o o o o o
APPENDIX B
CONTINGENCY TABLE ANALYSIS
79
CONTINGENCY TABLE ANALYSIS
The contingency table analysis was based on the procedure shown in chapter 4.1 in
Conover's Nonparametric Statistics text (Conover, 1971, p 142). This procedure has
three assumptions.
1. Each sample is a random sample.
2. The two samples are mutually independent.
3. Each observation may be categorized either into class 1 or class 2.
The data in this contingency analysis are arranged as follows.
Population 1
Population 2
Class 1
On
Class 2
Total
O22
ni
O21
O22
n2
N = ni + n2
The test statistic T for a two tailed test is given by the following formula.
T =
. N ( 0 n 0 7 9 - O19O21)'
nin2(Oii+02i)(Oi2+022)
Hypothesis I:
Did Not Discover
Error
11
11
Discovered Error
6
0
Bizarre
Irrelevant
Total
17
11
28
T=
r28V66 - or
(17)(11)(6)(22)
Decision rule: Reject Ho if T > x .95
80
= 4.94
Since T = 4.94, and x^.95 = 3.841, T > 3.841 => Reject Ho.
NOTE: Generally, cells in this type of analysis should be at least equal to 5 since this test
is a large sample approximation. However, since this rule is only an arbitrary rule of
thumb and the underlying assumptions ofthe analysis are met, the test statistic is useful in
this case (Conover, 1971, p. 143).
Hypothesis 2:
Discovered Error
Did Not Discover
Error
9
8
5
10
Unexpected
Absence of expected
Total
14
18
32
T=
(32V40 - 90)^
(14)(18)(15)(17)
= 1.24
Decision rule: Reject Ho if T > x .95
Since T =1.24, and x .95 = 3.841, T < 3.841 => Ho cannot be rejected.
Hypothesis 1 without misidentification of aircraft type:
Did Not Discover
Error
6
5
Discovered Error
6
0
Bizarre
Irrelevant
T=
n7V30 - 0)^
(12)(5)(6)(11)
= 3.86
Decision rule: Reject Ho if T > x .95
Since T = 3.86, and x^.95 = 3.841, T > 3.841 => Reject Ho.
81
Total
12
5
17
Hypothesis 2 without misidentification of aircraft type:
Discovered Error
Did Not Discover
Error
3
2
5
10
Unexpected
Absence of expected
Total
8
12
20
T=
(20V10 - 30)^
(8)(12)(15)(5)
= 1.11
Decision rule: Reject Ho if T > x .95
Since T =1.11, and x^.95 = 3.841, T < 3.841 => Ho cannot be rejected.
82
© Copyright 2026 Paperzz