Ergonomics Reviews of Human Factors and

Reviews of Human Factors and
Ergonomics
http://rev.sagepub.com/
Analysis of Cognitive Work
Ann Bisantz and Emilie Roth
Reviews of Human Factors and Ergonomics 2007 3: 1
DOI: 10.1518/155723408X299825
The online version of this article can be found at:
http://rev.sagepub.com/content/3/1/1
Published by:
http://www.sagepublications.com
On behalf of:
Human Factors and Ergonomics Society
Additional services and information for Reviews of Human Factors and Ergonomics can be
found at:
Email Alerts: http://rev.sagepub.com/cgi/alerts
Subscriptions: http://rev.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Citations: http://rev.sagepub.com/content/3/1/1.refs.html
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
>> Version of Record - Nov 1, 2007
What is This?
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
CHAPTER 1
Analysis of Cognitive Work
By Ann Bisantz & Emilie Roth
Cognitive task and work analyses are approaches to the analysis and support of cognitive
work (rather than primarily physical or perceptual activities). Although a variety of methods exist for performing cognitive task and work analyses, they share a common goal of providing information about two mutually reinforcing perspectives. One perspective focuses on
the fundamental characteristics of the work domain and the cognitive demands they impose.
The other focuses on how current practitioners respond to the demands of the domain. This
includes a description of the knowledge and skills practitioners have developed to operate
effectively as well as any limitations in knowledge and strategies that contribute to performance problems. This chapter provides a broad survey of cognitive task analysis and cognitive work analysis methods. Some of the methods highlight techniques for knowledge
gathering, whereas others focus on aspects of analysis and representation. Still other techniques emphasize process outputs, such as formal (computational) models of cognitive activities or design artifacts and associated rationales. In this chapter we review specific cognitive
task and work analysis methods and describe through illustration how these methods can
be adapted to meet specific project objectives and pragmatic constraints.
C
ognitive task and work analyses are approaches to the analysis and support of cognitive work (rather than primarily physical or perceptual activities). Although types of cognitive task and work analysis span a variety of perspectives and methodologies, they share
the goal of providing information about two mutually reinforcing perspectives, which will
be emphasized throughout this chapter. One perspective focuses on the fundamental characteristics of the work domain and the cognitive demands they impose. The other focuses
on how current practitioners respond to the demands of the domain. This includes a characterization of the knowledge and strategies that domain practitioners have developed that
allow them to function at an expert level as well as limitations in knowledge and strategies that contribute to performance problems.
In this chapter we summarize and highlight aspects of numerous cognitive engineering methods that share the goals of cognitive task and work analyses. The chapter provides a broad survey of alternative, often complementary, methods and, using illustrative
cases, demonstrates how these methods can be adapted and combined to meet the goals
and pragmatic constraints of real-world projects.
Historical Context and the Changing Nature of Work
In the last quarter of the 20th century, high-profile system failures (e.g., Three Mile Island,
numerous aviation accidents, and military incidents such as the July 1988 accidental
rev.sagepub.com
University
at Buffalo
Libraries
February 18, 2014
Copyright 2008 by Human Downloaded
Factors andfrom
Ergonomics
Society,atInc.
All rights
reserved.
DOI on
10.1518/155723408X299825
1
2
Reviews of Human Factors and Ergonomics, Volume 3
shooting down of Iran Air Flight 655 by the USS Vincennes) provided evidence regarding
the need for specific attention to the cognitive activities associated with complex system
control, as well as the impetus for research and methodological developments in these
areas. Since that time, numerous researchers and practitioners have put forth methodologies intended to explicitly identify the requirements of cognitive work so as to be able to
anticipate contributors to performance problems (e.g., sources of high workload, contributors to error) and specify ways to improve individual and team performance, be it through
new forms of training, user interfaces, or decision aids.
These methodologies stem from, and extend, a century of research and applied methodologies that have focused on the improvement of human work through systematic analysis. This tradition can be traced back to early studies in areas of scientific management
that put forward the notion that work could be decomposed into fundamental, repeatable components (Taylor, 1911). Additional advances in work measurement identified
fundamental motions in work (e.g., grasp, reach), as well as unnecessary or inefficient
motions, and developed innovative methodologies for work analysis (e.g., using motion
pictures; Gilbreth & Gilbreth, 1919).
The focus of these early methods on observable, physical work elements was well suited
to the extensively manual work of the day. Refinements and applications of time-andmotion study, such as the development of predetermined time systems (Sellie, 1992), continued through much of the 20th century, providing a framework for task analysis methods
that allowed the physical, perceptual, and cognitive demands of task components to be
compared against human capabilities.
Methods for examining cognitive work emerged as an adaptation and extension of
these techniques in response to fundamental shifts in work that were driven by advances
in automation and computerization, from primarily manual, observable activities (or routinized interactions with technology) to complex (and more hidden) cognitive activities,
such as monitoring, planning, problem solving, and deciding (Schraagen, Chipman, &
Shalin, 2000).
Analysis and Support of Cognitive Work
Analyses of cognitive work have variously been referred to as cognitive task analyses (CTAs)
or cognitive work analyses, depending on their focus and scope. Although we are sensitive
to these distinctions, we have chosen here to focus on an eclectic and purposefully broad
set of methods that share the goal of analysis and support of cognitively complex work.
Therefore, our use of the terms task analysis and work analysis should be interpreted
throughout this chapter in a general and somewhat interchangeable sense.
CTAs typically produce descriptions of domain characteristics that shape and constrain
cognitive and collaborative performance as well as descriptions of the knowledge and
strategies that underlie the performance of individuals operating in that domain. Because
CTAs are generally conducted with an applied purpose in mind, they also typically include
design recommendations regarding systems facets such as information displays, strategies
for adaptive and dynamic deployment of automation, and/or recommendations for training. Cognitive analyses have also been used to guide other aspects of complex system
analysis and design (e.g.,
personnel
selection;at University
manning
and
function
decisions)
Downloaded
from rev.sagepub.com
at Buffalo
Libraries
on Februaryallocation
18, 2014
Analysis of Cognitive Work
3
or as input to workload analysis and human reliability modeling.
Performing a cognitive analysis of complex human-system interaction necessarily encompasses knowledge-gathering activities to learn about the system and complexities in
question and the practitioners’ knowledge and skill that allow them to cope with system
complexity. It also requires analysis activities to synthesize and draw conclusions consistent with project goals. The output of the analysis can take various representational forms,
such as text descriptions, summary tables, diagrams, and computational models.
Although all CTA methods necessarily involve knowledge gathering, analysis, and representation of results, some CTA methods highlight techniques for knowledge gathering,
whereas others focus on aspects of analysis and representation. Still other techniques emphasize process outputs, such as formal (computational) models of cognitive activities or
design artifacts and associated rationales.
Chapter Organization
In this chapter we provide an overview of the kinds of information that CTA methods are
intended to extract and represent and a survey of specific methods available for knowledge acquisition and representation. The next section introduces two mutually informing
perspectives that are important to keep in mind when performing a CTA: the need to analyze domain characteristics that serve to shape and constrain cognitive performance, and
the need to analyze the knowledge, skills, and strategies of domain practitioners. We review
CTA methods and applications that are representative of each of these two perspectives.
Ultimately, both types of information are required to gain a full understanding of the
factors that influence practitioner performance and to identify opportunities for more
effective support.
Next, we survey knowledge acquisition, analysis, and representation methods used in
performing CTAs. We provide both an overview of knowledge acquisition techniques and
a description of ways of representing and communicating the output of CTA analyses.
We then review methods that are closely related to, and sometimes integrated with,
CTA. This includes task-analytic approaches as well as computational models of cognitive task performance.
We next return to the theme that CTA is fundamentally about uncovering the demands
of the domain and the knowledge and strategies that practitioners have developed in response. We show through illustration that specific CTA methods can be “mixed and
matched” and modified to meet the objectives and pragmatic constraints of particular
projects.
We end with a discussion of ongoing and future research directions regarding CTA
methodologies, including macroergonomic approaches, software support, and the integration of CTA methods within the larger systems design process.
MUTUALLY REINFORCING CTA PERSPECTIVES
Two mutually reinforcing perspectives are needed to fully understand the factors that
contribute to cognitive
and
opportunities
for
improving
performance (see
Downloadedperformance
from rev.sagepub.com at
University
at Buffalo Libraries on
February
18, 2014
4
Reviews of Human Factors and Ergonomics, Volume 3
Figure 1.1). One perspective involves analysis of the characteristics of a domain that
impose cognitive demands. This includes examination of the physical environment, socioorganizational context, technical system (or systems), and task situations that domain
practitioners confront. The second perspective examines the goals, motivations, knowledge, skills, and strategies that are used by domain practitioners when confronting task
situations.
Analysis of domain characteristics provides the framework for understanding the goals
and constraints in the domain, the task situations and complexities that domain practitioners are likely to encounter, the cognitive demands that arise, and the opportunities
that might be available to facilitate cognitive and collaborative performance. For instance,
analysis can identify interacting goals in the domain that can complicate practitioner
decision making; what information is available to practitioners and whether key needed
information is missing or unreliable; and, more generally, inherent performance limitations that are attributable to characteristics of the task or current technologies.
Documenting domain characteristics also defines the requirements for effective perfor-
Figure 1.1. A cognitive analysis requires consideration of two perspectives: examination of
domain characteristics and constraints that impose cognitive demands on domain practitioners, which include components of the task, technical system, social and organizational
structure, and physical environment; and examination of the goals, knowledge, skills, and
strategies that domain practitioners
utilize in response.
Downloaded from rev.sagepub.com
at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
5
mance and support, including the information that needs to be sensed to allow operator
control, constraints and interactions that should be displayed, and contexts in which automation or other aids could be effectively deployed.
The second, complementary, perspective examines the goals, motivations, knowledge,
skills, and strategies of domain practitioners. This perspective provides insight into the
knowledge, skills, and strategies that enable domain practitioners to operate at an expert
level as well as the cognitive factors that limit the performance of less experienced individuals (e.g., incomplete or inaccurate mental models). The results can be used to identify opportunities to improve performance either through training (e.g., to bring less
experienced personnel to the level of experts) or through the introduction of systems that
more effectively support cognitive performance (e.g., eliminating the need for the expert
strategies that compensate for poor designs).
CTA researchers and practitioners have typically emphasized one perspective or the
other; some tend to emphasize the need to uncover the knowledge and skills underlying
performance (e.g., Klein, 1998) and others emphasize the need to analyze characteristics
of the domain that serve to shape cognitive and collaborative performance (Rasmussen,
1986; Rasmussen, Pejtersen, & Goodstein, 1994; Sanderson, 2003; Vicente, 1999). In the
following two sections we provide an overview of work that is representative of each of
these perspectives. It needs to be stressed that the two perspectives are clearly mutually
informing. Importantly, the demands of the tasks interact with practitioner expertise, embedded work practices, and environmental supports to make aspects of system control
more or less challenging. To effectively support system design and performance-aiding
efforts, CTAs must reveal these complex interdependencies. Ultimately, therefore, both
perspectives need to be taken into account for a full picture of the factors that influence
practitioner performance and the opportunities available to more effectively support performance (Hoffman & Lintern, 2006; Potter, Roth, Woods, & Elm, 2000).
Understanding Domain Characteristics and Constraints
In order to aid complex cognitive work, one must understand the performance-shaping
factors of the domain within which that work is performed. Human activity can be understood not only in terms of tasks, procedures, or decisions but also in terms of the constraints that restrict, and the goals that provide direction to, action.
Vicente (1990) provided a convincing argument regarding the degree to which an indepth understanding of the environment in which humans operate is not only helpful but
necessary to make sense of and support performance in complex, unpredictable environments. Vicente quoted an example from Simon (1981) regarding an ant traveling across
a beach: Although the path taken by the ant is irregular, the complexity is a function of
the beach’s irregular surface, not of the ant. One can observe a similar example when flying at night: Whereas the city boundaries are visible from the patterns of lights, the reasons for their complexity are revealed only when one can see the underlying geography of
mountains, valleys, lakes, and rivers. Vicente (1990) described three factors that influence
the actions that an ant (or a person) will take to reach the same goal state: the state of the
system at the time the goal-directed activity begins; external, unpredictable disturbances
for which the operator
compensate;
and
individual
differences
in strategy. Thus a
Downloadedmust
from rev.sagepub.com
at University
at Buffalo
Libraries on February
18, 2014
6
Reviews of Human Factors and Ergonomics, Volume 3
successful task analysis methodology must provide a description of the work domain as
well as tasks and strategies.
Cognitive engineering methodologies have been developed to provide such a description. Roth and Woods (1988), for instance, provided a description of a competence model
necessary for the successful operation of a nuclear power plant. A competence model
characterizes essential complexities in the domain (e.g., competing goals and nonlinear
system dynamics) that experts have to manage and the strategies that are needed for accomplishing tasks in the face of these difficulties. Woods and Hollnagel (1987) provided
a more formal representation of the goals of a nuclear power plant and the functional
means available in the system to accomplish them.
Rasmussen’s abstraction hierarchy (Rasmussen, 1986; Rasmussen et al., 1994; Vicente,
1999) is a commonly adopted framework for representing a complex system at multiple
levels of abstraction—from the physical form and objects in the system; to processes,
functions, constraints, or abstract laws; to the highest-level purposes for which the system
was designed (see Figure 1.2 for an example). Key aspects of this representation include
the fact that levels differ in the manner in which they represent the system (goals vs. objects) rather than the level of detail and that the links between nodes represent meansends relationships. Lower-level nodes provide the means by which higher-level goals are
accomplished, and the higher-level nodes are the reasons for the existence of lower-level
nodes. Importantly, therefore, nodes are decomposed not into the activities or human actions that are deployed to accomplish a goal or function but, rather, into the functions,
processes, and objects that are part of the system. The abstraction hierarchy has been used
in performing work domain analyses as part of a more comprehensive cognitive engineering methodology called cognitive work analysis (Vicente, 1999).
Ecological interface design (EID; Burns & Hajdukiewicz, 2004; Vicente, 2002; Vicente
& Rasmussen, 1992) is a framework based on work domain analysis as well as other aspects
of cognitive work analysis that support the development of human-computer interfaces
for complex systems. Here, a work domain analysis (typically, using an abstraction hierarchy representation) identifies information requirements (associated with all levels of
the hierarchy) necessary to allow effective control under different circumstances. Additionally, the EID approach focuses on allowing operators to act whenever possible at less
effortful skill- and rule-based levels while still providing information necessary for
knowledge-based reasoning when required. Importantly, identifying information requirements associated with a system’s purposes, functions, and physical objects, compared with
requirements associated with specific tasks and activity sequences, makes it possible for
operators to reason about the system in unexpected circumstances (Vicente, 2002). EID
has been applied in a number of domains, such as nuclear power (Itoh, Sakuma, & Monta,
1995) and computer network management (Burns, Kuo, & Ng, 2003). Sanderson and Watson
(2005) applied EID principles to the design of auditory alerts in a medical environment.
Lintern (2006) applied work domain analysis to describe the goals, functions, and
physical resources of an insurgency operation in order to aid intelligence analysts. He
augmented nodes in the abstraction hierarchy with activity descriptions derived from a
scenario narrative provided by a subject matter expert. This analysis was used to develop
a prototype computer workspace to support insurgency analysis, in which information
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
7
Physical Form
Physical Function
General Processes
Economic Balance
Abstract Function
Risk-Benefit Balance
State of buildings, operating
conditions, location,
capacity, capacity vs. load
(e.g., for a hospital)
Number, condition,
operational state, location
of mobile resources,
personnel
Human Resources
(Police, EMT, Fire)
Coordinating and Implementing
Medical Operations
Coordinating and Implementing
Shelter Operations
Sub-System
Mobile Resources
(Ambulances, Fire Rescue,
Fire Engines, Police
Vehicles)
Operations Processes
Resource Balance
Fixed Location Resources
(Hospitals, 911 Call Centers,
Ambulance Dispatch, Fire
Stations, Police Stations)
Response Plans
Balance of Authority
System
Figure 1.2. Portion of an abstraction hierarchy work domain model for an emergency response
system. Based on work from Bisantz, Little, and Rogova (2004).
Casualty Management
Functional Purpose
8
Reviews of Human Factors and Ergonomics, Volume 3
from levels and nodes from the work domain model were instantiated in information panels on a large-format display.
Work domain analysis has application beyond interface design. Naikar (2006) described
the use of work domain analysis for identifying training needs and training system requirements for a fighter aircraft, for comparing and evaluating competing design proposals for a new military system, for designing team structures, and for developing training
strategies that manage human error. For instance, values and priority functions (abstract
functions) identified for a fighter aircraft, such as minimizing collateral damage, suggested the need to both explicitly train on and measure these variables. For design evaluations, the work domain analysis framework enabled the technical assessment of physical
objects (a typical step in evaluation) to be additionally evaluated against higher-level processes, functions, and goals. Therefore, system components were evaluated not just in
terms of the degree to which they met technical performance criteria but also in terms of
their significance to the overall sociotechnical system.
Bisantz et al. (2003) incorporated work domain analysis as part of the initial design
phase for a new naval vessel. Among other things, the analysis revealed that the same
planned weapon system was to be used to support multiple potentially conflicting goals.
An implication of the analysis was that either the physical system needed to be redesigned
to eliminate the potential goal conflict or that procedures would need to be put in place
reflecting how the use of that resource would be prioritized in case of goal conflict situations. Similarly, the analysis revealed how the operational processes of moving the ship
and emitting signals from sensor systems were both means associated with the function
of sensing but that their use at a particular point in time could differentially affect the
defensive and offensive purposes of the ship. This revealed a need for mutual awareness
and close communication among operators involved in the two functions.
Applied cognitive work analysis (Elm, Potter, Gualtieri, Easter, & Roth, 2003) is a comprehensive design methodology that also integrates an explicit representation of the work
domain. Here, the domain analysis results in a functional abstraction network (FAN), which
represents goals along with associated processes and system components. This network is
linked to, and provides the basis for, additional stages of analysis, including information
requirements and representation design. Potter, Gualtieri, and Elm (2003) described an
application of this methodology to military command and control in which the FAN was
used to represent abstract concepts such as “combat power” as well as high-level goals of
complying with military law and sociopolitical constraints. Subsequent stages of analysis
supported the development of innovative displays that visually represented levels of combat power to support commander decision making.
Uncovering Practitioner Knowledge and Strategies
The complementary goal of CTA is to understand and represent the knowledge of domain
practitioners and the problem-solving and decision-making strategies that they use to perform tasks. This tradition has its roots in cognitive psychology and cognitive science, in
which there was an attempt to understand the nature of expertise (Chase & Simon, 1973;
Chi, Glaser, & Farr, 1988; Hoffman, 1987).
One of the classic strategies
uncovering
the
basis
ofonexpert
performance
is to
Downloaded from for
rev.sagepub.com
at University
at Buffalo
Libraries
February 18,
2014
Analysis of Cognitive Work
9
compare the performance of experts with that of less experienced individuals. Early studies examined the performance of experts under controlled laboratory conditions to understand what made experts different from novices. For example, Chi, Feltovich, and Glaser
(1981) conducted laboratory studies comparing the performance of individuals with different levels of expertise in physics on simple tasks such as sorting and classifying different physics problems. They were able to show differences between experts and novices in
their organizational structure of knowledge.
Comparing the performance of individuals of different levels of expertise continues to
be a powerful technique for uncovering the basis of expertise. For example, Dominguez
(2001) used this approach to reveal differences between staff surgeons and residents in their
awareness of boundary conditions for safe operation in laparoscopic surgery.
Collecting think-aloud protocols is another classic strategy for understanding the nature
of expertise that has its roots in cognitive science (Ericsson & Simon, 1993). Individuals
are asked to “think aloud” as they attempt to solve problems. Protocol analyses of their
utterances and actions can be used to map the detailed knowledge and reasoning that individuals use in solving the problems. The results can be used to inform the design of training or support systems (Hall, Gott, & Pokorny, 1995; Means & Gott, 1988).
CTA methods have also been used to understand and model the process of decision
making under real-world conditions, which has come to be referred to as naturalistic
decision making (Klein, 1998). Klein and his colleagues developed a variety of structured
interview techniques to uncover how experts make decisions in high-risk, uncertain, and
time-pressured domains such as firefighting and clinical nursing (Klein, 1998). The research
has led to recognition-primed models of expert decision making. These models stress the
importance of situation recognition processes that rely on subtle cues and mental simulation processes that enable experts to make effective decisions in dynamic, time-pressured
situations.
Cognitive work can be examined at different “grains” of analysis. For some purposes,
it is appropriate to model the elemental mental processes that underlie performance (e.g.,
visual scanning, retrieval of information from long-term memory, short-term memory
storage of information, specific mental computations, attention shift). Gray and BoehmDavis (2000) demonstrated that analyses of mental processes at the millisecond level could
inform the design of improved user interfaces. Techniques suited for microlevel analyses
include think-aloud protocols, keystroke capture, and eye movement data because they
capture detailed mental and physical activity. Examples of studies that have used this approach include an investigation by Seagull and Xiao (2001), who used eye-tracking video
data to examine the detailed visual sampling strategies of medical staff performing tracheal intubations, and a study by Luke, Brook-Carter, Parkes, Grimes, and Mills (2006), who
examined visual strategies of train drivers.
Generally, cognitive work has been analyzed at a more “macrograin” level of analysis,
sometimes referred to as macrocognition, in which the focus is on describing informationgathering, decision-making, and collaborative strategies rather than the elemental cognitive processes (Klein, Ross, Moon, Klein, & Hollnagel, 2003). Examples include a study
that examined the strategies by which railroad dispatchers managed the multiple demands
placed on track usage to maintain efficiency and safety (Roth, Malsch, Multer, & Coplen,
1999); a study that
examined
the strategies
usedLibraries
by experienced
Downloaded
from rev.sagepub.com
at University at Buffalo
on February 18, 2014 hackers to attack a
10
Reviews of Human Factors and Ergonomics, Volume 3
computer network (Stanard et al., 2004); and research that examined how emergency ambulance dispatchers keep track of ambulances and make ambulance allocation decisions
(Chow & Vicente, 2002).
CTA methods have also been used to reveal sources of vulnerability and contributors
to error. For example, Patterson, Roth, and Woods (2001) examined the information search
strategies of intelligence analysts under simulated data overload conditions. Figure 1.3
shows the typical analysis process that the intelligence analysts used to search for and integrate information. Patterson et al. (2001) were able to identify a number of suboptimal
strategies, such as premature closure, that created the potential for incomplete or inaccurate analysis. Figure 1.4 illustrates how participant search strategies often caused analysts
to fail to locate and exploit “high-profit” documents that contained more complete and
accurate information.
Analysis of domain practitioner strategies can provide the basis for defining new system design requirements (e.g., Bisantz et al., 2003). For example, CTA methods can uncover
work-around strategies that experienced practitioners have developed to compensate for
system limitations (e.g., Mumaw, Roth, Vicente, & Burns, 2000; Roth & Woods, 1988). The
examination of these strategies can provide the basis for establishing cognitive support
requirements to guide new system design. Similarly, CTA methods can provide insight
into critical features in the current environment that are exploited by experienced domain
practitioners and that should be preserved or otherwise reproduced as new technology is
introduced (e.g., Roth, Multer, & Raslear, 2006; Roth & Patterson, 2005).
Although the discussion thus far has focused on empirical analyses aimed at providing descriptive models of the knowledge and strategies of domain practitioners in the current environment, cognitive analyses can also be performed to develop formative models
that specify the cognitive requirements for effective task performance without reference
to the actual performance of domain practitioners (Vicente, 1999). This approach is relevant when trying to analyze the cognitive demands that are likely to be imposed by a
system design that does not yet exist or to compare the impact in terms of cognitive performance requirements of alternative envisioned designs.
Decision ladders provide one formalism for representing the knowledge and informationprocessing activities necessary to make a decision or achieve a goal (Rasmussen, 1983).
Nehme, Scott, Cummings, and Furusho (2006) used this approach to develop information
and display requirements for futuristic unmanned systems for which no current implementations exist. They used a decision ladder formalism to map out the monitoring,
planning, and decision-making activities that would be required of operators of these
systems. Callouts were then used to specify information and display requirements in order
to support the corresponding cognitive tasks.
Empirical techniques have also been used to explore how changes in technology and
training are likely to affect practitioner skills, strategies, and performance vulnerabilities
(Woods & Dekker, 2000). Techniques include using concrete scenarios or simulations
of the cognitive demands that are likely to be confronted. Woods and Hollnagel (2006)
referred to these methods as staged-world techniques. One example is a study that used
a high-fidelity training simulator to explore how new computerized procedures and
advanced alarms were likely to affect the strategies used by nuclear power plant crews to
coordinate activities and
maintain
shared situation
& 2014
Patterson, 2005).
Downloaded
from rev.sagepub.com
at University at awareness
Buffalo Libraries on(Roth
February 18,
(text continues on page 13)
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
11
ARIAN E 5 FAILURE: INQUIRY BOARD FIN DIN GS
ARIAN E 5: AN ALL N EW LAUN CH ER FOR TH E GLOBAL MARKET
ARIAN E 5 FAILURE LEAVES EUROPE N O MARGIN FOR ERROR
ARIAN E 5 EXPLODES IN TH E SKY OVER KOUROU
ARIAN E 5 FAILURE LEAVES EUROPE N O MARGIN FOR ERROR
378
140
22
The loss of the first Ariane 5 on its initial qualification flight
on
June 4 marks a severe, but by no means fatal, setback for <<Europe>>'s
new
heavy-lift launcher and for Arianespace's ambitions to consolidate
its
dominant position on the global commercial launch market. It will
take
more than a single <<launch>> <<failure>> to derail the 15-year, $ 8.2
billion
programme, but the margin for error has henceforth been reduced to
zero.
<<Europe>>'s heavy-lift launcher programme will continue despite the
June 4
setback, but the margin for error has now been reduced to zero
As a resu lt of this failu re, ESA lost the US $500 m illion Clu ster satellite program and
estimates a 2-4% increase in cost of the US $8 billion Ariane 5 program , along w ith an
ap proxim ate one-year d elay in the Ariane 5 program in ord er to p erform m ore
hard w are simu lation tests and d esign a p rove a new gu id ance p rogram .
Du ring d evelop ment of the avionics for the Ariane 5, the Sextant Avioniqu e ring laser
gyro inertail platform s w ere not tested in the loop as part of an end -to-end sim u lation
since they had previou sly been proven to w ork on the Ariane 4. Instead , their inp ut
w as only sim u lated . A comp lete hard w are-in-the-loop test w ould likely have
u ncovered the systemic failure of these Ariane-4 d erived inertial platform s in the
Ariane 5 flight environm ent.
The acceleration of the Ariane 5 is m uch greater than the Ariane 4 after lift-off. This
caused the inertial gu id ance p latform em bed d ed softw are, w hich w as d esigned arou nd
the slow er Ariane 4 acceleration profile, to provid e num erical valu es beyond the
p rogram med lim its of the flight comp u ter w hich then shu td ow n both inertial platforms.
The platforms initiated a d iagnostic “reset” m od e that fed incorrect valu es to the flight
com p uter w hich then com m and ed an excessive p itch-over by gim balling the tw o strap on booster nozzles and then first-stage m ain engine nozzle. This rap id p itch into a high
angle of attack cau sed the fairing to sep arate from the vehicle d u e to the high
aerod ynam ic load ing. This cau sed the on-board self-d estruct system to activate w hich
w as later follow ed by a range safety-issu ed d estruct com mand. The flight com p uter
p rogram shu td ow n the inertial p latform s as part of a gu idance re-alignm ent rou tine.
This rou tine is p art of the Ariane 4 realignm ent routine d esigned to allow qu ick u p dates
after a hold late in the count-d ow n. This rou tine is allow ed to op erate u p to 40 second s
after lift-off. Du ring this period , the Ariane 5 travels farther d ow n range than the
Ariane 4, and therefore prod uces a larger horizontal velocity com ponent value. When
this valu e reached the flight com pu ter program lim it (i.e.: nu merical overflow ), w hich is
not reached by the Ariane 4 in the sam e tim efram e, the flight comp uter shutd ow n/ reset
the inertial platforms.
The 4 Ju ne 1996 lau nch of the first Eu rop ean Sp ace Agency (ESA) Ariane 5 sp ace lau nch
vehicle carrying the four Eu ropean “Clu ster” scientific satellites end ed in failure just
over 36 second s after lift-off du e to inherit softw are d esign flaw in the gu id ance system .
The im bed d ed softw are in both of the p rim ary and back-up id entically-d esigned
inertial guid ance p latform s, cu rrently p roven and u sed onboard the Ariane 4, w as illd esigned for the flight p rofile of the Ariane 5. Insu fficient requ irem ents and testing of
the Ariane 5 avionics system d u ring d evelopm ent d id not u ncover the d esign flaw .
ACCORDING TO THE REPORT RECENTLY SUBMITTED BY THE
INDEPENDENT
INQUIRY BOARD SET UP AFTER THE ARIANE 5 <<LAUNCH>>
<<FAILURE>> ON JUNE 4
(REFTEL A), THE EXPLOSION WAS DUE TO SPECIFICATION AND DESIGN
ERRORS
IN THE SOFTWARE OF THE INERTIAL REFERENCE SYSTEM. THE
REPORT
FURTHER STRESSED THE FACT THAT THE ALIGNMENT FUNCTION
OF THE
INERTIAL REFERENCE SYSTEM, WHICH SERVED A PURPOSE ONLY
BEFORE
LIFTOFF (BUT REMAINED OPERATIVE AFTERWARDS), WAS NOT
TAKEN INTO
ACCOUNT IN THE SIMULATIONS AND THAT THE EQUIPMENT AND
SYSTEM TESTS
WERE NOT SUFFICIENTLY REPRESENTATIVE. THE SYSTEM
ARCHITECTURE WAS
NOT IMPLICATED. THE REPORT RECOMMENDED EXTENSIVE
REVIEW OF THE
SOFTWARE DESIGN AND TESTING PROCEDURES.
THE COST OF CORRECTIVE MEASURES IS ESTIMATED AT 2 TO 4
PERCENT OF
THE GLOBAL COST OF THE PROJECT, THAT IS, USD 150 TO 300
MILLION.
CORRECTIVE ACTION IS EXPECTED TO POSTPONE THE NEXT
LAUNCH TO MID
SEMESTER 1997. ESA DOES NOT EXCLUDE THE POSSIBILITY THAT
THE THIRD
ARIANE 5 LAUNCH, INITIALLY PLANNED AS A COMMERCIAL LAUNCH,
MAY
EVENTUALLY BE TREATED AS A QUALIFICATION LAUNCH.
ARIAN E 5 FAILURE: IN QUIRY BOARD FIN DIN GS
Date: Augu st 8, 1996
Sou rce: FBIS rep ort
19960808
19960500
19960600
19960610
19960600
eu rope 1996
(eu rope 1996) & (lau nch failu re)
(eu rope 1996) & ((lau nch failu re):%2)
Figure 1.3. Typical analysis process used by intelligence analysts to search a document database
and synthesize results to formulate a response to an analysis query. Reprinted from Patterson, E. S.,
Roth, E. M., & Woods, D. D. Predicting vulnerability in computer-supported inferential analysis under
data overload. Cognition, Technology & Work, 3, 224–237. Copyright 2001, with kind permission of
Springer Science and Business Media.
Synthesize inform ation to
constru ct a coherent story
Corroborate inform ation (or
resolve d iscrep ancies in
inform ation) and fill in gaps
w ith su p p ort d ocum ents
Select “key” d ocu m ents
Brow se by title and / or d ate
Qu ery (keyw ord s, refinem ent)
H igh p rofit d ocum ents
Key d ocum ents
Key d ocum ents that are high p rofit
161
169
22
15
5
29
S2: 73 m inutes
esa & ariane*
(esa & ariane*) & failu re
S3: 24 m inu tes
S4: 68 m inu tes
europ e 1996
(eu rop ean sp ace agency):%3 &
(eu rop e 1996) & (lau nch failu re)
ariane & failu re & (lau ncher
(eu rop e 1996) & ((lau nch
| rocket))
failu re):%2)
419
66
184
28
14
7
S5: 96 m inu tes
ESA | (eu rop ean & sp ace &
agency)
(ESA | (eu rop ean & sp ace &
agency)) > (19960601) Infod ate
S6: 32 m inu tes
1996 & Ariane
(1996 & Ariane) & (d estr*
| exp lo*)
(1996 & Ariane) & (d estr*
| explo*) & (fail*)
S7: 73 m inu tes
softw are & gu id ance
194
29
12
4
S8: 27 m inutes
esa & ariane
ariane & 5
(ariane & 5):%2
((ariane & 5):%2) & (lau nch
& failu re)
S9: 44 m inutes
1996 & Eu rop ean Space
Agency & satellite
1996 & Eu rop ean Space
Agency & lost
1996 & Eu rop ean Space
Agency & lost & rocket
©1999 Patterson
Figure 1.4. Information-sampling process employed by intelligence analysts in the Patterson
et al. (2001) study. Largest circle represents articles in the database. Internal circles represent articles returned from database search queries. The thick-circumference circle represents
articles that were read. The filled-in circles represent which of the documents were “high profit” in the sense of containing extensive accurate information, which were “key” in the sense of
being relied upon heavily by participant, and which of the key documents were also high
profit. Reprinted from Patterson, E. S., Roth, E. M., & Woods, D. D. Predicting vulnerability in
computer-supported inferential analysis under data overload. Cognition, Technology & Work,
3, 224–237, Copyright 2001,
with
permission
of atSpringer
Science
Business Media.
Downloaded
fromkind
rev.sagepub.com
at University
Buffalo Libraries
on Februaryand
18, 2014
12
Analysis of Cognitive Work
13
Another example is a study by Dekker and Woods (1999) that used a future incident technique to explore the potential impact of contemplated future air traffic management
architectures on the cognitive demands placed on domain practitioners. Controllers, pilots,
and dispatchers were presented with a series of future incidents to jointly resolve. Examination of their problem solving and decision making revealed dilemmas, trade-offs, and
points of vulnerability associated with the contemplated architectures; this enabled practitioners and developers to think critically about the requirements for effective performance for these envisioned systems.
Cognitive analyses that capture the knowledge and strategies of domain practitioners
have application beyond design. For example, they have been used to support development
of training (Johnson et al., 2006; O’Hare, Wiggins, Williams, & Wong, 1998; Schaafstal,
Schraagen, & van Berlo, 2000; Seamster, Redding, & Kaempf, 1997) as well as specification of proficiency evaluation requirements (Cameron et al., 2000). More recently, CTAs
have been used as a means to capture domain expertise for archival purposes. For example, government and private sector organizations have found a need to capture expert
knowledge from individuals who are about to retire so as to preserve and transmit the
corporate knowledge (Hoffman & Hanes, 2003; Klein, 1992).
CTA METHODS
CTA methods provide knowledge acquisition techniques for collecting data about the
knowledge and strategies that underlie performance as well as methods for analyzing and
representing the results. Schraagen et al. (2000) provided a broad survey of different CTA
approaches. Crandall, Klein, and Hoffman (2006) produced an excellent “how-to” handbook with detailed practical guidance on how to perform a CTA. In this section we describe
some of the most widely used knowledge acquisition and representation methods, highlighting some of the key factors that distinguish among methods.
Knowledge Acquisition Methods
Effective knowledge acquisition depends on an understanding of the factors that enable
domain practitioners to more easily access and describe their own knowledge and thought
processes. Extensive psychological research suggests that self-reports of memory and decision processes can often be inaccurate (e.g., Banaji & Crowder, 1989; Nisbett & Wilson,
1977). The key is to understand the conditions under which self-reports are likely to be
reliable. Reviews of relevant factors that contribute to accurate self-reports can be found
in Ericsson and Simon (1993), Leplat (1986), and Roth and Woods (1989).
Roth and Woods (1989) highlighted three dimensions that affect the quality of the
information obtained. One important factor is the specificity of the information being
elicited. Domain practitioners are likely to give a more accurate and complete description
of their reasoning process and the factors that influence their thinking when asked to
describe a specific example than when asked a general question, such as “How do you
generally approach a problem?” or “Can you describe your typical procedure?” (Hoffman
& Lintern, 2006).Downloaded
A second
importantat University
factoratisBuffalo
how
similar
the18,conditions
under which
from rev.sagepub.com
Libraries
on February
2014
14
Reviews of Human Factors and Ergonomics, Volume 3
knowledge acquisition is conducted are to the actual “field” conditions in which the domain practitioners operate. The more the acquisition context allows the domain practitioner to display his or her expertise, rather than reflect on it, the more valid the results
will be. Thus domain practitioners can more easily demonstrate how they perform a task
in the actual work context than describe the task outside the work context.
The third important factor relates to the interval between when the information was
experienced or attended to by the domain practitioner and the time he or she is asked
about it. Think-aloud protocols conducted while a person is engaged in performing a task
are the most effective. Retrospective reports, in which a person is asked to describe tasks
or events that occurred in the distant past, are less likely to be reliable. When retrospective reports must be used, they can be improved by providing effective retrieval cues. For
example, Hoc and Leplat (1983) demonstrated that a cued retrospective methodology, in
which people are asked to describe how they went about solving a problem while watching a videotape of their own performance, improved the quality of the information they
provided.
A variety of specific techniques for knowledge acquisition have been developed that
draw on basic principles and methods of cognitive psychology (Cooke, 1994; Ericsson
& Simon, 1993; Hoffman, 1987). Although there are many specific knowledge acquisition
methods, fundamentally they can be classified into methods that primarily involve interviewing domain practitioners and those that primarily involve observing domain practitioners engaged in domain-relevant tasks. The next two sections describe methods that
fall into each of these classes.
Interview approaches. Interviews are among the most common knowledge acquisition
methods. Unstructured interviews are free-form interviews of domain practitioners in
which neither the content nor the sequence of the interview topics is predetermined
(Cooke, 1994). Unstructured interviews are most appropriate early in the knowledge acquisition process, when the analyst is attempting to gain a broad overview of the domain
while building rapport with the domain practitioners. More typically, CTA analysts will
use a semistructured interview approach, in which a list of topics and candidate questions
is generated ahead of time, but the specific topics and the order in which they are covered
is guided by the responses obtained (e.g., Mumaw et al., 2000; Roth et al., 1999).
Structured interview techniques utilize a specific set of questions in a specific order. A
number of structured and semistructured interview techniques for CTA have been developed. One of the most widely used structured CTA interview techniques is the critical
decision method (CDM), developed by Klein and his colleagues (Hoffman, Crandall, &
Shadbolt, 1998; Klein & Armstrong, 2005; Klein, Calderwood, & MacGregor, 1989). The
CDM is a structured approach for analyzing actual challenging cases that the domain
practitioner has experienced. It is a variant of the critical incident technique, developed
by Flanagan (1954) for analyzing critical cases that have occurred in the past. Analysis of
actual past cases provides a valuable window for examining the cognitive demands inherent in a domain. The incidents can be analyzed to understand what made them challenging and why the individuals who confronted the situation succeeded or failed (Dekker,
2002; Flanagan, 1954).
A CDM session includes
interview
phases,
orLibraries
“sweeps,”
that
examine a past
Downloaded four
from rev.sagepub.com
at University
at Buffalo
on February
18, 2014
Analysis of Cognitive Work
15
incident in successively greater detail: The first sweep identifies a complex incident that
has the potential to uncover cognitive and collaborative demands of the domain and the
basis of domain expertise. In the second sweep a detailed incident timeline is developed
that shows the sequence of events. The third sweep examines key decision points more
deeply by using a set of probe questions (e.g., “What were you noticing at that point?”
“What was it about the situation that let you know what was going to happen?” “What
were your overriding concerns at that point?”). Finally, the fourth sweep uses “what if ”
queries to explore the space of possibilities more broadly. For example “what if” questions
are used to probe for potential expert/novice differences (e.g., whether someone else, perhaps with less experience, might have responded differently). The output is a description
of the subtle cues, knowledge, goals, expectancies, and expert strategies that domain experts
use to handle cognitively challenging situations. It has been successfully employed to analyze the basis of expertise in a variety of domains, such as firefighting, neonatal caregiving, and intelligence analysis (Baxter, Monk, Tan, Dear, & Newell, 2005; Hutchins, Pirolli,
& Card, 2003; Klein, 1998).
Concept mapping is another structured interview technique that is widely used to uncover and document the knowledge and strategies that underlie expertise (Crandall et al.,
2006). In concept mapping knowledge elicitation, the CTA analyst helps domain practitioners build up a representation of their domain knowledge using concept maps. Concept
maps are directed graphs made up of concept nodes connected by labeled links. They are
used to capture the content and structure of domain knowledge that experts employ in
solving problems and making decisions.
Whereas many structured interview techniques are conducted with a single domain
practitioner as interviewee, concept mapping is typically conducted in group sessions that
include multiple domain practitioners (e.g., three to five) and two facilitators. One facilitator provides support in the form of suggestions and probe questions, and the second
facilitator creates the concept map based on the participants’ comments for all to review
and modify. The output is a graphic representation of expert domain knowledge that can
be used as input to the design of training or decision aids. See Figure 1.5 for an example
of a concept map that depicts the knowledge of cold fronts in Gulf Coast weather of an
expert in meteorology (Hoffman, Coffey, Ford, & Novak, 2006). It was created using a
software suite called CmapTools (Institute for Human and Machine Cognition, 2006).
Icons below the nodes provide hyperlinks to other resources (e.g., other Cmaps and digital images of radar and satellite pictures; digital videos of experts).
Other CTA methods that rely on interviews include the applied cognitive task analysis method (ACTA; Militello & Hutton, 1998) and the goal-directed task analysis method
(Endsley, Bolte, & Jones, 2003). ACTA was designed specifically to guide less experienced
cognitive analysts in performing a CTA. The goal-directed task analysis method provides
another example of a CTA method that is based on semistructured interviews. Its focus
is on deriving information requirements to support the design of displays and decision
aids intended to foster situation awareness.
Observational methods. A second common method of data collection to support cognitive task and work analyses is the observation of domain practitioners as they perform
domain tasks. Observational
methodsatused
inatcognitive
research are informed
Downloaded from rev.sagepub.com
University
Buffalo Librariesengineering
on February 18, 2014
16
Reviews of Human Factors and Ergonomics, Volume 3
Figure 1.5. An example of a concept map. This concept map represents the knowledge of
an expert in meteorology regarding Gulf Coast weather. Figure courtesy of R. R. Hoffman,
Institute for Human and Machine Cognition.
by a number of traditions, including the case study and ethnographic approaches used
in social science (Blomberg, Giacomi, Mosher, & Swenton-Wall, 1993; Hammersley &
Atkinson, 1983; Lincoln & Guba, 1985; Yin, 1989) as well as industrial engineering techniques of work analysis (Salvendy, 2001).
Bisantz and Drury (2005) noted that the use of observation methods can vary along a
number of key dimensions, many of which are relevant to the use of these methods for
CTA. These choices include the setting for observations, whether they are drawn from reallife or videotaped sessions, and the use of other forms of data that are collected and combined with observations.
Observations to support
CTA
can occur atinUniversity
a variety
ofLibraries
settings,
including
actual work
Downloaded
from rev.sagepub.com
at Buffalo
on February
18, 2014
Analysis of Cognitive Work
17
environments, high-fidelity simulations of work environments (e.g., cockpit simulators),
and laboratories. Additionally, observations can occur during actual work, during training exercises, or while operators are performing analyst-provided work tasks. The recorded
output of observations made to support CTAs can vary from unstructured, opportunistic field notes (informed by the analysts’ expertise and goals) to more structured observations based on predetermined categories.
Observations are often made in real time as work activities are unfolding (e.g., Roth
et al., 2004). Roth and Patterson (2005) emphasized that naturalistic observations taken
in real settings allow analysts to understand the full complexity of the work environment.
This includes understanding the complexities and cognitive demands faced by domain
practitioners and the strategies developed by domain practitioners to cope with demands.
Observational studies are particularly useful for identifying mismatches between how
work is depicted in formal processes and procedures and how it is actually performed,
often revealing “home-grown” tools and work-arounds that domain practitioners generate to cope with aspects of task complexity that are not well supported (e.g. Roth, Scott,
et al., 2006). Divergence between so-called canonical descriptions of work and actual work
practice can reveal opportunities to improve performance through more effective support.
Real-time observations in actual work settings are often combined with informal interviews conducted as the task progresses. In some cases participant observation methods
are employed in which analysts participate in the work performance (often in an apprenticeship capacity). In most cases, additional forms of data are collected (e.g., objective
records of unfolding events) and combined with the observations that are made (either
in real time or from recordings) to create a rich protocol or process trace that captures the
unfolding events and task activities, thus allowing the activities of operators to be understood within the context of the task itself (Woods, 1993).
As noted by Roth and Patterson (2005), naturalistic observational studies do not rely
on the experimental design logic of controlled laboratory studies, in which situational variables are explicitly varied or controlled for. Instead, methodological rigor required for
generalization is achieved by (a) sampling broadly, including observing multiple domain
practitioners who vary in level of expertise and observing different work conditions (e.g.,
shifts, phases of operation); (b) triangulation, using a variety of data collection and analysis methods in addition to observations; and (c) employing multiple observers/analysts
with differing perspectives (when possible). As with other qualitative analysis techniques,
an important method for ensuring the validity of the observational components of a CTA
is to check the findings with domain practitioners and experts themselves.
CTAs demonstrate a rich variety of approaches in their use of observational methodologies. For instance, Patterson and Woods (2001) conducted observations that focused
on space shuttle mission control shift change and handovers during an actual space shuttle mission. They combined observations with handwritten logs and spontaneous verbalizations of the controllers (captured via audiotape), along with flight plans, to identify
handover activities that were related to fault management, replanning, and maintaining
common communicational ground.
Mumaw et al. (2000) used observational methods to study operator monitoring strategies in nuclear power plant control rooms. They conducted observational studies at multiple sites to uncover
thefromvariety
of information
and strategies
that are used by
Downloaded
rev.sagepub.com
at University at Buffalosources
Libraries on February
18, 2014
18
Reviews of Human Factors and Ergonomics, Volume 3
power plant operators to support monitoring performance. Initial observations, combined with feedback from an operator who reviewed the preliminary findings, were leveraged to define more targeted observational goals for subsequent observations (e.g., to note
operator use of the interface to support monitoring, to identify ways monitoring could
become difficult), as well as to generate specific probe questions to ask operators as the
observations were taking place (e.g., asking about reasons for monitoring or about regular monitoring patterns).
Baxter et al. (2005) used targeted observations to log alarm events and caregiver interactions with equipment in a neonatal intensive care unit. Observations were used along
with interviews based on the CDM, along with analyses showing communication patterns
and written document use, to make recommendations for the design of a decision aid
intended to support the selection of ventilator settings.
Observations can also be performed under more controlled conditions. For example,
individuals may be instructed to think aloud as they perform the task to provide an ongoing verbal protocol of the task (see Bainbridge & Sanderson, 1995, for extensive details on
the collection and analysis of verbal protocol data). Observations to support CTA can also
occur in the laboratory. For instance, Gorman et al. (2004) observed functionally blind
users performing specified Internet search tasks in conjunction with a screen reader. Users
were asked to think aloud during the task in a laboratory environment, and decision models were developed to describe their activities.
Video and audio recordings can be used to capture observational data for later analysis. Video recordings of activities can be employed to support different types of analyses,
including qualitative analysis of activities (Miles & Huberman, 1984; Sanderson & Fisher,
1994). For example, Kirschenbaum (2004) observed groups of weather forecasters in either
their everyday work setting as they performed normal forecasting duties or in a simulated
shipboard forecasting center as they worked on a provided scenario. Team activities, along
with think-aloud protocols, were captured via videotape to allow for the detailed qualitative data analysis of cognitive activities related to weather forecasting.
Seagull and Xiao (2001) used video recordings on which eye-tracking data had been
superimposed to study a surgical procedure. The recordings were made from the perspective of the physician performing the procedure (wearing mobile recording and eyetracking equipment). The eye-tracking data indicated where (in the operating room) the
physician looked throughout the procedure. The tapes were reviewed by the physician and
other subject matter experts to determine what the physician had to look at to accomplish the task, what that information would indicate, and why it was sought by the physician at that point in the task—in essence, to identify information cues and their purpose
during the task. Seagull and Xiao (2001) found that this technique provided information
regarding task strategies that they had not uncovered through other analyses. Further,
comparing the eye-tracking recordings with previously completed task analyses led to
the discovery of nonvisual information use strategies (instances in which the task analysis indicated the need for information but the cue was absent from the eye-tracking
recording).
Video records make it possible to collect cued retrospective explanations of task performance by the individuals who participated in the task (Hoc & Leplat, 1983). They can
also be leveraged to elicit
additional
knowledge
from
other
experts. J. E.
Downloaded
from rev.sagepub.com
at University
at Buffalo
Librariessubject
on February matter
18, 2014
Analysis of Cognitive Work
19
Miller, Patterson, and Woods (2006) described a critiquing process for performing a CTA
that relies on video- and audio-recorded data of a novice performing a task. The results
are used to create a script of the novice’s performance that can then be critiqued by subject matter experts. The researchers recorded a novice completing a complex (military
intelligence analysis) task, during which the novice was asked to think aloud. Six expert
intelligence analysts were asked to read a transcript of the novice’s verbalizations while
being shown additional material (e.g., screen shots captured, documents accessed, and
handwritten notes generated by the novice during the task). Experts were asked to comment on the novices’ performance as the script was presented. Audio and video recordings, along with handwritten notes of the critiquing process, were used to generate a
protocol, which was then analyzed to provide insight into how experts approach this task.
Observational data are amenable to both qualitative and quantitative analysis, depending on the type of data collected. Typically, however, observation-based CTAs lean toward
a more qualitative, thematic analysis approach to identify the work complexities, associated cognitive demands, and practitioner strategies that are the focus of a CTA. In some
cases, previous research or theories are used to structure the analysis.
For example, Patterson, Roth, Woods, Chow, and Gomes (2004) applied a structured
approach to a meta-analysis of four previously conducted observational studies of ambulance dispatch, space shuttle control, railroad dispatch, and nuclear power control. Observational data from the original studies were coded according to a set of predefined
categories related to shift change and hand-off strategies to identify common strategies
across the multiple domains. Ultimately, analyses are guided by the theoretical stance and
associated representational forms adopted by the analyst, as described in the next section.
Methods for Representing the Results of CTA
The representations used to synthesize and communicate the results of a CTA play an important (if somewhat underappreciated) role in the success or failure of any particular
analysis. Forms of information representation can shape the process of cognitive task and
work analyses as well as provide a means for communicating analysis results. Information
that is gathered through means such as observations, interviews, or document analysis
must be processed and structured in a way that reveals the complexities of the task and
work domain. These representations are useful not only in supporting analysis but also
in eliciting additional information from domain experts because the current understanding can be inspected and improved (Bisantz et al., 2003).
The variety of representational forms used to synthesize and summarize CTA results
is large, representative of the background and inclinations of analysts performing the
work, and we will not survey them in detail. What can be said, however, is that the representations used during analysis, and for the presentation of results, range from thematically organized narrative outputs to highly structured graphic representations.
Narrative presentations of results are often associated with ethnographic observation
methods and are most suitable for presenting themes and conclusions that emerge from
data collection and reflection (often, a data-driven process). These can include providing
segments of think-aloud protocols, transcriptions of dialogue, or summary descriptions
of strategies that Downloaded
illustrate
theme (e.g.,
Mumaw
al., 2000;
Pfautz
froma
rev.sagepub.com
at University
at Buffaloet
Libraries
on February
18, 2014 & Roth, 2006; Roth,
20
Reviews of Human Factors and Ergonomics, Volume 3
Multer, & Raslear, 2006; Roth & Woods, 1988; Watts et al., 1996; Weir et al., 2007). Table
1.1 provides an example of a narrative representation that lists power plant operator strategies for extracting information about plant state.
More structured representations are often used to provide summary depictions of
selected aspects of highly complex data sets, such as timeline representations that map
the evolution of events and decisions over time (e.g., Figure 1.6, which shows the high
workload and interruptions that nurses must cope with during medication administration) and link analysis graphs that show operator movements within a workplace or communication events between individuals (e.g., Figure 1.7 and Baxter et al., 2005).
In some cases, theoretically motivated representations impose structure on the kinds
of information that will be the focus of the analysis. For instance, the decision ladder
structure (see Figure 1.8, page 23, for an example) utilized in the suite of cognitive work
analysis methods focuses the analyst on identifying (human or automated) informationprocessing stages, states of knowledge, and shortcuts across those stages. The abstraction
hierarchy (means-end) formalism (see Figure 1.2) focuses analysts’ attention on intentional
and structural properties of the work domain. These representations are often presented
in graphical form as sets of interlinked nodes (Bisantz et al., 2003; Bisantz & Vicente, 1994;
Lintern, 2006; Naikar, 2006; Vicente, 1999), but they can also be represented in tabular format to support additional annotation (e.g., see Vicente, 1999, pp. 199–200).
Other forms of representation have a more bottom-up focus. For example, concept
maps (see Figure 1.5) enable analysts to represent knowledge about a domain (e.g., gathered through interviews with domain practitioners) in a way that is structured by the
practitioners’ conceptualization of objects and relationships in a domain, rather than by
predefined categories specified by a theoretical framework.
In many cases, the structure of information representation is intimately linked with the
knowledge acquisition or analysis methodology itself. Outputs from the cognitive work
analysis methodology, such as the abstraction hierarchy and decision ladders noted previously, are examples. Another example is the decision requirements table that is associated with the CDM. The decision requirements table documents key decisions, cues, and
Table 1.1. Example Narrative Representation
Strategies That Maximize Information Extraction From Available Data
Operators have developed strategies that can be used to maximize the information they
extract from the plant state data available to them.
Reduce noise. Operators displayed a variety of alarm management activities designed
to remove noise so that meaningful changes could be more readily observed. The following
are examples of these activities.
(a) Clear alarm printer. At shift turnover, operators clear the printer of all alarms generated
on the previous shift….
(b) Cursor alarms (i.e., delete the alarm message from the screen before the alarm
actually clears, but do not disable it) when they are considered to be unimportant....
Enhance signal. This action increases the salience of visibility of an indicator....
Note. Excerpt taken with permission from Mumaw, R. J., Roth, E. M., Vicente, K. J., and Burns,
C. M. (2000), pp. 47–48. Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
21
Figure 1.6. A timeline representation of nurse activities illustrating the high number of interruptions
(indicated by arrows) that nurses must cope with during medication administration (Patterson, Cook,
& Render, 2002). Reprinted with permission from the American Medical Informatics Association.
22
Reviews of Human Factors and Ergonomics, Volume 3
Communication partner % of events
Figure 1.7. Link analysis showing hypothetical communication links among personnel in a
hospital emergency department. Large circles represent individual caregivers (different physicians and nurses), and small circles represent groups of caregivers of a particular type, with
which the individuals hypothetically communicated. Labels indicate types of caregivers (i.e.,
ATTG = attending physician; R1/R2 = first- or second-year resident; Tx = transporter). The
thickness of the links represents the frequency (in terms of percentage of communication
events) of communication between a particular caregiver and other caregiver types. For a
related study see Fairbanks, Bisantz, and Sunm (2007).
strategies used in making the decision; specific challenges that complicate the decisionmaking process; and potential pitfalls and errors to which less experienced practitioners
are prone (Crandall et al., 2006). The applied cognitive work analysis methodology (Elm
et al., 2003) produces multiple, linked information representations (graphical and tabular) that connect information-gathering activities to display design in order to provide clear
design traceability. The information representations include a functional abstraction network representing goals and associated processes and system components, cognitive work
requirements stemming from the goals and processes, associated information requirements, and, finally, requirements
for information
representation.
Downloaded from rev.sagepub.com
at University at
Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
23
Assessing movement with active
sonar: Does it move ping-to-ping?
Is movement consistent with
Doppler information? Is signal
shape consistent with movement?
Continued observation over time to
gather information regarding
movement,
allow evidence
to build
Figure 1.8. Example decision ladder model showing part of a task of detecting and identifying
submarines based on sensor data. Small nodes represent information-processing stages that
are not part of this task. Reprinted from the International Journal of Human-Computer Studies,
58, A. M. Bisantz, E. M. Roth, B. Brickman, L. Gosbee, L. Hettinger, and J. McKinney, Integrating cognitive analyses in a large-scale system design process, 177– 206. Copyright 2003,
with permission from Elsevier.
In other cases, analysts have developed or adopted ad hoc representations to suit their
particular project. Examples include the graphical representations of intelligence analyst
search strategies developed by Patterson et al. (2001; see Figures 1.3 and 1.4), abstraction
hierarchy representations annotated with activity elements (Lintern, 2006), a schematic
showing relative physical locations and types of communication links among NASA mission controllers (Watts et al., 1996), and cross-referenced functional matrices utilized by
Bisantz et al. (2003; see Figure 1.9) to link system function decompositions to associated
higher-level cognitive activities and to display areas that would support those functions
and activities.
From a practical standpoint, analysts will typically use a variety of representations in
an opportunistic way, choosing complementary capabilities to focus on and highlight key
aspects of the analysis.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
24
Reviews of Human Factors and Ergonomics, Volume 3
Figure 1.9. Cross-linked functional matrices showing links from ship functions to cognitive
functions to display requirements. Reprinted from the International Journal of HumanComputer Studies, 58, A. M. Bisantz, E. M. Roth, B. Brickman, L. Gosbee, L. Hettinger, and
J. McKinney, Integrating cognitive analyses in a large-scale system design process, 177–206,
Copyright 2003, with permission from Elsevier.
RELATED APPROACHES
As noted in the beginning of the chapter, methods for analyzing complex cognitive work
have historical roots in a number of disciplines; these methods are also informed by, and
may be performed in concert with, a number of related analysis and modeling techniques.
Here we review three such approaches that are closely related to CTAs. We describe how
task-analytic methods that focus primarily on documenting observable or better-defined
work activities may be usefully combined with CTA techniques, particularly in large projects; how cognitive modeling techniques may be used to represent what is learned from
CTA knowledge acquisition activities in a form that can be used to generate specific predictions about human performance; and, finally, how participatory approaches from
human-computer interaction may complement CTA methods.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
25
Task-Analytic Approaches
Other task analysis methodologies may be useful in the analysis of cognitive work, although
they do not traditionally focus on expertise and task and environmental complexities, as
do the methods described previously.
Hierarchical task analysis (HTA) is a well-known and often-utilized task analysis technique that represents tasks through a goal-subgoal-activity decomposition (Annett, 2003;
Kirwan & Ainsworth, 1992; Shepherd & Stammers, 2005). As with the CTA techniques
described thus far, information to support an HTA can be drawn from a number of
sources, including interviews with subject matter experts, document analysis, and observation (Stanton, 2001). Tasks are described in terms of the operations (activities) that
achieve the task goals and in terms of the plans that indicate the order and preconditions
necessary for executing the activities. For instance, a plan may specify operations that need
to be performed iteratively until a stopping condition is met or may indicate that some
operations are optional, based on a condition.
The level of detail is flexible, depending on the analyst’s needs. For instance, Kirwan
and Ainsworth (1992, p. 11) noted that when one is analyzing how people interact with
or control systems, the level of analysis must capture details of the interaction (e.g., read
information from screen, enter a control action); however, for applications such as training support, the level of detail of the analysis should be guided by the likelihood that an
error would be made, combined with the cost of such an error. The descriptive component of an HTA (the goal decomposition with related plans) is typically represented in
an annotated tree structure (see Figure 1.10) and can be augmented with an analysis of
potential failure modes (and thus the information, knowledge, and/or skills required to
alleviate these) associated with activities or plans (Annett, 2003).
HTA has been used in a variety of applications, such as specifying training requirements
(Annett, Cunningham, & Mathias-Jones, 2000; Shepherd & Kontogiannis, 1998), identifying error potential (Shryane, Westerman, Crawshaw, Hockey, & Sauer, 1998), analyzing
the fit between emergency medical technician tasks and a portable computer designed to
aid those tasks (Tang, Zhang, Johnson, Bernstam, & Tindall, 2004), and modeling tasks
as input to the iterative design of user interfaces (Tselios & Avouris, 2003).
Another goal decomposition approach, operator function modeling (OFM), has been
used to model human interaction with complex systems (Mitchell, 1987; Mitchell & Miller,
1986). In this technique, system goals, subgoals, and operator activities are represented
as a set of interconnected nodes. Each node (corresponding to a goal or activity) has an
associated state space and next-state transition diagram showing how nodes change states
in response to external inputs or the states of higher-level goals. For instance, a node could
represent the activity of “control cell phone ring modality,” with states corresponding to
“audible signal on” and “vibrate signal on.” One would transition among these states based
on higher-level goals (e.g., work uninterrupted by noise) or the situational context and
associated external demands (e.g., a concert begins). Operator function models have been
instantiated to support human performance in a number of contexts, including training
support using intelligent tutoring in satellite ground control (Chu, Mitchell, & Jones, 1995)
and display design for information retrieval (Narayanan, Walchli, Reddy, & Balachandran, 1997).
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
26
Figure 1.10. Example of a hierarchical task analysis for the task of finding papers using an
electronic database.
Analysis of Cognitive Work
27
Although not typically considered CTA techniques, methods such as HTA and OFM
can provide a framework that allows the identification of areas for CTA analysis, may allow
some aspects of tasks identified through a CTA to be specified in more detail, or may provide information that is complementary to that derived from other forms of analysis.
C. A. Miller and Vicente (2001), for example, demonstrated how performing an HTA in
addition to a work domain analysis of a thermal-hydraulic microworld provided information to support display design that complemented the results from the work domain
analysis (specifically, information related to executing task procedures). Shepherd and
Stammers (2005) noted that it is important to recognize that techniques such as HTA are
not in opposition to those labeled CTA and that the choice is not exclusive; rather, methods should be chosen and applied in a way that accomplishes the necessary analysis.
These methods could be used as a framework to represent an overall task and to identify task aspects such as planning, decision making, or fault diagnosis, which can then be
explored using CTA analyses. For instance, Chrenka, Hutton, Klinger, and Anastasi (2001)
described a tool in which an operator function model of a complex system is used as an
organizing framework against which cognitively challenging components can be identified
and categorized for additional focus using extensive CTA methods (such as those described
previously). Tang et al. (2004) used an HTA to describe emergency medical technician tasks
and then performed a GOMS (goals, operators, methods, selection rules) analysis of some
of the cognitively demanding tasks.
Lee and Sanquist (2000) described a CTA method that augments an operator function
model by specifying the cognitive activities, information-processing demands, task inputs
and outputs, and task and environmental demands associated with the functions and activities specified in an OFM. For example, for a target identification function, they identified cognitive activities such as identification, task inputs such as a potential threat seen
on a radar screen, information-processing requirements such as perception and long-term
memory, a task output of a restricted set of objects to monitor, and external demands such
as the number of targets and their rate of change. Raby, McGehee, Lee, and Nourse (2000)
applied this method to aid in display design for snowplow operators.
Cognitive Modeling Approaches
Another technique that can complement or augment CTAs, as described previously, is to
develop a formal (often computer-based) model that represents the knowledge and information processes that are presumed to be required for cognitive task performance (Card,
Moran, & Newell, 1983; Gray & Altmann, 2001; Ritter & Young, 2001). Cognitive models
provide a means to represent what is learned from CTA knowledge acquisition activities
in a form that can be used to generate specific predictions about the performance of humans when confronted with different situations (e.g., when using different displays or
support systems to perform the same cognitive tasks).
There are a variety of approaches to cognitive modeling. Some types of cognitive models, such as the GOMS family of models (John & Kieras, 1996), utilize a cognitively oriented goal decomposition approach that falls under the broad class of task-analytic
methods. Other types of cognitive models are computational models that simulate the cognitive processes that
are hypothesized
underlie
task
performance.
Downloaded
from rev.sagepub.com atto
University
at Buffalo
Libraries
on February 18, 2014We summarize some
28
Reviews of Human Factors and Ergonomics, Volume 3
of the most prominent approaches. Comprehensive reviews can be found in Pew and
Mavor (1998), Ritter et al. (2003), Chipman and Kieras (2004), and Gray (2007).
The GOMS family of models represents one of the most accessible and widely used cognitive modeling approaches (John & Kieras, 1996). GOMS models provide a formalism
for decomposing and representing tasks in terms of the person’s goals; elemental mental
and physical operators that combine to achieve goals (e.g., pressing a key, retrieving a piece
of information from memory); available methods, which are sequences of operators that
can be used to accomplish the goals; and selection rules that specify which methods to use
in different situations.
GOMS models are particularly suited for modeling well-understood routine tasks (Gray
& Altmann, 2001). They provide an analytic means to predict task performance times,
learning times, and workload (Gray & Boehm-Davis, 2000; Kieras, 1998). They have been
successfully used to evaluate the adequacy of a user interface design as well as to compare
alternative designs (Kieras, 1998). Design aspects that can be checked with a GOMS model
include whether methods are available for all user goals that have been specified, whether
there are efficient methods for common user goals, and whether there are ways to recover
from errors (Chipman & Kieras, 2004).
Network models are another common approach for cognitive modeling. A prominent
example is the family of IMPRINT (IMproved Performance Research INTegration Tool)
models that are used by the U.S. Army to predict the performance of military systems
(Booher, 2003). Network models decompose tasks into elemental subtasks that are combined in a network representation to predict performance. Typically, each elemental subtask has associated with it an estimated performance time and probability of success
parameter (typically represented as a distribution). Monte Carlo simulations are performed to generate statistical distribution predictions for overall task performance times,
learning times, and/or workload measures.
There are also computer-based models that attempt to simulate the actual mental processes (sensory, perceptual, cognitive, and motor activities) that are presumed to underlie
human cognitive performance. Examples include COGNET (COGnition as a NETwork
of Tasks; Zachary, Ryder, Ross, & Weiland, 1992), MIDAS (Man-Machine Integration
Design and Analysis System; Laughery & Corker, 1997), and OMAR (Operator Model
Architecture; Deutsch, 1998). Included in this class are models built using cognitive architectures that embody psychological theories of human cognitive performance. Cognitive
architectures with an extensive research base include SOAR (Laird, Newell, & Rosenbloom,
1987), ACT-R (Adaptive Control of Thought—Rational; Anderson et al., 2004; Anderson
& Lebiere, 1998), and EPIC (Executive-Process/Interactive Control; Kieras & Meyer, 1997;
Kieras, Woods, & Meyer, 1997).
Cognitive models have been successfully used to develop fine-grained models of routine task performance as a way to explore the impact of different interface designs (Gray
& Boehm-Davis, 2000; Kieras, 2003). Examples include evaluation of telephone information operator workstations (Gray, John, & Atwood, 1993), the efficiency of alternative cell
phone menu structures (St. Amant, Horton, & Ritter, 2007), and the design of commercial
computer-aided design systems (Gong & Kieras, 1994). In each case, the models successfully predicted substantial differences in performance times as a function of system design.
More recently, cognitive
models,
particularly
cognitive
simulations
on cognitive
Downloaded
from rev.sagepub.com
at University
at Buffalo Libraries
on February 18, built
2014
Analysis of Cognitive Work
29
architectures, have been applied to more complex cognitive tasks and tasks that involve
multiperson communication and coordination. For example, the NASA Aviation Safety
and Security program had five teams develop cognitive simulation models of pilots performing taxi operations and runway instrument approaches with and without advanced
displays (Foyle, Goodman, & Hooey, 2003; Foyle et al., 2005). The models utilized different cognitive architectures to illuminate different aspects of pilot cognitive performance
and contributors to error.
For example, Byrne and Kirlik (2005) used ACT-R to model pilots’ scanning behavior
and to explore the impact of the structure of the environment on errors. Deutsch and Pew
(2004) utilized D-OMAR to examine the impact of expectations and habits on potential
for error. Lebiere et al. (2002) employed a hybrid model that integrates IMPRINT with
ACT-R to explore the impact of differences in individual cognitive, perceptual, and motor
abilities as well as changes in the environment on performance and error.
Boehm-Davis, Holt, Chong, and Hansberger (2004), as part of a separate project, utilized ACT-R to examine crew interaction during the descent phase of flight. They created
separate models for each of two pilots (one flying and one not), which they ran jointly
under different conditions. They manipulated the level of expertise and task load and
showed an impact on performance and error, including differences in situation awareness
between the two pilots and crew miscommunications. Other examples of cognitive modeling for complex, dynamic domains include a cognitive simulation model of nuclear
power plant operator performance during emergencies (Roth, Woods, & Pople, 1992) and
cognitive modeling of submarine officers (Gray & Kirschenbaum, 2000).
Cognitive models provide an effective means of establishing the adequacy of a cognitive analysis. A cognitive model can be used to establish that the knowledge and processing assumed to underlie human performance in a particular task are in fact sufficient to
generate the observed behavior. For example, Roth et al. (1992) developed a cognitive
simulation of dynamic fault management in nuclear power plant emergencies. The cognitive simulation provided an objective means for establishing some of the cognitive activities required to handle the emergency event successfully. As such, it provided a tool for
validating and extending the CTA that was performed based on discussions with instructors, review of procedures, and observations of crews in simulated emergencies.
Cognitive models not only provide a formal means for representing the results of a CTA;
they can also generate new insights into the cognitive contributors to performance. For
example, Byrne and Bovair (1997) developed a cognitive model that embodied a theory
of memory activation to explain a common type of human error called a postcompletion
error. It has often been observed that task steps that need to occur after a person’s main
goal has been achieved are prone to omission errors (e.g., people regularly forget to take
the original sheet out of the copier or to take their bank card out of the automatic teller
machine). Byrne and Bovair (1997) built a computer model based on a theory of memory that exhibited that behavior. This model served both to strengthen the validity of the
theory and to illuminate the reason for the error.
Another example of using a model to illuminate the psychological basis of an observed phenomenon was provided by Kieras and Meyer (2000). It has been repeatedly
observed that when people have to suddenly take over a function from an automated system, performanceDownloaded
is initially
degraded.at University
This isatreferred
toonas
automation
deficit. Kieras and
from rev.sagepub.com
Buffalo Libraries
February
18, 2014
30
Reviews of Human Factors and Ergonomics, Volume 3
Meyer (2000) developed a cognitive model based on psychological theory that exhibited
similar behavior, thus providing a theoretically grounded account of the phenomenon.
These two examples illustrate the use of cognitive models as a way to build and test
cognitive theories to explain observed performance. The ultimate aim is to build cognitive models that have sufficient theoretical grounding that they can be generalized across
applications and domains. Although they are not examples of CTAs aimed at specific application, they illustrate ways to illuminate the cognitive contributors to performance.
Related Approaches From Human-Computer Interaction
As with CTA, methods within the human-computer interaction and software design communities have been developed that focus the requirements-gathering, development, and
design processes on users in the context of their work and tasks. Those who employ participatory analysis and design techniques take the view that for software systems to be successful, the ultimate users of the systems need to be directly involved in all phases of the
design process and empowered to make design decisions (Bodker, Kensing, & Simonsen,
2004; Clement & Van den Besselaar, 1993; Greenbaum & Kyng, 1991; Mueller, Haslwanter, & Dayton, 1997; Schuler & Namioka, 1993). Participatory design includes a variety of hands-on techniques and methods that tend to involve small groups of designers
and users performing activities such as paper prototyping and brainstorming. (For an
extensive set of examples, see Mueller et al., 1997.)
Contextual inquiry (Beyer & Holtzblatt, 1998), a comprehensive analysis and design
process involving users, encompasses a number of activities that in some ways correspond
to those conducted during a CTA. Information about a work domain is gathered through
observations and interviews, and specific models are generated that allow work processes,
communication patterns, task steps, workplace objects and layout, cultural practices, organizational factors, and workplace artifacts to be documented, shared, and used as input
to a design process. For example, work flow models, though not emphasizing the cognitive activities or domain complexities typically identified in a CTA, provide a means of
representing people along with the types of communication and coordination activities
that occur between them.
Scenario-based design (Carroll, 1995, 2000) emphasizes the development and analysis
of user interaction scenarios that describe work activities. Scenarios are “concrete, narrative descriptions of activity that the user engages in when performing a specific task, a
description sufficiently detailed so that design implications can be inferred and reasoned
about” (Carroll, 1997, p. 396). Scenarios can support a number of functions during a
design process. For example, they can facilitate discussion among designers and users
regarding current activities and how new technology could be used (during requirements
gathering); they can provide the basis for tasks during testing and evaluation; and they
can be used in training to demonstrate system functionality to users in a meaningful way
(Carroll, 1997). In similar ways, scenarios are often integrated into CTA methodologies
(e.g., to guide discussion with subject matter experts; to generate concrete tasks for thinkaloud protocols).
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
31
ADAPTING METHODS TO PROJECT OBJECTIVES
AND CONSTRAINTS
The foregoing review of CTA and related methods makes clear that a large toolkit of methods is available to an analyst attempting to characterize cognitive work in a particular setting. As we have tried to emphasize, what is fundamentally important in performing a CTA
is to capture (a) the domain characteristics and constraints that define the cognitive requirements and challenges and (b) the knowledge, skills, and strategies that underlie both
expert performance and the error-vulnerable performance of domain practitioners.
The selection and timing of particular CTA methods will depend on the goals and pragmatic constraints of the specific situation: What kind of information and level of detail
are needed? How much time is available? What kind of access is available to domain
experts? What is the nature of the work, and does it lend itself to observation?
The choice of CTA method (or methods) will be strongly guided by analysis objectives.
If the goal of the analysis is to identify “leverage points” where new technology could have
significant positive impact, then techniques that provide a broad-brush overview of cognitive and collaborative requirements and challenges in a domain, such as field observations and structured interviews, can be very effective. If the goal is to develop training
programs or to produce assessment protocols to establish practitioner proficiency (e.g.,
for accreditation purposes), then methods that capture the detailed knowledge and skills
(e.g., mental models, declarative and procedural knowledge) that distinguish practitioners at different levels of proficiency (e.g., the CDM and process trace approaches) can be
particularly useful. On the other hand, if the goal is to develop a computer model that
simulates the detailed mental processes involved in performing a task, then techniques
such as think-aloud verbal protocol methods may be most appropriate.
The particular set of techniques selected will also be strongly determined by the pragmatics of the specific local conditions. For example, access to domain practitioners is often
limited. In those cases, other sources of domain knowledge (e.g., written documents)
should be leveraged to maximize productive use of time with domain experts. In some
cases, observing domain experts in actual work practice (e.g., using ethnographic methods or simulator studies) may be impractical; in those cases, structured interview techniques (e.g., concept mapping) and critical incident analysis may be the most practical
methods available. In other cases, domain experts may not be accessible at all (e.g., in highly
classified government applications), in which case, it may be necessary to look for surrogate experts (e.g., individuals who have performed the task in the past) or analogous
domains to examine.
Several CTA studies serve to illustrate the impact of analysis goals and local pragmatics on the selection of CTA methods. For example, researchers interested in uncovering
mismatches between the prescribed approach to task performance and actual work practice tend to use field observations because they provide a direct window on actual practice
(e.g., Patterson, Cook, & Render, 2002). However, field observations are not always a practical option. Field observations are impractical when studying work that happens privately or over a long span of time (e.g., planning or design tasks that can span multiple
days and involve solitary work that is not externally observable). Field observations are
also inefficient for
the study
of rare atevents
occurrence
of 18,
which
cannot be reliably
Downloaded
from rev.sagepub.com
University the
at Buffalo
Libraries on February
2014
32
Reviews of Human Factors and Ergonomics, Volume 3
predicted (e.g., response to emergencies). In those situations other CTA approaches are
required.
The CDM was developed partly in response to the need to study expertise in situations
in which field observation was not a practical option (Klein, Calderwood, & ClintonCirocco, 1986). As described in Crandall et al. (2006), Klein and his associates initially
attempted to study the decision making of firefighters by “shadowing” them and getting
them to think aloud (Klein et al., 1986). However, they quickly discovered that fires are
relatively rare occurrences and that asking individuals to think aloud is not a practical request in high-stress, time-pressured conditions such as firefighting. Thus, the CDM grew
out of a need to tailor methods to the demands of the knowledge acquisition conditions.
Another example of the need to adapt methods to deal with local pragmatics arises
in domains such as intelligence analysis and information assurance analysis, in which
security concerns prevent analysts from discussing actual past cases. Researchers have had
to come up with ingenious new methods to enable domain practitioners to express their
expertise. Patterson et al. (2001) dealt with the challenge by having intelligence analysts
work on analogous unclassified information search and integration tasks. This enabled
them to uncover analysts’ search strategies in the face of data overload conditions. D’Amico,
Whitley, Tesone, O’Brien, and Roth (2005) faced similar hurdles in trying to study how
information assurance analysts detect and pursue network attacks. They overcame the
security concern issues by asking the analysts to create hypothetical scenarios that shared
critical characteristics with actual cases they encountered. This enabled the research team
to uncover critical challenges that arise in the domain and the strategies that expert information assurance analysts have developed to handle them without needing to analyze
actual cases.
Although we have focused on specific CTA methods, it should be emphasized that CTA
is fundamentally an opportunistic bootstrap process (Potter et al., 2000). In most cases,
multiple converging CTA methods are employed. The selection and timing of specific
CTA methods depend on local constraints. The key is to develop an understanding of both
the characteristics of the domain that influence cognitive and collaborative performance
and the knowledge and strategies that domain practitioners possess.
Typically, the cognitive analyst might start by reading available documents that provide background on the field of practice (e.g., training manuals or policy and procedure
guides). This background knowledge will raise questions that can then be pursued through
field observations and/or interviews with domain practitioners. In turn, these may point
to complicating factors in the domain that place heavy cognitive demands on the user and
create opportunities for user error. It may also highlight discrepancies between how work
is “supposed to be done” and how it “actually gets done.” These, in turn, can point to opportunities to improve performance and reduce the disconnect between proscriptions and actual practice through improved training or support systems. Further observations and/or
interviews, perhaps with different domain practitioners at different locations, can then be
conducted to build on and test the generality of initial, tentative insights.
When the results of using multiple methods, domain practitioners, and sites reinforce
each other, confidence in the adequacy of understanding is increased. If differences are
found, it signals the need for analysis revision. The research logic employed is similar to
the rationale that underpins
theory
(Glaser
Strauss,
1967).
Downloadedgrounded
from rev.sagepub.com
at University
at Buffalo&
Libraries
on February
18, 2014
Analysis of Cognitive Work
33
RESEARCH FRONTIERS
CTA research is continuing on several fronts. Some of these fronts have been described in
earlier sections and include the development of new knowledge acquisition methods and
variants, the development of new computational modeling tools, and the expansion of
psychological theory on the cognitive and collaborative processes of individuals and teams.
Here we focus on three research trends that are particularly salient: (a) CTAs as applied
to multiperson teams and organizations (i.e., macroergonomic and macrocognition applications); (b) development of software tools to support the CTA endeavors; and (c)
integration of CTA results into the systems engineering process, particularly to support
human-system integration issues that arise as part of large first-of-a-kind design efforts.
Macrocognition and Macroergonomic Applications of CTA
Over the past few years there has been growing interest in applying CTA methods to multiperson units (Klein, 2000; Klein et al., 2003). This includes understanding the cognitive
and collaborative processes that underlie small team performance as well as the distributed cognitive processes that span organizational- and managerial-level boundaries. The
term macrocognition was coined to capture the need to study this higher-level, distributed
aspect of cognition (Klein et al., 2003). This move has coincided with growing interest in
analyzing and supporting the design of large, complex sociotechnical systems and systems
of systems (e.g., military command and control systems, railroad operations, health care
systems) that fall under the umbrella of macroergonomics (Hendrick, 2007 [chapter 2, this
volume]; Hendrick & Kleiner, 2001).
Examples of cognitive activities that underlie multiperson performance include communication patterns that foster shared situation awareness, shared mental models, and
problem-solving and decision-making strategies that lead to resilient team performance
(or the converse: brittle performance subject to error). Team CTA methods are relevant to
the analysis and design of team and organizational structures (e.g., Naikar, Pearce, Drumm,
& Sanderson, 2003), the development of support systems for distributed multiperson performance (e.g., O’Hara & Roth, 2005), and the development of team and organizational
training (e.g., Salas & Priest, 2005).
Generally, CTA studies of distributed cognitive processes have used variants of standard CTA interview and observation techniques. For example, Klein, Armstrong, Woods,
Gokulachandra, and Klein (2000) employed the CDM to examine the role of common
ground in supporting coordination and replanning in distributed military teams. Roth,
Multer, et al. (2006) employed a combination of field observation and semistructured interviews to examine informal cooperative strategies developed by railroad workers (including train crews, roadway workers, and dispatchers). They documented a variety of informal
communication strategies that served to foster shared situation awareness across the distributed organization, which contributed to efficiency, safety, and resilience to error of railroad operations.
A. Miller and Xiao (2006) used semistructured interviews to examine resource allocation strategies employed across organizational levels in a trauma hospital to cope with high
patient demand Downloaded
pressures.
They interviewed
individuals
at different
managerial levels
from rev.sagepub.com
at University at Buffalo
Libraries on February
18, 2014
34
Reviews of Human Factors and Ergonomics, Volume 3
(surgical unit medical director, anesthesia staff and nursing staff schedulers, and charge
nurses) to understand scheduling and decision-making strategies at different levels of the
work organization and how they combine in a nested fashion to achieve organizational
resilience in the face of variable-tempo resource demands.
New CTA methods have also emerged that are specifically intended to analyze the
knowledge and strategies that underlie multiperson performance. These include methods
to analyze team knowledge (Cooke, 2005), to measure shared situation awareness (MacMillan, Paley, Entin, & Entin, 2005), to elicit and represent communication and coordination
patterns (Harder & Higley, 2004; Jentsch & Bowers, 2005), and to understand the distributed decision-making strategies and information requirements (Klinger & Hahn,
2003, 2005).
Software Tools to Support CTA Capture and Dissemination
Currently there is a paucity of software tools specifically tailored to the capture and dissemination of CTA results. Generally, cognitive analysts rely on standard text-processing
and drawing tools to document CTA results. However, these tools are limited in their ability to support knowledge maintenance, update, and reuse. This is a particular drawback
in the case of large projects that span multiple years and that involve collection across multiple domain practitioners and sites and multiple cognitive analysts.
Some efforts have been made to develop software tools to support cognitive analysts
in capturing, integrating, and disseminating CTA results. These include the Work Domain
Analysis Workbench developed by Skilton, Cameron, and Sanderson (1998), the CmapTools
software suite created at the Institute for Human and Machine Cognition (2006), and the
Cognitive Systems Engineering Tool for Analysis (CSET-A; Cognitive Systems Engineering
Center, 2004). However, most systems to date have been developed as part of research and
development efforts and are limited in robustness.
Integrating Cognitive Requirements Into the Systems
Engineering Process
Another important research frontier is the development of methods and tools for more
effectively integrating cognitive and domain analyses into large-scale system design projects (e.g., next-generation ships or process control plants). Human-system integration spans
a wide range of activities throughout a system life cycle (Booher, 2003). It includes initial
concept development, hardware and software specification, function allocation, staffing and
organization design, procedures and training development, and testing activities. Although
CTA methods are clearly applicable, there has been growing recognition of the need to
develop more systematic methods and tools for integrating the results of CTA into the
systems development process (Osga, 2003; Pew & Mavor, 2007).
A number of cognitive engineering methods have emerged that incorporate cognitive
and work domain analyses as core activities. These include decision-centered design (Hutton,
Miller, & Thordsen, 2003), cognitive work analysis (Vicente, 1999), applied cognitive
work analysis (Elm et al., 2003), situation awareness–oriented design (Endsley, Bolte, &
Jones, 2003), use-centered design (Flach & Dominguez, 1995), and work-centered design
(Eggleston, 2003).
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
35
There are also a number of successful examples of the application of cognitive and work
domain analysis in systems development. These include a redesign of the weapons director station in an advanced surveillance and command aircraft (Klinger & Gomes, 1993),
design of crew composition for a new air defense platform (Naikar et al., 2003), design of
next-generation navy ships (Bisantz et al., 2003; Burns, Bisantz, & Roth, 2004), and design
of integrated visualizations to support dynamic mission monitoring and replanning for
an airlift service organization (Roth, Stilson, et al., 2006). There is a need for further work
in developing ways to better integrate cognitive and domain analyses into large-scale systems engineering, as well as for more examples of successful integration efforts.
CONCLUSIONS
Rather than representing a single technique or procedure, CTA comprises a wide range of
theoretical perspectives, data collection methods, and analysis and representational choices.
This rich diversity of approaches is held together by a common goal of understanding and
supporting complex cognitive work. CTAs necessarily involve examination of both the
characteristics and demands of the work domain as well as the knowledge and strategies
that domain practitioners have developed in response to domain demands. The survey
of CTA, cognitive work analysis, and related methods presented in this chapter demonstrates the wide diversity of available methods and how they can be combined and adapted
to meet the goals and pragmatic constraints of real-world projects.
REFERENCES
Anderson, J. R., Bothell, D., Byrne, M. D., Douglass, S., Lebiere, C., & Quin, Y. (2004). An integrated theory of
the mind. Psychological Review, 111, 1036–1060.
Anderson, J. R., & Lebiere, C. (1998). The atomic components of thought. Mahwah, NJ: Erlbaum.
Annett, J. (2003). Hierarchical task analysis. In E. Hollnagel (Ed.), Handbook of cognitive task design (pp. 17–35).
Mahwah, NJ: Erlbaum.
Annett, J., Cunningham, D., & Mathias-Jones, P. (2000). A method for measuring team skills. Ergonomics, 43,
1076–1094.
Bainbridge, L., & Sanderson, P. (1995). Verbal protocol analysis. In J. R. Wilson & E. N. Corlett (Eds.), Evaluation
of human work (pp. 169–201). London: Taylor & Francis.
Banaji, M., & Crowder, R. (1989). The bankruptcy of everyday memory. American Psychologist, 44, 1185–1193.
Baxter, G. D., Monk, A. F., Tan, K., Dear, P. R. F., & Newell, S. J. (2005). Using cognitive task analysis to facilitate
the integration of decision support systems into the neonatal intensive care unit. Artificial Intelligence in
Medicine, 35, 243–257.
Beyer, H., & Holtzblatt, K. (1998). Contextual design: Defining customer-centered systems. New York: Morgan
Kaufmann.
Bisantz, A. M., & Drury, C. G. (2005). Applications of archival and observational data. In J. R. Wilson & N.
Corlett (Eds.), Evaluation of human work (3rd ed., pp. 61–82). Boca Raton, FL: Taylor & Francis.
Bisantz, A. M., Little, E., & Rogova, G. (2004). On the integration of cognitive work analysis within a multisource information fusion development methodology. In Proceedings of the Human Factors and Ergonomics
Society 48th Annual Meeting (pp. 494–498). Santa Monica, CA: Human Factors and Ergonomics Society.
Bisantz, A. M., Roth, E. M., Brickman, B., Gosbee, L., Hettinger, L., & McKinney, J. (2003). Integrating cognitive analyses in a large-scale system design process. International Journal of Human-Computer Studies, 58,
177–206.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
36
Reviews of Human Factors and Ergonomics, Volume 3
Bisantz, A. M., & Vicente, K. J. (1994). Making the abstraction hierarchy concrete. International Journal of
Human-Computer Studies, 40, 83–117.
Blomberg, J., Giacomi, J., Mosher, A., & Swenton-Wall, P. (1993). Ethnographic field methods and their relationship to design. In D. Shiler & A. Namioka (Eds.), Participatory design: Principles and practices (pp. 123–157).
Mahwah, NJ: Erlbaum.
Bodker, K., Kensing, F., & Simonsen, J. (2004). Participatory IT design: Designing for business and workplace realities. Cambridge, MA: MIT Press.
Boehm-Davis, D. A., Holt, R. W., Chong, R., & Hansberger, J. T. (2004). Using cognitive modeling to understand
crew behavior. In Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting (pp. 99–103).
Santa Monica, CA: Human Factors and Ergonomics Society.
Booher, H. R. (2003). Handbook of human system integration. Hoboken, NJ: Wiley.
Burns, C. M., Bisantz, A. M., & Roth, E. M. (2004). Lessons from a comparison of work domain models: Representational choices and their implications. Human Factors, 46, 711–727.
Burns, C. M., & Hajdukiewicz, J. (2004). Ecological interface design. Boca Raton, FL: CRC Press.
Burns, C. M., Kuo, J., & Ng, S. (2003). Ecological interface design: A new approach for visualizing network management. Computer Networks: The International Journal of Computer and Telecommunications Networking
43(3), 369–388.
Byrne, M. D., & Bovair, S. (1997). A working memory model of a common procedural error. Cognitive Science,
21, 31–61.
Byrne, M. D., & Kirlik, A. (2005). Using computational cognitive modeling to diagnose possible sources of aviation error. International Journal of Aviation Psychology, 15, 135–155.
Cameron, C. A., Beemsterboer, P. L., Johnson, L. A., Mislevy, R. J., Steinberg, L. S., & Breyer, F. J. (2000). A
cognitive task analysis for dental hygiene. Journal of Dental Education, 64(5), 333–351.
Card, S., Moran, T., & Newell, A. (1983). The psychology of human-computer interaction. Hillsdale, NJ: Erlbaum.
Carroll, J. M. (1995). Scenario-based design: Envisioning work and technology in system development. New York:
Wiley.
Carroll, J. M. (1997). Scenario-based design. In M. Helander, T. K. Landauer, & P. V. Prabhu (Eds.), Handbook
of human-computer interaction (pp. 383–405). Amsterdam: Elsevier Science.
Carroll, J. M. (2000). Making use: Scenario-based design of human-computer interactions. Cambridge, MA:
MIT Press.
Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55–81.
Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representations of physics problems by
experts and novices. Cognitive Science, 5, 121–152.
Chi, M. T. H., Glaser, R., & Farr, M. L. (Eds.). (1988). The nature of expertise. Mahwah, NJ: Erlbaum.
Chipman, S. F., & Kieras, D. E. (2004). Operator centered design of ship systems. Presented at the Total Ship
Symposium, Gaithersburg, MD. Available: http://www.cs.cmu.edu/~bej/CognitiveModelingForUIDesign/
ChipmanKieras04.pdf
Chow, R., & Vicente, K. J. (2002). A field study of emergency ambulance dispatching: Implications for decision
support. In Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting (pp. 313–317).
Santa Monica, CA: Human Factors and Ergonomics Society.
Chrenka, J., Hutton, R. J. B., Klinger, D. W., & Anastasi, D. (2001). The cognimeter: Focusing cognitive task
analysis in the cognitive function model. In Proceedings of the Human Factors and Ergonomics Society 45th
Annual Meeting (pp. 1738–1742). Santa Monica, CA: Human Factors and Ergonomics Society.
Chu, R. W., Mitchell, C. M., & Jones, P. M. (1995). Using the operator function model and OFMspert as the
bases for an intelligent tutoring system: Towards a tutor/aid paradigm for operators of supervisory control
systems. IEEE Transactions on Systems, Man, and Cybernetics, 25, 1054–1075.
Clement, A., & Van den Besselaar, P. (1993). A retrospective look at participatory design projects. Communications
of the ACM, 36(4), 29–37.
Cognitive Systems Engineering Center. (2004). Cognitive Systems Engineering Tool for Analysis (Version 1.0)
[Computer software]. Pittsburgh, PA: ManTech International.
Cooke, N. J. (1994). Varieties of knowledge elicitation techniques. International Journal of Human-Computer
Studies, 41, 801–849.
Cooke, N. J. (2005). Measuring team knowledge. In N. Stanton, A. Hedge, K. Brookhuis, E. Salas, & H. Hendrick
(Eds.), Handbook of human
factorsfrom
and
ergonomicsatmethods
(pp.
52.51–52.55).
Boca
FL: CRC Press.
Downloaded
rev.sagepub.com
University at
Buffalo
Libraries on February
18, Raton,
2014
Analysis of Cognitive Work
37
Crandall, B., Klein, G. A., & Hoffman, R. R. (2006). Working minds: A practitioner’s guide to cognitive task analysis. Cambridge, MA: MIT Press.
D’Amico, A., Whitley, K., Tesone, D., O’Brien, B., & Roth, E. (2005). Achieving cyber defense situational awareness: A cognitive task analysis of information assurance analysts. In Proceedings of the Human Factors and
Ergonomics Society 49th Annual Meeting (pp. 229–233). Santa Monica, CA: Human Factors and Ergonomics
Society.
Dekker, S. W. A. (2002). The field guide to human error investigation. London: Ashgate.
Dekker, S. W. A., & Woods, D. D. (1999). To intervene or not to intervene: The dilemma of management by exception. Cognition, Technology & Work, 1, 86–96.
Deutsch, S. (1998). Interdisciplinary foundations for multiple-task human performance modeling in OMAR. In
Proceedings of the 20th Annual Meeting of the Cognitive Science Society (pp. 303–308). Mahwah, NJ: Erlbaum.
Deutsch, S., & Pew, R. (2004). Examining new flight deck technology using human performance modeling. In
Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting (pp. 108–112). Santa Monica,
CA: Human Factors and Ergonomics Society.
Dominguez, C. (2001). Expertise and metacognition in laparoscopic surgery: A field study. In Proceedings of the
Human Factors and Ergonomics Society 45th Annual Meeting (pp. 1298–1302). Santa Monica, CA: Human
Factors and Ergonomics Society.
Eggleston, R. G. (2003). Work-centered design: A cognitive engineering approach to system design. In Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting (pp. 263–267). Santa Monica, CA:
Human Factors and Ergonomics Society.
Elm, W. C., Potter, S. S., Gualtieri, J. W., Easter, J. R., & Roth, E. M. (2003). Applied cognitive work analysis: A
pragmatic methodology for designing revolutionary cognitive affordances. In E. Hollnagel (Ed.), Handbook
of cognitive task design (pp. 357–382). Mahwah, NJ: Erlbaum.
Endsley, M., Bolte, B., Jones, D. G. (2003). Designing for situation awareness: An approach to user-centered design.
New York: Taylor and Francis.
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.
Fairbanks, R. J., Bisantz, A. M., & Sunm, M. (2007). Emergency department communication links and patterns.
Annals of Emergency Medicine, 50, 396–406.
Flach, J. M., & Dominguez, C. (1995). Use-centered design: Integrating the user, instrument, and goal. Ergonomics in Design, 3(3), 19–24.
Flanagan, J. C. (1954). The critical incident technique. Psychological Bulletin, 51, 327–358.
Foyle, D. C., Goodman, A., & Hooey, B. L. (2003). An overview of the NASA aviation safety program (AvSP)
system-wide accident prevention (SWAP) human performance modeling (HPM) element. In D. C. Foyle,
A. Goodman, & B. L. Hooey (Eds.), Proceedings of the 2003 Conference on Human Performance Modeling
of Approach and Landing with Augmented Displays (pp. 1–13). Moffett Field, CA: NASA.
Foyle, D. C., Hooey, B. L., Byrne, M. D., Corker, K. M., Deutsch, S., Lebiere, C., et al. (2005). Human performance models of pilot behavior. In Proceedings of the Human Factors and Ergonomics Society 49th Annual
Meeting (pp. 1109–1113). Santa Monica, CA: Human Factors and Ergonomics Society.
Gilbreth, F., & Gilbreth, L. (1919). Applied motion study. London: Sturgis and Walton.
Glaser, R., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative research. New York:
Aldine.
Gong, R., & Kieras, D. E. (1994). A validation of the GOMS model methodology in the development of a specialized, commercial software application. In Proceedings of CHI (pp. 351–357). New York: Association for
Computing Machinery.
Gorman, M. E., Militello, L. G., Swierenga, S. J., & Walker, J. L. (2004). Internet searching by ear: Decision flow
diagrams for sightless Internet users. In Proceedings of the Human Factors and Ergonomics Society 48th Annual
Meeting (pp. 243–247). Santa Monica, CA: Human Factors and Ergonomics Society.
Gray, W. D. (Ed.). (2007). Integrated models of cognitive systems. New York: Oxford University Press.
Gray, W. D., & Altmann, E. M. (2001). Cognitive modeling and human-computer interaction. In W. Karwowski
(Ed.), International encyclopedia of ergonomics and human factors (Vol. 1, pp. 387–391). New York: Taylor
& Francis.
Gray, W. D., & Boehm-Davis, D. A. (2000). Milliseconds matter: An introduction to microstrategies and to
their use in describing and predicting interactive behavior. Journal of Experimental Psychology: Applied, 6,
322–335.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
38
Reviews of Human Factors and Ergonomics, Volume 3
Gray, W. D., John, D. E., & Atwood, M. E. (1993). Project Ernestine: Validating a GOMS analysis for predicting
and explaining real-world task performance. Human-Computer Interaction, 8, 237–309.
Gray, W. D., & Kirschenbaum, S. S. (2000). Analyzing a novel expertise: An unmarked road. In J. M. C. Schraagen,
S. F. Chipman, & V. L. Shalin (Eds.), Cognitive task analysis (pp. 275–290). Mahwah, NJ: Erlbaum.
Greenbaum, J., & Kyng, M. (1991). Design at work: Cooperative design of computer systems. Mahwah, NJ: Erlbaum.
Hall, E. P., Gott, S. P., & Pokorny, R. A. (1995). A procedural guide to cognitive task analysis: The PARI methodology. Brooks Air Force Base, TX: Air Force Materiel Command.
Hammersley, M., & Atkinson, P. (1983). Ethnography principles in practice. London: Tavistock.
Harder, R., & Higley, H. (2004). Application of thinklets to team cognitive task analysis. In Proceedings of the
37th Hawaii International Conference on System Science (pp. 1–9). New York: Institute of Electrical and
Electronics Engineers.
Hendrick, H. W. (2007). Macroergonomics: The analysis and design of work systems. In D. Boehm-Davis (Ed.),
Reviews of human factors and ergonomics (Vol. 3, pp. 44–78). Santa Monica, CA: Human Factors and Ergonomics Society.
Hendrick, H. W., & Kleiner, B. M. (2001). Macroergonomics: An introduction to work system design. Santa Monica,
CA: Human Factors and Ergonomics Society.
Hoc, J. M., & Leplat, J. (1983). Evaluation of different modalities of verbalization in a sorting task. International
Journal of Man-Machine Studies, 18, 283–306.
Hoffman, R. (1987, Summer). The problem of extracting the knowledge of experts from the perspective of
experimental psychology. AI Magazine, 8, 53–67.
Hoffman, R., Coffey, J. W., Ford, K. M., & Novak, J. D. (2006). A method for eliciting, preserving, and sharing
the knowledge of forecasters. Weather and Forecasting, 21, 416–428.
Hoffman, R., Crandall, B., & Shadbolt, N. (1998). Use of the critical decision method to elicit expert knowledge:
A case study in the methodology of cognitive task analysis. Human Factors, 40, 254–276.
Hoffman, R., & Hanes, L. F. (2003). The boiled frog problem. IEEE: Intelligent Systems, 68–71.
Hoffman, R., & Lintern, G. (2006). Eliciting and representing the knowledge of experts. In K. A. Ericsson, N.
Charness, P. Feltovich, & R. Hoffman (Eds.), Cambridge handbook of expertise and expert performance (pp.
203–222). New York: Cambridge University Press.
Hutchins, S. G., Pirolli, P., & Card, S. (2003). Use of critical analysis method to conduct a cognitive task analysis of intelligence analysts. In Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting
(pp. 478–482). Santa Monica, CA: Human Factors and Ergonomics Society.
Hutton, R. J. B., Miller, T. E., & Thordsen, M. L. (2003). Decision-centered design: Leveraging cognitive task analysis in design. In E. Hollnagel (Ed.), Handbook of cognitive task design (pp. 383–416). Mahwah, NJ: Erlbaum.
Institute for Human and Machine Cognition. (2006). CmapTools knowledge modeling kit. Retrieved November
6, 2006, from http://cmap.ihmc.us/
Itoh, J., Sakuma, A., & Monta, K. (1995). An ecological interface for supervisory control of BWR nuclear power
plants. Control Engineering Practice, 3, 231–259.
Jentsch, F., & Bowers, C. (2005). Team communication analysis. In N. Stanton, A. Hedge, K. Brookhuis, E. Salas,
& H. Hendrick (Eds.), Handbook of human factors and ergonomics methods (pp. 50.51–50.55). Boca Raton,
FL: CRC Press.
John, B. E., & Kieras, D. E. (1996). Using GOMS for user interface design and evaluation: Which technique?
ACM Transactions on Computer-Human Interaction, 3, 287–319.
Johnson, S., Healey, A., Evans, J., Murphy, M., Crawshaw, M., & Gould, D. (2006). Physical and cognitive task
analysis in interventional radiology. Clinical Radiology, 61(1), 97–103.
Kieras, D. E. (1998). A guide to GOMS model usability evaluation using NGOMSL. In M. Helander (Ed.),
Handbook of human-computer interaction (pp. 391–438). Amsterdam: Elsevier.
Kieras, D. E. (2003). Model-based evaluation. In J. Jacko & A. Sears (Eds.), Handbook for human-computer interaction (pp. 1139–1151). Mahwah, NJ: Erlbaum.
Kieras, D. E., & Meyer, D. E. (1997). An overview of the EPIC architecture for cognition and performance with
application to human-computer interaction. Human-Computer Interaction, 12, 391–438.
Kieras, D. E., & Meyer, D. E. (2000). The role of cognitive task analysis in the application of predictive models
of human performance. In J. M. C. Schraagen, S. F. Chipman, & V. L. Shalin (Eds.), Cognitive task analysis
(pp. 237–260). Mahwah, NJ: Erlbaum.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
39
Kieras, D. E., Woods, D., & Meyer, D. E. (1997). Predictive engineering models based on the EPIC architecture
for a multimodal high-performance human-computer interaction task. Transactions on Computer-Human
Interaction, 4, 230–275.
Kirschenbaum, S. S. (2004). The role of comparison in weather forecasting: Evidence from two hemispheres. In
Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting (pp. 306–310). Santa Monica,
CA: Human Factors and Ergonomics Society.
Kirwan, B., & Ainsworth, L. K. (1992). A guide to task analysis. London: Taylor & Francis.
Klein, G. A. (1992). Using knowledge elicitation to preserve corporate memory. In R. R. Hoffman (Ed.), The
psychology of expertise: Cognitive research and empirical AI (pp. 170–190). Mahwah, NJ: Erlbaum.
Klein, G. A. (1998). Sources of power: How people make decisions. Cambridge, MA: MIT Press.
Klein, G. A. (2000). Cognitive task analysis of teams. In J. M. Schraagen, S. F. Chipman, & V. L. Shalin (Eds.),
Cognitive task analysis (pp. 417–429). Mahwah, NJ: Erlbaum.
Klein, G. A., & Armstrong, A. A. (2005). Critical decision method. In N. Stanton, A. Hedge, K. Brookhuis, E.
Salas, & H. Hendrick (Eds.), Handbook of human factors and ergonomics methods (pp. 35.31–35.38). Boca
Raton, FL: CRC Press.
Klein, G. A., Armstrong, A. A., Woods, D. D., Gokulachandra, M., & Klein, H. (2000). Cognitive wavelength: The
role of common ground in distributed replanning (Tech. Rep. AFRL-HE-WP-TR-2001-0029). Wright-Patterson
Air Force Base, OH: U.S. Air Force Research Laboratory.
Klein, G. A., Calderwood, R., & Clinton-Cirocco, A. (1986). Rapid decision making on the fire-ground. In Proceedings of the Human Factors and Ergonomics Society 30th Annual Meeting (pp. 576–580). Santa Monica,
CA: Human Factors and Ergonomics Society.
Klein, G. A., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. IEEE
Transactions on Systems, Man, and Cybernetics, 19, 462–472.
Klein, G. A., Orasanu, J., Calderwood, R., & Zsambok, C. E. (Eds.). (1993). Decision making in action: Models
and methods. Norwood, NJ: Ablex.
Klein, G. A., Ross, K. G., Moon, B. M., Klein, D. E., & Hollnagel, E. (2003, May–June). Macrocognition. IEEE
Intelligent Systems, 18, 81–85.
Klinger, D. W., & Gomes, M. G. (1993). A cognitive systems engineering application for interface design. In
Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting (pp. 16–20). Santa Monica,
CA: Human Factors and Ergonomics Society.
Klinger, D. W., & Hahn, B. B. (2003). Handbook of team CTA. Fairborn, OH: Klein Associates.
Klinger, D. W., & Hahn, B. B. (2005). Team decision requirement exercise: Making team decision requirements
explicit. In N. Stanton, A. Hedge, K. Brookhuis, E. Salas, & H. Hendrick (Eds.), Handbook of human factors
and ergonomics methods (pp. 52.51–52.55). Boca Raton, FL: CRC Press.
Laird, J. E., Newell, A., & Rosenbloom, P. S. (1987). SOAR: An architecture for general intelligence. Artificial
Intelligence, 33(1), 1–64.
Laughery, K. R., & Corker, K. M. (1997). Computer modeling and simulation of human/system performance.
In G. Salvendy (Ed.), Handbook of human factors (2nd ed., pp. 1375–1408). New York: Wiley.
Lebiere, C., Biefeld, E., Archer, R., Archer, S., Allender, L., & Kelley, T. D. (2002). IMPRINT/ACT-R: Integration
of a task network modeling architecture with a cognitive architecture and its application to human error
modeling. In M. J. Chinni (Ed.), 2002 Military, government and aerospace simulation (Vol. 34, pp. 13–19).
San Diego, CA: Society for Modeling and Simulation International.
Lee, J. D., & Sanquist, T. F. (2000). Augmenting the operator function model with cognitive operations: Assessing the cognitive demands of technological innovation in ship navigation. IEEE Transactions on Systems,
Man, and Cybernetics—Part A: Systems and Humans, 30, 273–285.
Leplat, J. (1986). The elicitation of expert knowledge. In E. Hollnagel, G. Mancini, & D. D. Woods (Eds.),
Intelligent decision support (pp. 107–122). New York: Springer-Verlag.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.
Lintern, G. (2006). A functional workspace for military analysis of insurgent operations. International Journal
of Industrial Ergonomics, 36, 409–422.
Luke, T., Brook-Carter, N., Parkes, A. M., Grimes, E., & Mills, A. (2006). An investigation of train driver visual
strategies. Cognition, Technology & Work, 8, 15–29.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
40
Reviews of Human Factors and Ergonomics, Volume 3
MacMillan, J., Paley, M. J., Entin, E. B., & Entin, E. E. (2005). Questionnaires for distributed assessment of team
mutual awareness. In N. Stanton, A. Hedge, K. Brookhuis, E. Salas, & H. Hendrick (Eds.), Handbook of
human factors and ergonomics methods (pp. 51.51–51.59). Boca Raton, FL: CRC Press.
Means, B., & Gott, S. P. (1988). Cognitive task analysis as a basis for tutor development: Articulating abstract
knowledge representations. In L. D. M. J. Psotka & S. A. Mutter (Eds.), Intelligent tutoring systems: Lessons
learned (pp. 35–57). Mahwah, NJ: Erlbaum.
Miles, M. B., & Huberman, A. M. (1984). Qualitative data analysis. Newbury Park, CA: Sage.
Militello, L. G., & Hutton, R. J. B. (1998). Applied cognitive task analysis (ACTA): A practitioner’s toolkit for
understanding cognitive task demands. Ergonomics, 41, 1618–1641.
Miller, A., & Xiao, Y. (2006). Multi-level strategies to achieve resilience for an organisation operating at capacity: A case study at a trauma centre. Cognition, Technology & Work, 9, 51–66.
Miller, C. A., & Vicente, K. J. (2001). Comparison of display requirements generated via hierarchical task and
abstraction-decomposition space analysis techniques. International Journal of Cognitive Ergonomics, 5,
335–356.
Miller, J. E., Patterson, E. S., & Woods, D. D. (2006). Elicitation by critiquing as a cognitive task analysis methodology. Cognition, Technology & Work, 8, 90–102.
Mitchell, C. M. (1987). GT-MSOCC: A domain for research on human-computer interaction and decision aiding in supervisory control systems. IEEE Transactions on Systems, Man, and Cybernetics, 17, 553–572.
Mitchell, C. M., & Miller, R. A. (1986). A discrete control model of operator function: A methodology for information display design. IEEE Transactions on Systems, Man, and Cybernetics, SMC-16, 343–357.
Mueller, M. J., Haslwanter, J., & Dayton, T. (1997). Participatory practices in the software lifecycle. In M.
Helander, T. K. Landauer, & P. V. Prabhu (Eds.), Handbook of human-computer interaction (pp. 256–296).
Amsterdam: Elsevier Science/North Holland.
Mumaw, R. J., Roth, E. M., Vicente, K. J., & Burns, C. M. (2000). There is more to monitoring a nuclear power
plant than meets the eye. Human Factors, 42, 36–55.
Naikar, N. (2006). Beyond interface design: Further applications of cognitive work analysis. International Journal
of Industrial Ergonomics, 36, 423–438.
Naikar, N., Pearce, B., Drumm, D., & Sanderson, P. (2003). Designing teams for first-of-a-kind complex systems
using the initial phases of cognitive work analysis: A case study. Human Factors, 42, 202–217.
Narayanan, S., Walchli, S. E., Reddy, N., & Balachandran, R. (1997). Model-based design of an information retrieval system for a university library. International Journal of Cognitive Ergonomics, 1, 149–167.
Nehme, C. E., Scott, S. D., Cummings, M. L., & Furusho, C. Y. (2006). Generating requirements for futuristic
heterogeneous unmanned systems. In Proceedings of the Human Factors and Ergonomics Society 50th Annual
Meeting (pp. 235–239). Santa Monica, CA: Human Factors and Ergonomics Society.
Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes.
Psychological Review, 84, 231–259.
O’Hara, J. M., & Roth, E. M. (2005). Operational concepts, teamwork, and technology in commercial nuclear
power stations. In C. Bowers, E. Salas, & F. Jentsch (Eds.), Creating high-tech teams: Practical guidance on work
performance and technology (pp. 139–159). Washington, DC: American Psychological Association.
O’Hare, D., Wiggins, M., Williams, A., & Wong, W. (1998). Cognitive task analyses for decision-centered
design and training. Ergonomics, 41, 1698–1718.
Osga, G. A. (2003). Human-centered shipboard systems and operations. In H. R. Booher (Ed.), Handbook of
human systems integration (pp. 743–793). Hoboken, NJ: Wiley.
Patterson, E. S., Cook, R. I., & Render, M. L. (2002). Improving patient safety by identifying side effects from
introducing bar coding in medication administration. Journal of the American Informatics Association, 9,
540–553.
Patterson, E. S., Roth, E. M., & Woods, D. D. (2001). Predicting vulnerability in computer-supported inferential
analysis under data overload. Cognition, Technology & Work, 3, 224–237.
Patterson, E. S., Roth, E. M., Woods, D. D., Chow, R., & Gomes, J. O. (2004). Handoff strategies in settings with
high consequences for failure: Lessons for health care operations. International Journal for Quality in Health
Care, 16, 125–132.
Patterson, E. S., & Woods, D. D. (2001). Shift changes, updates, and the on-call architecture in space shuttle mission control. Computer-Supported Cooperative Work, 10, 317–346.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
41
Pew, R., & Mavor, A. S. (Eds.). (1998). Modeling human and organizational behavior: Application to military
simulations. Washington, DC: National Academy Press.
Pew, R. W. and Mavor, A. S. (Eds.). (2007). Human-system integration in the system development process: A new
look. (Committee on Human-System Design Support for Changing Technology). Washington, DC: Committee on Human Factors, Division of Behavioral and Social Sciences and Education, National Research
Council, National Academies Press.
Pfautz, J., & Roth, E. M. (2006). Using cognitive engineering for system design and evaluation: A visualization
aid for stability and support operations. International Journal of Industrial Ergonomics, 36(5), 389–407.
Potter, S. S., Gualtieri, J. W., & Elm, W. C. (2003). Case studies: Applied cognitive work analysis in the design of
innovative decision support. In E. Hollnagel (Ed.), Handbook of cognitive task design (pp. 653–678). Mahwah,
NJ: Erlbaum.
Potter, S. S., Roth, E. M., Woods, D., & Elm, W. C. (2000). Bootstrapping multiple converging cognitive task analysis techniques for system design. In J. M. Schraagen, S. F. Chipman, & V. L. Shalin (Eds.), Cognitive task
analysis (pp. 317–340). Mahwah, NJ: Erlbaum.
Raby, M., McGehee, D. V., Lee, J. D., & Nourse, G. E. (2000). Defining the interface for a snowplow lane-tracking
device using a systems-based approach. In Proceedings of the XIVth Triennial Congress of the International
Ergonomics Association and 44th Annual Meeting of the Human Factors and Ergonomics Society (pp.
3.369–3.372). Santa Monica, CA: Human Factors and Ergonomics Society.
Rasmussen, J. (1983). Skills, rules, and knowledge; Signals, signs, and symbols, and other distractions in human
performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257–266.
Rasmussen, J. (1986). Information processing and human-machine interaction: An approach to cognitive engineering. New York: North-Holland.
Rasmussen, J., Pejtersen, A. M., & Goodstein, L. P. (1994). Cognitive systems engineering. New York: Wiley.
Ritter, F. E., Shadbolt, N. R., Elliman, D., Young, R. M., Gobet, F., & Baxter, G. D. (2003). Techniques for modeling human performance in synthetic environments: A supplementary review. Wright-Patterson Air Force Base,
OH: Human Systems Information Analysis Center.
Ritter, F. E., & Young, R. M. (2001). Embodied models as simulated users: Introduction to this special issue on
using cognitive models to improve interface design. International Journal of Human-Computer Studies, 55,
1–14.
Roth, E. M., Christian, C. K., Gustafson, M. L., Sheridan, T., Dwyer, K., Gandhi, T. K., et al. (2004). Using field
observations as a tool for discovery: Analyzing cognitive and collaborative demands in the operating room.
Cognition, Technology & Work, 6, 148–157.
Roth, E. M., Malsch, N., Multer, J., & Coplen, M. (1999). Understanding how train dispatchers manage and
control trains: A cognitive task analysis of a distributed team planning task. In Proceedings of the Human
Factors and Ergonomics Society 43rd Annual Meeting (pp. 218–222). Santa Monica, CA: Human Factors
and Ergonomics Society.
Roth, E. M., & Multer, J. (2005). Fostering shared situation awareness and on-track safety across distributed
teams in railroad operations. In Proceedings of the Human Factors and Ergonomics Society 49th Annual
Meeting (pp. 529–533). Santa Monica, CA: Human Factors and Ergonomics Society.
Roth, E. M., Multer, J., & Raslear, T. (2006). Shared situation awareness as a contributor to high reliability performance in railroad operations. Organization Studies, 27, 967–987.
Roth, E. M., & Patterson, E. S. (2005). Using observational study as a tool for discovery: Uncovering cognitive and collaborative demands and adaptive strategies. In H. Montgomery, R. Lipshitz, & B. Brehmer
(Eds.), How professionals make decisions (pp. 379–393). Mahwah, NJ: Erlbaum.
Roth, E. M., Scott, R., Deutsch, S., Kuper, S., Schmidt, V., Stilson, M., et al. (2006). Evolvable work-centered support systems for command and control: Creating systems users can adapt to meet changing demands. Ergonomics, 49, 688–705.
Roth, E. M., Stilson, M., Scott, R., Whitaker, R., Kazmierczak, T., Thomas-Meyers, G., et al. (2006). Workcentered design and evaluation of a C2 visualization aid. In Proceedings of the Human Factors and Ergonomics
Society 49th Annual Meeting (pp. 332–336). Santa Monica, CA: Human Factors and Ergonomics Society.
Roth, E. M., & Woods, D. D. (1988). Aiding human performance: 1. Cognitive analysis. Le Travail Humain, 41,
39–64.
Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
42
Reviews of Human Factors and Ergonomics, Volume 3
Roth, E. M., & Woods, D. D. (1989). Cognitive task analysis: An approach to knowledge acquisition for intelligent system design. In G. Guida & C. Tasso (Eds.), Topics in expert system design (pp. 233–264). Amsterdam:
Elsevier Science.
Roth, E. M., Woods, D. D., & Pople, H. E. (1992). Cognitive simulation as a tool for cognitive task analysis.
Ergonomics, 36, 1163–1198.
Salas, E., & Priest, H. (2005). Team training. In N. Stanton, A. Hedge, K. Brookhuis, E. Salas, & H. Hendrick
(Eds.), Handbook of human factors and ergonomics (pp. 44.41–44.47). Boca Raton, FL: CRC Press.
Salvendy, G. (2001). Handbook of industrial engineering: Technology and operations management. New York:
Wiley.
Sanderson, P. M. (2003). Cognitive work analysis. In J. M. Carroll (Ed.), HCI models, theories, and frameworks:
Toward a multi-disciplinary science (pp. 225–264). San Francisco: Morgan Kaufmann.
Sanderson, P. M., & Fisher, C. (1994). Exploratory sequential data analysis. Human-Computer Interaction, 9,
251–317.
Sanderson, P. M., & Watson, M. O. (2005). From information content to auditory display with ecological interface design: Prospects and challenges. In Proceedings of the Human Factors and Ergonomics Society 49th
Annual Meeting (pp. 259–263). Santa Monica, CA: Human Factors and Ergonomics Society.
Schaafstal, A., Schraagen, J. M., & van Berlo, M. (2000). Cognitive task analysis and innovation of training:
The case of structured troubleshooting. Human Factors, 42, 75–86.
Schraagen, J. M., Chipman, S. F., & Shalin, V. L. (Eds.). (2000). Cognitive task analysis. Mahwah, NJ: Erlbaum.
Schuler, D., & Namioka, A. (1993). Participatory design. Mahwah, NJ: Erlbaum.
Seagull, F. J., & Xiao, Y. (2001). Using eye-tracking video data to augment knowledge elicitation in cognitive task
analysis. In Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting (pp. 400–403).
Santa Monica, CA: Human Factors and Ergonomics Society.
Seamster, T. L., Redding, R. E., & Kaempf, G. L. (1997). Applied cognitive task analysis in aviation. Burlington,
VT: Ashgate.
Sellie, C. N. (1992). Predetermined motion-time systems and the development and use of standard data. In G.
Salvendy (Ed.), Handbook of industrial engineering (2nd ed., pp. 1639–1698). New York: Wiley.
Shepherd, A., & Kontogiannis, T. (1998). Strategic task performance: A model to facilitate the design of instruction. International Journal of Cognitive Ergonomics, 2, 349–372.
Shepherd, A., & Stammers, R. B. (2005). Task analysis. In J. Wilson & E. N. Corlett (Eds.), Evaluation of human
work (3rd ed., pp. 129–158). Boca Raton, FL: CRC Press.
Shryane, N. M., Westerman, S. J., Crawshaw, C. M., Hockey, G. R. J., & Sauer, J. (1998). Task analysis for the
investigation of human error in safety-critical software design: A convergent methods approach. Ergonomics,
41, 1719–1736.
Simon, H. A. (1981). The sciences of the artificial (2nd ed.). Cambridge, MA: Cambridge University Press.
Skilton, W., Cameron, S., & Sanderson, P. M. (1998). Supporting cognitive work analysis with the Work Domain
Analysis Workbench (WDAW). In Proceedings of the Computer Interaction Conference, Australasian (pp.
260–267). Los Alamitos, CA: IEEE Computer Society.
St. Amant, R., Horton, T. E., & Ritter, F. E. (2007). Model-based evaluation of expert cell phone menu interaction. ACM Transactions on Human-Computer Interaction, 14, 1–24.
Stanard, T., Lewis, W. R., Cox, D. A., Malek, D. A., Klein, J., & Matz, R. (2004). An exploratory qualitative study
of computer network attacker cognition. In Proceedings of the Human Factors and Ergonomics Society 48th
Annual Meeting (pp. 401–405). Santa Monica: CA: Human Factors and Ergonomics Society.
Stanton, N. A. (2001). Hierarchical task analysis. In W. Karwowski (Ed.), International encyclopedia of ergonomics and human factors (pp. 3183–3190). Boca Raton, FL: CRC Press.
Tang, Z., Zhang, J., Johnson, T. R., Bernstam, E., & Tindall, D. (2004). Integrating task analysis in software usability evaluation: A case study. In Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting
(pp. 1741–1745). Santa Monica, CA: Human Factors and Ergonomics Society.
Taylor, F. W. (1911). The principles of scientific management. New York: Norton.
Tselios, N. K., & Avouris, N. M. (2003). Cognitive task modelling for system design and evaluation in nonroutine task domains. In E. Hollnagel (Ed.), Handbook of cognitive task design (pp. 303–330). Mahwah,
NJ: Erlbaum.
Vicente, K. J. (1990). A few implications of an ecological approach to human factors. Human Factors Society
Bulletin, 33(11), 1–4. Downloaded from rev.sagepub.com at University at Buffalo Libraries on February 18, 2014
Analysis of Cognitive Work
43
Vicente, K. J. (1999). Cognitive work analysis. Mahwah, NJ: Erlbaum.
Vicente, K. J. (2002). Ecological interface design: Progress and challenges. Human Factors, 44, 62–78.
Vicente, K. J., & Rasmussen, J. (1992). Ecological interface design: Theoretical foundations. IEEE Transactions
on Systems, Man, and Cybernetics, SMC-22, 589–606.
Watts, J. C., Woods, D. D., Corban, J. M., Patterson, E. S., Kerr, R. L., & LaDessa, C. H. (1996). Voice loops as
cooperative aids in space shuttle mission control. Proceedings of the 1996 ACM Conference on Computer
Supported Cooperative Work (pp. 48–56). New York: ACM.
Weir, C. R., Nebeker, J. J. R., Hicken, B. L., Campo, R., Drews, F., & LeBar, B. (2007). A cognitive task analysis
of information management strategies in a computerized provider order entry environment. Journal of
the American Medical Informatics Association, 14(1), 65–75.
Woods, D. D. (1993). Process tracing methods for the study of cognition outside of the experimental psychology laboratory. In G. A. Klein, J. Orasanu, R. Calderwood, & C. E. Zsambok (Eds.), Decision-making in
action: Models and methods (pp. 228–251). Norwood, NJ: Ablex.
Woods, D. D., & Dekker, S. W. A. (2000). Anticipating the effects of technological change: A new era of dynamics for human factors. Theoretical Issues in Ergonomics Science, 1, 272–282.
Woods, D. D., & Hollnagel, E. (1987). Mapping cognitive demands in complex problem-solving worlds. International Journal of Man-Machine Studies, 26, 257–275.
Woods, D. D., & Hollnagel, E. (2006). Joint cognitive systems: Patterns in cognitive systems engineering. Boca Raton,
FL: CRC Press.
Yin, R. (1989). Case study research: Design and methods. Newbury Park, CA: Sage.
Zachary, W., Ryder, J., Ross, L., & Weiland, M. (1992). Intelligent human-computer interaction in real time, multitasking process control and monitoring systems. In M. Helander & M. Nagamachi (Eds.), Human factors in
design for manufacturability (pp. 377–402). New York: Taylor & Francis.