EXPLORATIONS OF COGNITIVE AGILITY: A REAL

EXPLORATIONS OF COGNITIVE AGILITY:
A REAL TIME ADAPTIVE CAPACITY
by
DARREN J. GOOD
Submitted in partial fulfillment of the requirements
for the degree of Doctor of Philosophy
Department of Organizational Behavior
CASE WESTERN RESERVE UNIVERSITY
AUGUST, 2009
CASE WESTERN RESERVE UNIVERSITY
SCHOOL OF GRADUATE STUDIES
We hereby approve the thesis/dissertation of
Darren Good____________________________________________
candidate for the ___Ph.D.___________________degree *.
(signed)_________Richard Boyatzis___________________
(chair of the committee)
_______________David Kolb_______________________
________________Sandra Russ______________________
_______________Peter Whitehouse___________________
________________________________________________
________________________________________________
(date) ____June 26, 2009___________________
*We also certify that written approval has been obtained for any
proprietary material contained therein.
2
This dissertation is dedicated to my wife, Rachel, who made a commitment to me, which
will always surpass in meaning what is on the pages to follow.
3
TABLE OF CONTENTS
Chapter
I.
II.
III.
Topic
Page
Introduction
11
i. Foundations of Cognitive Agility
14
ii. Cognitive Intelligence and Adaptability
21
iii. Focused Attention and Adaptability
22
iv. Cognitive Openness and Adaptability
26
v. Cognitive Flexibility and Adaptability
34
vi. Cognitive Agility and Adaptability
41
Methods
43
i. Study Overview
43
ii. Research Design
44
iii. Participants
45
iv. Procedure
46
v. Measures
48
vi. Hypotheses
60
Results
65
i. Descriptive Statistics
65
ii. Reliability Estimates
69
iii. Hypotheses 1-11
70
iv. Hypotheses 12-14
74
v. Factor Analysis
76
4
IV.
V.
vi. Hypotheses 15-24
77
vii. Hypotheses 25-27
83
viii. Post Hoc Results
89
Discussion of Results
102
i. Discussion of Hypotheses 1-11
104
ii. Discussion of Hypotheses 12-14
107
iii. Discussion of Hypotheses 15-24
117
iv. Discussion of Hypotheses 25-27
122
v. Discussion of Post Hoc Analysis
126
vi. Limitations
130
Conclusion
135
i. Implications
136
1. Research
136
2. Practice
142
Appendices
151
A. Informed Consent
151
B. Invitation Letter To Participants
155
C. Invitation Letter to External Raters
156
D. Thank You Completion Letter
159
Bibliography
160
5
LIST OF TABLES
Table 1: Measures for the Study
47
Table 2: Descriptive Statistics for the Dependent Variable
66
Table 3: Descriptive Statistics for Independent Variables of CA
67
Table 4: Descriptive Statistics for Intelligence Measures
68
Table 5: Scale Reliabilities
69
Table 6. Reliabilities for Performance Measures
70
Table 7: Correlation Matrix
71
Table 8: Pattern Matrix
77
Table 9: Regression Analyses for Individual Performance Measures
79
Table 10: Regression Analyses for Individual Self Reports
80
Table 11: Regression Analyses for Individual Other Rater Reports
82
Table 12: Regression Analyses for Cognitive Agility Performance Score
85
Table 13: Regression Analyses for Cognitive Agility Self Report Score
86
Table 14: Regression Analysis for Individual Perf Measures Beyond Controls
87
Table 15: Regression Analysis for Individual Self/Other Measures Beyond Controls
88
Table 16: Regressions for Cognitive Agility Perf Measure Beyond Controls
91
Table 17: Regressions for Cognitive Agility Self Report Measure Beyond Controls
93
Table 18: Regressions for Cognitive Agility Other Report Measure Beyond Controls
94
Table 19: Regressions for All Measures Beyond Controls
98
Table 20: Correlation Matrix – Women
99
Table 21: Correlation Matrix – Men
99
6
Table 22: Regression Analysis for Gender Moderation
101
Table 23: Multi Trait Multi Method Matrix
108
7
LIST OF FIGURES
Figure 1: Cognitive Agility Formative Construct
20
Figure 2: Networked Fire Chief
48
Figure 3: Go No Go
55
Figure 4: Card Rotations Test
59
8
LIST OF ABBREVIATIONS
ACS
Attentional Control Scale
AUT
Alternate Uses Test
BWVT
Basic Word Vocabulary Test
DDM
Dynamic Decision Making
LMS
Langer Mindfulness Scale
MAI
Metacognitive Awareness Inventory
NFC
Networked Fire Chief
9
Explorations of Cognitive Agility:
A Real Time Adaptive Capacity
ABSTRACT
By
DARREN J. GOOD
This dissertation proposes and tests a new construct applicable to real-time
adaptation. Cognitive agility is a formative construct that measures the individual ability
to exhibit cognitive flexibility, cognitive openness and focused attention. This research
seeks to demonstrate whether the formative construct of cognitive agility predicts
adaptive performance in a dynamic-decision-making microworld. 181 undergraduates
performed three consecutive trials, each of increasing difficulty and cognitive demand, in
a microworld computer game called Networked Fire Chief (NFC). The changes within
and between trials require the participants to flexibly adapt strategies using both cognitive
openness and focused attention, in order to score highly. The individual variables that
form cognitive agility, as well as the formative construct, explain unique variance beyond
measures of general intelligence on the total score of adaptive performance. Most
notably, the cognitive agility construct explains unique variance beyond general
intelligence in each of the respective methods of measurement (R2Δ=11% for
performance measures and R2Δ=6% for both the self reports and other rater reports). The
results indicate a novel combination of abilities that may further the study of real-time
adaptability.
10
CHAPTER I
INTRODUCTION
Members of today’s organizations face rapid change. This is caused in part by
technological advancements (Hollenbeck and McCall, 1999; Ilgen & Pulakos, 1999;
Pulakos et al., 2002; Thach & Woodman, 1994), increased competition (Edwards and
Morrison, 1994), shorter product lifecycles (Bowonder and Miyake, 2000), the
boundaryless nature of career (Arthur, Inkson, & Pringle, 1999; Boyaztis and Kram, Hall,
2002), cultural complexity (Cascio, 2003; Sanchez and Levine, 2001), globalization
(Black, 1990; Noe and Ford 1992) and an increase in mergers and acquisitions (Kinicki
and Latack, 1990). An additional level of complexity comes from the knowledge
worker’s need to self-develop in order to stay competitive (Boyatzis and Kram, 1999;
Hall & Mirvis, 1995; London and Mone, 1999; Senge, 1990). Taken together, this
combination of environmental elements creates a dynamic context of increased
complexity within which an individual must contend (Milliken, 1990; Weick, 1995)
11
These conditions foster a continual increase in the load of information processing
and dynamic decision-making demands that individuals face (Farhoomand and Drury,
2002; Kozlowski et al., 2001; Mendelson and Pillai, 1998; Schroder et al., 1967). While
these demands are more obvious in times of punctuated change, such as a shift in job
role; they are often less obvious within our daily routines at work. Yet, with the
constancy of change individuals are often managing the micro-momentary-processes
(Porac, Thomas, & Baden-Fuller, 1989; Weick, 1995), such as dealing with the numerous
ongoing distractions that crowd one’s attention at work (e.g. ringing phones, influx of
emails, coworkers asking questions, information scraps and a list of tasks to be
accomplished). In essence, individuals at work have to manage the big changes along
with the continual ones (Kegan, 1994).
A common way that scholars suggest responding to both types of demands is
through individual adaptability, a construct that has been written about with heightened
frequency (Baird and Griffin, 2006; Goleman, Boyatzis and Kram, 1998; Edwards and
Morrison, 1994; Hesketh, 1997; Ilgen & Pulakos, 1999; O’Connel et al., 2008; Pulakos et
al., 2000; Smith et al., 1997). Adaptability is generally referred to as an ability to change
when necessary. However, scholars who explicitly contribute to the study of adaptability
do not often address the complexity inherent in the cognitive aspects of adaptive
performance, as it pertains to real-time dynamic tasks.
Understanding an individual’s ability to be adaptive at the cognitive level may be
a vital starting point to successfully navigating dynamic environments (Glynn, 1996). In
order to be adaptive in a real time dynamic context, one must create a new understanding
of information in the environment (Kozlowski, et al., 2001; Zaccaro, 2001), allow it to
12
alter the course of thinking when necessary (Mitroff et al, 2002; Savickas, 1997), and
remain focused on relevant information (Lustig et al., 2001). The purpose of this paper is
to propose and test a new construct: cognitive agility, a potential cognitive ability that can
support the micro-momentary or real-time adaptive performance within the scope of a
single dynamic decision making task. One such context that has proved useful to
researchers studying cognition is the dynamic decision making microworld (DDM).
Microworlds (real-time interactive computer based simulations) mirror the complexity
seen in real life (Diehl and Sterman, 1995; Gonzalez et al, 2005) as they present real-time
decisions, a changing environment (Brehmer, 1992), nonlinear relationships and feedback
delays (Sterman, 2000). Dynamic decision making microworlds offer an experimental
alternative to the “paper and pencil” tests that are so often used in the assessment of
“dynamic, interactive, and time oriented phenomena” (DiFonzo, Hantula and Bordia,
1998, p. 280). Therefore, this study predicts that cognitive agility will lead to better
performance in such a dynamic decision making microworld. The variables that form
cognitive agility will be measured using multiple methods (performance measures, self
reports and other rater reports) in an attempt to demonstrate construct validity (Campbell
and Fiske, 1959).
This dissertation begins with a review of conceptualizations of adaptability and its
relation to the proposed formulation of the construct of cognitive agility.
13
Theoretical Foundations of Cognitive Agility
This section provides the theoretical background for the construct of cognitive
agility. Specifically, the section will demonstrate the relevance the construct has for real
time adaptive performance and how the name agility helps provide a novel perspective
from other existing constructs.
Environmental influences on cognition (like novelty and complexity) are not fixed
but instead more perceptual (Haynie, 2005; Hilton, 1995; Neuberg, 1989; Schwarz, 1996;
Tetlock, 1992), suggesting individual differences in how information is attended to,
filtered, encoded and interpreted (Neisser, 1967). Given that today’s organizational
knowledge worker faces an increase in demands on judgment and decision-making skills
(Howard, 1995; Ilgen & Pulakos, 1999; LePine et al., 2000), an investigation into
cognitive-related constructs that support adaptability in real-time dynamic environments
is needed. This dissertation aims to partially fulfill that need.
Adaptability is often used as an overarching term to describe a set of individual
behaviors, leading to adaptation (Briscoe and Hall, 1999). Therefore, it becomes
important to unbundle the constituent concepts and seek greater clarity in both the
concepts and measures. Aspects of adaptability have been interpreted as parts of various
aspects of a person, including personality traits (Morrison, 1977; Mumford, et al., 1993;
Murphy, 1989), emotional intelligence (Boyatzis and Goleman, 1996; Goleman et al.,
1998; Goleman et al., 2002; Bar-On, 1995), learning style (Kolb, 1984; Kolb et al.,
2000), and cognitive style (Martinsen & Kaufman, 1999; Sternberg, 1997). Others admit
that, regardless of their definition or approach, adaptability is in part a cognitive
14
capability (Earley & Ang; 2003; Fiedler & Garcia, 1987; Haynie, et al, 2008; Helson,
1964; Kozlowski, 1998; Mumford et al., 2000; Showers & Cantor, 1985; Tetlock, 1990).
In this dissertation I claim that Cognitive Agility is an emergent cognitive ability
necessary for adaptive performance within a single real-time dynamic context.
Adaptability is a broad ability to support change. Cognitive agility is specific cognitive
ability that leads to increased performance in a context that requires a series of individual
adaptations. While adaptability is viewed generally as the capability to make a successful
change in either cognition, behavior or emotion in response to anticipated or actual
environmental shifts (LePine et al 2000), it is possible to further categorize it along the
dimensions of time and task. For instance, one can adapt in the moment, to real-time
tasks (Gonzalez, 2004; Lerch and Harter, 2001), or over a longer period of time, as in
adjusting well to a new job (Ashford and Taylor, 1990; Morrison, 1977). Also, one can
adapt within a particular task that is changing (Canas et al. 2003) or across various tasks
that make up a dynamic context, like adapting well to the introduction of a new
technology (Edmondson, Bohmer, and Pisano, 2001).
There have been numerous studies predicting abilities or characteristics of
individual adaptability (Pulakos et al., 2000); most of which, at least implicitly, track
adaptations across a longer time horizon and across multiple tasks. Fewer studies have
focused on the micro-momentary cognitive aspects of adapting within a real time
dynamic decision-making task (exceptions include Canas et al., 2003; LePine et al.,
2000). Therefore further inquiry into individual abilities that may lead to real time
adaptive performance is a necessary addition to the adaptability literature.
15
Adaptive Performance
Real time adaptive performance within a task (e.g. how well an individual performs
within a changing task - Kozlowski, et al 2001; LePine et al., 2000), likely requires a
range of unique skills and abilities.
A real-time dynamic decision task context is one in
which change, novelty, ambiguity and complexity are prevalent (Brehmer, 1992;
Mumford et al., 1993). Such a context presents the individual with a series of continuous
decisions to be made, each of which present various task related tradeoffs (Tverskey et
al., 1988). In order to be adaptive in such a context one must be flexible enough to create
a new understanding of information in the environment (Smith, et al., 1997; Sparrow,
1994) and allow it to alter the course of his or her thinking when necessary (Mitroff et al.,
2002; Savickas, 1997). Altering the course of thinking requires the flexibility to override
a dominant or automatic response (Clark, 1996; Rende, 2000) in favor of a more
appropriate one.
This flexibility is naturally preceded by the ability to notice stimuli of consequence.
Kaplan and Simon stated that, “One of the distinguishing characteristics of insightful
problem solvers is that they are good noticers.” (1990, p. 396). Conversely, an inability to
encode vital elements of the task environment can result in a failure to adapt (Holland et
al., 1986; Combe & Greenley, 2002). Yet, attention is a resource of scarcity in regards to
decision-making within a real-time task (Simon, 1978). Therefore, noticing too much
information or allowing all information to dictate course change would limit one’s
adaptive performance (Brunstein & Olbrich, 1985; Kuhl & Kazen-Saad, 1988). As such
“distractibility” is a central concern of decision making within tasks, as frequent attention
capture can lead to a loss of decision effectiveness (Anderson, 1983). This loss of
16
effectiveness may be caused by distractions that lead to future relevant pieces of data
being missed (Shapiro and Raymond, 1944). Consequently, while it is important to
notice relevant stimuli for task adaptability, adaptive decisions are also supported by the
capacity to avoid less relevant environmental data (Shanteau, 1988).
Cognitive agility is a new, proposed construct that seeks to synthesize and evolve
simultaneously the current conceptualizations of adaptability, adaptive performance, and
flexibility. Specifically, cognitive agility represents an individual cognitive ability to
flexibly operate with cognitive openness and focused attention. The following section
provides theoretical reasoning for the choice of agility as a construct name as opposed to
variations of adaptability or flexibility (i.e. other terms more commonly used to suggest a
range of individual appropriate and variable behavior).
There are multiple ways in the literature to describe the ability to make an
appropriate change in response to the environment. Adaptability as described throughout
this work is perhaps the most common naming convention (Pulakos et al., 1999). Yet,
adaptability is often operationalized as performance in a task that is complex, novel or
ambiguous (LePine et al.,2000; Mumford et al., 1993), regularly having been referred to
as adaptive performance (Allworth, 1997, 1998; Kozlowski et al., 2001). Therefore
describing adaptability as an ability leading to adaptive performance becomes
tautological and potentially confusing towards creating a deeper knowledge of the
construct. Within the cognitive literature some have simply labeled such adjustments as
‘cognitive adaptability,’ which means an ability to change decision frameworks or
knowledge to meet the environmental needs (Haynie, 2005). What this conceptualization
may be missing with regard to the current study are the cognitive needs of the particular
17
context. In real-time dynamic tasks, the speed of change is a necessary component, along
with trying to identify what the individual is changing to and from in terms of orientation
and decision frameworks. The context of a real-time dynamic task is different then
adapting on a longer time horizon, across multiple tasks and therefore requires a different
set of abilities (constructs) that may apply directly.
Cognitive flexibility is another commonly used construct to signify the
appropriate cognitive adjustment according to situational needs. Like adaptability, it is
often defined as an appropriate shift in behavior (Zacarro et al., 1992). Yet its definitions
range from being described as creative (Isen et al., 1987) to having an awareness and
belief that situations have alternative options (Martin and Anderson, 1998); thus, making
it hard to pin down an agreed upon definition. Flexibility is often used synonymously
with adaptability (Calarco & Garvis, 2006; Canas et al., 2003; Goleman, 1998; Grattan et
al., 1999; Griffin and Hesketh, 2003; Murphy and Jackson, 1999; Pulakos et al., 2000),
further confusing the two constructs. Statements such as, “Cognitive flexibility is defined
as a person’s willingness to be flexible and adapt to the situation” are not at all
uncommon (Martin and Anderson, 1998, p.1). Again by using the terms ‘adapt’ and
‘flexible’ together, in place of one another, or to describe flexibility as being flexible,
makes the understanding of the constructs less clear.
Cognitive flexibility is a construct whose conceptualization has not been fully
agreed upon. Yet across the diverse definitions is a theme of intelligently adapting to
one’s environment (Berg & Sternberg, 1985), through various forms of shifting,
restructuring or expanding cognition. It is suggested here that the ability to control one’s
thinking and change the decision strategy is just one aspect of cognition that allows for
18
more adaptive performance in dynamic contexts (Canas et al., 2003). Yet, cognitive
flexibility alone, does not adequately describe what an individual is actually changing
about his or cognition. The particular change is likely relevant to specific environmental
contexts. The elements encountered in a real-time dynamic context likely require a
particular integration of flexibility with other necessary dimensions.
There is a need for identifying unique clusters of ability that support adaptability
in specific contexts. Therefore the word agility was chosen as it represents integration,
coordination and a balance of multiple abilities amidst changing conditions. It suggests
an ability to do so quickly, which lends itself to creating a proper fit with the speed
oriented requirements of adapting in a real time dynamic task. At the organizational level,
agility describes the capacities of the firm to respond quickly to continual and ongoing
environmental changes (Kodish et al., 1995). Therefore cognitive agility is a unique
construct at the individual-cognitive level that is predicted to lead to adaptive
performance in the specific context of a real-time dynamic task.
As a formative construct, cognitive agility includes the variables of cognitive
openness, focused attention and cognitive flexibility. These three variables represent a
cluster of abilities that operate in unison within a task that demands dynamic real-time
updates. The emergent formative construct of cognitive agility is shown in Figure 1.
Each of these variables is considered in greater detail below.
19
Figure 1: Cognitive Agility Formative Construct
Cognitive
Openness
Cognitive
Flexibility
Cognitive Agility
Focused
Attention
This study focuses on the real-time aspect of adaptability within a particular
dynamic task. The task chosen for this study is a dynamic decision-making microworld –
which will be reviewed in greater detail in the Methods section. In the particular
microworld used (the Networked Fire Chief) as well as others, intelligence has shown to
be a predictor of success (Ackerman, 1992; Brehmer & Dörner, 1993; Gonzalez et al.,
2005). This study attempts to support these findings and extend them by demonstrating
that cognitive agility impacts adaptive performance beyond intelligence in a microworld.
The following section provides a review of the variables being tested in this study and
demonstrates how each is related to adaptive performance in a dynamic decision making
context. Subsequent sections include the methods of testing the individual variables and
the construct of cognitive agility, the presentation of results, and a discussion of study
outcomes, limitations and implications for future research and practice.
20
Intelligence and Real-Time Adaptability
Intelligence is the ability to balance the demands of the situation to adapt with
success (Sternberg, 1999). General intelligence (g) refers to the individual capability to
“broadly” comprehend one’s environment and effectively plan a response (Gottfredson,
1997). Others relate intelligent thought to one’s ability to choose the most effective
strategies in novel scenarios (Frensch & Sternberg, 1989). Both interpretations are
supported as higher levels of g are associated with greater results in novel tasks (Hartigan
and Widgor, 1989; Hunter & Hunter, 1984), and, as task complexity increases, a stronger
relationship is found between performance and ability (Ackerman, 1988; Kyllonen,
1985). Furthermore, individuals with higher intelligence are shown to be more able to
modify attentional processes than those with lower intelligence (Shafer, 1979; Shafer &
Marcus, 1973). Multiple forms of intelligence have shown a predictive capacity in
determining outcomes in dynamic decision-making scenarios (Gonzalez et al., 2005) and
g has been shown to predict adaptive performance in particular (LePine et al., 2000;
Pulakos et al., 2002; Zaccaro, 2001).
The Dynamic Decision Making scenario used in this study requires one to adapt
in a novel and complex environment (Sterman, 2000). Because past dynamic decision
making studies have shown that intelligence predicts success in the changing context(s)
(Ackerman 1992; Gonzalez et al., 2005; Rigas, Carling, and Brehmer, 2002; Schoppeck,
1991), it should predict higher performance in the particular dynamic decision making
21
environment – a changing scenario with conflicting goals and numerous options to
choose among (Brehmer, 1995; Omodei et al, 2001).
Focused Attention and Real-Time Adaptability
Focused attention, in general, describes the ability to attend to relevant stimuli and
ignore distracting ones (Lustig et al., 2001). It serves the individual in being able to filter
out information (Kahneman & Treisman, 1984). By bringing focus to essential stimuli,
while suppressing awareness of opposing distractions (Van Zomeren & Brouwer, 1994;
Kahneman, 1973), one can limit incoming and potentially distracting information, instead
of searching for novelty (Lustig et al., 2001). Focused attention is one form of many
cited attentional capacities to describe ways that information is selected and filtered
(Neisser, 1966).
Various other attentional components, including selective attention and sustained
attention, share similarities or relationships and support the understanding of focused
attention as researched here. These complementary terms will be explored as wilxl a
brief overview of attention in order to frame the manner in which focused attention
operates within the framework of adaptive performance.
All cognitive processes utilize attention to some degree (Kahneman & Triesman
(1984). Attention is a resource and like all resources, it is limited. A wide range of
conceptualization regarding models of attention exists (Goldhammer & Moosbrugger,
2006). Human attention, defined generally as stimulus selection (Gibson, 1969), also
includes maintenance, focus and resource allocation and the processes through which
22
sensory input is “transformed, reduced, elaborated, stored, recovered, and used”
(Neisser, 1966). Coull (1998) characterized attention as "... the appropriate allocation of
processing resources to relevant stimuli" (p. 344). Attention is an idea that encompasses
many aspects of human information processing experiences.
Scholars often describe the perception of attention through metaphors, which
assist in understanding its operations. Some metaphors of attention include: a filter
(Broadbent, 1958), effort (Kahneman, 1973), a resource (Shaw and Shaw, 1978), a
control process of short-term memory (Shiffrin and Schneider, 1977), orienting (Posner,
1980), a spotlight (Tsal, 1983), a gate (Reeves and Sperling, 1986), a zoom lens (Eriksen
and St. James, 1986), and a selective channel (LaBerge and Brown, 1989). For instance,
in the zoom lens metaphor, focused attention is viewed as a relatively swift process in
which there is a rich output in a limited area (Eriksen & St. James, 1986). In the gate
metaphor, attention allows information to enter but in order to do so, an individual must
cognitively shut one stream of information in order to open another (LaBerge, 2004). The
metaphors used to describe attention inform and structure our models of
conceptualization (Gibbs, 1994).
Metaphors for attention have led to several important models which help deepen the
conceptualization of attention. Posner serves as a key figure in the conceptualization of
attentional models (Posner & Boies, 1971; Posner, Rafal, 1987; Posner & Peterson,
1990). In his view, the multi-component model of attention separates the capacities of
alertness and selectivity (Posner & Boies, 1971). Posner proposed that within selectivity
there is a separate 'executive' branch of the attentional system, which is accountable for
focusing attention on chosen aspects of the environment (Posner & Petersen, 1990).
23
Sturm and Zimmermann extended Posner’s model to show a hierarchy of
attention with both upper and lower levels (Sturm and Zimmermann (2000). The upper
level contains intensity and selectivity. Intensity is associated with alertness, sustained
attention, and vigilance. Similar to Posner’s model, the selectivity portion of this level
includes focused or selective attention, visual spatial selective attention, attentional
switching, and divided attention (Sturm and Zimmermann, 2000; Van Zomeren and
Brouwer, 1994). Each of these models describes focused attention as a selective process
controlled by an executive system. Yet they do not describe the process of holding this
attention to filter out irrelevant information.
There are several terms used to describe the process of holding attention. For
instance, sustained attention, or vigilance, refers to the ability of observers to maintain a
high level of mental concentration over extended periods of time (Rose et al., 2002).
DeGangi and Porges (1990) suggest a three-stage model of sustained attention. In their
model, attention that is sustained, and runs the course of: attention getting, attention
holding and attention releasing. In this model, attention is held by information that is
novel or complex. This however suggests that the executive system has less conscious
control of sustaining selection or focus, as it is dependent on environmental cues.
In contrast, James wrote of sustained attention as a fleeting activity: "There is no
such thing as voluntary attention sustained for more than a few seconds at a time"; "what
is called sustained voluntary attention is a repetition of successive efforts which bring
back the topic to the mind" (James, 1890, vol. 1, p. 420). James describes the struggle of
bringing attention to a stimuli and speaks about an ongoing interference with information
that is irrelevant to current intention.
24
The ability to limit incoming information is necessary. Although Coull (1998)
characterized attention as "... the appropriate allocation of processing resources to
relevant stimuli" (p. 344), this assumes a simplification, and does not account for an
individual’s degree of skill or ability to regulate and deflect incoming data, which is
focused attention.
Focused attention (Schneider et al., 1984) generally describes the ability to attend
only to relevant stimuli and to ignore distracting ones (Lustig et al., 2001). Posner
proposed that there is a separate 'executive' branch of the attentional system, which is
responsible for focusing attention on selected aspects of the environment (Posner &
Petersen, 1990). Focused attention can be thought of as part of selective attention, which
is the capacity to bring focus or attend to essential stimuli, while suppressing awareness
of opposing distractions (Van Zomeren & Brouwer, 1994; Kahneman, 1973). It is this
last point that subtly differentiates focus from selection. Selective Attention describes the
choice of possible stimuli to focus on and not the reduction or filtering out of irrelevant
stimuli (Kahneman & Treisman, 1984).
The incapacity to deny entry of irrelevant information or needless information
processing, marks a failure of focused attention. Focusing attention can fail due to a
situational occurrence that disrupts ongoing attention (Shiffrin and Schneider 1977,
Yantis, 1993; Theeuwes 1994). This disruption can ‘grab’ attention and lead to future
relevant pieces of data being missed (Shapiro and Raymond, 1944) or the dilution of
attentional resources to accomplish the intended goal.
The use of attentional resources is vital to understating real time adaptation. In
dynamic situations, one must be able to manage the effects of incoming information
25
within the current course of cognitive action. With ongoing changes to the environment,
focusing attention can fail due to a situational occurrence that disrupts ongoing attention
(Theeuwes 1994; Yantis, 1993). Adaptive performance within a task requires that focus
be paid to relevant data, which provide ongoing cues to adjust activity accordingly. If
focused attention were low, then the individual may quickly find a dilution of his or her
attentional resources in relation to accomplishment of an intended task. In essence, he or
she may become overwhelmed by information, thus not following a coherent decision
path toward adapting well with the changes (Anderson, 1983).
Cognitive Openness and Real-Time Adaptability
Cognitive openness as a concept is associated with several streams of existing
literature, which are each, linked to real time adaptability. There are no existing
instruments to measure, explicitly, the cognitive aspects of being open. Yet, the construct
of openness in the psychological literature is most often cited as ‘openness to
experience,’ as used in the literatures of Big Five Personality Trait Measurements (Costa
& McCrae, 1985). Still other constructs such as creativity (Gough, 1979), curiosity
(Littman, 2005), and mindfulness (Langer, 1989), while they do not always employ the
term openness per se, also highlight the importance of “being open” or employing related
qualities of cognition in respect to dynamic contexts. Therefore, the theoretical links
between adaptability and aspects of openness to experience, curiosity, creativity and
mindfulness are established here to support the choice of terminology and measurement.
26
A brief review of each of these constructs and their application to real time adaptive
performance is discussed below.
In the psychological literature the term “openness” is most often attributed and
understood in the context of The Big Five Personality Trait called ‘Openness to
Experience’ (Costa & McCrae, 1985; McCrae & Costa, 1985). This use of the term
openness was first measured by Costa & McCrae (1985) using the NEO Personality
Inventory, a 240 item self report (NEO PI-R; Costa & McCrae, 1992). Openness in this
context is used to describe a personality type with a proclivity toward sensation seeking
and an interest in the non-routine. Since open individuals are comfortable with ambiguity,
tend to like intellectual problems and seek novelty (Costa & McCrae, 1992; LePine et al,
2000; King et al., 1996), they are more likely to adapt appropriately to changing
conditions (Blickle, 1996).
Measures of ‘openness to experience’ on the Big 5 Personality Inventory (e.g.
NEO-PI), indicate individuals who score highly on this dimension as being receptive to
new ideas, experiences and perspectives. McCrae and Costa (1997) state that ‘‘openness
is seen in the breadth, depth, and permeability of consciousness, and in the recurrent need
to enlarge and examine experience’’ (p. 826). Openness as used in the Big Five is often
conceptually related to creativity to the point that scholars use the term creativity to refer
to openness (Digman, 1990; Matthews & Deary, 1998).
Creativity is the ability to produce novel and appropriate solutions to problems
(Amabile, 1996; Lubart, 1994; Jackson & Messick, 1973; Mackinnon, 1962; Sternberg &
Lubart, 1996). Creativity has been conceptualized as a personality characteristic
(Spearman, 1923, 1927), a thought process (Ward, 1994), an outcome (Amabile, 1996),
27
and has been measured in a variety of ways (Davis, 1971). Creative individuals and
creative processes are associated with the adjustments necessary in adapting to change
(Mumford et al., 1991)
As a personality characteristic, creative individuals are often regarded as
independent, curious, open-minded, artistic, attracted to novelty (Davis, 1992) and
possessing the ability to think divergently (Guilford, 1960). As a process, creativity is
often differentiated into the consecutive stages of preparation, incubation, illumination
and verification (Lubart, 1994). As an outcome, creativity is viewed in terms of the value
of a product (Bailin, 1988)1. Creative individuals are open to new ideas and are able to
establish associations, distinctions and unique perspectives that non-creative individuals
are not as able to do. 2 In a real time dynamic context, creative individuals are more able
to find ways of approaching a complex situation (Mumford et al., 1993).
One important connection creativity has with cognitive openness is the ability for
divergent thinking. Divergent thinking provides the capacity for the creation and
inclusion of new ideas (Wallach & Kogan, 1965; Hyman, 1964; Torrance, 1974;
Guilford, 1967; Guilford, 1960). The cognitive aspects of creativity have been strongly
associated with the concept of divergent thinking (Baer, 1993; Scratchley & Hakstian,
2000), which is defined as an ability to generate as many responses as possible to a
stimulus (Guilford, 1950, 1957). Wallach (1970) states that divergent thinking is
dependent on the flow of ideas and the "fluidity in generating cognitive units" (p. 1240).
Divergent thinking has also been linked to a broad scanning ability that is distinct from
1
Bailin states that "The only coherent way in which to view creativity is in terms o f the production of
valuable products." (1988,p. 5). While this is a widely held believe b y creativit y scholars, this paper
will not investigate creativit y as an outcome since the main focus is about general capacit y to engage
with novelt y and not a capacit y to create a product in a particular domain.
28
general intelligence (Runco, 1991) and suggestive of one’s ability to see more stimuli in
an environment (Mendelsohn, 1976; Mendelsohn & Griswold, 1964, 1966). This wide
breadth of attention improves the chances that otherwise unseen stimuli will be used to
create new ideas (Mendelsohn, 1976). Cognitive openness is supported by a divergent
thinking process and ability and is therefore closely related to this dimension of
creativity.
Theoretical models of curiosity inform cognitive openness in that both
demonstrate a desire to engage in exploratory behavior (Voss & Keller, 1983; Peterson &
Seligman, 2004). Curiosity has been defined as a motivation toward exploratory
behavior, which results in acquisition of new knowledge (Berlyne, 1949; Collins, Littman
& Spielberger, 2004; McDougall, 1921). Curiosity has been recognized by multiple
scholars as both sensation seeking (Scroth & Lund, 1994; Zuckerman, 1979), and
exploratory behavior (Voss & Keller, 1983). It is measured as both a motivational state
(Berlyne; Day, 1969) and as an individual trait (Spielberger & Butler, 1971). It is
broadly considered an approach behavior (Loewenstein, 1994), suggesting an active
expression of openness toward stimuli. Several curiosity models demonstrate its variation
in the literature and its general association toward adapting in a dynamic context (Reio
and Wiswell, 2001).
An early model known as the “curiosity drive model” suggests that individuals
require a sense of coherence in their cognitive frames and will seek out information to
keep a sense of contiguity intact. Therefore, situations of novelty or complexity, which
can create “uncertainty”, compel individuals to reduce this disruption through curiosity
seeking behavior (Berlyne, 1950; 1955). This model is very similar to Loewenstein’s
29
“knowledge gap/approach gradient” theory of curiosity in which individuals discover
novelty and then recognize a gap in their knowledge (1994).
Slightly different is the optimal arousal model, which states that curiosity is
simply an enjoyable experience that individuals seek (Peterson & Seligman, 2001).
Optimal Arousal suggests that there is an optimal point between being over aroused and
under aroused. Littman and Jimerson (2004) attempt to reconcile these two prevailing
models (Curiosity as Drive and Optimal Arousal) through the interest-deprivation model
(ID). This model includes two main components, which are Curiosity as Feeling of
Interest (CFI) (Litman & Jimerson, 2004), in which individuals experience curiosity as a
feeling of interest versus Curiosity as Feeling of Deprivation (CFD). CFI is suggestive of
a perpetual need that is satisfied by a sense of openness. CFI, which the authors compare
to ‘openness to experience,’ is motivated by a genuine interest in wanting to know
something new. Conversely, CFD is stimulated when one perceives to be lacking some
necessary information. Individuals differ in their propensity to experience both CFI and
CFD (Litman, 2005).
In yet another model, Berlyne divided notions of curiosity into two types:
epistemic and perceptual. Perceptual curiosity is defined as “ the curiosity which leads to
increased perception of stimuli” (p.180), whereas epistemic curiosity reflects a desire to
reduce knowledge gaps. Berlyne also made another delineation in terms of curiosity
between two kinds of exploratory behavior, diversive and specific (1960). In diversive
exploratory behavior, individuals are compelled by boredom and need for challenge to
“seek stimulation regardless of source of content” (p. 26). In contrast, specific
exploratory behavior is compelled by the search for depth of knowledge in a certain
30
domain or activity (Kashdan et al, 2004). These two modes of curiosity are hypothesized
to work in conjunction with diversive behavior leading to contact with novelty and
specific curiosity attempting to answer the uncertainty found from the new stimuli (Dav,
1971; Krapp, 1999; Kashdan, 2004). Each of these models of curiosity help to support a
mindful approach to novelty seeking (Langer, 1997).
Another exploratory mindset with a distinct similarity to cognitive openness are
certain aspects of Langer’s version of mindfulness (1989; Bodner & Langer, 2001).
Individuals with a propensity toward being mindful, with respect to openness, are able to
“1) view a situation from several perspectives, and (2) see information presented in the
situation as novel “(Langer, 1997 p.111). Like the other constructs discussed in this
chapter, mindfulness suggests a comfort with the non-routine and an open-mindedness.3
Mindfulness is a state in which a person is both aware that his/her understanding
of a situation is always subject to alternative interpretations and is willing to direct his/her
attention toward creating those other interpretations (Bodner, 2000). The mindful
individual is able to disseminate differences between the past and the present and
discover uniqueness when encountering cues. The mindful individual actively creates
novel distinctions, thus leading to an increase in multiple interpretations of any stimuli or
situation (Langer, 1978, 1989, 1992, 1997; Bodner & Langer, 2001). The creation of
novel distinction begins in part by the propensity of the mindful individual to approach
novelty.
3
In contrast, Langer discusses mindlessness as the normative human attentional mode in which one’s
approach to information is reliant on automatic cognitive functioning. Mindlessness is a state of automaticity
in which individuals behave like robots and can be considered a passive version cognitive closedness to
contrast openness. This is nicely captured in the familiar experience of driving home from work on the same
route, a task we can accomplish without any active attention paid to the necessary turns. More subtly, the
mindless state can be found in social interaction when we automatically categorize people with preconstructed expectations, thus disregarding the sensitivity of context.
31
A particular aspect of mindfulness that is relevant for cognitive openness is
novelty seeking behavior (Epstein 2003, Runco 1994, Langer, 1989; Stokes, 1999) or a
preference for novelty (Houston & Mednick, 1963). Novelty seeking as a personality trait
is defined by Pearson 1970) as:
a tendency to approach versus a tendency to avoid novel experiences. It is a
disposition toward changing, new or unexpected experiences versus a disposition
to avoid these experiences. The degree of novelty in any one experience is a
function of the discrepancy between an individual’s past experience and the
present one. (p. 199).
Novelty seeking is most commonly measured through self reports (Edwards, 1966;
Garlington & Shimona; Jackson, 1967; Pearson, 1970). These self reports often identify
a series of specific desires to ‘try new things’ or ‘go to new places.’ Fast paced dynamic
decision making scenarios present the individual with tremendous novelty.
Aspects of one’s openness, curiosity, creativity and mindfulness likely affect real
time adaptive performance in a dynamic context. For instance, openness to experience
has been expressed as part of a higher order factor along with extraversion to explain
plasticity in personality (DeYoung, Peterson, and Higgins, 2002). John & Srivastava
(1999) explain that this higher-order factor is a meta-trait and includes the overarching
concerns of an organism to incorporate novel information into that organization, as the
state of the organism changes both internally (developmentally) and externally
(environmentally). Thus, openness to experience would support one in noticing and
including more novel data supporting adaptation in a changing and dynamic context.
32
In general, adaptability places a ‘premium’ on creativity (LePine et al., 2000).
Creativity and specifically one’s capacity for thinking divergently allow for the creation
and inclusion of new ideas (Guilford, 1960; Torrance, 1974). As presented above,
divergent thinking is defined as an ability to generate as many responses as possible to a
stimulus (Guilford, 1950). Divergent thinking has also been linked to a broad scanning
ability that is distinct from general intelligence (Runco, 1991) and as previously stated,
divergent thinking is suggestive of one’s ability to see more stimuli in an environment
(Mendelsohn, 1976; Mendelsohn & Griswold, 1964, 1966). This wide breadth of
attention improves the chances that otherwise unseen stimuli will be used to create new
ideas (Mendelsohn, 1976).
Curiosity as a need in reducing knowledge gaps and as an inherent interest in
understanding something more deeply seems to be foundational for adaptation. As
contextual changes are encountered in a dynamic environment new ways of behaving are
necessary. Curiosity implies "a direction of development toward differentiated
interaction patterns and more effective problem solving" (Voss and Keller 1983, p. 156).
Curiosity does not ensure successful adaptation but it starts the process of
experimentation and discovery that leads to adaptivity and innovation. Adaptability is
strongly linked to curiosity as "that factor which underlies the willingness of an
individual to expose himself to information" (Day, Langevin, Maynes, & Spring, 1972, p.
330). Adaptability within a dynamic task requires that one be open to the changes that
take place, and be able to notice new information when appropriate.
Mindful awareness allows individuals to be more adaptive (Langer 1989). One
area within mindfulness where this is most salient is in regard to novelty seeking (Bodner
33
and Langer, 2001). Novelty seeking from a mindfulness conceptualization and
measurement perspective includes aspects of openness, curiosity and creativity (Bodner
and Langer, 2001). In a real time dynamic context this combined resource supports one
in seeking understanding (curiosity), embracing change (novelty/openness) and searching
for ways to categorize information (creativity) that could lead to adaptive performance.
Cognitive Flexibility and Real-Time Adaptability
Cognitive flexibility is linked to adaptive performance within a real time dynamic
decision-making context (Canas et al., 2003). Cognitive flexibility is both a metacognitive and executive function that supports successful adaptation through its
underlying components of cognitive monitoring and cognitive control (Clark, 1996). This
notion of cognitive monitoring and control (Monsell, 1996; Rostan, 1994) allows one to
overcome dominant or automatic response sets (Jost, et al., 1998; Diekman, 1994), and
shift to a more appropriate contextual response (Reder & Schunn, 1999). Cognitive
flexibility, the ability to cognitively control and shift mental set (Canas, 2003; Clark,
1996; Rende, 2000), is necessary in real-time adaptive performance.
It is instructive to note at the outset that definitions of cognitive flexibility
encompass a diverse range of conceptualizations. For example, cognitive flexibility has
been described as being creative (Isen et al., 1987), attending to novel events (Barcelo &
Perianez, 2002), demonstrating divergent thinking (Guildford, 1959), intelligently
adapting to one’s environment (Berg & Sternberg), being open to new views within a
domain (Scott, 1967), having the ability to see distinctions (Murray et al 1990), forming
34
associations or interrelationships (Murrray et al, 1990; Mednick, 1962) switching
strategies (van Zomeren, 1981; Showers & Cantor, 1985), shifting sets (Luchins &
Luchins, 1959), having an awareness and belief that situations have alternative options
(Martin and Anderson, 1998), displaying the ability to restructure knowledge (Spiro &
Jehng, 1990; Chi, 1997; Mumford, Baughman, Maher, Costanza & Supinski, 1997;
Perkins, 1988), and using cognitive control to override automatic responses (Canas, 2003;
Clark, 1996; Rende, 2000).
Another way cognitive flexibility has been defined, which is relevant to real time
adaptive performance, is an individual’s ability to shift attention in order to respond to
the environment in a new way (Eslinger & Grattan, 1993). This shift can either be
initiated within oneself or dictated by external requirements that necessitate a new way of
responding (Richard et al., 1993). The distinction of cognitive flexibility can be more
specifically described in the separation of reactive or adaptive flexibility and spontaneous
flexibility4 (Elinger & Grattan, 1993). And while this describes where the need for
flexibility originates it does not explain what underlying abilities are necessary to create
this attentional shift.
Using the analogy of a train track, cognitive flexibility serves as the “switch” in
determining track direction. On one level, cognitive flexibility is about cognitive
monitoring—noticing the need to switch direction. On another level, it is about cognitive
control—being able to execute the track change. These two dimensions, monitoring and
control, are what makes cognitive flexibility and ultimately cognitive agility possible.
4
The other type of flexibility is spontaneous flexibility. Is similar to divergent thinking. Broken up into
ideational and semantic, ideational is the number of ideas and semanitic is the variety of ideas.
35
Cognitive monitoring and cognitive control are two aspects of regulation or
flexible use of cognition. They are encompassed within the established literatures of
cognitive control (Groborz and Necka, 2003) and metacognition (Moses & Baird,1999;
Paris & Lindauer, 1982; Pintrich et al., 1993; Nelson & Narens, 1994; Schraw &
Dennison, 1994; Flavell, 1979). For example, metacognition generates flexible cognition
if someone perceives through cognitive monitoring that they do not fully comprehend an
experience (Flavell and Wellman, 1977; Brown and Palinscar, 1982; Whimbey and
Lochhead, 1999). In such a situation, he may choose, through cognitive regulation, to
increase attentional focus to reduce other distractions (Hacker, 1998). Similarly,
cognitive control is an executive capacity which allows one to overcome automatic
responses in favor of more controlled consciousness, allowing contextual cognitive
adjustments (Norman & Shallice, 1986; Fernandez-Duque & Johnson, 1999).
Therefore, cognitive flexibility as conceptualized here, is a meta-cognitive
regulative capability. Meta-cognition, which is commonly referred to as “thinking about
thinking” (Flavell, 1979), includes both the knowledge and regulation of cognitive
activity (Moses & Baird, 1999).5 The regulation component of meta-cognition is
synonymous to cognitive flexibility, as it encompasses bottom up processes like
monitoring (e.g., error detection, source monitoring) (Fernandez-Duque,et al., ,2000), and
the top-down processes of cognitive control (Fernandez-Duque, et al.,2000) which
include error correction, inhibitory control, planning and resource allocation (Nelson &
Narens, 1990; Reder & Schunn, 1996). The processes or abilities of cognitive control
5
Cognitive Knowledge is…which is not necessary for the present examination of
flexibility
36
and metacognitive regulation are presented as brief reviews below to support the
theoretical development of cognitive flexibility.
Strategies used in cognitive operations that are monitored and controlled rely on
metacognition. Metacognition is essentially ‘thinking about thinking’ (Schacter, 1996;
Wyer & Srull, 1989), but has several other definitions (Van Zile-Tamsen, 1994, 1996)
and models (Flavell, 1979; Brown, 1987; Tobias, 1995). This broad ability is
underpinned by the abilities to monitor (Moses & Baird,1999; Paris & Lindauer, 1982;
Pintrich et al., 1993) and regulate thinking (Moses & Baird,1999; Nelson & Narens,
1994; Schraw & Dennison, 1994). Metacognition refers to "One’s knowledge concerning
one’s own cognitive processes and products or anything related to them (...) [and] refers,
among other things, to the active monitoring and consequent regulation and orchestration
of these processes (...), usually in the service of some concrete goal or objective” (Flavell,
1976, p. 232). Metacognition exists at a higher level of the strategy processing hierarchy
and is less domain specific than the employed cognitive strategies (Flavell, 1979)
suggesting a general ability beyond expertise or task exposure.6
Metacognition requires an ability to monitor one’s ongoing thought processes as
they relate to accomplishing a goal (Nelson & Narens, 1990). Monitoring is used to scan
the mental landscape in order to see if attention is being properly distributed according to
the task at hand (Pintrich & Schrauben, 1992). Metacognitive awareness also responds to
whether attention allocation with regard to strategy use is working successfully
(Montague, & Bos, 1990; Tobias, 1995; Weinstein & Mayer, 1986). Monitoring works
6
‘Knowledge about’ cognitive states refers to awareness of prior knowledge about
specifiuc task and one’s ability in that task. This is more domain specific and therefore
an area of metacognition that I will not explore in as much detail as regulation.
37
in concert with regulation, as awareness through monitoring creates ground for controlled
change (Hart, 1965; Koriat & Goldsmith, 1996; Nelson, Dunlosky, Graf, & Narens, 1994;
Vernon and Usher 2003). A mismatch in goals and what one deduces through current
monitoring creates a subjective awareness of a need to change cognitive strategies.
The controlled component of metacognitive regulation is very similar to cognitive
control (mentioned above and described in greater detail below) or the executive function
that is said to guide so much of our conscious behavior (Posner & Snyder, 1975). Yet,
despite the obvious connections between metacognitive regulation and executive control
that exist, the two streams of literature are rarely associated (Carlson, Moses, & Hix,
1998; Garner, 1994; Hughes, 1998;Mazzoni& Nelson, 1998; Metcalfe &
Shimamura,1994).
Cognitive control is a mechanism responsible for a diverse range of psychological
functioning (Monsell, 1996; Rostan, 1994). These functions can include memory
(Baddeley & Hitch, 1974; Hasher & Zacks, 1979; Jacoby, 1991; Koriat & Goldsmith,
1996), attention (Kane et al., 2001; Norman & Shallice, 1986; Posner & Snyder, 1975;
Shiffrin, 1988; Shiffrin & Schneider, 1977), and general fluid intelligence (Conway, et
al., 2002; Engle, Tuholski, et al., 1999; Kyllonen & Christal, 1990). Cognitive control
directs thinking and behaving as it leads to self-selected goals and is responsible for
inhibiting automatic responses that may sidetrack goal achievement (Canas, 2003).
Groborz and Necka (2003) state that cognitive control is responsible for the reduction of
“chaos” in the processing of information, defining it as the “ability to suppress inadequate
responses and evoke the proper ones “ (Groborz & Necka, 2003 p. 185). Cognitive
control provides the capacity to choose what to approach or to avoid in the environment
38
of possible stimuli.
There is an executive branch for attentional processing which is able to bring
attention to intended parts of the environment (Posner & Petersen, 1990). As stated
above, this level of control starts with intentional allocation of the attention needed for
“….maintaining temporary goals, in the face of distraction and interference and for
blocking, gating, and /or suppressing distracting events” (Engle et al., 1999 p 102). Yet it
is also responsible for overriding the automatic processing of information through
schemas (Posner, DiGirolamo, & Fernandez-Duque, 1997). In instances where there is
an injury to the executive system, individuals are reported to activate the same schema to
the point of perseveration (Milner & Petrides, 1984). In contrast, lesions to the frontal
lobe (where the executive system exists) can lead to an increase in distractibility when
schemas are activated by a host of inappropriate stimuli (Shallice, 1988). The executive
system is most effective when “under the conscious control of the subject” (Posner &
Snyder, 1975, p. 73).
James viewed cognitive control (mental control) as an activity similar to that of
physical effort (1890). He thought that similar to the physical body and its muscles
which could operate according to reflex, so too the mind had muscles (i.e. attention) that
could operate automatically without control or volition (James, 1890). Humans are
automatized via our currently held schemas and mental models or our top down
processing (Posner, DiGirolamo, & Fernandez-Duque, 1997). Alternatively, cognitive
control allows one to slow down the processing of information in a way that may lead to
a change in schemas (Norman & Shallice, 1986; Fernandez-Duque & Johnson, 1999) or
strategy (Reder & Schunn, 1996).
39
Cognitive flexibility is necessary for adaptive performance within real time
dynamic decision-making contexts (Canas et al., 2003). While cognitive flexibility is
defined as the capacity to shift cognitive set, the actual ability to accomplish this is
grounded in monitoring and control of cognition (Clark, 1996). It is this monitoring and
cognitive control to shift strategies midstream, which leads to increased adaptivity in real
time dynamic situations (Earley & Ang, 2003). Studies of performance taken from
numerous domains demonstrate that most people use many different strategies on tasks
(Reder, 1982; Siegler, 1996) thus showing strategy use is not fixed per task. The
metacognitively adept individual has been found to be more open to the information in
the environment and more able to integrate this information into existing representations
of decisions (Melot, 1998; Schraw & Dennison, 1994). This process of controlled
regulation is responsible for choosing the strategies and putting them into effect (Nelson
& Narens, 1990).
In addition cognitive flexibility and/or cognitive control is used to override
automatic responses (Canas, 2003; Clark, 1996; Rende, 2000). Cognitive regulation is
necessary in making adaptive changes in processing strategy (Payne, Bettman and
Johnson, 1993), which requires elements of overriding automatic responses to familiar
cues in the environment (Deikman,1982). This requires cognitive control (Clark, 1996;
Rende, 2000; Bargh,1994; Schneider & Schiffrin, 1977), metacognition (Flavell, 1979)
and an ability to override dominant logic (Ireland et al., 2003; Deikman, 1982). Diekman
writes about this ability to overrule a usual response as “an undoing of the automatic
processes that control perception and cognition” (p. 137). This is defined as
“deautomization” in which one separates oneself from the automatic sequence and
40
increases the likelihood of choosing a more fitting reaction (Deikman, 1982). In a real
time dynamic context, one must be able to override preprogrammed responses as the
situation calls for new strategies.
Cognitive Agility and Real-Time Adaptability
Adapting successfully within a real time dynamic task context requires that one
flexibly operate, being both open and focused. An individual must be able to notice
novelty and have the capacity for the creation and inclusion of new information
(Kozlowski, et al, 2001; Zaccaro, 2001; Wallach & Kogan, 1965; Hyman, 1964;
Torrance, 1974; Guilford, 1967; Guilford, 1960). Cognitive openness thus supports one
in noticing relevant stimuli, and being more likely to consider parts of the environment
that others may miss (Chan & Schmitt, 2000; Langer, 1989). Yet, the inclusion of too
much information may indicate a failure in focused attention (Schneider et al., 1984),
causing the individual to lose the cognitive thread of what is most relevant for the given
task (Anderson, 1983; Brunstein & Olbrich, 1985; Kuhl & Kazen-Saad, 1988; Salvucci &
Taatgen, 2008). An over inclusiveness of information can hamper individuals who see
many associations between ideas but have trouble focusing on one thing (Necka, 1999).
On the other hand, being too focused limits one’s capacity to find new and necessary
information and prevents adaptation. An over-focus can be found in those who have
‘tunnel vision’ and operate with incredible efficiency within a bounded space (Fiol &
Huff, 2007). In this sense, having the capacity to be flexible, utilizing both openness and
focus may help determine successful adaptation within a given task. One must be able to
41
shift between being open and focused throughout the real-time task. In other words, in
order to adapt one must be able to shift from a currently utilized thought routine, to a new
one that favors the changes to the environment (Canas, 2003; Clark, 1996). Cognitive
flexibility, the ability to cognitively control and shift mental set (Clark, 1996; Rende,
2000), then is also necessary in real-time adaptive performance. A combination of these
three capacities should support adaptive performance through a real-time changing task
context.
42
CHAPTER II
METHODS
Study Overview
This is an initial criterion related predictive validity, laboratory study. It examines
the relationship between cognitive openness, cognitive flexibility, focused attention, two
forms of cognitive intelligence and their effects on a computer based dynamic decisionmaking game. Independent variable data come from questionnaires, three performance
tests, a vocabulary test using a multiple-choice format and a visual-spatial intelligence
test. Furthermore, external raters are selected to complete each of the questionnaires on
behalf of the participants in an attempt to further validate the construct (Campbell &
Fiske, 1959). None of the three self report measures used in this study has been
previously subjected to correlation comparisons with measures from external raters. The
use of self report, performance scores and other rater reports provide data to properly
employ analysis through a multimethod (self, other and performance), multitrait
(cognitive openness, cognitive flexibility, focused attention) comparison approach
43
(Campbell and Fiske, 1959). The dependent variable was a total performance score (i.e.
adaptive (performance) on three trials of the dynamic decision making game known as
the Networked Fire Chief (NFC).
Design
The study had two parts. In Part I, participants were invited via email to take part
in a lab experiment in exchange for a modest cash reward. The lab experiment included
the data collection of some basic demographic information, a series of questionnaires and
performance tests that were completed at individual computer terminals. Participants
completed three questionnaires to include the subsections of the Langer Mindfulness
Scale – Novelty Seeking (Langer 2001), The Attentional Control Scale – Focused
Attention (Derryberry, 1998) and the Metacognitive Awareness Inventory – The
Metacognitive Regulation Sub-scale (Schraw and Dennison, 1994). Once the
questionnaires were completed the participants were instructed to complete the following
performance tests: The Stroop Color Test (Stroop, 1935), The Basic Word Vocabulary
Test (Dupuy, 1974), The Go No Go Paradigm (Zimmerman and Fimm, 2000), the
Alternate Uses Test (two objects, brick and paper clip) (Guilford, 1956) and The Card
Rotations Test (Ekstrom, 1976). Once completed participants took part in three separate
five minute trials of a dynamic decision making computer simulation known as the
Networked Fire Chief (NFC). Total test time lasted approximately one hour per
participant. Testing for Part I of the study took place over a six week period.
Once Part I was completed, Part II began with an automated email sent to
person(s) the participant provided as knowing very well. As part of the demographic
44
information collected, participants were required to provide one to three email addresses
of people who “know them best” to take a survey on their behalf. The survey included the
same three questionnaires the participants completed in Part I, but were reworded to
collect information about how the selected ratee considered the behavior of the
participant. Other than a change to the pronouns (from “I” to “Him/Her” the items
remained exactly the same.
Participants
Undergraduate students at a mid size Midwestern University volunteered to be
included on a contact list to take part in university supported research studies. Data were
collected from 185 undergraduate students. However, four participants were eliminated
from the sample due to technical errors. A total viable sample of 181 participants was
available for self-data. 152 participants provided usable data for self and other (i.e. that
had at least one person who knew them, successfully complete the reworded
questionnaires on their behalf).
All participants in this study had voluntarily signed up to be on a contact list to
participate in paid research studies through the Economics department of the University.
Participants received emails inviting them to take part in the study (see Appendix A). The
overall response rate from the research contact list was 44% (420 email invitations were
sent; 195 agreed to participate) with the effective response rate being 43% (181/420 for
scores of self) and 36% (152/420) for scores of self and other, respectively. The total
sample (185) consisted of 102 males and 83 females. The sample of completed self
reports (181) consisted of 101 males and 80 females. The sample of completed self and
45
others (152) consisted of 89 males and 63 females. Participants were of traditional
college age ranging from 18-25 with 78% between the ages of 19-21.
Procedure
The descriptive outline of the procedures for this experiment are shown in Table
1. After agreeing to take part in the study, participants were asked to arrive at the
computer lab at the University’s School of Management at a predetermined time. Upon
arrival participants were assigned to an individual computer terminal. The principal
investigator remained present during testing to answer technical questions. All the tests
were completed on a computer. Use of the lab allowed for uniformity of experience
across participants. The computer screen displayed the following information and tests in
this order:
•
•
•
•
•
•
•
•
•
•
•
•
•
A brief description of the research, instructions on how to proceed and thanking
them for their participation
A prompt to insert their log-in participant code
A short demographic questionnaire (i.e., age, gender, use of computer games)
A combination of items from subscales of three questionnaires (including the
LMS, MAI & ACS).
The Basic Word Vocabulary Test (BWVT)
An online version of the Stroop Task (the name of a color written in a different
color i.e. – the word RED written in the color green).
A reaction time test using the go/no go paradigm
The Alternate Uses Test utilizing 2 objects (brick, paper clip)
A short computer based video recreation of how to operate the Networked Fire
Chief
A practice trial of the Networked Fire Chief
A trial 1 of the NFC
A trial 2 of the NFC
A trial 3 of the NFC
46
Table 1. Measures for Study
Name of Variable
1
Name of Measure
Number
Estimated
of Items
Measure
Time
of
Authors of Measure
Cognitive Openness

Performance
Alternate Uses Test
2 objects
8 minutes
Guilford (1959)

Self Report
Langer Mindfulness Scale
9
5 minutes
Bodner

Other
Langer Mindfulness Scale
9
5 minutes
(2001)
Rater
Report
Bodner
&
Langer
&
Langer
(2001)
2
Focused Attention

Performanc
Go No Go paradigm
60 trials
5 minutes
e
Attentional Control Scale
9
5 minutes
Derryberry (1988)

Self Report
Attentional Control Scale
9
5 minutes
Derryberry (1988)

Other Rater
Report
3
Cognitive Flexibility

Performance
Stroop Task
60 trials
5 minutes
Stroop (1935)

Self Report
Metacognitive Awareness
35
7 minutes
Schraw & Dennison,

Other
Inventory
35
7 minutes
1994
Rater
Report
4
5
Metacognitive Awareness
Schraw & Dennison,
Inventory
1994
Intelligence

Verbal
Basic

Visual-
Test
Spatial
Card Rotations Test
Word
Vocabulary
40
10 minutes
Dupuy, (1974)
80
6 minutes
Ekstron, (1976)
Adaptive Performance

Trial 1
The Networked Fire Chief
1 trial
5 minutes
Omodei and Wearing

Trial 2
The Networked Fire Chief
1 trial
5 minutes
(1995)

Trial 3
The Networked Fire Chief
1 trial
5 minutes
Omodei
and
Wearing(1995)
Omodei
Wearing(1995)
TOTAL
minutes
47
TIME
=
71
and
Measures
Dependent Variable - Adaptive Performance in DDM
DDM Task– The Networked Fire Chief (NFC). The (NFC) program is used in this
study to create task demands that change dynamically (Omodei & Wearing, 1995). The
NFC is used as a laboratory-based microworld for this study to create a contextual change
that operates across participants in a controlled fashion (Lepine et al., 2000). The NFC is
a Dynamic Decision Making environment, which presents ongoing change under a time
pressured situation (See Figure 2 – The Networked Fire Chief).
Figure 2. Networked Fire Chief Screen Shot
48
The NFC program allows participants to “play” the role of a “fire chief” who
must extinguish simulated forest fires. The subject extinguishes the fires as quickly as
possible using water carried by helicopters and fire trucks. The helicopters and trucks are
deployed to fires by using a drag and drop approach with a computer mouse. The user
must contend with changes in the microworld as the fire becomes more intense (size of
flames) and spreads faster depending on the wind direction and wind intensity (both
demarcated by a compass). Fire trucks and helicopters run out of water and therefore
users must monitor water levels of each fire truck and helicopter, which appear toward
the bottom of the computer screen demarcated with a water percentage (%) left in
reserve. The fire trucks and helicopters can be refilled with water at clearly marked water
supply sites on the computer screen. Users can only see ¼ of the game space at a given
time, and must click on a map on the left side of the screen to move to other views of the
game (i.e. other parts). Thus the participants have multiple demands of the DDM
environment to consider when playing the game. The program produces a performance
score (dependent variable) at the end of each trial, determined by the percentage of
landscape that remains unburned.
Three trials of the NFC program were used. The trials were preceded by an
instruction session in which a ‘moving’ demonstration was provided using the software
program Captivate. Preceding the first trial was a three minute practice session to give
the participants a chance to familiarize themselves with the system controls. The three
successive trials all lasted 5 minutes apiece. Each of the three consecutive trials was of
increasing difficulty. Trial one was virtually static with limited wind change and only a
few developing fires of low intensity. Trial two, included more frequent fire
49
development, switches in wind direction and increases in wind intensity. The third trial
incorporated more difficult conditions, with more developed fires, more frequent wind
changes, and greater intensity levels of both wind and fire.
While the NFC uses the context of fire fighting it is not meant to provide direct
external validity. Instead the NFC, as with other dynamic decision making microworlds,
intends to simulate conditions of dynamism presented to individuals in external working
conditions (Brehmer & Dorner, 1993; Funke, 1991). The reason for using the
microworld is to “map the functional relationships between the variables studies in the
MV, and not necessarily the surface similarities between the MW and a particular field
setting” (DiFonzo, Hantula & Bordia, 1998, p282). An example of this is from
Brehmer’s study (1992), in which he used a similar fire fighting based microworld to
study “the more general problem of how subjects cope with spatio-temporal processes of
the kind exemplified by the fire fighting task” (pp. 214-215).
The Networked Fire Chief (NFC) has been used in a number of studies
measuring phenomena at both the group (McLennan et al., 2006) and the individual
decision-making level (Canas et al, 2003). The NFC has consistently demonstrated good
reliability at .70 (Omodei et al., 2001). Several studies help demonstrate validity for the
NFC as a decision making performance measurement. In one study, in the journal
Intelligence, Gonzalez et al., (2005) refute the different demands hypothesis by
measuring cognitive intelligence as a predictor of success in the NFC. This study
demonstrated a strong relationship between fluid intelligence (measured by the Ravens
Progressive Matrices) and performance on the Networked Fire Chief across 16 trials
(n=13, r=.60, p<.05). Another study outcome which supports the validity of the NFC as a
50
naturalistic decision-making scenario appeared in the journal Behavior and Research
Methods. This study predicted that the core perceptual-cognitive characteristics of
naturalistic decision-making; namely speed, accuracy, efficiency, and planning would be
associated with performance on the NFC (Glaser, 1976; Klein & Peio, 1989). The results
indicate significant correlations between behavioral aspects of each of these elements and
performance on the NFC (significant correlations ranging from r=.75, p<.01 for proactive
planning to r=.36, p<.05 for the percentage of time the closest appliance was allocated
first) (Elliott et al., 2007). In a series of studies with team performance in the NFC, a
team leader’s tendency to micromanage (i.e., bypassed subordinates) led to decreases in
overall performance (r=-.41, p<.05) (Omodei et al., 1999). One way to establish validity
for the NFC is to compare performance scores to an expert’s decision-making (the action
an expert would perform in the exact situation). In a study of decision quality, experts
rated appropriateness of priorities in decision strategy (on a scale of 1 to 4 with one being
completely inappropriate and 4 being completely appropriate) at four points in a final
experimental trial (Clancy et al., 2003). The NFC has a play back function, which allows
one to go back and review the actions taken during each trial. In this case there was a
significant correlation between expert rated decision quality and performance (r = 0.62, p
< .01).
At the end of each trial an overall performance score is produced by the program,
which is calculated adding every safe cell (i.e. unburned). Demonstrating gains over
losses after an individual has interacted with a changing environment is a way of
calculating adaptive performance (Baltes & Staudinger, 1996; Featherman et al, 1990). A
composite for the three trials is used as adaptive performance.
51
Measures for Independent Variables
Cognitive Openness - The Novelty Seeking Subscale from the Langer Mindfulness Scale
(LMS)
The Langer Mindfulness Scale (LMS) is a 21 item self-report scored on a seven
point Likert scale (Bodner and Langer, 2001). It measures an individual’s propensity to
experience mindfulness. The full scale consists of four subscales to include flexibility,
engagement, novelty seeking and novelty producing. Only the novelty seeking subscale
was used for this study as a measure of cognitive openness.
The novelty seeking subscale contains six items. Some examples of questions that
appear on the LMS Novelty Seeking Subscale are “I do not actively seek to learn new
things“ and “I try to think of new ways of doing things.” Validity for this scale is
established with correlations with the Big Five Factor Model of “openness to experience”
at (r=.50), the Multiple Perspectives Inventory (r=.64) and is negatively correlated with
the Need for Cognitive Closure (r = -.20) (all measures from Bodner, 2000).
The standardized factor loadings are .53 to .62 for the novelty seeking subscale
(Bodner, 2000). The Novelty Seeking subscale measures the manner of engagement with
the environment. Those scoring highly on this subscale are more likely to look for new
information and engage in new learning opportunities.
Cognitive Openness - Alternative Uses Test
The performance score of cognitive openness was assessed using the fluency
results from The Alternate Uses Test (AUT) (Guilford et al., 1978; Wallach & Krogan,
1965). The AUT requires one to list as many possible uses for a common item (such as a
52
brick, a rubber band, a paperclip, a newspaper, or shoe). This is a common test of
divergent thinking which is a label given to a collection of terms which include
elaboration, fluency, flexibility, redefinition, and originality (Guilford, 1956). While
divergent tests are criticized by some in their global measurement of creativity
(Sternberg, 1985) they do measure abilities of creative achievement (Barron &
Harrington, 1981). In this study participants were asked to think of as many uses for a
“brick” and a “paper clip” and were provided with four minutes per item to complete
their lists.
The AUT has long been used as a performance measure of creativity. While
cognitive openness is not meant to be a direct measure of creativity, there are obvious
links. In fact Openness as used in the Big Five is often conceptually related to creativity
to the point that scholars use the term creativity to refer to openness (Digman, 1990;
Matthews & Deary, 1998). Reliability for the AUT has been demonstrated from .62 to
.85 (AUT Manual). For this study validity is established with fluency scores on the AUT
correlating significantly with Openness on the NEO at (r=.46) (Chamorro-Premuzic,
2006), the Barron Symbolic Equivalence Test (r=.49) (Barron, 1988; Glicksohn et al.,
1993), and with greater sensitivity in a habituation process (r=.36) (Martindale et al.,
1996). The total number of responses for both objects is used to determine performance
of cognitive openness.
Focused Attention - The Attentional Control Scale - The Focus of Attention Subscale
The self report score for focused attention was assessed using the Focus of
Attention Subscale from The Attentional Control Scale (Derryberry & Rothbart, 1988).
53
This scale was designed to assess individual differences in voluntary attentional focusing
and attentional shifting. The measure has three subscales to include focus of attention,
shift of attention and flexible control of attention. An example of an item found in the
focus subscale is “When concentrating, I can focus my attention so that I become
unaware of what’s going on in the room around me.” High scores on the scale indicate a
self-perception of an ability to limit the influences of goal irrelevant information from the
environment. Validity for the total scale has been shown with correlations on the
Inhibitory Control Scale (r = .25), and with Trait Anxiety (r=.-50) (Derryberry &
Rothbart, 1988. The three scales together have been shown through factor analysis to
measure an overall capacity for attentional control. The measure is internally consistent
(alpha =.88). This study will use the 9-item subscale of focus of attention.
Focused Attention Performance- Go/NoGo Paradigm
The Go/NoGo was used as a performance test of focused attention. This test
employed a standard go/nogo paradigm in which participants must press a response key
as fast as possible when presented with a “go” stimuli. Conversely, the participant must
desist from pressing a key when presented with “no-go” stimuli. This task used 6 target
stimuli, presented in the form of shapes (squares with different textures 3x3 cm) in which
two of the squares are “go” targets and the remaining 4 are “no go” targets (See Figure 3,
Go No Go Shapes). Participants must memorize the two patterns that are to be “go”
stimuli. Then one of the six squares is presented and the participant must decide which
kind of stimuli it is (a Go or No/Go). The inter-stimulus interval varies between 1000 and
2000 ms. There are 60 trials and the time between the presentation of the target and the
54
response is measured and stored at the millisecond level. The ability to block out the “no
go” stimuli is an expression of focused attention ability.
The go/no go paradigm has been widely used to measure response inhibition in
both experimental and clinical studies (Garavan et al., 2002; Laurens et al., 2003). This
paradigm has been used on the Test of Attentional Performance (TAP) as a measure of
focused attention (Zimmermann and Fimm, 2000). The go/nogo paradigm demonstrates
validity, correlating with the Barrett Impulsiveness Scale (r=.40, p<.01) and perseverative
error on the Wisconsin Card Sorting Task (r = 0.46) (Keilp et al., 2005). Reliability for
go/no go has been demonstrated with split half and odd even coefficients at .998
(Zimmerman and Fimm, 2000). This study employed 60 trials and the time between the
presentation of the target and the response was measured and stored at the millisecond
level. Response times were used to measure focused attention ability.
Figure 3 – GO/NO/GO Shapes
55
Cognitive Flexibility Self Report- The Regulation Subscale of the Metacognitive
Awareness Inventory (MAI)
The Metacognitive Awareness Inventory (MAI) is a 52 item questionnaire scored
on a 7 point Likert scale with two main scales consisting of knowledge and regulation of
cognition.
The knowledge section contains 17 items broken down into three subscales:
declarative, procedural, and conditional. Knowledge scales measure the capacity of
understanding one has for their cognitive resources, how to use this understanding and
when to use it (Shraw & Dennison,1994).
The second main subscale, defined as regulation, is composed of 35 items. The
regulation section is made up of five subscales to include regulation of cognition:
planning, information management, monitoring, debugging, and evaluation. Schraw and
Dennison define the subscales as:
1. Planning: planning, goal setting, and allocating resources prior to learning;
2. Information management: skills and strategy sequences used on-line to process
information more efficiently;
3. Monitoring: assessment of one’s learning or strategy use;
4. Debugging: strategies used to correct comprehension and performance errors;
5. Evaluation: analysis of performance and strategy effectiveness after a learning
episode (Schraw & Dennison, 1994, pp. 474–475).
56
Only items from The Regulation subscale will be used to measure Cognitive
Flexibility. Items were chosen based on clean Factor loadings from Schraw and
Dennison’s original validation of the instrument.
Examples of items from the Regulation subscale include “I am aware of what
strategies I use when I study” “I consider several alternatives to a problem before I
answer.” “I change strategies when I fail to understand” and “I stop and reread when I get
confused” “I ask myself how well I accomplished my goals once I’m finished” “I think of
several ways to solve a problem and choose the best one.”
High scores on these subscales are indicative of an ability to monitor and control
strategies in goal driven learning. This is predicted to be the underlying abilities of
cognitive flexibility, as changing mental representation is necessary for strategy shifting.
The original MAI has shown strong reliability at .90 (Schraw & Dennison, 1994). The
cognitive regulation factor correlates with the Motivated Strategies for Learning
Questionnaire on the Individual Learning Strategies Scale (.72) (Pintrich et al., 1991). It
has shown consistently high validity and reliability in use across a range of task domains
(Schraw, 1998).
Cognitive Flexibility-The Stroop Task
The Stroop Task (Stroop, 1935) is a widely studied paradigm used to measure
aspects of cognitive control and flexibility. The word-color Stroop task requires the
participant to respond to a font color in which an incongruent word of a color is presented
(i.e. the word green written in red font). The participant must choose the word red instead
of choosing the word green, which is the stronger response (MacLeod, 1991). The ability
57
for participants to override the stronger response tendency (i.e., reading the color word
rather than responding to the color) and produce a correct response to an incongruent
Stroop stimulus requires additional time (known as the Stroop interference effect)
compared with control stimuli in which colors and words are congruent (Dyer, 1973;
Jensen & Rohwer, 1966; MacLeod, 1991).
In this study words were presented in colored font with two colored fonts below
and arrows pointing to the two different colors. Participants were told to press the
corresponding arrow to the font color that matches the stimuli presented. There were 60
total trials.
Validity for the Stroop Task has been demonstrated with a significant correlation on
the Self Monitoring Scale at (r=.43), with demonstrated reliability at .86 (Koch, 2003).
Reaction times for 60 trials were recorded at the level of milliseconds were used to
determine performance of cognitive flexibility.
Measure for Intelligence Tests
Verbal Intelligence - The Basic Word Vocabulary Test
The Basic Word Vocabulary (BWVT) test measures the number of basic words
that one knows (Dupuy, 1974). The BWVT was used as a measure of verbal intelligence.
The measure has high internal consistency (.96). The measure is highly correlated with
other nationally standardized measures of verbal ability. This includes a .76 is median
correlation of the BWVT with the verbal sections of five different nationally standardized
tests, including the Sequential Tests of Educational Progress and the School and College
58
abilities Tests (the STEP and the SCAT respectively) (Dupuy, 1974). Total correct
responses (out of 40) were used to measure verbal ability.
Spatial Intelligence - The Card Rotations Test
The Card Rotations Test from the ETS kit of reference tests for cognitive factors
(Ekstrom et al, 1976), is used as a measure of visual spatial intelligence. The test consists
of two sections each lasting three minutes. Subjects are asked to look at a two
dimensional shape on the left hand column of the screen. They must determine whether
the eight figures to the right are rotated within the plane or are mirror images of the
primary shape (see Figure 4 below, Card Rotations Test). Each section has 14 lines of
images which creates 112 potential correct responses, and 224 total possibilities. The test
has shown correlation (r=.66) for monozygotic twins who were reared apart (Ekstrom et
al, 1976). Validity for this test has been demonstrated with significant correlations on the
Tower of Hanoi task (r=.34) (Miyake et al., 2004), the Raven’s Progressive Matrices
(r=.40) (Pallier, 2000), and the Short Form Test of Academic Aptitude (r=.41) (Burns
and Gallini, 1983). The total number of correct responses were used to measure visual
spatial intelligence.
Figure 4 – Card Rotations Test
59
Participant Data Form
The participant data form is a self report designed to gather basic data such as age,
gender, and the frequency of video game play.
Research Hypotheses
The hypotheses for this study fall into four primary groupings. The first set of
hypotheses (H1-H11) predict that significant correlations will be found between
independent variables and the dependent variable. The next set of hypotheses (H12-H14)
predict that relationships will be found between various measures of each construct (e.g.
measures of performance, self report and other rater report for cognitive openness will be
significantly correlated). The next set of hypotheses (H 15-24) predict that each of the
individual variables and constructs will explain unique variance in the adaptive
performance score on the NFC. Finally, a combination of Cognitive Openness, Focused
Attention and Cognitive Flexibility (the formative construct of Cognitive Agility) are
predicted to demonstrate unique variance in the adaptive performance score (H25-H27).
Below is a list of each of the study hypotheses:
I.
Significant correlations between independent variables and dependent variable
Hypothesis 1: The total score on verbal intelligence will be significantly
correlated with the adaptive performance score on the NFC.
Hypothesis 2: The total score on visual-spatial intelligence will be
significantly correlated with the adaptive performance score on the NFC.
60
Hypothesis 3: The performance score for Focused Attention will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 4: The performance score for Cognitive Openness will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 5: The performance score for Cognitive Flexibility will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 6: The self-report measure for Focused Attention will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 7: The self-report measure for Cognitive Openness will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 8: The self-report measure for Cognitive Flexibility will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 9: The other rater-report measure for Focused Attention will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 10: The other rater-report measure for Cognitive Openness will be
significantly correlated with the adaptive performance score on the NFC.
Hypothesis 11: The other rater-report measure for Cognitive Flexibility will
be significantly correlated with the adaptive performance score on the NFC.
II.
Significant correlations between variables measuring same construct
Hypothesis 12: Cognitive Openness - Self report measure of Cognitive
Openness (LMS), the Performance score for Cognitive Openness (Alternate
Uses Test ) and the Other Rater Report of Cognitive Openness will be
significantly correlated.
61
Hypothesis 13: Focused Attention - Self Report measure of Focused Attention
(ACS), performance score for Focused Attention (Go/NoGo Paradigm) and
the Other Rater Report measure of Focused Attention will be significantly
correlated.
Hypothesis 14: Cognitive Flexibility - Self Report measure of Cognitive
Flexibility, the Performance score for Cognitive Flexibility (the Stroop Task)
and the Other Rater Report measure of Cognitive Flexibility will be
significantly correlated.
III.
Linear and Hierarchical Regression demonstrating unique variance explained
by each of the variables and constructs on adaptive performance on the NFC.
Hypothesis 15 - Intelligence (as a factor score) will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario.
Hypothesis 16: Focused attention performance score will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 17: Cognitive openness performance score will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 18: Flexibility performance score will explain significant variance
in the adaptive performance score in the dynamic decision making scenario
beyond that explained by intelligence.
62
Hypothesis 19: Focused Attention Self Report will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 20: Cognitive Openness Self Report will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 21: Cognitive Flexibility Self Report will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 22: Focused Attention Other Rater Report will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
Hypothesis 23: Cognitive Openness Other Rater Report will explain
significant variance in the adaptive performance score in the dynamic decision
making scenario beyond that explained by intelligence.
Hypothesis 24: Cognitive Flexibility Other Rater Report will explain
significant variance in the adaptive performance score in the dynamic decision
making scenario beyond that explained by intelligence.
IV.
Linear and Hierarchical Regression demonstrating unique variance explained
by the formative construct of Cognitive Agility on adaptive performance on
the NFC.
Hypothesis 25: The formative performance construct (composite of all three
performance measures) of cognitive agility will explain significant variance in
63
the adaptive performance score in the dynamic decision making scenario
beyond that explained by intelligence.
Hypothesis 26: The formative self report construct (composite of all three Self
Report measures) of cognitive agility will explain significant variance in the
adaptive performance score in the dynamic decision making scenario beyond
that explained by intelligence.
Hypothesis 27: The formative other rater report construct (composite of all
three other rater report measures) of cognitive agility will explain significant
variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence.
64
CHAPTER III
RESULTS
Results are presented as relevant to each research hypothesis. In addition, an ad
hoc results section follows to show findings not predicted but discovered during the
analysis.
Descriptive Statistics
Table 2 (Descriptive Statistics for the Dependent Variable) presents the
descriptive statistics for performance on the Networked Fire Chief. The descriptive
statistics of the performance scores for the three NFC trials (labeled Adaptive
Performance for NFC), suggest an increase in difficulty across the 3 trials as intended.
The mean score on Trial of 1 of the Fire Chief was 98.24 out of 100 with a standard
deviation of .685. The mean score on Trial 2 of the Fire Chief was 87.69 with a standard
deviation of 2.85. The mean score of Trial 3 of the Fire Chief was 78.19 with a standard
deviation of 3.23. The difference between Trials 1 and 2 was 10.5 points in performance.
The difference between Trials 2 and 3 was 9.5 points in performance. Thus the mean
65
change suggests a near uniform degree of increased difficulty, which was intended. The
mean score of total trials on the NFC was 264.13 with a standard deviation of 6.07. The
skewness and kurtosis scores all fall within threshold normality.
Table 2. Descriptive Statistics for Dependent Variable
Construct
Test
M
SD
N
Skewness
Statistic
SE
Kurtosis
Statistic
SE
Adaptive
Performance
NFC trial 1
98.24
.685
181
,112
.181
-.665
.359
Adaptive
Performance
NFC trial 2
87.69
2.850
181
.568
.181
-.118
.359
Adaptive
Performance
NFC trial 3
78.19
3.227
181
.685
.181
.085
.359
NFC All
Trials
264.13
6.073
181
.702
.181
-.055
.359
Table 3 (Descriptive Statistics for Independent Variables) presents the descriptive
statistics for the independent variables that form cognitive agility (cognitive openness,
focused attention and cognitive flexibility) and the three methods of measurement to
correspond with them (Performance, Self Report and Other Report).
The performance measure for Cognitive Openness was the Alternate Uses Test (2
trials), which had a Mean of 21.6 total items listed per participant with a standard
deviation of 8.76. The self report measure of Cognitive Openness yielded a mean score of
5.34 out of 7 with a standard deviation of .829. The Other Rater Report for Cognitive
Openness yielded a mean score of 5.44 with a standard deviation of .944.
The Performance measure for Focused Attention yielded a mean score of 1361.36
milliseconds with a standard deviation of 31.20 milliseconds. The Self Report measure of
Focused Attention had a mean score of 4.0 out of 7.0 and a standard deviation of 1.15.
66
The Other Rater Report measure of Focused Attention had a mean score of 4.6 and a
standard deviation of 1.2.
Table 3. Descriptive Statistics for Independent Variables of Cognitive Agility
Construct
Test
M
S.D
N
Skewness
Statistic
SE
Kurtosis
Statistic
SE
Cognitive
Openness
Performance
21.6
8.76
181
1.19 (.004)1
.181
3.95 (.-.071)1
.359
Self Report
5.34
.829
181
-.492
.181
-.023
.359
Other Report
5.44
.944
152
-.579
.197
-.382
.391
Performance
1361.36
31.20
181
-1.539 (.000)1
.181
8.603 (-.075)1
.359
Self Report
4.020
1.145
181
.112
.181
-.319
.359
Other Report
4.584
1.204
152
-.269
.197
-.422
.391
Performance
1332.90
233.859
181
.653
.181
1.359
.359
Self Report
4.576
.801
181
-.320
.181
.218
.359
Other Report
4.957
.888
152
-.374
.197
.481
.391
Focused
Attention
Flexibility
The Performance score for Cognitive Flexibility yielded a mean score of 1332.9
milliseconds with a standard deviation of 233.86. The Self Report measure of cognitive
flexibility had a Mean of 4.6 out of 7 with a standard deviation of .80. The Other Report
67
rater measure of cognitive flexibility had a mean score of 4.9 and a standard deviation of
.89.
Table 4 (Descriptive Statistics for Intelligence Measures) shows the descriptive
statistics for both the intelligence measures. The Card Rotations Test, which serves as
the visual spatial test of intelligence had a mean score of 170.02 out of 224. The standard
deviation was 24.58. The Basic Word Vocabulary Test had a mean score of 32.34 out of
40 with a standard deviation of 3.25.
Table 4. Descriptive Statistics for Intelligence Measures (n=181)
Construct
Test
M
SD
N
Skewness
Statistic
SE
Kurtosis
Statistic
SE
Intelligence
Card
Rotations
170.02
24.58
181
-.828
.181
2.866
.359
BWVT
32.34
3.25
181
-1.047
.181
1.929
.359
Intelligence
The skewness measure for independent variables are within normality (+/-1.00)
for all but two measures (George & Mallery, 2003). Performance measures for openness
and focused attention each have a skewness and kurtosis beyond +/-1.00 suggesting a
departure from suggested normality (George & Mallery, 2003; Morgan, Griego, &
Gloeckner, 2001). An area transformation using Rankits formula was employed which
created z scores from the standard normal distribution (Garson, 2008). This method was
successful in correcting the data to meet normality estimates. Both the raw data and
transformed data are shown in Table 2.
68
Reliability Estimates
Table 5. Scale Reliabilities (n=181 for self reports/n=152 for other reports)
Measure
Open Self
Open Other
Focus Self
Focus Other
No. Items
9
9
9
9
Cronbach’s
Alpha
.759
.804
.828
.890
Flex Self
35
Flex Other
35
.906
.951
Table 5 (Reliabilities) represents the reliability data for each scale. The total
questionnaire for Self and Other Rater Reports contained 58 items. As previously stated,
the items were identical with changes made only to the pronoun to reflect the rater’s
perspective (in the Other Rater Reports). Negative items were recoded for scoring
purposes. A factor analysis showed three strong factors as predicted. Five items from the
Self Report Cognitive Flexibility scale were dropped based on weak factor loadings.
Two items from the Self Report Focused Attention were dropped due to weak factor
loadings. Two items from the Self Report Openness Scale were dropped due to weak
factor loadings. One item from the Cognitive Flexibility Other Rater Report scale was
dropped due to weak factor loadings. One item was dropped from the Focused Attention
Other Rater Report due to weak factor loading. Three items were dropped from the
Cognitive Openness Other Rater Report due to weak factor loadings. The reliability
analysis was performed for all scales in the Self-Reports and Other Rater Reports. The
scale reliabilities ranged from .759 to .951. All of the Cronbach’s Alphas are very strong
and thus represent acceptable scale reliability.
69
Table 6 (Reliabilities for Performance Measures) represents the reliability data for
each of the performance tests (Independent and dependent measures). The Alternate Uses
Test demonstrated an alpha of .81; the Stroop Task demonstrated an alpha of .82 and the
Go/NO Go paradiam was .73. Both intelligence tests (the BWVT and the Card Rotations
Test) show alphas at .80. In addition, the dependent measure (NFC) demonstrated an
alpha of .73 across three trials.
Table 6. Reliabilities for Performance Measures (n=181)
Construct
Measure
Cognitive Openness
Alternate Uses Test
Focused Attention
Go No Go
Cognitive Flexibility
Stroop Task
Verbal Intelligence
Basic Word Voc Test
Spatial Intelligence
Card Rotations
Adaptive Performance
Networked Fire Chief
Reliability Estimate
.81
.73
.82
.80
.80
.80
Test of Hypotheses 1-11
This section provides the results for Hypotheses 1-11. The intercorrelations are
presented in Table 7 (Correlation Table). In general the dependent variable, Adaptive
Performance Score (Total Score of Three trials on the NFC), correlated significantly with
almost all of the independent variables. Despite this finding there are some mixed results
in terms of support for the first set of hypotheses, which predict that there will be
significant correlations between each of the individual independent variables and the
dependent variable.
70
Table 7. Correlation Matrix
Var
1
2
1
.237***
2
3
4
5
6
7
8
9
10
11
12
13
14
-
3
.161*
.112+
-
4
.323***
.112+
.188**
5
.115
-.017
-.029
.084
-
6
.087
-.026
-.016
.000
.346***
-
7
-.203**
-.155*
-.035
-.012
.536***
.449***
8
.221**
.134*
-.056
.019
.206**
.067
.145*
-
9
.159*
.018
-.057
.009
.054
.345***
.052
.236***
-
10
-.109+
.002
-.066
-.080
.100
.028
.218**
.471***
.049
-
11
.236***
.133+
-.078
.025
.225**
.114
.009
.109
.140+
-.012
-
12
.446***
.118
.101
.188*
.192*
.046
-.023
.113
.111
.122+
.161*
-
13
.414***
.032
.119
.080
.075
-.007
-.137+
.125+
.140+
-.114
-.037
.174*
14
.282***
.092
.052
.152**
.005
.035
-.059
.175**
.089
.038
.043
.139*
-
+ p<.1; * p<.05; ** p<.01; ***p<.001
N=181, except N=152 for vars 5-7
Number
1.
2.
3.
4.
5.
6.
7.
-
.427***
-
Variables 1-10, pearson r (1 tailed)/Variables 11-13, pearson r (2 tailed)/ Variable 14 (kendel tau).
Variable Name
Adaptive Performance
Cognitive Openness Performance
Focused Attention Performance
Cognitive Flexibility Performance
Cognitive Openness Other Rater Report
Focused Attention Other Rather Report
Cognitive Flexibility Other Rater Report
Number Variable Name
8.
Cognitive Openness Self Report
9.
Focused Attention Self Report
10.
Cognitive Flexibility Self Report
11.
Verbal Intelligence Test
12,
Spatial Intelligence Test
13.
Video Game Experience
14.
Gender
Hypothesis 1: A significant correlation will be found between the dependent
variable and the measure of verbal intelligence.
Hypothesis 1 was upheld as the measure of verbal intelligence (BWVT) was
significantly correlated with the adaptive performance score on the NFC (r=.236,
p<.001).
Hypothesis 2: A significant correlation will be demonstrated between the
dependent variable and the measure of visual-spatial intelligence.
Hypothesis 2 was upheld as the card rotations test used to measure visual spatial
intelligence was significantly correlated with the adaptive performance score on the NFC
(r=.446, p<.001).
71
Hypothesis 3: A significant correlation will be demonstrated between the
dependent variable and the performance measure of focused attention.
Hypothesis 3 was upheld as the go/no go test used to measure focused attention
performance was significantly correlated with the adaptive performance score on the
NFC (r=.161, p<.05).
Hypothesis 4: A significant correlation will be demonstrated between the
dependent variable and the performance measure of cognitive openness.
Hypothesis 4 was upheld as the Alternate Uses Test used to measure performance
of Cognitive Openness was significantly correlated with the adaptive performance score
on the NFC (r=.237, p<.001).
Hypothesis 5: A significant correlation will be demonstrated between the
dependent variable and the performance measure of cognitive flexibility.
Hypothesis 5 was upheld as the Stroop Task used to measure performance of
Cognitive Flexibility was significantly correlated with the adaptive performance score on
the NFC (r=.323, p<.001).
Hypothesis 6: A significant correlation will be demonstrated between the
dependent variable and the self report measure of focused attention.
Hypothesis 6 was upheld as the focused attention subscale of the Attentional
Control scale used to measure the self report score for focused attention was significantly
correlated with the adaptive performance score on the NFC (r=.159, p<.05).
Hypothesis 7: A significant correlation will be demonstrated between the
dependent variable and the self report measure of cognitive openness.
72
Hypothesis 7 was upheld as the Novelty Seeking subscale of the Langer
Mindfulness inventory used as the self report measure Cognitive Openness was
significantly correlated with the adaptive performance score on the NFC (r=.221, p<.01).
Hypothesis 8: A significant correlation will be demonstrated between the
dependent variable and the self report measure of cognitive flexibility.
Hypothesis 8 was not upheld as the cognitive regulation subscale of the Metacognitive Awareness Inventory used to measure the self report score for cognitive
flexibility reached a near significant correlation with the adaptive performance score on
the NFC, yet demonstrated an inverse relationship (r= -.109, p=.10).
Hypothesis 9: A significant correlation will be demonstrated between the
dependent variable and the other rater report measure of focused attention.
Hypothesis 9 was not upheld as the measure of focused attention as rated by an
external rater was not significantly correlated with the dependent variable (r=.087, p>.1)
Hypothesis 10: A significant correlation will be demonstrated-between the
dependent variable and the other rater report measure of Cognitive Openness.
Hypothesis 10 was not upheld as the measure of cognitive openness as rated by an
external rater was not significantly correlated with the dependent variable (r=.115, p>.1).
Hypothesis 11: A significant correlation will be demonstrated between the
dependent variable and the other rater report measure of cognitive flexibility.
Hypothesis 11 was not upheld as the measure of cognitive flexibility as rated by
an external rater was significantly correlated with the dependent variable, yet
demonstrated an inverse relationship (r=-.203, p< .01).
73
In summary of Hypotheses 1-11, only Openness Other Report and Focused
Attention Other Report did not correlate significantly or reach near significance with the
Fire Chief Performance score. The Other Rater Report measure of Cognitive Flexibility
and the Self Report measure of Cognitive Flexibility each had negative correlations with
Adaptive Performance (Cognitive Flexibility Other r=-.203, p=.01; Cognitive Flexibility
Self, r= -.109, p=.10).
Results for Hypotheses 12-14
As previously stated the corresponding correlation table is Table 7, in which the
results for these hypotheses can be found. There are some mixed results in terms of
support for this set of hypotheses, which predict that there will be significant correlations
between trait (e.g. openness self report, openness other report and openness
performance).
Hypothesis 12 - Self-report measure of Cognitive Openness (LMS), the
Performance score for Cognitive Openness (Alternate Uses Test) and the Other
Rater Report of Cognitive Openness will be significantly correlated.
In support of hypothesis 12, the Self Report for Cognitive Openness is correlated
with the Other Rater Report for Openness (r=.21, p<.01). In addition the Cognitive
Openness Performance and Cognitive Openness Self Report correlate at a near significant
level (r=.11, p<.1). However the other rater report for Cognitive Openness and the
Cognitive Openness Performance score are not significantly correlated (r=-.017, p>.1).
Therefore this hypothesis is not fully supported as not all measures are significantly
intercorrelated.
74
Hypothesis 13 - Self Report measure of Focused Attention (ACS), performance
score for Focused Attention (Go/NoGo Paradigm) and the Other Rater Report
measure of Focused Attention will be significantly correlated.
In support of hypothesis 13, the Self Report for Focused Attention and the Other
Rater Report for Focused Attention are significantly correlated (r=.35, p=.000). Yet,
neither the Self Report or Other Rater Report are significantly correlated with the
performance score for focused attention. In summary, Hypothesis 13 is not fully
supported as the performance measure does not appear to share a relationship to the self
and other rater reports for the focused attention construct.
Hypothesis 14 - Self Report measure of Cognitive Flexibility, the Performance
score for Cognitive Flexibility (the Stroop Task) and the Other Rater Report
measure of Cognitive Flexibility will be significantly correlated.
In support of hypothesis 14, the Self Report for Cognitive Flexibility and the
Other Rater Report for Cognitive Flexibility are correlated significantly (r=.22, p<.01).
Yet, the performance score for cognitive flexibility does not show any relationship to the
either the self report or other rater report. Therefore this hypothesis is not fully supported
by the results.
In summary, hypotheses 12-14 can only be partially supported by the results.
Support is evident in that the scores on the three Self Reports (for cognitive openness,
cognitive flexibility and focused attention) and Others Reports appear to be correlated
significantly (e.g. Flexibility are correlated significantly (r=.22, p=.01). Yet, the story is
not quite as clear for the correlations between Self and Other Reports and the
corresponding trait performance scores for each variable. In this case only the Openness
75
Performance measure correlates with a corresponding construct Self Report score (r=.11,
p<.1).
Factor Analysis
Given the proposed formative nature of the cognitive agility construct, one way to
further assess construct validity prior to regression analysis is through a factor analysis
(Rossiter, 2002). A principal component analysis with a promax rotation was conducted
with three factors. The number of factors was specified since the theory and hypotheses
suggested that the factors would form across method by construct (e.g. Openness
Performance, Openness Self Report and Openness Other Rater Report would form a
single factor). The factor analysis shows three solid factors with the exception of
Focused Attention Self Report which loads weekly (.331) on factor 2. The results of this
analysis appear in Table 8. Due to the formative nature of the proposed construct this
variable was left in the analysis, as its removal would violate the structure of the
proposed construct (Jarvis et al., 2003). The pattern matrix shows three factors, yet the
loadings are not consistent with the original hypotheses. Rather than loading by construct
the variables loaded by method. However this is to be expected in view of results from
the correlation matrix (Table 7) in which the only performance score to significantly
correlate with a questionnaire was Openness Performance and Openness Self Report
(r=.11, p<.1). As shown in Table 8, three factors emerge by method: Performance, Other
Rater Reports and Self Reports. These results help in grouping the variables for adequate
regression analyses.
76
Table 8. Pattern Matrix
N=181, except N=152 for other reports
Component
Variables
Openness Performance
Focus Performance
Flexibility Performance
Openness Other Report
Focus Other Report
Flexibility Other Report
Openness Self Report
Focus Self Report
Flexibility Self Report
1 (Other)
2 (Self)
3 (Perf)
-.188
.030
.118
.759
.762
.815
.033
.234
-.010
.286
-.153
-.074
.063
-.010
.045
.860
.331
.775
.574
.651
.702
.079
.048
-.101
.066
.069
-.174
Due to the unpredicted negative correlation that the cognitive flexibility self
report and other rater report have to the dependent variable, it will not be possible to
create a proper composite score. Therefore reliability analysis was run to see the alpha
score for the factors 1 and 2 established through the factor analysis. The Other Rater
Reports had a Cronbach’s Alpha of .703, which suggests a strong grouping. The Self
Report measures for the Self Report had an Alpha of .501, which is weak yet adequate.
Both scores are within the acceptable threshold. The performance measure does not
require a reliability score since it will be combined into a composite score or formative
construct (Jarvis, 2003).
Test of Hypotheses 15-24
Hypotheses 15-24 are tested through a series of regressions. Regression results
are found below in Tables 9 – 17. The tables include, R2, R2 and ß. In addition,
collinearity diagnostics are included with both Tolerance and VIF results.
Table 9 (Regression Analyses for Individual Performance Measures) shows the
regressions chart for all steps testing Hypotheses 15-18. Standard linear regression
77
models were used to measure the impact of each variable on overall performance on the
dependent variable. Prior to testing Hypothesis 15 the two intelligence scores were
formed in a factor score since both are well validated instruments (Hair, et al., 1998).
Combining the verbal and spatial measures of intelligence is used to create a general
cognitive intelligence factor reflecting both crystallized (verbal) and fluid intelligence
(spatial) (Cattell, 1963, 1943; Horn, 1985, 1998).
Hypothesis 15 predicted that the intelligence factor would demonstrate significant
variance when regressed on the total adaptive performance score in the dynamic decision
making scenario.
In Model 1 within Table 9, intelligence is regressed on the dependent variable.
Hypothesis 15 was supported as significant variance was found (R2 =.20, p<.001).
Hypotheses 16 -18 sought to measure whether the individual performance
variables of cognitive openness, focused attention and cognitive flexibility would
demonstrate significant variance beyond the intelligence factor when regressed on the
adaptive performance score in the dynamic decision making scenario (NFC). Regression
analysis was used to study the additive effects of the independent variables beyond
intelligence.
78
Table 9 . Regression Table for Intelligence and Individual Performance Measures
Variable
NFC- DDM
Performance
Intelligence Factor
R
R2
Model 1
Model 2a
Model 2b
Model 2c
Model 3
.45
.20***
.45
.20***
.45
.20***
.45
.20***
.45
.20***
Focus Performance
Openness Performance
Flexibility Performance
Cognitive Agility
R2
.16**.
17**
.27***
.35***
.20***
.22***
.02***
R2Δ
44.86*** 25.73***
FΔ
Tolerance
.973
1.00
VIF
1.028
1.00
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
.23***
.27***
.31***
.03***
26.26***
.973
1.028
.07***
32.90***
.981
1.020
.11**
16.02***
.959
1.043
The corresponding regression results for hypotheses (16-18) are also shown in
Table 9.
Hypothesis 16 states that the focused attention performance score will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence. Model 2a shows that Hypothesis 16 was
upheld as focused attention demonstrates unique variance beyond the intelligence factor
(ΔR2 =.02= p<.05; β=.16). The results of the collinearity diagnostics demonstrate a
Tolerance score of 1.00 and a VIF of 1.00.
Hypothesis 17 states that the cognitive openness performance score will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence Model 2b shows that Hypothesis 17 was
79
supported as the openness performance measure demonstrated unique variance beyond
the intelligence factor (ΔR2 =.03= p<.001; (β=.17).
Hypothesis 18 states that the Flexibility performance score will explain significant
Variable
Intelligence Factor
R
R2
Table 10. Regression Results for Individual Self Reports
NFC- DDM Adaptive Performance
Model 1
Model 2a
Model 2b
.45
.20***
Focus Self Report
Openness Self Report
Flexibility Self Report
R2
.45
.20***
.45
.20***
Model 2c
.45
.20***
.09
.16**
-.07
.20***
R2Δ
44.86***
FΔ
Tolerance
.979
VIF
1.022
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
.21
.23**
.21
.01
1.69
.979
1.022
.03**
5.68**
.973
1.028
.01
1.08
.992
1.008
variance in the adaptive performance score in the dynamic decision making scenario
beyond that explained by intelligence Hypothesis 18 was supported as Model 2c
(Cognitive Flexibility) shows unique variance beyond the intelligence factor (ΔR2 =.07,
p<.001; (β=.27).
80
The corresponding regression results for hypotheses (19-21) are shown in Table
10 (Regression Table for Individual Self Reports). Hypothesis 19 states that the Focused
Attention Self Report will explain significant variance in the adaptive performance score
in the dynamic decision making scenario beyond that explained by intelligence. Model
2a shows that Hypothesis 19 was not supported as the self report for focused attention did
not demonstrate unique variance beyond the intelligence factor (ΔR2 =.01= p=.195;
β=.08).
Hypothesis 20 states that the Cognitive Openness Self Report will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence. Model 2b shows that Hypothesis 20 was
supported as the openness self report measure demonstrated unique variance beyond the
intelligence factor (ΔR2 =.03= p<.05; β=.16).
Hypothesis 21 states that the Cognitive Flexibility Self Report will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence. Model 2c shows that Hypothesis 21 was
unsupported as the self report for cognitive flexibility did not demonstrate unique
variance beyond the intelligence factor (ΔR2 =.01= p=.30; β= -.07).
The corresponding regression results for hypotheses (22-24) are shown in Table
11 (Regression Table for Individual Other Rater Reports). Hypothesis 22 states that the
Focused Attention Other Rater Report will explain significant variance in the adaptive
performance score in the dynamic decision making scenario beyond that explained by
intelligence. Model 2a shows that Hypothesis 22 was not supported as the other rater
81
report for focused attention did not demonstrate unique variance beyond the intelligence
factor (ΔR2 =.00; p=.58; β=.04).
Hypothesis 23 states that the Cognitive Openness Other Rater Report will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence. Model 2b shows that Hypothesis 23 was
unsupported as the openness other report measure did not demonstrate unique variance
beyond the intelligence factor (ΔR2 =.00= p=.89; β=-.01).
Variable
Intelligence Factor
R
R2
Table 11 Regression Table for Individual Other Rater
NFC- DDM Adaptive Performance
Model 1
Model 2a
Model 2b
.45
.20***
Focus Other
Openness Other
Flexibility Other
R2
.45
.20***
.45
.20***
Model 2c
.45
.20***
.04
-.01
-.20***
.20***
.20
.20
.00
.00
R2Δ
44.86***
.30
.02
FΔ
Tolerance
.925
.989
.925
VIF
1.081
1.011
1.081
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181; N=152 for Other Reports
.24***
.04***
7.78***
1.00
1.00
Hypothesis 24 states that the Cognitive Flexibility Other Rater Report will explain
significant variance in the adaptive performance score in the dynamic decision making
scenario beyond that explained by intelligence. Model 2c shows that Hypothesis 24 was
82
partially supported as the other rater report for cognitive flexibility did demonstrate
unique variance beyond the intelligence factor (ΔR2 =.04= p<.001; β= -.20). Yet the
direction of the β suggests an inverse relationship which was not explicitly or implicitly
predicted by the original hypothesis.
Hypotheses 25-27
Based on the factor analysis and theoretical basis of cognitive agility, formative
constructs based on method were sought. Formative constructs are a composite of
multiple measures (MacCallum and Browne, 1993). Since multiple measures (i.e.
Cognitive Openness, Focused Attention and Cognitive Flexibility) create cognitive
agility, a formative or composite measure is ideal (Jarvis et al, 2003). Unfortunately,
given the negative correlations that both of the Cognitive Flexibility scales (Self Report
and Other Rater Report) have to the Dependent Variable, the Self Report and Other Rater
Reports could not be made into a composite or formative measure; since the negative
correlations would dilute the power of the measure. The performance scores each had
positive correlations with the dependent variable and therefore Cognitive Openness,
Focused Attention and Cognitive Flexibility were formed into a formative factor to be
used in the regression analysis, while Self Reports and Other Rater Reports were entered
into separate blocks in the regression analyses. As mentioned earlier, reliability scores
are unnecessary for the Performance scores since the composite is meant to be measuring
different facets of the same construct.
83
Hypothesis 25 sought to measure whether the performance method, formative
construct of cognitive agility, would demonstrate significant variance beyond the
intelligence factor when regressed on the adaptive performance score in the DDM.
Theory helps support the creation of a formative construct for cognitive agility.
Formative constructs are a composite of multiple measures (MacCallum and Browne,
1993). The formative construct for cognitive agility was created by taking a composite
score of cognitive openness, focused attention and cognitive flexibility (Jarvis et al,
2003). Results are presented within Table 12 (Regression Analysis for Cognitive Agility
Performance Measure). A regression analysis was run to show how the formative
construct measure of cognitive agility influenced the adaptive performance score on the
NFC. In Table 12, Model 1 provides the factor of Intelligence Tests (Spatial and Verbal).
The addition of the cognitive agility formative construct entered listwise provided 11.0%
unique variance on the NFC beyond the intelligence factor (ΔR2 =.11) . The total R2 in
this model is .31 and is significant (p= .000). The beta weight shows that the measure of
cognitive agility was positively related to the adaptive performance score on the NFC (β=
.35).
84
Table 12. Regression Table for Cognitive Agility Performance Measure
Variable
NFC- DDM Performance
Model 1
Model 2
Intelligence Factor
R
R2
.45
.20***
Cognitive Agility
R2
.45
.20***
.35***
.20***
R2Δ
44.86***
FΔ
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
.31***
.11**
16.02***
Hypothesis 26 sought to demonstrate that the formative self report construct
(composite of all three Self Report measures) of cognitive agility will explain significant
variance in the adaptive performance score in the dynamic decision making scenario
beyond that explained by intelligence. As mentioned earlier, given the negative
correlations that both of the Cognitive Flexibility scales (Self Report and Other Rater
Report) have to the Dependent Variable, the Self Report and Other Rater Reports could
not be made into a composite or true formative measure; since the negative correlations
would dilute the power of the measure. Therefore a linear regression was used entering
all three self report measures into a single block listwise. This method will allow each of
the individual measures to interactively demonstrate impact on the dependent variable
rather than have the cognitive flexibility measure dilute the results with a negative
correlation. Results for this regression are found in Table 13. Hypothesis 26 was upheld
as the addition of the self reports accounted for 6% unique explained variance in the
85
adaptive performance score on the dynamic decision making scenario (ΔR2 =. 06,
p=.006).
Though a closer examination of the results demonstrate findings that are
more nuanced. The focused attention self report within Model 2 does not add any unique
variance to the overall model (β= .05, p=.49). The cognitive flexibility self report does
add significant unique variance within the model, yet as discussed previously, it retains
an inverse relationship to the dependent variable (β= -.19, p=.01). The cognitive
openness self report adds unique variance within the model as predicted (β=.24, p=.002).
Table 13. Regression for Cognitive Agility Self Reports
Self report
Model 1
Model 2
Intelligence Factor
R
.45
.45
R2
.20***
.20***
Variable
Focus Self Reports
.05
.924
Tolerance
VIF
1.082
Openness Self Reports
.24**
.709
Tolerance
VIF
1.411
Flexibility Self Reports
-.19**
.751
Tolerance
VIF
R2
1.331
.20***
.26**
R2Δ
44.86***
FΔ
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
.06**
4.35**
Hypothesis 27 sought to demonstrate that the formative other rater report
construct (composite of all three Other Rater Report measures) of cognitive agility will
explain significant variance in the adaptive performance score in the dynamic decision
making scenario beyond that explained by intelligence. Like the self reports, the other
rater report for cognitive flexibility has a negative correlation with the Dependent
86
Variable. Therefore a linear regression was used entering all three other rater report
measures into a single block listwise. Table 13 shows the results of this regression
analysis. Hypothesis 27 was upheld as the total model explained 6% unique variance on
the dependent variable (ΔR2 =.06, p=.002). Like the self reports, the other reports have
more nuanced results beyond the total change in R2 . The focused attention other rater
measure is significant in this model (β=.14, p=.05). The Cognitive Openness other rater
measure is near significant in this model (β=.13, p=.10). The Cognitive Flexibility other
rater measure retains significance but its relationship remains negatively associated to the
dependent variable in this model (β= - .32, p=.000).
Table 14. Regression for Cognitive Agility Other Reports
Other report
Model 1
Model 2
Intelligence Factor
R
.45
.45
R2
.20***
.20***
Variable
Focus Other
.14*
Tolerance
VIF
.778
.1.286
Openness Other
.13+
Tolerance
VIF
.626
.1.592
Flexibility Other
-.32***
Tolerance
VIF
R2
.606
.1.649
.20***
.26**
.06**
R2Δ
44.86***
5.14**
FΔ
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181; N=152 for Other Reports
87
Table 21. Summary of Hypotheses Support
Hypotheses 1-11:
Significant Correlations Between Independent Variables and Dependent Variable
Hypothesis
1. Verbal Intelligence
2. Spatial Intelligence
3. Focus Performance
4. Openness Performance
5. Flexibility Performance
6. Focus Self Report
7. Openness Self Report
8. Flexibility Self Report
9. Focus Other Report
10. Openness Other Report
11. Flexibility Other Report
Supported
Unsupported
Partial/Mixed
Support
√
√
√
√
√
√
√
√
√
√
√
Hypotheses 12-14:
Significant Correlations Between Variables Measuring the Same Construct
Hypothesis
Supported
Unsupported
12. Correlations Between
Focus Measures
13. Correlations
Between Openness Measures
14. Correlations
Between Flexibility Measures
Partial/Mixed
Support
√
√
√
Hypotheses 15-24:
Unique Variance Explained by each of the Variables (beyond intelligence)
Hypothesis
15. Intelligence Factor
16. Focus Performance
17. Openness Performance
18. Flexibility Performance
19. Focus Self Report
20. Openness Self Report
21. Flexibility Self Report
22. Focus Other Report
23. Openness Other Report
24. Flexibility Other Report
Supported
Unsupported
Partial/Mixed
Support
√
√
√
√
√
√
√
√
√
√
Hypotheses 25-27:
Unique Variance explained by Cognitive Agility by Method
Hypothesis
25. Performance
26. Self Reports
27. Other Reports
Supported
√
√
√
88
Unsupported
Partial/Mixed
Support
Post Hoc Analysis
There were two additional variables collected as potential control variables.
Previous Micro World studies used as foundational literature for this study do not
investigate the effects of Gender or Video Game Experience (Gonzalez et al 2005;
LePine et al., 2000). Yet it made intuitive sense that the Dependent Variable was similar
to a video or computer game. Therefore the amount of weekly video game experience
was a question included in the demographic questionnaire. Gender was collected as basic
demographic information. Males play more video games than females and generally
perform better on them (Brown et al, 1997), so therefore it is expected that those two
variables may be related.
As evidenced from the correlation table (Table 7 – Correlation Matrix) both
gender and video game experience showed significant relationship to the dependent
variable. Video game experience was significantly correlated to the adaptive
performance score from the NFC (.41, p<.001). As well, gender was significantly
correlated to the NFC (.28, p<.001). The two potential control variables were
significantly correlated (Gender and Video Game Experience (r=.427, p=.000). This
suggests that both measures may be potential control variables to support further
explanation of performance variation on the NFC.
Several other interesting relationships emerge from the correlation matrix
regarding gender, which was collected as a potential control variable. The Cognitive
Flexibility Performance score was significantly correlated with gender (r=.152, p=.01).
Also, Openness Self Report and Gender were significantly correlated (r=.175, p=.01).
89
Finally, Gender and the test of visual-spatial intelligence were significantly correlated
(Gender and The Card Rotations Test at r=.139, p<. 05).
Regressions were performed to see if the independent variables retained their
effect on the dependent variable beyond Gender and Video Game Experience. Table 14
(Regression Analysis For Control Measures and Individual Performance Scores) shows
the results of these operations as they pertain to the individual performance measures. In
the first model, the two control variable each explain unique variance on the adaptive
performance score on the NFC decision making scenario. The two measures entered
listwise within the same step explain about 22% of the variance (ΔR2 =.22, p<.001).
According to the regression results males perform better than women on the NFC (β=.24,
p=.001). Video game experience also demonstrated unique variance (β=.34, p<.001).
The intelligence factor was then entered in a separate block in the regression analysis.
The intelligence factor still retains significance beyond gender and video game
experience (β=.40, p<.001). The intelligence factor explains approximately 16% unique
variance beyond the control variables. This is approximately 4% less variance than
intelligence accounted for in the regression without the control variables. Next the
performance measure for focused attention explains approximately 1% variance at near
significant levels (ΔR2 =.01, β=.11; p=.07) beyond the controls and intelligence factor.
The measure of cognitive openness performance still remains significant beyond controls
and intelligence, accounting for about 2% explained variance (β=.15; p=.016 ). The
measure of cognitive flexibility performance score also remains significant, accounting
for 8% of explained variance (β=.30, p<.001 ).
90
Table 14. Regression Analysis of Individual Performance Measures beyond Controls
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3a
Model 3b
Model 3c
R
.47
.47
.47
.47
.47
R2
.22***
.22***
.22***
.22***
.22***
Video Game Experience and Gender
Intelligence Factor
.39***
.960
1.041
Tolerance
VIF
Focus Performance
.11+
.985
1.015
Tolerance
VIF
Openness Performance
.15**
Tolerance
VIF
.964
1.037
Flexibility Performance
.30***
Tolerance
VIF
.926
1.079
R2
.22***
.38***
.39+
.40**
.46***
R2Δ
-
.16***
.01+
.02**
.08***
FΔ
25.52***
44.08***
3.19+
5.94**
27.32***
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
Next a similar set of regressions was run to measure the additive effect of the
individual self report and other report measures beyond the controls and intelligence.
Table 15 presents these results. In this case only the measures that were significant
91
beyond intelligence from the previous set of regression were included in this analysis.
Cognitive openness self report no longer remains significant after the control variables
and the intelligence factor (β=.087, p=.15). The cognitive flexibility other rater report
remains significant accounting for 2% of unique variance beyond the controls and
intelligence (β=-.14, p=.017).
Table. 15 Regression Table Significant Self and Other Reports with Controls
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3a
Model 3b
.47
.22***
.47
.22***
.47
.22***
.47
.22***
Video Game Experience and Gender
R
R2
Intelligence Factor
.39***
Tolerance
VIF
.975
1.026
Flexibility Other report
-.14**
Tolerance
VIF
.981
1.020
Openness Self-report
.09
Tolerance
VIF
R2
.939
1.065
.22***
.38***
.39**
.39
2
RΔ
-
.16***
.01**
.01
FΔ
25.52***
44.08***
3.19**
2.05
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181; N=152 for Other Report
As with prior steps, the next set of analyses was to use linear and hierarchical
regressions to test the additive impact of cognitive agility beyond the controls and
intelligence. The formative construct of cognitive agility for performance measures
remains significant, accounting for 9% unique variance (B=.31, p<.001). This result is
92
found in Table 16 – Regression Analysis for Cognitive Agility Performance Score
Beyond Controls.
Table 16. Regression Analysis of Cognitive Agility Performance Measure beyond
Controls
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3
R
.47
.47
.47
R2
.22***
.22***
.22***
Video Game Experience and Gender
Intelligence Factor
.39***
Tolerance
VIF
.946
1.057
Cognitive Agility Performance
.31***
Tolerance
VIF
R2
.935
1.070
.22***
.38***
.47***
2
RΔ
-
.16***
.09***
FΔ
25.52***
44.08***
29.57***
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
93
Table 17. Regression Analysis for Cognitive Agility Self Report with Controls
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3
Video Game Experience and Gender
Variable
R
2
R
.47
.47
.47
.22***
.22***
.22***
Intelligence Factor
.39***
Focus Self Report
Openness Self Report
.01
.15**
Flexibility Self Report
-.13+
2
.22***
.38***
.40
2
-
.16***
.02
25.52***
44.08***
1.80
R
RΔ
FΔ
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
Since a composite score was not feasible for the self and other rater reports, the
three corresponding measures were entered listwise into a single step within a regression.
The results of the three self report measures indicate an overall model that is no longer
significant (ΔR2 =.02, p=.148). When looking at the individual measures within the
model there remains some variation as to the significance of each. The self report for
focused attention remains an insignificant predictor of performance (β=.008, p=.89). The
self report for cognitive openness remained significant in this model (β=.15, p= .038).
The cognitive flexibility self report which was not significant in the model as a singular
variable beyond intelligence, reaches near significance in this model (β= -.13, p=.07).
Table 17 demonstrates these results.
94
Table 17. Regression Analysis for cognitive agility Other Report with Controls
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3
R
.47
.47
.47
R2
.22***
.22***
.22***
Video Game Experience and Gender
Intelligence Factor
.39***
Focus Other Report
.09*
Tolerance
VIF
.771
1.296
Openness Other Report
.11
Tolerance
VIF
.612
1.633
Flexibility Other Report
-.24**
Tolerance
VIF
R2
.584
1.712
.22***
.38***
.41*
R2Δ
-
.16***
.03*
FΔ
25.52***
44.08***
3.60**
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181; N=152 for Other Reports
The other rater report version of cognitive agility accounted for nearly 4% unique
variance beyond the controls and intelligence. (R2 =.037, p=.014). Focused Attention
95
other rater report reaches near significance in this current model (β=.09, p<.1). Cognitive
openness which was near significant in the prior model, no longer retains any
significance. (β=.11, p=.23). Cognitive Flexibility remains a significant variable in this
model (β=.-24, p=.002). Table 18 shows these results.
Next, a hierarchical regression was run including the controls, the intelligence
factor, the composite performance measure, the three self reports and the three other rater
reports. The results of this are displayed in Table 19. The total model has an R2 of .499
and is significant (p=.000). The composite measure, self reports and other reports
(entered in a single step listwise) explain 12% unique variance beyond the controls and
intelligence. The only measures that remain significant are Cognitive Flexibility Other
Rater Report (β= -.19, p=.008) and the Composite Performance measure (β=.275,
p=.000). The Self Report Cognitive Openness results suggest a near significant
relationship in this model to the dependent variable (β=.12, p=. 074).
96
Table 19. Regression Analysis for All Measures with Controls
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3
.47
.47
.47
.22***
.22***
.22***
Video Game Experience and Gender
R
2
R
Intelligence Factor
.39***
Focused Attention Self-Report
-.01
Tolerance
VIF
.785
1.274
Openness Self Report
.15+
Tolerance
VIF
.688
1.453
Flexibility Self Report
-.05
Tolerance
VIF
.730
1.369
Focused Attention Other Report
.10
Tolerance
VIF
.667
1.500
Cognitive Openness Other Report
.06
Tolerance
VIF
.596
1.679
Cognitive Flexibility Other Report
-.19**
Tolerance
VIF
.550
1.818
Performance Cognitive Agility
.28***
Tolerance
VIF
.854
1.170
R2
.22***
.38***
.50***
2
RΔ
-
.16***
.12***
FΔ
25.52***
44.08***
5.88***
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181; N=152 for Other Rater Reports
97
Table 20. Correlation Matrix - FEMALE
Var
1
2
3
4
5
6
7
8
9
10
11
12
1.NFC
2.OPEN_PERF
.135
3.FOCUS_PER
-.144
.093
-
+
.073
.074
-.134
.069
.056
-.051
.320**
-.110
-.095
.619***
.596**
-.183
-.110
.205
.040
.129
-.159
.074
.444***
.284*
.286**
-
4.FLEX_PER
-
.193
-
-
5.OPEN_OTHER
.051
-.173
6.FOCUS_OTHER
.000
-.181
7.FLEX-OTHER
-.080
8.OPEN_SELF
.215
.000
9.FOCUS_SELF
-.046
-.016
10.FLEX_SLEF
-.030
.027
-.185
-.059
.273*
.137
.202
.473***
.129
-
11.INT_FACT
.307**
.144
-.078
.067
.094
.055
.032
.033
.098
.086
-
12.VIDEO_GAME
.068
-.045
.006
.112
.068
.093
-.061
-.076
.003
.079
.209
+
-.217
+ p<.1; * p<.05; ** p<.01; ***p<.001
+
-.209
+
-
-
Variables 1-10, pearson r (1 tailed) N=101, except N=89 for vars 5-7
Table 21. Correlation Matrix - MALES
Var
1
2
3
4
5
6
7
8
9
10
11
12
1.NFC
2.OPEN_PERF
.255**
-
3.FOCUS_PER
.254**
.112
4.FLEX_PER
.590***
5.OPEN_OTHER
.152
.071
.016
.103
6.FOCUS_OTHER
.110
.080
-.070
.092
.375***
7.FLEX-OTHER
-.273**
-.110
.060
.047
.482***
.319**
8.OPEN_SELF
.136
.198*
-.011
.215*
.213*
.064
.175
-
9.FOCUS_SELF
.210*
.019
.077
.043
.269**
-.087
+
.-.027
-.005
-.086
.-012
-.071
.234*
10.FLEX_SLEF
-.198
*
.189
+
.304**
-
.195
-
+
-
11.INT_FACT
.512***
.162
.053
.371**
.385**
.132
-.032
12.VIDEO_GAME
.355***
.042
.087
.162
.028
-.102
-.097
+ p<.1; * p<.05; ** p<.01; ***p<.001
Variables 1-10, pearson r (1 tailed) N=101, except N=89 for vars 5-7
98
.170
.470***
.194
+
-.021
-.020
-
+
-.102
-
-.007
.256*
*
.110
.191
-
A final analysis was performed to see the potential difference in the relationships
between the predictor variables and the dependent variable broken out by Gender. This is
not a theoretically driven analysis but one performed ad hoc based solely on the
significant correlation between Gender and adaptive performance on the Networked Fire
Chief (r=.23, p<.001). Tables 20 and 21 display the results of this analysis. Table 20
shows just the female correlations and Table 21 is just the male group.
Overall the results taken from the two tables demonstrate a power of the predictor
variables emerging largely from the male gender rather than the female gender or a
balance of the two. For the male group, most notably, each of the performance measures
result in higher correlations to the dependent variable than in the overall correlation
matrix found in Table 7 (Cognitive Openness, r=.25, p<.05; Focused Attention, r=.25,
p<.05; Cognitive Flexibility, r=.59, p<.001). The Cognitive Flexibility Other Rater
Report had a stronger negative correlation for the males (r=-.273, p<.05). The Cognitive
Openness Self Report was no longer significant for the males (r=.13, p=.17), while the
Focused Attention Self Report (r=.21,p<.05) and the Cognitive Flexibility Self Report
(r=-.19, p<.05) are more significant (these comparison are being made to the total
correlation matrix in Table 7). The correlations for the women are distinctly less
significant. There is not a single predictor variable that reaches significance other than
the intelligence factor (variable number 11 in Table 20).
Based on these initial findings a regression analysis was run as a test of
moderation. Given the gender driven correlations, the regression below (Table 22) shows
how gender moderates the relationship between Cognitive Agility and the Networked
99
Fire Chief. Model 1 shows the impact of the Cognitive Agility formative performance
measure, model 2 shows the impact of Gender and Model 3 shows the impact of the
interaction term – Gender X Cognitive Agility. The test shows a moderating relationship
for Gender (R2Δ =.05, p<.001).
Table 22. Regression Results for Moderator Variable Gender/Agility
Variable
NFC- DDM Adaptive Performance
Model 1
Model 2
Model 3
Cognitive Agility Per
R
.42
.42
.42
R2
.17***
.17***
.17***
Gender
Moderation Interaction
Gender X Cognitive
Agility
R2
.35***
.27***
.17***
.30***
.12***
R2Δ
38.58***
32.23***
FΔ
+ p<.1; * p<.05; ** p<.01; ***p<.001; N=181
100
.35**
.05***
5.68**
CHAPTER IV
DISCUSSION
Discussion of Results
This chapter provides a discussion of the empirical results. As noted throughout,
there is a need to better understand the cognitive abilities necessary for successful
adaptation. The results of this study demonstrate the need for greater knowledge
regarding these abilities that support adaptability in a real time dynamic context. Of
particular interest here, is the formative construct of cognitive agility and whether it
predicts success in a dynamic decision making exercise beyond measures of intelligence.
Discussion Overview
The primary research question asks whether there is a relationship between the
combination of cognitive openness, focused attention and cognitive flexibility, and
adaptation within a changing task condition. Specifically, the following general
hypotheses were made:
101
1.
There would be a significant relationship between each of the
independent variables and the dependent variable (Hypotheses 1-11).
2.
There would be a positive relationship between trait measures of each
construct (e.g. The multiple measures of Openness would be
significantly correlated across the methods of Self, Other and
Performance). (Hypotheses 12-14).
3.
Building off of Hypotheses 1-11; each of the independent variables
were intended to predict performance outcomes in the dependent
variable beyond intelligence (Hypotheses 15-24).
4.
Lastly, it was hypothesized that each of the methods of observation –
self reports, other rater reports and performance tests (e.g. the
composite measure of Openness Performance, Focus Performance and
Flexibility Performance) would demonstrate unique variance beyond
intelligence tests when regressed on the DDM. (Hypotheses 25-27).
Overall, it was predicted that a real time dynamic decision-making task requires one to
operate flexibly with cognitive openness and focused attention in order to successfully
adapt in real-time.
Summary of Results
Overall, the results of this study support the idea that cognitive agility is important
to discuss within a real time dynamic decision-making task context. The overall adaptive
performance score on the dynamic decision-making microworld (the Networked Fire
102
Chief or NFC) significantly correlates with most of the independent variables and the
potential control variables (gender and video game experience). The only variables that
do not significantly correlate with the dependent variable are the Other Rater Reports:
Focused Attention and Cognitive Openness. However, there are mixed results regarding
the impact that each of the individual independent variables have in predicting adaptive
performance on the dynamic decision making microworld beyond intelligence. Yet
perhaps of greatest significance, cognitive agility, as measured for each method (selfreport, other rater report and performance tests), accounts for unique variance beyond the
measures of intelligence.
Discussion of Results for Hypotheses 1-11
Overall the first set of hypotheses produce strong results. As discussed above, all
but two of the independent variables correlate significantly with the dependent variable
(adaptive performance on the dynamic decision making microworld). Despite mostly
significant correlations, two of these correlations inversely relate to the dependent
variable. The inversely related variables present the greatest cause for concern regarding
the current study and present an interesting point for discussion.
One intriguing finding that was not predicted were the negative correlations found
between Self and Other Questionnaires for Cognitive Flexibility with the Dependent
Variable (NFC). (Self report r=-.11, p<.1; Other report r=-.20, p<.01). The self
report/other rater report Cognitive Flexibility measures are taken from the metacognitive
regulation subscale of the Metacognitive Awareness Inventory (Schraw and Dennison,
103
1994). Metacognitive regulation accounts for control of cognitive operations leading to
greater flexibility (Fernandez-Duque et al., 2000; Nelson & Narens, 1994). It is predicted
that metacognitive regulation (i.e. monitoring and control over one’s cognition) leads to
increased adaptive performance in a real time dynamic context. Research shows that
metacognitive capacities are linked to expert decision making (Chi et al., 1991)
successful problem solving (Clements, 1986; Mayer, 1998) and increased adaptability in
uncertain and dynamic contexts (Earley & Ang, 2003). In addition metacognitive
capacities have been associated with higher performance in real time activities like
aviating (Valot, 2002) and driving (Lesch & Hancock, 2004). Therefore it was predicted
that in an environment of change that being able to regulate one’s cognition to meet the
demands of the context would lead to higher performance.
The results from this study do not support the predictions made regarding the
Cognitive Flexibility self/other report measures. Metacognitive regulation has been
predictive of success in dynamic contexts but has not been widely tested as a self report
measure in a true real time dynamic decision making microworld (Haynie, 2005).
Dynamic decision making microworlds require a series of continual and real-time rapid
decisions, which may represent a completely distinct class of performance, separate from
single decisions (Sitkin & Pablo, 1992). Perhaps the ongoing and constant decisions that
must be made require one to be less “regulated” in how one thinks through each decision.
In such a dynamic context, speed of operation compliments accuracy of decisions.
Strong metacognitive regulatory tendencies may increase a participant’s likelihood to
monitor and evaluate his/her ongoing perception of success and failure (Jacobs & Paris,
1987; Schraw & Dennison, 1994). Past studies indicate that as one detects error they
104
tend to slow down speed of operation in order to increase accuracy (Rabbit, 1966; Rabbit
& Rodgers, 1977; Robertson, Manly, Andrade, Baddeley, & Yiend, 1997). Therefore in
this context, participants with strong metacognitive regulation may by more engaged in
online monitoring and evaluation and therefore more aware of possible errors. This
awareness may slow down processing at a time when doing so may cause performance
decrements. This performance deficit may also be linked to literature on intuitive thought,
which suggests that such metacognitve activity creates interference in necessary
unconscious material (Baylor, 2001; Kuhn, 1989; Mandler 1995). Given the short time
horizon of the microworld used here (15 minutes), slowing down and controlling
cognitive operation may lead to a decrease in performance.
Two independent variables do not significantly correlate with the dependent
variable. These variables are the Cognitive Openness other rater report (r=.115, p>.1)
and the Focused Attention other rater report (r=.087, p>.1). Therefore, how an external
rater views a ratees’ tendency to be cognitively open or focused, is of little to no
consequence to performance in the context of a real time dynamic decision making
microworld. In contrast both of the corresponding self reports demonstrate significant
correlations with the dependent variable (Cognitive Openness self report, r=.221, p<.01;
Focused Attention self report, r=.159, p<.05). In this case self-perception relates stronger
to adaptive performance than the other’s perception. Ashford (1989) stated that
“Individuals often know more about their abilities...than do observers because the abilities
may not yet become manifest in actual behavior ” (p. 141). Since items from both of these
measures describe some elements of potentially private experience (e.g. “He/She is very
curious” and “When he/she needs to concentrate and solve a problem, he/she has trouble
focusing their attention” there may be some important internal experience that is not fully
105
conveyed to an external party.
The remainder of the variables demonstrate significant correlations with the
dependent variable. The correlations ranged from .446 (p<.001) for the visual spatial
intelligence measure to .159 (p<.05) for the Focused Attention self report. This is
suggestive of moderate to strong relationships between the independent variables and the
dependent variable. Overall, there are some potential measurement issues – namely the
inverse correlations between measures of Cognitive Flexibility (in the self and other rater
reports) and the dependent variables. Yet, most measures correlate significantly with the
dependent variable suggesting promise regarding measurement choice.
Discussion of results Hypotheses 12-14
The results do not fully support each of the hypotheses generated by the mixed
methods and mixed traits model (MTMM) that was followed (Table 23 shows MTMM
Matrix Results). Table 23 is simply another way of examining the correlation matrix
with regard to the variables that form cognitive agility. Table 23 shows the correlations
between traits, between methods, across traits and across methods (with the reliabilities in
parentheses). Mixed trait mixed method models (MTMM) help validate constructs by
suggesting that correlations between traits and methods will be significant (Campbell and
Fiske, 1959) and that correlations will be greatest between trait and within method.
While it was anticipated that significant correlations would be found within traits and
between methods, the results do not fully support this. The limited support for these
106
hypotheses is seen most strongly in regard to the lack of significant correlations between
performance measures and the corresponding self and other rater reports.
Only the Cognitive Openness performance measure (the Alternate Uses Test)
shows a significant correlation to a corresponding questionnaire (see Table 23 and Table
7 for full Correlation Matrix). In this case there is a small, yet near significant correlation
between the Cognitive Openness Performance measure and the Self Report for Cognitive
Openness (.11 p<.1). Neither the Focused Attention or Cognitive Flexibility
Performance measures correlate with any of the corresponding self and other rater report
measures for each trait. These findings suggest that the performance tests do not measure
the same construct measured by the questionnaires. Since each of these performance
tests has never been measured alongside the chosen questionnaires these results offer
significant findings to the extent literature.
Table 23. MTMM Output
Traits
Self Reports
OP1 FA1 FL1
Other Raters
OP2 FA2 FL2
Self Reports
OP1
FA1
FL1
(.76)
.24 (.83)
.47 .05 (.91)
Other Raters
OP2
FA2
FL2
.21
.07
.15
.05
.35
.05
.10
.03
.22
(.80)
.35 (.90)
.54 .45 (.95)
Performance
OP3
FA3
FL3
.13
-.06
.02
.02
-.06
.01
.00
-.07
-.08
-.02 -.03 -.16
-.03 -.02 -.04
.08 .00 -.01
Performance
OP3 FA3 FL3
(.81)
.11
.11
(.70)
.19 (.82)
There are several possible explanations for the lack of relationship between the
questionnaires and performance tests. One possible reason for this measurement
difference may be the fact that the performance tests measure ability while the
107
questionnaires measure behavioral practices or different aspects of perceived
performance. It is not unusual to find that explicit self-reports do not correlate strongly
with implicit reaction time tasks (Blair, 2001; Dovidio, Kawakami, & Beach, 2001).
There are several potential reasons for this. One may be due to Greenwald and Banaji’s
(1995) definition of implicit representation as “introspectively unidentified traces of past
experience” (p.5). This indicates that absence of correlations are a result of differences in
awareness regarding implicit representation (Greenwald and Banaji, 1995). Another
explanation for correlation differences can be taken from Wilson, Lindsey, and
Schooler’s (2000) dual attitudes model. This suggests that implicit measurement happens
automatically when an individual is confronting the stimuli but explicit judgments are
subject to motivational and cognitive limitations in the retrieval process. Thus
correlations tend to be lower when individuals engage in very deliberate versus
spontaneous retrieval processes (Florack, Scarabis, & Bless, 2001).
Another explanation may be that the performance tests measure completely
different constructs than the questionnaires. For instance, the other rater report for the
Cognitive Openness measure (LMS), measures perceived preferences for engaging with
novelty and idea generation. While it may be assumed that this preference correlates
with ability to produce ideas in the Alternative Uses Test, it is possible that a significant
difference exists between actual capacity to produce novel associations, and an external
rater’s perception of another’s ability or orientation toward seeking novelty.
The Focused Attention performance test (Go-No-Go Paradigm) and the Focused
Attention questionnaire (Attentional Control Scale) may be measuring different aspects
of attention. For instance the performance test measures ability to limit the impact of
108
irrelevant visual information and make a quick decision (Garavan et al., 1999). The
Focused Attention questionnaire measures perceived ability to maintain focus or
concentration in the face of other information from the environment (Derryberry and
Reed, 2002; Derryberry and Rothbart, 1988). Several of the items involve limiting
incoming auditory data while remaining focused on a task such as “My concentration is
good even if there is music in the room around me.” The questionnaire does ask about
general concentration and blocking out distracting thoughts but nothing of a purely
visual-attentive condition. Therefore it is possible that these two tests are tapping
different aspects of a similar construct or perhaps completely different constructs
altogether.
The Cognitive Flexibility performance task (Stroop Task) and the corresponding
Cognitive Flexibility questionnaire (the MAI) may also be measuring different aspects of
a construct or different constructs altogether. The Stroop Task has been used in many
ways to measure different constructs. For instance the Stroop Task has been used to
measure cognitive flexibility (Kalmijin et al., 2002) and executive control (Rubinstein et
al., 2001). The scale for the Cognitive Flexibility measure in this study is the Metacognitive Regulation subscale of the Metacognitive Awareness Inventory (Schraw and
Dennison, 1994).7 The Stroop Task measures an ability to overcome an automatic
response by measuring how quickly and accurately the participant can identify the color a
word is written in as opposed to the word itself (which is the more dominant response)
(MacLeod, 2005; Galotti, 1999). In this way it demonstrates an aspect of cognitive
7
There are not any well aligned cognitive flexibility scales measuring the capacity to
overcome cognitive set (to the best of the researcher’s knowledge), therefore this scale
was used as an alternative measure representing the regulative aspect of being cognitively
flexible.
109
flexibility associated with overcoming automaticity (MacLeod, 2005; Galotti, 1999). The
questionnaire (MAI) measures the degree to which participant’s perceive their cognitive
regulation of planning, thinking, and evaluation of tasks (Schraw and Dennison, 1994).
Someone who pays very close attention to the cognitive regulation of his or her cognitive
processes should score highly on this scale. While items ask how often a participant
assesses strategies when the demands of the task require it, (such as “ When performing a
task, I frequently assess my progress against my objectives”) it does not explicitly ask
questions about overcoming a routinized response set. Therefore, a different aspect of
cognitive flexibility may be captured by the two measures. The Stroop Task, like the
Focused Attention Performance test (Go/No Go), is a measure that is entirely visual in
nature and tests reaction speed. The MAI measures other cognitive processes relating to
the amount of attention one allots to task assessment and the perceived “regulation” one
employs to change when necessary.
The Self and Other reports within each trait are significantly correlated. This
makes theoretical sense given that the questions were the same with the exception of the
rewording for other raters to address the ratee (e.g. changed from “I” to “He/She”). The
correlations between corresponding self/other ratings range from .206 to .345. This is
consistent with past studies that demonstrate an average correlation of .30 for self /other
ratings on behavioral assessments (Church, 1997)
The Self Report and Other Rater measures of Cognitive Openness show the
weakest correlation of the three constructs. The correlation though significant is only of
moderate strength (.20, p<.01). This suggests that though related there is still a difference
in how one perceives their interest in seeking novelty and how that interest manifests in
110
presentation to others. Similar to the explanation above, this questionnaire is mostly
about orientation and not about ability or behavior. For instance, one can be highly
interested in ideas or very curious without ever demonstrating this preference to others.
A self/other comparison of this instrument has not been performed in prior research. This
finding adds further validity to the existing instrument (LMS) as a significant correlation
is present.
The self report and other rater measures of Focused Attention show a stronger
correlation (r=.35 , p<.001). This suggests a moderate to strong relationship between selfperceptions of focused attention and that perceived by the raters. This stronger
correlation could be attributed to the behavioral indications that individuals supply (e.g.
expressing annoyance) when they get distracted, especially by noise (Jones & Davies,
1984; Kjellberg et al., 1996). Since the raters are individuals the ratees know well, and
the fact that they are undergraduate students, the kinds of study conditions that the ratee
prefers to work in may be known by the raters. For instance, if the ratee does not work
well in a loud coffee shop and requires a quiet library, this may be shared knowledge
between the rater and ratee, which could positively impact related scoring. Again this
scale has not been measured by an external rater so this adds further validity to the
instrument (ACS).
The self report and other rater measures of Cognitive Flexibility show a significant
yet moderate relationship (.218, p<.01). The agreement between self and other scoring
on this measure may again make sense due to the student population. Some of the items
in this questionnaire may be more evident as experienced behaviors seen by the rater. For
instance, the amount of “cognitive regulation” one allots to tests or projects prior to
111
beginning, during or in evaluation may be shared between close friends in a university
setting. An example would be when students speak directly after a test, trying to evaluate
how they did and what they could have done differently. For example the item “I ask
myself how well I’ve accomplished my goals once I’ve finished” may be quite evident
within the context of a school-based relationship. Yet the reason for only a moderate
correlation, as opposed to a high correlation, could be due to other items, which may be
of a more private-cognitive nature. Most items require one to answer the degree to which
one is ‘monitoring’ or ‘adjusting’ his/her cognitive activity as it is happening. Some
example items that capture this private experience are “I stop and re-read when I get
confused” and “I find myself analyzing the usefulness of a given strategy while engaged
in a given task.” This type of item may be more difficult for raters to accurately assess.
In the process of sharing and comparing results, parts of one’s private experience of
thinking may remain private. Again this scale has not been tested with an external rater
prior to this study, thus the results here add to the further validity of the measure.
As anticipated the results demonstrate both convergent and discriminant validity
across the majority of the questionnaire measures for self and other. Convergent and
discriminant validity suggests that each of the constructs here is unique as measures
correlate stronger between traits than across traits. The one exception is the Cognitive
Flexibility Other Rater measure and Cognitive Openness self measure which shows a
small, yet significant correlation (.15, p<.05). A potential reason for this may have to do
with the items on the Cognitive Openness Self scale that ask about interest in ideas, for
example “I do not actively seek to learn new things” (reverse scored). The Cognitive
Flexibility other rater questionnaire, asks how much the ratee questions assumptions and
112
considers options, such as “He/She stops and goes back over new information that is not
clear”. The Cognitive Openness scale in part tests openness to ideas (assumptions) and
interest in seeking out new ways of doing things. Some items form the Cognitive
Flexibility questionnaire also capture an interest in trying to gain additional
understanding. This similarity may account for the small intercorrelation among the
measures.
While there are significant correlations between self/other ratings, stronger
correlations are instead found within method and across trait. The performance tests in
general show the smallest within method correlations. The Cognitive Openness
performance score correlates with both the Focused Attention and Cognitive Flexibility
scores at near significant levels (r=.11, p=.10). The small and near significant correlation
between cognitive openness and cognitive flexibility is inconsistent with past findings.
Generally, creative individuals record slower reaction times to the Stroop Task
(Kwiatkowski, Vartanian, & Martindale, 1999). In addition creativity has been linked to
weaker attentional control which is associated with the Stroop Task (Dewing & Battye,
1971; Dykes & McGhie, 1976; Mendelsohn & Griswold, 1964; Toplyn & Maguire,
1991). Yet the small correlation between the AUT and the Stroop found here may be
explained by a quality of flexibility that is shared in both tasks. The AUT can be thought
of as a test of spontaneous flexibility (Eslinger and Grattan, 1993), or the ability to
produce an unprompted shift in one’s thinking. Creating a list of uses for an object
requires this ability, as one must choose to break set spontaneously. The Stroop Task
challenges the ability for reactive flexibility (Eslinger and Grattan, 1993) as one must
shift mental set quickly when the stimuli creates a prompt to do so. This common use of
113
flexibility may account for the small correlation.
Cognitive openness and focused attention may be considered opposing abilities.
For instance as attention becomes defocused, one’s cognitive network expands in nodal
associations (Finke et al., 1992; Martindale, 1991). Yet Martindale also wrote that
creative individuals (those scoring highly on the Alternate Uses Test) are able to bring
focus of attention when the task calls for it (1999), suggesting a more nuanced view of
opposing capacities. Creative individuals are more able to gather their attentional focus
to complete tasks when necessary (Dykes & McGhie, 1976). Specific to the modest
correlation between the Go-No-Go and the Alternate Uses Test, creative individuals are
more able to inhibit “peripheral” data when appropriate (Stavridou and Furnham, 1996).
The Focused Attention and Cognitive Flexibility performance scores are
significantly correlated (r=.18, p<.01). This significant correlation may be explained by
the fact that the Focused Attention and Cognitive Flexibility performance measures are
both visual based reaction time tests. While these two measures are not related in past
studies, there have been results from studies of event related potential (ERP) which
demonstrate that the cognitive control associated with the Stroop led to increased focused
attention (Hylton, et al., 2008), suggestive of shared cognitve processes between the
abilities. Yet the relatively small correlation is still suggestive of two unique
measurements. However the intercorrlations found within the self report and other rater
reports are not as small.
The intercorrelations within the self and other reports are moderate to high. One
explanation is that the significant correlations are suggestive of a potential common
method bias (Cote and Buckley, 1987). For instance, the Cognitive Openness Self report
114
correlates with the Focused Attention Self Report (r=.236, p<.01), despite there not being
any theoretical connection between the two measures. Yet the simple explanation of
common method bias may not fit, as there is not a significant correlation between
Focused Attention Self Report and Cognitive Flexibility Self Report (r=.087, p>.1).
Cognitive Openness Self Report is highly correlated with the Cognitive Flexibility Self
Report (r=.471, p<.001). This correlation can be explained in part to the similarity of
items between the two measures (in regard to interest in ideas).
The Other Rater reports are all significantly intercorrelated (ranging from .346 to
.546). The Cognitive Openness Other Report is correlated with the Focused Attention
Other Report measure (r=.346, p<.001). This modest to strong correlation is somewhat
similar to the correlation of the two measures in the self reports. The Focused Attention
Other Report measure is strongly correlated with the Cognitive Flexibility Other Report
measure (r=.449, p<.001). Lastly, the Cognitive Flexibility measure is highly correlated
with Cognitive Openness scale (r=.536, p<.001). Again, there may be some degree of
explanation attributed to common method variance and item overlap.
The exploratory factor analysis was consistent with the results explained above.
The factor analysis shows three relatively solid factors that separate by method rather
than trait. The factor analysis (Table 8) provides further evidence of methods that seem to
be measuring different traits, in the case of comparing performance measures to
questionnaires. While the self/other reports did correlate in correspondence with their
respective traits, the relationship by method remains stronger. The factor analysis helps
to confirm these findings in the context of all the measures.
115
Discussion of Results for Hypotheses 15-24
Hypotheses 15-24 measure the impact of each of the variables when regressed on
the dependent variable beyond the intelligence. The outcomes of these tests produce
mixed results. Each of the multicollinearity diagnostics is in proper range, suggesting
that there is not a concern of multicollinearity amongst the independent variables in the
regressions (i.e. all tolerance numbers are close to 1.00 and all VIF numbers are well
under 4.0). In general the performance variables remain strong while the majority of the
self and other rater measures lose significance in relation to the dependent variable.
Hypothesis 15 predicts that the intelligence factor explains unique variance in
adaptive performance in the dynamic decision making environment. The results of this
study with regard to the measurement of intelligence can be summarized in the
following ways. While significant correlations have been found between intelligence
measures and performance in past microworld studies (Ackerman, 1988); intelligence has
not been able to consistently predict performance in dynamic decision making
microworlds (Brehmer, 1992; Rigas and Brehmer, 1999). This inconsistency has been
explained through the different demands hypothesis; that dynamic situations require
specific cognitive abilities that are separate from general intelligence (Putz-Osterloh,
1993; Rigas and Brehmer, 1999). These inconsistent findings are also said to be a
function of low reliability coefficients found within studies using dynamic decision
making tasks (Rigas and Brehmer, 1999). The results of the current research
116
demonstrate a significant predictive impact of intelligence on the dynamic decision
making performance (joining others e.g. Gonzalez et al., 2005; Rigas, Carling, and
Brehmer, 2002). Furthermore the reliability coefficient for the Networked Fire Chief
performance across the trials was .73, thus providing a fairly reliable testing scenario.
Therefore, this study adequately refutes the different demands hypothesis in that it finds
intelligence to be predictive of performance. Yet it also supports the different demands
hypothesis by showing the impact of aspects of cognitive openness, focused attention and
cognitive flexibility on the adaptive performance measure within the DDM.
In this study both forms of intelligence significantly correlate with adaptive
performance on the dynamic decision making task. The intelligence factor score also
significantly correlates with the total adaptive performance score on the dynamic decision
making task (r=.448, p<.001). The results of the regression analysis demonstrate that the
intelligence factor is a strong predictor of performance in the DDM microworld,
explaining about 20% of the variance. While other measures of intelligence have been
used in dynamic decision making microworlds (Gonzolez et al., 2005; LePine et al.,
2000), the present measures have not before been used. Furthermore, no decisionmaking microworld has used a factor for verbal and spatial intelligence (Cattell, 1963;
Horn, 1989). These results suggest a valid intelligence measure that is predictive of
performance in this dynamic decision making microworld.
The hypotheses 16-24 predict, that beyond intelligence, each of the individual
methods of measurement (e.g. Cognitive Openness Performance, Cognitive Flexibility
Performance, and Focused Attention Performance) adds unique variance in explaining
adaptive performance in the dynamic decision making microworld. To test these
117
hypotheses a series of linear regressions were performed entering the intelligence factor
in block 1, and then the corresponding measure in block 2.
Each of the performance measures explains unique variance beyond intelligence.
The performance measure of Cognitive Openness (i.e. the Alternate Uses Test) displays a
small yet significant predictive capacity beyond intelligence (ΔR2 =.03). This suggests
that the ability to include many ideas (Wallach & Kogan, 1965) and scan an environment
more broadly (Runco, 1991) may be important in understanding performance in a real
time dynamic context. The dynamic decision making microworld requires one to pay
attention to multiple parts of the environment and actively search for new information,
therefore it is theoretically consistent that one’s ability to create new cognitive
associations is predictive of performance. The performance measure of Focused
Attention (i.e. the Go/No Go Test) also demonstrates significant predictive capacity
beyond intelligence (ΔR2 =.02). The dynamic decision making microworld provides the
user with large amounts of data and a continual choice to react or ignore the new
information as it is presented in the changing environment. One’s ability to focus
attention; to block out information in order to attend to a particular part of the task
(Lustig et al., 2001) is necessary in the ongoing adjustment to the dynamic context.
Finally, the Cognitive Flexibility Performance measure also shows a significant
predictive capacity beyond intelligence (ΔR2 =.07). The ability to flexibly control
cognition (Rende, 2000) is vital in the dynamic context as one needs to often times
overcome dominant response sets (Jost et al.,1998) in order to vary strategy successfully
(Kiesler & Sproull, 1982). Each of these individual abilities is significant in the
118
prediction of performance across the dynamic decision making microworld (NFC)
beyond the intelligence factor.
Most notable among these findings is the Cognitive Flexibility performance score,
measured by the Stroop Task. The Stroop Task explains about 7% of the variance
beyond intelligence. The Stroop Task has not been used in the testing with dynamic
decision making microworlds prior to this study. The fast paced nature of change within
the DDM microworld requires that subjects alter strategies to adjust with the changes.
Therefore, the capacity to overcome a dominant or automatic response set (i.e. The
Stroop) may be a very necessary component of success in such an environment. The
other two performance measures are also significant in predicting success in the DDM
microworld. As explained above, Focused Attention (as measured by the Go/No Go)
accounts for 2% variance beyond intelligence and Cognitive Openness (measured by the
Alternate Uses Test) accounts for 3% unique variance, respectively. While not
hypothesized, it makes some theoretical sense that Cognitive Flexibility accounts for
more explained variance than the Focused Attention and Cognitive Openness
performance measures. Cognitive agility is conceptualized as the capacity to flexibly
operate with both openness and focus. Therefore, it is flexibility that is perhaps most
crucial in being able to balance both openness and focus. An individual who is extremely
open may notice too much information and become overwhelmed (Brunstein & Olbrich,
1985; Kuhl & Kazen-Saad, 1988). Likewise, one who is extremely focused may become
stuck on one part of the context and not notice necessary environmental signals
(Anderson, 1983). Such a claim is never made on behalf of the flexibility measure; that
too much flexibility could disrupt effort. Still, each of the three performance measures
119
demonstrates unique variance beyond intelligence, which is a contribution to the extent
research performed within dynamic decision-making microworlds.
The impact of the individual self-report measures within the regression analyses
show results that do not fully support the hypotheses. Only the Self Report for Cognitive
Openness produces results that indicate it as having predictive capacity beyond
intelligence – explaining approximately 3% unique variance on the dynamic decision
making microworld. This measure, in general, asks participants the degree to which they
seek novelty, are curious and investigate ideas. Therefore in what would be a likely novel
scenario (the Networked Fire Chief dynamic decision making microworld); participants
rating themselves higher on this measure of Cognitive Openness may be disposed to
approach this context with greater interest (Bodner and Langer, 2001). The Focused
Attention and Cognitive Flexibility self reports, no longer retain any predictive effect on
the dependent variable beyond intelligence. This indicates that intelligence accounts for
the variance in these relationships. Regardless of how one perceives their capacity to
focus attention or regulate cognition, this effect is mitigated beyond the measure of
intelligence. In the context of a real time dynamic decision making environment, only the
Cognitive Openness self report remains a significant predictor of success beyond
intelligence.
The impact of the individual Other Report measures within the regression
analyses are also less valid in terms of findings. Only the Cognitive Flexibility Other
Rater measure is significant beyond intelligence. Yet like the correlations reported
above, the explained variance (at 4%) is inversely related to the dependent variable.
This finding is intriguing as it suggests that beyond intelligence, an external perception of
120
another’s cognitive regulation has a substantial, albeit inverse relationship to adaptive
performance in the dynamic decision making microworld. The metacognitive regulation
measure used for Cognitive Flexibility includes items about monitoring and control of
cognitive activity. A potential explanation may be that external raters view ratees who
“obsess” about outcomes and processes as being more cognitively regulated. In fact
metacognitive regulation has recently been associated with obsessive-compulsive
behavior (Purdon and Clark, 2009). Given the inverse relationship, it may be that too
much cognitive control, while a necessary component of being flexible, can have a
negative impact in a fast paced real time dynamic context. Furthermore, according to
these results, one’s self report is not as meaningful as an external rater’s perception.
Overall it would appear from these results that the individual performance tests
are a more valid set of predictors toward success in the DV than the self or other reports.
Again this makes theoretical sense based on the implicit nature of the performance tests
and the fire chief exercises (Blair, 2001; Dovidio, Kawakami, & Beach, 2001). The self
and other reports are missing this implicit nature, and may have a lower degree of
consistent predictability with the DV. Furthermore each of the performance tests and the
DV have a time intensive component to them; whether it be reaction time (Stroop and
Go/No Go), product creation (in the case of the AUT) or decision speed (with the DV).
Discussion of Results for Hypotheses 25-27
Hypotheses 25-27 attempts to measure the Cognitive Agility formative construct
grouped by method. The most significant finding within this set of regression analyses
121
was the Cognitive Agility performance composite score. In this case a composite score of
the performance measures for Cognitive Openness (AUT), Cognitive Flexibility (Stroop
Task) and Focused Attention (Go/No Go) was regressed on the adaptive performance
score of the Networked Fire Chief. The results indicate that the addition of the Cognitive
Agility composite performance score provides 11% unique significant variance beyond
intelligence (R2Δ=.11, p<.001). This finding suggests that in a real time dynamic
exercise, that the formative performance construct of Cognitive Agility is important in
predicting successful adaptation. This finding is especially interesting given that
research in microworlds have not previously measured the impact of additional cognitive
abilities beyond general intelligence. In other studies using microworld simulations both
conscientiousness and openness to experience (as measured by the NEO) accounted for
2% unique variance beyond intelligence (LePine et al., 2000). This gives a frame of
reference for the significance of this finding. It is also worth noting that the Cognitive
Agility composite score provides a greater degree of unique variance than any of the
individual variables that form it (i.e. Cognitive Openness, Focused Attention or Cognitive
Flexibility), suggesting an important formative conceptual addition to such real time
adaptive capacities. Lastly, the tests used in this study (the Stroop, Alternate Uses and
the Go/NoGo) have never been tested as predictors in a microworld domain, thus
providing a novel measurement approach to such contexts.
As noted in the results section it is not possible to create a formative construct for
the self and other rater reports, as intended, due to the negative correlation that the
Cognitive Flexibility self and other reports has with the Networked Fire Chief adaptive
122
performance score. Instead each of the respective methods (self and other) was entered
listwise in a single regression block, preceded by the intelligence factor.
The self-reports (Cognitive Openness, Cognitive Flexibility and Focused
Attention) were entered listwise in a regression to see the combined effect beyond
intelligence. Entered together these variables account for 6% unique explained variance
in the adaptive performance score on the dynamic decision making scenario (R2Δ = .06,
p=.006). When looking at the individual variables in the model it shows a clearer picture.
In this model Openness Self-Report is significant (β=.24, p=.002). This suggests that
Cognitive Openness as a self-report has some significant capacity to predict success in
the dynamic decision making microworld in this model, as it did as a separate variable.
This provides further evidence that self-perceived cognitive openness is an important and
significant component in such a real time dynamic environment. The Focused Attention
self report within the model does not add any unique variance to the overall model (β=
.05, p=.49) as was also demonstrated in the earlier regression with it as an individual
variable. The Cognitive Flexibility Self report demonstrates near significance in this
model (β=.-19, p=.07). Again the most interesting part of this model in regards to
Cognitive Flexibility is the sign direction, which was not originally predicted to have an
inverse relationship to the dependent variable. The Cognitive Flexibility score
demonstrates that a high score on the scale predicts a lower performance on the
Networked Fire Chief microworld. The Cognitive Flexibility Self Report does not
demonstrate any significant variance when entered by itself beyond intelligence.
Therefore, the full model, with the interaction of Cognitive Openness and Focused
Attention, changes the impact that Cognitive Flexibility has on the dependent variable
123
(i.e. in interaction with the other variables it gains predictive capacity). While the full
model is significant, these results taken together suggest that the self report measures
chosen to predict adaptive performance on the Networked Fire Chief microworld are not
fully adequate.
The impact of the other rater reports also demonstrate results that do not fully
support the hypothesis. As with the Self Reports it is not possible to create a formative
construct given the negative correlation that the Cognitive Flexibility Other Rater
measure has with the Networked Fire Chief. Therefore the three measures (Cognitive
Openness Other Rater, Focused Attention Other Rater and Cognitive Flexibility Other
Rater) were entered together into a model listwise, following the intelligence factor. In
support of the hypothesis, the overall model is in fact significant, adding 6% of unique
variance (R2Δ=.06, p=.002). While this is a supportive finding, the negative β score for
Cognitive Flexibility again suggests that this finding is not in line with the intent behind
the conception of the hypothesis. The Cognitive Flexibility measure as reported by
others demonstrates a significant inverse predictive capacity to adaptive performance on
the Networked Fire Chief (β=- .32, p=.000). This score indicates the strongest
relationship of any of the individual scales (either Self or Other) to the Networked Fire
Chief (Dependent Variable). The Cognitive Flexibility Other Rater measure is a stronger
inverse predictor in this model than it was as an individual variable entered beyond
intelligence (β=- .20). Therefore in the current model, with Cognitive Openness and
Focused Attention, the inverse impact of Cognitive Flexibility (as measured by a
Metacognitive Regulation Subscale) is even more significant. In this regard it is an
important finding, adding further evidence that an external perspective of how another
124
appears to regulate cognition, may be a powerful way of predicting performance in a realtime dynamic situation.
The Focused Attention Other Rater Report, in the context of full model, is more
powerful in predicting success on the dynamic decision making microworld than it was
as a single variable. In the case of the Focused Attention Other Report, informants’ views
of how focused others are has a significant impact on the adaptive performance outcome
(β=.14, p=.05). Most interesting here is the difference that the external rating of Focused
Attention has within the full model compared to it as an individual variable beyond
intelligence. As an individual variable beyond intelligence it had no significant impact in
the prediction of adaptive performance in the microworld. Yet when included with
Cognitive Openness and Cognitive Flexibility an external rater’s perception of the ratee’s
focus does demonstrate unique importance relative to the given model.
The Cognitive Openness Other Rater measure is near significant in its impact in
this current model. (β=.13, p=.10). As a singular measure beyond intelligence the
Cognitive Openness Other Report showed no significance but when added within the
model, with the other measures, it reaches near significance. This demonstrates that in
interaction with the other variables the external rater’s perception of a ratee’s cognitive
openness is a potentially important predictor of real time adaptive performance.
Discussion of Post Hoc results
The post hoc analysis produces very significant findings. Most notable is the
significant influence that the control variables have in explaining variance toward the
125
dependent variable (R2 =.22, p<.001). Furthermore, as individual variables both gender
and video game experience demonstrate significant β scores (Gender demonstrates β=.24,
p=.001 and video game experience shows (β=.34, p<.001 ). Past studies using
microworlds either do not collect gender and gaming experience data or simply do not
report the results. Only one study shows the results of video game experience and its
relation to performance in a microworld; with a very significant correlation to the
outcome variable (r=.73, p<.01 Valentine, 2005). Furthermore the sample size for this
study is quite small (n=15). Yet this variable is not normally reported in microworld
studies. The same seems to be true of results related to gender. Therefore the addition of
these variables (in a sample of this size, n=181) in explaining performance is a potentially
significant addition to the extant literature.
The intelligence factor still retains significance beyond gender and video game
experience (β=.40, p<.001). The intelligence factor explains approximately 16% of the
unique variance beyond the control variables. This is approximately 4% less variance
than intelligence accounts for in the regression without the control variables. This
suggests that intelligence is still an important predictor of adaptive performance and
clearly demonstrates that the control variables are relatively unique from intelligence in
relation to the dependent variable.
The performance measures all remain significant after accounting for the control
variables and intelligence. The Focused Attention performance measure explains
approximately 1% variance at near significant levels (ΔR2 =.01, β=.11; p=.07) beyond the
controls and intelligence factor. The measure of Cognitive Openness performance still
remains significant beyond controls and intelligence, accounting for about 2% explained
126
variance (β=.15; p=.016 ). The measure of Cognitive Flexibility performance score also
remains significant, accounting for 8% of explained variance (β=.30, p<.001 ). While
the amount of explained variance decreases slightly with each of the measures, these
results add further evidence to their unique impact in being able to predict adaptive
performance in the real time dynamic decision making microworld.
The next set of analysis tests the two individual questionnaire measures that were
significant in the earlier regressions beyond control variables and intelligence (Cognitive
Openness Self report and Cognitive Flexibility Other Rater Report). In these results the
Cognitive Openness Self Report no longer remains significant after accounting for the
control variables and the intelligence factor (β=.087, p=.15). This suggests that the
control variables likely account for the majority of the significance explained regarding
the Cognitive Openness Self Report. The Cognitive Flexibility Other Rater Report
remains significant accounting for 2% of unique variance beyond the controls and
intelligence (β=-.14, p=.017). While the explained variance is cut in half, this finding
further demonstrates the predictive capacity that an external rater’s perception of metacognitive regulation has on adaptive performance in a real time dynamic decision making
microworld.
The composite performance factor for Cognitive Agility remains a significant
predictor of adaptive performance beyond the control variables and intelligence,
accounting for 9% unique variance (B=.31, p<.001). While this model explains
approximately 2% less variance than the earlier model, it is strong evidence of Cognitive
Agility as an important predictor variable of adaptive performance. Furthermore, it
127
suggests a unique composite construct which maintains a relatively large predictive
ability in the dynamic decision making microworld.
The three self-reports no longer remain significant as an overall model beyond the
control variables and intelligence. The results of the three self report measures indicate an
overall model that is no longer significant (ΔR2 =.02, p=.148). This suggests that Gender
and Video game experience account for the variance that was originally explained by the
earlier Cognitive Agility Self Report model (prior to the inclusion of the control
variables).
The other rater reports representing the variables of Cognitive Agility account for
nearly 4% unique variance beyond the controls and intelligence (ΔR2 =.037, p=.014). The
total R2 is approximately 2% less than in the earlier model without the control variables.
This suggests that beyond the control variables and intelligence, the external rater
perception is still significant in explaining variance in adaptive performance. The
Cognitive Flexibility measure as reported by others demonstrates a significant inverse
relationship to adaptive performance on the Networked Fire Chief (β=- .24, p=.002). This
is slightly smaller than in the earlier regression but still demonstrates unique importance
relative to the given model. In the case of the Focused Attention Other Report,
informants’ views of how focused others are remains nearly significant in its relative
explanatory importance (β=.11, p=.08).
Finally, a hierarchical regression was run including the controls, the intelligence
factor, the composite performance measure, the three self reports and the three other rater
reports. The results of this table are in Table 19. The total model has an R2 of .499 and is
128
significant (p=.000). The composite measure, self reports and other reports (entered in a
single step listwise) explain 12% unique variance beyond the controls and intelligence. In
this full model the only measures that remain significant are Cognitive Flexibility Other
Rater Report (β= -.19, p=.008) and the Cognitive Agility Composite Performance
measure (β=.275, p=.000). The Cognitive Openness Self Report measure demonstrates
results that suggest a near significant relationship in this model to the dependent variable
(β=.12, p=. 074). Therefore when investigating all the variables the R2 becomes larger
but the variables that remain significant are far fewer.
Lastly, the correlation tables (Tables 20 and 21) that break out the relationships
based on gender reveal exciting data for future study. The significant relationships that
exist between the predictor variables and the Networked Fire Chief is largely driven by
gender (as evidenced by Table 21). This would suggest a potential moderating effect that
gender has between cognitive agility and adaptive performance on the Networked Fire
Chief. This relationship was tested and supported (Table 22) as unique variance is
explained by the interaction term (Gender X Cognitive Agility beyond the previous
regression steps (ΔR2 = .05, p<.001).
Limitations
This study has several major limitations that when addressed may provide future
research opportunities. This research is an initial step in proposing a new construct, and
therefore a great deal of future work toward validation is necessary, which is beyond the
scope of this section. However, current study limitations are noted regarding
129
generalizability of results. Limitations to generalizability may be found within the study
sample and task conditions of the dependent variable.
With regard to this study, the testing scenario of the microworld greatly limits
direct generalizability of the results to decision making of individuals in organizations.
Laboratory based decision making studies do not fully capture real life decision making
(Dawes, 1988) Admittedly, the choice to use a microworld in no way is meant to simulate
the context of organizational experiences. Instead it is used as a way to capture the
elements found throughout the dynamic experiences individual encounter often within
organizational life (Brehmer & Dorner, 1993; Funke, 1991; Omodei & Wearing, 1995).
These elements include the interaction of dynamism, complexity, time-constraints and
feedback delays (Brehmer 1992). The microworld provides a method for controlling the
task conditions across time and participant while testing for relationships to the dynamic
phenomena (Brehmer and Dorner, 1993). Microworlds are considered especially optimal
for studies that investigate highly complex phenomena in a dynamic context (Brehmer &
Dorner, 1993; Brehmer, Leplat & Rasmussen, 1991).
While few would argue that experiences in organizational life have become more
dynamic and complex, the results produced by this microworld based laboratory study
still need to be handled with caution. The results pertaining to cognitive agility and the
individual variables which form it (Cognitive Openness, Focused Attention and
Cognitive Flexibility), are promising in being able to predict adaptive performance, yet
this outcome is within the context of a computer simulated game. Microworld simulation
performance outcomes are seldom studied alongside actual organizational performance
outcomes, making generalizability of results limited. The predicament between realism
130
and control is an old and ongoing debate in research setting selection (Berkowitz and
Donnerstein, 1982; Dipboye and Flanagan, 1979; Runkel and McGrath, 1972).
Microworlds have been lauded as a method to “bridge the gap” between laboratory and
field studies (Brehmer & Dorner, 1993). Yet, the current results, while a promising
contribution to dynamic decision making, are still limited due to the computer based
format. Therefore, measurement of cognitive agility must be further tested in more real
world organizational contexts to claim additional external validity.
The sample population presents another potential limitation of the present study.
The sample in this study was drawn entirely from undergraduate students at a highly
selective university. This sample creates two possible limitations. The first is that the
relationship of the independent variables to the dependent variable is limited to
individuals who have a mean age of 20 years. This becomes a potential issue since past
research demonstrates that each of the independent variables could be impacted by age.
This age restriction may especially impact the intelligence tests and the performance
measures, as they are implicit measures prone to age related differences in neuroanatomy.
In general executive functions are impacted by age (West and Bell, 1997) as are
adaptive problem solving abilities (Ridderinkhop, Span & van der Molen, 2002). In a
recent study, Salthouse (2009) shows cognitive functions, like speed of thinking and
spatial visualization, peak by age twenty-two and start to decline around age twentyseven. Specifically the Stroop Task has shown performance decrements with increases in
age (Delis et al 2001), based in part on processing speed (Salthouse and Meinz, 1995).
Focused attention is also subject to age related decreases as inhibitory control (Kramer,
Humphrey, Larish & Logan, 1994), response competition (Salthouse & Somberg, 1982),
131
and response suppression (Fluster, 1997) are also subject to age related decrements.
Additionally, divergent thinking has been shown to decline sharply after the age of 30
(Lehman, 1953). Also, fluid intelligence is subject to age decrements (Shilling et al.,
2002), which would impact the outcomes on the spatial intelligence measure. With the
average age of the US labor force at 40.5 years, the current sample is not representative
of those inhabiting todays’ organizations from an age – and therefore neuroanatomical
perspective.
Additionally, this current population is taken entirely from a highly selective
university. The median SAT score of an entering freshman at the university in the
current study is 1300 and the average SAT of undergraduate business majors is 1270,
ranking it number thirty-two in the country (Business Week Online, 2009). This suggests
that the population being studied may not be generalized to a wider population as both
age and preselection (from the University) indicate potential comparison limits.
The focus of this study is the investigation of a particular set of cognitive
capacities and therefore a population with a uniform age provides a built in control. It
has been previously noted that “…the generalizability of the results of behavioral
research is not a function of the population sampled, but rather….the external validity of
the research depends on the interaction of subject characteristics and the particular
behavioral phenomena with which one is concerned” (Christensen, 1997, p. 482).
Therefore, this sample serves as a valid initial group to study with future work extending
to both older age groups and those in organizational contexts.
A potential limitation of this study may come from some of the measures chosen.
Admittedly there is mixed theoretical and quite limited empirical support linking each of
132
the independent variables across methods. For example, there is not a definitive way to
measure Cognitive Flexibility. It is a complex construct which is measured by many
different performance tasks. Furthermore there is not a questionnaire that aims to
measure the construct as defined for the study. Adding to this difficulty there is virtually
no empirical support to link Cognitive Flexibility performance tasks with Cognitive
Flexibility questionnaires. As reported throughout this chapter, the results were
consistent with this limitation, as performance tests did not relate to the corresponding
questionnaires. Therefore this became a major limitation in the results and restricts
validity interpreted for the construct.
133
CHAPTER V
CONCLUSION
The primary purpose of this work is to propose the construct of cognitive agility,
and to test whether agility predicts adaptive performance in a real time dynamic decisionmaking environment. As a formative construct comprised from the composite of the
three performance measures, cognitive agility predicts adaptive performance in the
dynamic decision making scenario beyond measures of general intelligence. This finding
suggests that the unique combination of Cognitive Openness, Focused Attention and
Cognitive Flexibility (as they are measured here), may be an important cluster of abilities
in managing real time dynamic contexts. While the performance measures were
predictive of success in the dynamic decision making task, the results of the self-reports
and other-rater reports were not as conclusive. Much of this is due to the inverse
relationship of the Cognitive Flexibility questionnaire(s) to the dependent variable. This
134
finding does however raise important questions about the roles of cognitive regulation,
cognitive control and cognitive flexibility in real time dynamic decision making contexts.
Overall this study investigates specific capabilities targeted to a specific context. Studies
of highly contextualized aspects of adaptive performance are vital to our development in
meeting the demands for research and practice in dynamic times.
Research Implications
There are several future research studies suggested below to further develop this
work beyond the current findings. Future research should test the cognitive agility
construct in other microworlds, in relationship to other relevant variables and in more
organizationally relevant contexts. Each of these will help support further internal and
external construct validation.
This study demonstrates significant relationships between the independent variables
and adaptive performance in the microworld simulation know as the Networked Fire
Chief (Omodei and Wearing, 1998). Further construct validation may be assessed
through future testing of the cognitive agility construct in alternative microworlds. In
general, dynamic decision-making microworld programs contain some defining features
that are common across platforms. These common characteristic elements include,
“…temporal dependencies among system states, nonlinear relationships among system
variables, and a lack of user access to complete information about the system state and
system structure. ” (Gonzalez, 2005, p. 278). These characteristics manifest in four criteria
by which Gonzalez (2005) and others (Brehmer & Dorner, 1993) rate the microworlds:
1. Dynamism - rates of change and the feedback loops that impact it (Edwards,
135
1962),
2. Complexity - the degree of interconnection amongst parts of the system (Brehmer
& Allard, 1991).
3. Opaqueness - ready accessibility of system state to the user (Brehmer, 1992), and
4. Dynamic complexity - combination of dynamism and complexity, which creates
conditions that limit ability of user to gain control of system, (Sterman, 1989).
The Networked Fire Chief is a microworld that is rated as being “high ” in dynamism and
complexity and “moderate” in its opaqueness and dynamic complexity. This means that
the Networked Fire Chief is considered to have many ongoing changes and a great deal of
interconnectedness between parts of the system. Yet it is only moderately difficult in terms
of assessing one’s current performance and ability achieve control of the system.
Future studies could include the use of alternative microworlds, which represent
different ends of the spectrums with regard to degree of dynamics, complexity,
opaqueness, and dynamic complexity. For instance The Water Plant Purification Task
(Gonzalez, 2004) is rated as ‘high’ across all categories, allowing for the effects of testing
Cognitive Agility in a more intensive system of operation. Given the increased dynamic
complexity The Water Plant Purification Task has over the Networked Fire Chief, one has
to wonder if Cognitive Agility would predict more or less in the adaptive performance
score, as general intelligence may account for more variance (Ackerman, 1988; Gonzalez,
et al., 2005; Kyllonen, 1985). Another well-researched microworld is the Beer Game
(Sterman, 1989), which offers much lower dynamism and complexity than the
Networked Fire Chief, while retaining greater levels of opaqueness and dynamic
complexity (Gonzalez, 2005). The greatest differences between the Beer Game and the
Networked Fire Chief is that in the Beer Game the system only changes when the
136
participant acts upon it and the individual has more time to make decisions. These
differences allow testing of Cognitive Agility in a completely different kind of dynamic
decision making scenario in which speed may be less of a factor. Given that each of the
performance tasks used to measure Cognitive Agility contain a speed component, this
may significantly alter the construct’s impact. Yet, the difference in speed may also alter
the Cognitive Flexibility questionnaire’s (Meta Cognitive Awareness Inventory) impact
in predicting success (as opposed to having an inverse relationship to adaptive
performance on the Networked Fire Chief). Each microworld has unique characteristics
and assessing outcomes related to Cognitive Agility would help in determining specific
real-time elements that are associated.
This study demonstrates significant relationships between the intelligence measures
and dynamic decision-making. This finding is important in light of the mixed results that
various forms of intelligence have had in relation to dynamic decision making
microworlds (Andresen & Schmidt, 1993; Brehmer & Rigas, 1997; Dorner, Kreuzig,
Reither, & Staudel, 1983; Omodei, Anastasios, Waddell, & Wearing, 1995; Putz-Osterloh
& Luer, 1981; Staudel, 1987; Strohschneider, 1986, 1991). Furthermore, this study
reveals that Cognitive Agility predicts unique variation in performance above and beyond
intelligence measures.
Another method to further validate the construct of Cognitive Agility is the testing
of additional or alternative intelligence tests. The intelligence tests in this study were
chosen to incorporate both a verbal and spatial feature (Cattell, 1963; Horn, 1989).
While the spatial test certainly taps into the measure of fluid intelligence (Ekstrom,
1976), it may also be of benefit to use a direct measure of fluid intelligence in future
137
studies. Fluid intelligence describes the innate capacity to adapt knowledge to deal with
novel situations (Snow & Lohman, 1984; Carpenter, Just, & Shell, 1990), and has been
shown significant in explaining professional success in complex environments
(Gottfredson, 1997). It is therefore, an important aspect of intelligence to measure
alongside Cognitive Agility when predicting adaptive performance. A common measure
of fluid intelligence is the Ravens Progressive Matrices – RPM (Raven, 1962, 1977;
Raven et al., 2003). The RPM has demonstrated very strong correlation to performance
on the Networked Fire Chief in a past study (r=.60, p<.05) (Gonzalez et al., 2005); yet
this was based on a very small sample (n=15). Future studies of Cognitive Agility should
measure its effect on microworld programs beyond fluid intelligence. Cognitive Agility
as a predictor of adaptive performance beyond fluid intelligence would support additional
construct validation.
One underlying cognitive process that supports high fluid intelligence is the ability
to utilize working memory (Carpenter et al., 1990; Gray et al., 2003). Working memory
measures an individual’s ability to maintain information when facing interference
(Baddeley & Hitch, 1974). It is a limited capacity, which allows one to keep information
for a temporary period of time while being able to apply it for strategic use (Baddeley,
1992). Working memory ability has been correlated with strategy adaptivity in past
studies (Schunn and Reder, 1998). Given the dynamic testing context and focused
attention component of Cognitive Agility, this may be an appropriate measure necessary
to further illustrate discriminant validity of Cognitive Agility. Working memory can be
measured with the reading span task (Daneman & Carpenter, 1980), the Digit Span Test
(Baddeley, 1986), or the Operation Span Task (Turner & Engle, 1989); each measuring a
138
slightly different aspect of working memory.
Next, the measurement of Situation Awareness is appropriate in further validation
of Cognitive Agility. Situation Awareness measures one’s knowledge of the current state
of a dynamic context (Endsley, 1995). It is “the ability to extract and integrate information
in a continuously changing environment and to use such information to direct future
actions ” (Wickens, 2000). Situation Awareness has been tested in other microworlds as a
way of predicting performance under specific conditions (Endsley, 1995). In a dynamic
context, placing attention on one part of the environment has an effect on understanding
other parts of the environment (Endsley, 1995). The capacities underlying Cognitive
Agility may lead to greater situation awareness of the dynamic microworld context.
Situation Awareness can be measured using the Situation Awareness Global Assessment
Technique (SAGAT) (Endsley, 1987, 1990, 1995). Another way to measure Situation
Awareness is through participant query. In this method the microworld can be paused at
random points and the participant is queried for their awareness of the current
environment (Endsley, 1995). Situation Awareness is another variable that Cognitive
Agility could be tested alongside to further internal construct validation.
Future studies should aim to further validate Cognitive Agility as a formative
construct. The literature on formative constructs suggests applying a structural equation
model, relating the formative construct with an existing reflexive construct that should
relate to it (Diamantopoulos, 2006). This would support greater criterion and nomological
evidence for the formative construct (Diamantopoulos and Winklhofer, 2001). One
potential construct to link cognitive agility is action control (Kuhl, 1992, 1994). Action
control is meant to measure one’s ability to maintain focus on a task without becoming
overwhelmed by competing cognitive activity. Action Control can be measured using the
139
Action Control Scale (Kuhl & Beckmann, 1994), which is comprised of three subscales:
(1) Preoccupation versus Disengagement (which measures concentration on intention
without becoming distracted by outside cognition), 2) Hesitation versus Initiative (which
measures intention to move toward behavioral action without hesitating and (3) Volatility
versus Persistence (which measures vigilance in completing the task).
Future studies may also assess the predictive impact of Cognitive Agility on more
widely used organizationally relevant outcomes. Potential testing could compare
Cognitive Agility results, beyond intelligence, with objective measures of organizational
performance (e.g. productivity, supervisor ratings etc.). It is well documented that
general cognitive intelligence predicts job performance (House et al, 1992; Schmidt and
Hunter, 1989). Hundreds of studies demonstrate that general cognitive ability predicts
performance in just about every job, situation and career (Gottfredson, 1986; Hunter,
1986; Ree & Earles, 1993; Schmidt & Hunter, 1981). Others have suggested that g is the
best predictor of job performance (Barrett & Depinet, 1991; Gottfredson, 1986; Herrnstein
& Murray, 1994; Hunter & Hunter,1984; Jensen, 1993; Ree & Earles, 1993; Schmidt &
Hunter, 1981, 1993). Furthermore, as jobs require more information processing, the
predictive capacity of general cognitive ability increases toward performance (Wright and
Mischel, 1987). Yet there is still heavy debate in this arena, as IQ scores, despite being
the “best available predictor” of job performance, still account for only half the variance
(Neisser et al., 1996 p.83). Therefore, any unique variance that could be demonstrated by
cognitive agility, beyond general intelligence, would be a major step towards additional
external criterion related construct validity.
Lastly the testing of cognitive agility in some particular decision-based
organizational contexts would be important. One potential context, rich with real-time
140
adaptive performance challenges is the daily role held by portfolio asset managers.
Managers make a series of critical dynamic decisions that have profound implications for
their personal wealth, the success of their firm and the assets of individuals invested
either directly or indirectly in the fund. Expert investment decision-making process is
associated with both personality and cognitive factors (Rohrbaugh and Shanteau, 1999).
Beyond these, asset managers likely must be open to new information, focused on
particular elements of the financial landscape and flexibly operate to avoid functional
fixedness. Therefore, the study of portfolio managers presents a real world context in
which cognitive agility may have a profound impact. Such a study design would control
for a manager’s intelligence, age, tenure and level of education (Hambrick and Mason,
1984; Shukla and Singh (1994). Furthermore, other individual differences need to be
accounted for, to include confidence/overconfidence (Odean (1998) and establishied
cognitive abilities like pattern recognition (Brown and Soloman, 1990). Additional
unique variance explained from cognitive agility may predict individual financial
performance for the funds managed over a set periods of time. The study could track
results in a variety of time horizons, gaining additional insight as to cognitive agility’s
predictive impact in both real time and longer time periods.
Practical Implications
This dissertation offers a construct, which helps in explaining individual
differences in adaptive performance in a specific real time dynamic context. Adaptive
141
performance is often assessed across multiple tasks, contexts and longer periods of time
(Pulakos et al., 2000). Therefore, current conceptualizations overlook aspects of
individual cognitive capacity that may support real time adjustments within a singular
task. While results of this study are not conclusive, the knowledge generated begins to
offer support to organization members in developing specific skill capacities related to
real time adaptive performance. This capacity can be applied to individual and
organizational level development.
A primary question from a practice-oriented perspective is: can cognitive agility be
nurtured as a cognitive capacity? In order to accomplish this, each of the three variables
that form Cognitive Agility must be subject to developmental improvement. Cognitive
Openness as a performance variable is measured within the current study through a
divergent thinking test. Past research has shown that divergent thinking can be developed
and increased with training (Torrance, 1972). This suggests that with practice individuals
can begin to create more cognitive associations (Torrance, 1974), improve scanning
ability (Runco, 1991), and begin to notice more novelty (Mendelson, 1976). The abilities
of focused attention and cognitive flexibility are usually considered less apt as
developmental capacities, as they are closely associated with hereditable traits (Gray &
Thompson, 2004). Yet, a recent study from Jaeggi and colleagues (2008) in the
Academy of Science Proceedings, suggests that contrary to past research, developing
such abilities might be possible. In this study subjects engaged in “n back” style tests
over a period of time and results showed changes to subjects’ level of fluid intelligence,
an ability not considered viable of improvement in adults. What is most compelling is
the training that subjects participated in, consisted of exercises that challenge response
142
inhibition, the ability to ignore irrelevant information, and monitoring of ongoing task
progress. Within this training, there exist obvious parallels to focused attention and
cognitive flexibility as described within the Cognitive Agility construct. For instance,
focused attention is the capacity to ignore irrelevant information and maintain a coherent
cognitive thread (Lustig et al., 2001). If tests that challenge “ignoring irrelevant”
information lead to increases in general fluid intelligence, it is likely that those tests also
lead to increases in the act of ignoring irrelevant data. Cognitive flexibility is the ability
to overcome an automatic response set, which is associated with response inhibition
(Clark, 1996; Rende, 2000). The ability to regulate cognition through monitoring and
control is an aspect of cognitive flexibility (Clark, 1996; Schraw and Dennison, 1994)
and is associated with metacognition. There appear to be individual differences in
metacognitive capacity (Allen & Armour-Thomas, 1999), and strong evidence to suggest
that it can be developed through training (Delclos & Harrington, 1991; Hartman &
Sternberg, 1993; Schmidt and Ford, 2003). Therefore, the ability to monitor and control
cognition can be enhanced through specific practice. Together it is plausible to increase
the elements of Cognitive Agility and likely the overall construct. Such training can be an
important addition to the developmental programs organization members undertake,
especially if their jobs consist of managing real time adaptive contexts.
Such training programs may begin with cognitive agility testing. Cognitive Agility
is a cognitive ability which is difficult to enhance since it tends to be more stable
(Kozlowski, 1998). Yet with specific learning agendas this ability and the variables that
form it may become permeable (Ross and Lussier, 2000). A learning agenda will have
multiple parts. To begin with Individuals will be assessed on overall agility and the
143
individual abilities that form it. Individualized plans will be crafted to support gaps in
ability. Once aspect that could be helpful is, Discovery learning, employed through realtime dynamic decision making microworld simulations (Kozlowski, 1998). In discovery
learning, individuals participate in the simulations in order to hypothesize and self
discover what the principles are that lead to success in that context (London & Bassman,
1989,). Discovery learning allows individuals to transfer new knowledge to novel settings
Atlas, Cornett, Lane, & Napier, 1997; Smith et al., 1997
Another aspect of a learning agenda is to engage in deliberate practice to enhance
weaknesses (Ross and Lussier (2000). For example, an individual lower in cognitive
openness may engage in structured exercises to expand awareness of novel aspects of
one’s lived contexts. Interactions with novelty may take the form of practicing new
routines (e.g. taking a new driving route home from work), looking for new distinctions
(e.g. recognizing charactersitics you have never noticed in your environment), and using
additional information (e.g. prior to making a decision think of additional perspectives or
materials that can widen scope).
Cognitive Agility training should also extend to individuals working in teams or
work groups. Team functioning may be enhanced by bringing awareness to each
member’s cognitive agility based strengths (i.e. focus, openness or flexibility) and areas
for improvement. Teams can structure opportunities for individuals who need
development in an area to take a lead in a particular corresponding process. For instance,
an individual that has targeted focused attention as a developmental opportunity may be
given the chance to be more proactive in keeping a “brain storming” session very focused.
144
Furthermore, individual strengths can be harnessed when necessary. For example, an
individual strong in cognitive flexibility may be in charge of preventing team functional
fixedness.
Another result of this study that may have promise in organizational effectiveness
practices is the impact that video game experience has on adaptive performance. In this
study participants who play more video games perform better on adaptive performance in
the Networked Fire Chief. Therefore, introducing video games as part of a
“play/training” opportunity for organizational members may have some benefits to
adaptive decision making in a real time context. This has been partially supported by
past studies that show training using video games impacts differences in spatial cognition
(Feng, Spence Pratt, 2007) and spatial selective attention (Green & Bavelier , 2003). This
was potentially supported in this study as experience playing video games is associated
with spatial cognition (r=174, p<.05). Additionally, video game experience is also
related to self report measures of Cognitive Openness (r=.125, p<.05) and Focused
Attention (r=.14, p<.05). Video game experience had a negative correlation with the
Other Rater Report of Cognitive Flexibility (r= -.137, p<.1), suggesting an inverse
relationship between how others view one’s cognitive regulation with the amount of
video games one plays. Therefore, some exposure to video games may be a helpful
training tool for organizations to introduce as it relates to variables that predict success in
real time adaptive performance.
Furthermore, in the current study men also perform better than women on the
measure of adaptive performance. This may be explained as men generally score higher
in tests of spatial cognitive ability (Kimura, 1999; Terlecki & Newcombe, 2005; Voyer,
145
Voyer, & Bryden, 1995), which is supported by the results of this study (r=.139, p<.05).
Its been demonstrated that training through use of video games (e.g. Medal of Honor:
Pacific Assault) erases the gender variance on tests of spatial cognition (Feng, Spence
Pratt, 2007). Therefore, the use of video games may have multiple positive effects with
regard to adaptive performance in real time dynamic contexts.
Cognitive agility as a construct describes the flexibility between what some
consider conceptually opposing phenomena (cognitive openness and focused attention).
Rather than separating them, in some particular contexts, combinatorial skills may form
unique higher-order operations. A formative construct in this case may be an adequate
way to consider these phenomena. The notion of managing opposites has been written
about in organization science at multiple levels of system. In particular striking the
optimal balance between the paradox of being open and flexible along with being focused
has been a central concern in organization and leadership development for decades
(Argyris, 1970; Bennis, 1966). At the organizational level March’s work on exploration
versus exploitation describes the tendency for firms to either explore (search, innovate,
experiment) or exploit (refine and become more efficient) (March, 1991). This work has
led to the idea of “ambidexterity” which is the balance between exploration and
exploitation that organizations must employ in order to reach superior performance
(Tushman and O’Reilly, 1996). For firms to achieve ambidexterity it may be argued that
they require a body of individuals who possess aspects of abilities linked to both; with a
construct such as cognitive agility providing this support. At the individual level,
leadership has been said to require the ability to employ the simultaneous use of
146
opposites (Kotter, 1990; Quinn, Spreitzer, & Hart, 1992; Zaccaro et al., 1991), and
managerial flexibility has been described as the ability to strike ongoing balances
between method and style (Quinn, Spreitzer, & Hart, 1992; Sloan, 1984). Rather than
consider them in pure opposition the goal is to enhance the “the ability to act out a
cognitively complex strategy by playing multiple, even competing roles, in a highly
integrated and complementary way” Hooijberg & Quinn, 1992, p. 164) (2001: 131).
Therefore, a formative construct at the individual cognitive level may provide additional
understanding of particular skills in combination. In particular dynamic contexts, the
construct of cognitive agility may support individual and organizational practices.
Contributions
This work makes several core contributions to the fields of psychology and
organizational behavior. First of all organizational behavior, and specifically individual
adaptability have not assessed cognitive performance measures beyond intelligence that
predict adaptive performance. Next, much of the writing on both adaptability and
flexibility has been far too general in its definition and description. The work in adaptive
performance is focused on longer time horizons and across multiple tasks (Pulakos et al.,
2000). This study contributes to the above literatures through:
1. The investigation of adaptive performance in very specific contexts – in real time
within a singular task
147
2. Demonstration that a factor score of crystallized/fluid intelligence predicts
performance in a microworld program.
3. Demonstration that beyond intelligence the individual variables as well as the
formative construct of Cognitive Agility predict adaptive performance
4. Demonstration that video game experience has a predictive association to
adaptive performance, and
5. Demonstration that cognitive regulation has a significant inverse relationship to
performance in a real time dynamic context.
The constant turmoil associated with working as a member of the modern
organization creates conditions that have lead to an increase in the demands of individual
dynamic decision making (Farhoomand and Drury, 2002). These demands are associated
with big changes (like a merger with another organization) and with more micromomentary contextual management (Porac, Thomas, & Baden-Fuller, 1989). In essence,
individuals at work have to manage the big changes along with the continual ones
(Kegan, 1994).
Adaptability is often referred to as the individual level ability best suited to properly
manage oneself in relation to these changes ( Edwards and Morrison, 1994; Hesketh,
1997; Ilgen & Pulakos, 1999; O’Connel et al., 2008; Pulakos et al., 2000). As such
adaptability, the ability to perform well in dynamic contexts, has been cited as an
important and missing component to include within the larger discussion of job and task
performance (Campbell, 1999; Hesketh & Neal, 1999; London & Mone, 1999; P. R.
Murphy & Jackson, 1999). However, scholars who explicitly contribute to the study of
148
adaptability do not often address the complexity inherent in the cognitive aspects of
adaptive performance, as it pertains to real-time dynamic tasks. Understanding an
individual’s ability to be adaptive at the cognitive level may be a vital starting point to
successfully navigating dynamic environments (Glynn, 1996). In a real time dynamic
task context the variables that form cognitive agility and the construct itself may support
such necessary development.
In conclusion the purpose of this study is to investigate a potential individual
capacity to adapt within a real time adaptive performance task. Specifically, the research
sought to demonstrate whether the formative construct of cognitive agility predicts
adaptive performance in a dynamic decision making microworld. Results show that
beyond measures of intelligence, the Cognitive Agility construct was able to predict
performance across three trials of increasing difficulty on the Networked Fire Chief
microworld. While these findings need to be further tested in more natural organizational
settings, the initial results are suggestive of a combination of skills that may support
performance in real-time dynamic contexts.
149
APPENDIX
A. Informed Consent Document
INFORMED CONSENT DOCUMENT
Exploring the Concept of Cognitive Agility
You are being asked to participate in a research study about individual cognitive agility.
You were selected as a possible participant because you submitted your name and contact
information to a University Research Participant List. Please read this form and ask any
questions that you may have before agreeing to be in the research.
Researchers at Case Western Reserve University are conducting this study.
Background Information
The purpose of this research is to explore the idea that individuals can be cognitively
agile.
Procedures
If you agree to be a participant in this research, we would ask you complete the following
things, which should require approximately One Hour of your time:
(1) A questionnaire asking you about how you learn and pay attention
(2) Three short tests measuring how you use your attention and think
(3) A short questionnaire assessing your subjective well being
150
(4) A vocabulary test
(5) A test of visual spatial ability
(6) A simulation game in which you are asked to prevent the spread of brush fires
(7) Finally, you will be requested to provide the researcher of this study with a name
and email address of someone who you consider to be “close to you” (i.e. a friend,
significant other, roommate). This person will be contacted to fill out a brief email
questionnaire regarding what they perceive to be your tendencies to learn and pay
attention. This will require no more than 10-15 minutes of their time and will
require no additional time of yours. Once it is returned, their name and email
address will be removed from all files. The results of their questionnaire will not
be known to you.
Risks and Benefits to Being in the Study
This research study has no known or foreseeable risks and should be completed within a
year.
There are no direct or secondary benefits associated with your participation in this
research study.
Compensation
You will receive $20 in total for you participation in this study. You will receive $10 in
cash following your participation in the test taking portion of this study. A check for $10
151
will be sent to you once the person you have asked to fill out the questionnaire on your
behalf has done so
Confidentiality
The records of this research will be kept private at Case Western Reserve University. In
any sort of report we might publish, we will not include any information that will make it
possible to identify a participant. Research records will be kept in a locked file, and
access will be limited to the researchers, the University review board responsible for
protecting human participants, and regulatory agencies
Voluntary Nature of the Study
Your participation is voluntary. If you choose not to participate, it will not affect your
current or future relations with the University or with your standing at the Weatherhead
School of Management. There is no penalty or loss of benefits for not participating or for
discontinuing your participation. If you do withdraw from the study while taking the
tests you will not receive the $20 compensation. If however, you complete the tests but
the individual you have assigned to fill out the questionnaire on your behalf does not do
so, then you will still receive half the compensation in the form of $10 cash.
Contacts and Questions
The researchers conducting this study are Darren Good and Professor Richard Boyatzis.
You may ask any questions you have now. If you have any additional questions,
concerns or complaints about the study, you may contact them at:
152
Darren Good: (216) 368-2189 or by email to [email protected]
Richard Boyatzis: (216) 368-2053 or by email to [email protected].
If the researchers cannot be reached, or if you would like to talk to someone other than
the researcher(s) about; (1) questions, concerns or complaints regarding this study, (2)
research participant rights, (3) research-related injuries, or (4) other human subjects
issues, please contact Case Western Reserve University's Institutional Review Board at
(216) 368-6925 or write: Case Western Reserve University; Institutional Review Board;
10900 Euclid Ave.; Cleveland, OH 44106-7230.
You will be given a copy of this form for your records.
Statement of Consent
I have read the above information. I have received answers to the questions I have asked.
I consent to participate in this research. I am at least 18 years of age.
Print Name of Participant:
Signature of Participant:
Date:
153
Signature of Parent or Guardian [omit if not applicable]:
Date:
Signature of Person Obtaining Consent [omit if not applicable]:
Date:
________
B. INVITATION LETTER to PARTICIPANTS -- SENT VIA EMAIL
Subject: Invitation to Participate in a Paid Research Study
Dear Student,
You are invited to participate in a research study conducted on the campus of
Case Western Reserve University at the Weatherhead School of Management. This study
is investigating an individual’s cognitive agility. If you choose to join this study you will
be compensated with $20 in cash upon its completion. Participation in this study would
require approximately one hour of your time.
If you are interested please send a reply email stating your interest. You will be
contacted by the researcher with a list of potential times the study is running and the
location.
***You have been contacted as an undergraduate student who has volunteered to
be on an e-mail list for research participation. You have the right to withdraw your name
from this list without any consequence or loss of standing with the University. Your
154
participation in this study is also completely voluntary. If you choose not to participate,
it will not affect your current or future relations with the University or your standing at
the Weatherhead School of Management. There is no penalty or loss of benefits for not
participating or for discontinuing your participation.***
Thank you for taking the time to read this message.
Darren Good
Principal Researcher
216 368-2189
[email protected]
C. Invitation Sent to External Raters
SUBJECT: A Survey on Behalf of XXXXXX (Participant’s name)
To: XXXX
XXXX (Participant’s Name) is participating in a research study through Case Western
Reserve University to measure Cognitive Agility. He/She has submitted your name and
email address to assist him/her in the completion of this study. You are being requested to
155
respond to a series of questions about XXXX’s (Participant’s Name) behavior. If you
would like to participate in this survey on behalf of XXXX (Participant’s Name), please
click on the link below which will take you to a web site with instructions on how to
answer the questionnaire. Once at the website you will be prompted to fill in the user
code listed below. The questionnaire would require no more than 15 minutes of your
time. Your participation in this study would be greatly appreciated.
Website URL
Code Number: 1234
***This invitation to participate in this study is completely voluntary. You are in no way
obligated to participate in this study on behalf of XXXX. You will not be receiving any
monetary compensation for your participation. Your participation and responses would
remain completely anonymous and confidential. XXXX will have no knowledge
regarding your decision to participate. You can end your participation in this
questionnaire at any point after you begin without any consequence. If you decline to
participate it will have no effect on your standing or that of XXXX with Case Western
Reserve University.
If you should have any questions or concerns with this process please feel free to contact
the researchers on this study or the Case Western Institutional Review Board - Office of
Research Compliance.
156
Researchers
Darren Good
[email protected]
216.368.2189
Richard Boyatzis
[email protected]
216.368.2053
Case Western Institutional Review Board
Office of Research Compliance
216.368.6925
157
D. Thank you Letter Sent to Participants
Dear Research Participant,
Thank you for taking part in this research study on Cognitive Agility. At least one of the
people you selected to fill out a questionnaire on your behalf has done so. Therefore,
enclosed you will find $10 Cash to pay you for the second part of the study. At this point
you are completed with the study and will not be contacted further.
If you should have any questions or concerns please do not hesitate to contact me.
Thank you,
Darren Good
Researcher
Case Western Reserve University
Dept of Organizational Behavior
[email protected]
158
REFERENCES
Abelson, R. P. (1981). Psychological status of the script concept. American
Psychologist, 36, 715-729.
Ackerman, P.L. (1988). Determinants of individual differences during skill
acquisition: cognitive abilities and information processing. Journal of
Experimental Psychology: General, 117, 288 318.
Ackerman, P. L. (1992). Predicting individual differences in complex skill
acquisition: Dynamics of ability determinant. Journal of Applied Psychology, 77,
598 – 614.
Allen, B. A., & Armour-Thomas, E. (1993). Construct of metacognition. Journal
of Psychology, 127, 203-207.
Amabile, T. M. 1996. Creativity in context. Boulder, CO: Westview.
Argyris, Chris 1970 Intervention theory and method: a behavioral science view
Reading, Mass.: Addison-Wesley.
159
Arthur, M.B., Inkson, K., Pringle, J.K. (1999). The new careers: Individual action
and economic change.London: Sage.
Ashford, S. & Taylor, S. (1990). Understanding individual adaptation: An
integrative approach. In K. Rowland & J. Ferris, editors, Research in personnel
and human resource management.
Ashton, M. C., Lee, K., Vernon, P. A., & Jang, K. L. (2000). Fluid
intelligence, crystallized intelligence, and the Openness/Intellect factor. Journal of
Research in Personality, 34, 197–207.
Baddeley, A. (1992). Working memory. Science, 255, 556-559.
Baddeley, A. D., & Hitch, G. (1974). Working memory. In G. A. Bower (Ed.),
The
psychology of learning and motivation (Vol. 8, pp. 47– 89). New York:
Academic Press.
Baer, J. (1993). Divergent thinking and creativity: A task-specific approach.
160
Hillsdale, NJ: Erlbaum.
Bailin, S.(1988).Achieving extraordinary ends: An essay on creativity. Boston:
Kluwer Academic
Baird, L. & Griffin, D. (2006). Adaptability and responsiveness: the case for
dynamic learning. Organization Dynamics, 35, 372–383.
Baker, L. (1989). Metacognition, comprehension monitoring, and the adult
reader. Educational Psychology Review, 1, 3-38.
Baltes, P. B., & Staudinger, U. M. (Eds.). (1996). Interactive minds: Lifespan perspectives on the social foundation of cognition. New York: Cambridge
University Press.
Barcelo, F., PeriaÑez, J.A., & Knight,R.T (2002). Think differently: a brain
orienting response to task novelty. NeuroReport, 13, 1887-1892.
Bargh, J.A. 1994. "The Four Horsemen of Automaticity: Awareness,
Intention, Efficiency, and Control in Social Cognition." in T.K. Srull and R.S.
Wyer, Jr. (Eds.), Handbook of Social Cognition. Vol 1, 1-40. Hillsdale, N J:
Erlbaum.
161
Barnes, A. E., Nelson, T. O., Dunlosky, J., Mazzoni, G., & Narens, L. (1999).
An integrative system of metamemory components involved in retrieval. In D.
Gopher & A. Koriat (Eds.), Attention and Performance XVII – Cognitive
regulation of performance: Interaction of theory and application (pp. 287-313).
Cambridge, MA: MIT Press
Bar-On, R. (1995). EQ-I: The emotional quotient inventory Manual. A test of
emotional intelligence. New York, NY: Multi-Health Systems
Barrett, G.V. & Depinet, R.L. (1991). A reconsideration of testing for competence
rather than for intelligence. American Psychologist, 46, 1012-1024.
Barrick, M.R., & Mount, M.K. (1991). The big five personality dimensions and
job performance; a meta analysis. Personnel Psychology, 44, 1-16.
Baughman, W.A., & Mumford, M.D. (1995). Process-analytic models of
creative capacities: Operations influencing the combination and reorganization
process. Creativity Research Journal, 8, 37-62.
Baylor, A.L. (2001). A U-shaped model for the development of intuition by level
of expertise. New Ideas in Psychology, 19, 237-244.
162
Beck, A. T. (1967). Depression: Clinical, Experimental, and Theoretical Aspects.
New York: Hoeber.
Bennis, W. Changing organizations New York: McGraw-Hill, 1966.
Berg, E.A. (1948). A simple objective technique for measuring flexibility in
thinking, Journal of General Psychology 39:15–22.
Berg, C.A., & Sternberg, R.J. (1985). A triarchic theory of intellectual
development during adulthood. Developmental Review, 6, 334–370.
Berkowitz, L., & Donnerstein, E. (1982). External validity is more than skin
deep. American Psychologist, 37, 245-257
Berlyne, D.E. (1960) Conflict, arousal, and curiosiMcGraw-Hill Book Company,
New York.
Berlyne, D. E. (1949). Interest as a psychological concept. British Journal of
Psychology, 39, 184–185.
Berlyne, D. E. (1950). Novelty and curiosity as determinants of exploratory
163
behavior. British Journal of Psychology, 41, 50–68.
Berlyne, D. E. (1955) The arousal and satiation of perceptual curiosity in the rat.
Journal of Comparative Physiology and Psychology 48, 238–246
Berlyne, D. E. (1958). The influence of complexity and novelty in visual figures
on orienting responses. Journal of Experimental Psychology, 55, 289–296.
Black, S. J. (1990). The relationship of personal characteristics with the adjustment
of Japanese expatriate managers. Management International Review 30, 119–134.
Blair, I. (2001). Implicit stereotypes and prejudice. In G. B. Moskowitz (Ed.),
Cognitive Social Psychology: The Princeton Symposium on the Legacy and
Future of Social Cognition (pp. 359–374). Mahwah,NJ: Erlbaum
Blickle, G. (1996). Personality traits, learning strategies, and performance.
European Journal of Personality, 10, 337-352.
Bodner, T., & Langer, E. (2001). Individual Differences in Mindfulness:
The Mindfulness/Mindlessness Scale. 13th Annual American Psychological
Society Conference, Toronto, Ontario, Canada Poster presented on June 15, 2001.
164
Bowonder, B., & Miyake, T. (2000). Technology management: A knowledge ecology
perspective. Int. J. Tech. Man. 19, 662-684.
Boyatzis, R. E. & Kram K. E. (1999). Reconstructing Management Education
as Lifelong Learning. Selections. 16, 17-28.
Brehmer, B. (1992). Dynamic decision making: Human control of complex systems.
Acta Psychologica, 81, 211–241.
Brehmer B. and Dörner D, (1993). Experiments with computer-simulated
microworlds: Escaping both the narrow straits of the laboratory and the deep blue
sea of the field study, Computers in Human Behavior, 9, 171-184
Brehmer, B., Leplat, J., and Rasmussen, J. (1991). Use of Simulation in the study
of Complex Decision Making", Rasmussen, J., Brehmer, B., and Leplat, J. (eds)
Distributed Decision Making:cognitive models for co-operative work“, pp 373386.
Brehmer, B., & Rigas, G. (1997). Transparency and the Relationship between
Dynamic Decision Making and Intelligence. Manuscript under preparation.
165
Briscoe, J.P. and Hall, D.T. (1999) Grooming and picking leaders using
competency frameworks: do they work? An alternative approach and new
guidelines for practice. Organizational Dynamics, 28, 37–52.
Broadbent, D. E. (1958). Perception and communication. New York, Pergamon Press.
Brown, D. R. (1953). Stimulus-similarity and anchoring of subjective scales,
American Journal of Psychology, 66, 199-214
Brown, S. J., Lieberman, D. A., Germeny, B. A., Fan, Y. C., Wilson, D. M., & Pasta,
D. J. (1997).Educational video game for juvenile diabetes: Results of a controlled
trial. Medical Informatics, 22,77–89
Brown A, Palinscar AS (1982) Inducing strategic learning from text by means
of informed self-control thinking. Topics in Learning Disabilities 2, 1-17.
Brunstein, J. C. & Olbrich, E. (1985). Personal helplessness and action control:
Analysis of achievement related cognitions, self-assessments, and performance.
Journal of Personality and Social Psychology, 48, 1540–1551.
166
Butler, R.A. (1957).The effect of deprivation of visual incentives on visual
exploration motivation in monkeys. Journal of Comparative & Physiological
Psychology, 48, 247–249.
Business Week (2009) The top undergraduate business programs
-http://bwnt.businessweek.com/interactive_reports/undergrad_bschool_2009/
Calarco, A., & Gurvis, J. (2006). Adaptability: Responding effectively to
change. Greensboro, NC: Center for Creative Leadership.
Campbell, D T & Fiske, D W. (1959). Convergent and discriminant validation by
the multitrait-multimethod matrix. Psychological Bulletin. 56, 81-105
Cañas, J.J.; Quesada, J.F.; Antolí, A. and Fajardo, I. (2003). Cognitive flexibility
and adaptability to environmental changes in dynamic complex problem solving
tasks. Ergonomics, 46, 482-501.
Carlson, S.M., Moses, L.J. Hix, H.R. (1998). The role of inhibitory processes
in young children's difficulties with deception and false belief. Child
Development, 69, 672-691.
167
Carpenter, P.A., Just, M.A., & Shell, P. (1990). What one intelligence test measures:
A theoretical account of the processing in the Raven Progressive Matrices Test.
Psychol Review, 97, 404–431
Cascio, W.F. (2003). Managing Human Resources: Productivity, Quality of Work
Life, Profits, (6th ed.), McGraw-Hill, New York, NY, pp.300.
Cattell, R.B. (1943). The measurement of adult intelligence, Psychological Bulletin
40, 153–193.
Cattell, R.B. (1963). Theory of fluid and crystallized intelligence: A critical
experiment, Journal of Educational Psychology 54, 1–22.
Chamorro-Premuzic, T. (2006). Creativity versus Conscientiousness: which is a
better predictor of student performance? Applied Cognitive Psychology, 20: 521531.
Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1991). Categorization and representation
of physics problems by experts and novices. Cogntive Sciences, 5, 121-152.
Chi, M.T.H. (1997). Creativity: Shifting across ontological categories flexibly. In
168
Ward, T.B., Smith, S.M. and Vaid, J. (eds.), Creative thought: An investigation
of conceptual structures and processes. American Psychological Association,
Washington, DC, pp. 209–34.
Christensen, C.M. (1997). The Innovator’s Dilemma: When New Technologies
Cause Great Firms to Fail. Boston: Harvard Business School Press
Church, A. H. (1997). Managerialself-awareness in highPerforming individuals in organizations. Journal ofApplied Psychology, 82, 281292
Clark, H.H. (1996). Using Language. Cambridge: Cambridge University Press.
Clements, DH. (1986). Effects of Logo and CAI environments on cognition
and creativity. Journal of Educational Psychology, 78, 309-318
Cohen, L. J. (1981). Can Human Irrationality Be Experimentally
Demonstrated? Behavioral and Brain Sciences, 4, 317-370.
Collins, Littman & Spielberger, (2004). The measurement of perceptual
curiosity Personality and Individual Differences 36, 1127–1141.
169
Costa, P. T., & McCrae, R. R. (1985). The NEO Personality Inventory manual.
Odessa, FL: Psychological Assessment Resources.
Costa, P.T. & McCrae, R. R. (1992). The revised neo personality inventory
(NEOPI–R) and NEO Five Factor Inventory (NEO–FFI) professional manual.
Odessa, FL: Psychological Assessment Resources.
Cote, J. A., & Buckley, M.R. (1987). Estimating Trait, Method, and
Error Variance: Generalizing Across Seventy Construct Validation Studies.
Journal of Marketing Research, 26, 315-319.
Coull, J. T. (1998). Neural correlates of attention and arousal: Insights
from electrophysiology, functional neuroimaging and psychopharmacology.
Progress in Neurobiology, 55, 343–361.
Conway, Andrew R.A., Nelson Cowan, Michael F. Bunting, David J. Therriault,
and Scott R.B. Minkoff (2002). A latent variable analysis of working memory
capacity, short-term memory capacity, processing speed, and general fluid
intelligence. Intelligence 30, 163-83
170
Crawford, H. J. (1993). Differential Attentional Processes Inventory: Sustained and dual
attentional styles inventory that discriminates between hypnotic susceptibility level.
Unpublished manuscript
Crocker, J., Fiske, S. T. & Taylor, S. E. (1984). Schematic bases of belief change.
In J.R.Eisner (Ed), Attitudinal judgment (pp. 197-226). New York, NY: SpringerVerlag
Cropley, A. J. (2006), In praise of convergent thinking. Creativity Research
Journal, 18. 391-404.
Daneman, M., & Capenter, P. A. (1980). Individual differences in working memory
and reading. Journal of Verbal Learning and Verbal Behavior, 19, 450-466.
Darwin, C. (1958). The autobiography of Charles Darwin. New York: W. W. Norton.
Davis, G. A. (1992). Creativity is forever (3rd ed.), Dubuque, IA: Kendall/
Hunt Publishing Company.
Davls, G. (1971). Instruments useful In studying creative behavior and creative
talent. Journal of Creative Behavior, 5, 162-165.
171
Dawes, R.M. (1988). Rational Choice in an Uncertain World. San Diego, CA:
Harcourt, Brace, Jovanovich
Day, H. I. (1971). The measurement of specific curiosity. In H. I. Day, D. E. Berlyne,
& D. E. Hunt (Eds.), Intrinsic motivation (pp. 77- 97). Toronto: Holt, Rinehart and
Winston of Canada.
Day, H. I., Maynes, F., & Spring, L. (1972). Prior knowledge and the desire
for information. Canadian Journal of Behavioural Science, 4, 330-337.
Day, H. I. (1969). An instrument for the measurement of intrinsic motivation. An
interim report to the Department of Manpower and Immigration
DeFries, J.C. Vandenberg, S.G., McClearn, G.E. , Kuse A.R. , & J.R. Wilson
(1974). Near identity of cognitive structure in two ethnic groups. Science, 183,
338–339.
DeGangi, G., & Porges, S. (1990). Neuroscience foundations of human
performance. Rockville, MD: American Occupational Therapy Association.
172
Delis, D.C., Kaplan, E., & Kramer, J.H. (2001). The Delis-Kaplan Executive Function
System. San Antonio: The Psychological Corporation
Dempster,F.N. (1991). Inhibitory processes: a neglected dimensions of
intelligence. Intelligence, 15, 157-173.
Deikman, A.J. (1982). The observing self: Mysticism and psychotherapy.
Boston: Beacon Press.
Derryberry, D., & Rothbart, M.K. (1988). Affect, arousal, and attention as components
of temperment. Journal of Personality and Social Psychology, 55, 958-966.
Dewing, K. & Battye, G. 1971. Attentional deployment and non-verbal fluency.
Journal of Personality and Social Psychology, 17, 214-218.
DeYoung, C. G., Peterson, J. B., & Higgins, D. M (2002). Higher-order factors of
the Big Five predict conformity: Are there neuroses of health? Personality and
Individual Differences. 33, 533-552.
173
DeYoung, C. G., Peterson, J. B., & Higgins, D. M. (2005). Sources of Openness/
Intellect: Cognitive and Neuropsychological Correlates of the Fifth Factor of
Personality. Journal of Personality 73, 825-858.
Diamantopoulos A. (2006). The error term in formative measurement
models: interpretation and modeling implications. Journal of Modeling in
Management , 7–17.
Diamantopoulos, A.; Winklhofer, H.(2001)., Index Construction with
Formative Indicators: An Alternative to Scale Development. Journal of Marketing
Research, 38, 269-277.
Diehl, E., & Sterman, J.D. (1995)., Effects of feedback complexity on dynamic
decision making. Organizational Behavior and Human Decision Processes,62,
198–215
DiFonzo, N., Hantula, D. A., & Bordia, P. (1998). Microworlds for
experimental research: Having your (control & collection) cake, and realism too.
Behavior Research Methods, Instruments, & Computers, 30, 278-286
Digman, J. M. (1990). Personality structure: Emergence of the five-factor model.
174
Annual Review of Psychology, 41, 417–440.
Digman, J. M. (1997). Higher-order factors of the Big Five. Journal of Personality
and Social Psychology, 73, 1246–1256.
Dipboye, R., & Flanagan, M. (1979). Are findings in the field more generalizable than
in the laboratory. American Psychologist, 34, 141-150.
Dörner, D., Kreuzig, H. W., Reither, F., & Stäudel, T. (Eds.).(1983). Lohhausen.
Vom Umgang mit Unbestimmtheit und Komplexität. Bern, Switzerland: Hans
Huber.
Dovidio, J. F., Kawakami, K., & Gaertner, S. L. (2002). Implicit and explicit prejudice
in interracial interaction. Journal of Personality and Social Psychology, 82, 62–68
Dykes, M., & McGhie, A. (1976). A comparative study of attentional strategies
of Schizophrenic and highly creative normal subjects. British Journal of Psychiatry,
128, 50–56
Earley, P.C., & Ang, S. (2003).Cultural intelligence: Individual interactions across
cultures. Stanford, CA: Stanford University Press.
175
Edwards, W. (1966). Nonconservative information processing systems. University
of Michigan, Institute of Science and Technology
Edwards, J. E., & Morrison, R. F. (1994). Selecting and classifying future naval
officers: The paradox of greater specialization in broader areas. In M. G. Rumsey,
C. B. Walker, & Harris, J. H. (Eds.), Personnel selection and classification.
Hillsdale, NJ: Lawrence Erlbaum Associates.
Ekstrom, J.W. French, and H.H. Harman (1976). Manual for Kit of Factor
Referenced Cognitive Tests. Educational Testing Service, Princeton, NJ.
Endsley, M. R. (1995). Measurement of situation awareness in dynamic systems.
HumanFactors, 37, 65-84.
Engle, R.W., Tuholski, S.W., Laughlin, J.E., & Conway, A.R.A. (1999).
Working memory, short-term memory, and general fluid intelligence: A
latent-variable approach. Journal of Experimental Psychology: General, 128,
309-331.
176
Eriksen, C. W., & St. James, J. D. (1986). Visual attention within and around the field
of focal attention: A zoom lens model. Perception & Psychophysics, 40, 225–240.
Eslinger, P.G. & Grattan, L.M. (1993). Frontal lobe and frontal striatal substrates f
or different forms of human cognitive flexibility. Neuropsychologia, 31, 17-28.
Epstein R. 2003. Generativity theory as a theory of creativity.
Farhoomand, A. F. and Drury, D. H. (2002). Managerial Information
Overload, Communications of the ACM, 45, 127-131.
Featherman, D.L., Smith, J., & Peterson, J.G. (1990). Successful aging in a “postretired” society. 50–93.
Feng, J., Spence, I., & Pratt, J. (2007). Playing an Action Video Game Reduces
Gender Difference in Spatial Cognition. Psychological Science, 18,(10).
Fernandez-Duque, D, Baird, J. A., & Posner, M. I. (2000). Executive attention
and metacognitive regulation. Consciousness and Cognition, 9, 288–307.
177
Fiedler, F. E., & Garcia, J. E. (1987). New approaches to effective leadership:
Cognitive resources and organizational performance. New York: Wiley.
Fiol, M.C., & Huff, A.S. (2007). Maps for managers: where are we? Where do we
go from here. Journal of Management Studies, 29, 267 – 285.
Fiske, S.T., & Taylor, S.E. (1991). Social Cognition (2nd ed.). New York: McGraw-Hill.
Fiske, S. T. (1982): “Schema-Triggered Affect: Applications to Social Perception”. In
M. S. Clarke and S. T. Fiske: Affect and Cognition: The 17Th Annual Carnegie
Symposium on Cognition, pp. 55-78. Hillsdale, N.J.: Erlbaum
Fiske, S.T., and Neuberg, S.L. 1990. A Continuum of Impression Formation,
from category based to individuating processes: Influences of Information and
Motivation no Attention and Interpretation. Advances in Experimental Social
Psychology, 23, 1-74.
Flavell, J. (1979). Metacognition and cognitive monitoring: a new area of
178
cognitive-developmental inquiry, American Psychologist, 34, 906-911.
Flavell, J. H., and Wellman, H. M. (1977). Metamemory. In R. V. Kail, Jr., and J.
W. Hagen (eds.), Perspectives on the Development of Memory and Cognition.
Lawrence Erlbaum Associates, Publishers, Hillsdale, NJ, pp. 3-33
Florack, A., Scarabis, M., & Bless, H. (2001). When do associations matter?: The use
of implicit associations towards ethnic groups in person judgments. Journal of
Experimental Social Psychology, 37, 518-524.
Frensch, P. A., & Sternberg, R. J. (1989). Expertise and intelligent thinking: When is
it worse to know better? In R. J. Sternberg (Ed.), Advances in the psychology of
human intelligence (pp. 157-188). Hillsdale, N J: Lawrence Erlbaum.
Funke, J. (1991). Solving complex problems: Human identification and control
of complex systems. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem
solving: Principles and mechanisms (pp. 185-222). Hillsdale, NJ: Lawrence
Erlbaum Associates.
179
Galotti, K. M. (1999). Cognitive Psychology In and Out of the Laboratory (2nd
edition). California: Wadsworth Publishing.
Garavan H., Ross T.J., Murphy K., Roche R.A., Stein E.A. (2002). Dissociable
executive functions in the dynamic control of behavior: inhibition, error detection,
and correction. NeuroImage 17, 1820 –1829.
Garlington, W. K., & Shimona, H. E. (1964). The change seeker index: A measure of
the need for variable sensory input. Psychological Reports, 14, 919-924.
Garner, R. (1994). Metacognition and executive control. In R. B. Ruddell, M. R.
Ruddell, & H. Singer (Eds.). Models and processes of reading (fourth edition)
(pp. 715-732). Newark, DE: International Reading Association.
Garson, D. (2008). Statnotes: Topics in Multivariate Analysis. Retrieved March 20,
2008, from http://www2.chass.ncsu.edu/garson/pa765/statnote.htm.
Garlington, W. K., & Shimona, H. E. (1964). The change seeker index: A measure of
the need for variable sensory input. Psychological Reports, 14, 919-924.
180
George, D., & Mallery, P. (2003) SPSS for Windows step by step: a simple guide
and reference 11.0 update, Allyn and Bacon, Boston.
Gibbs, R.W., Jr. (1994). The poetics of mind: Figurative thought, language,
and understanding. New York: Cambridge University Press.
Gilhooly, K. J. and Green, A. K. (1989). Learning problem solving skills. In A.
M. Colley and J. R. Beech, (Eds.), Acquisition and Performance of Cognitive
Skills. Chichester, UK: Wiley.
Glicksohn J., Kapelusshnik J., Bar-Ziv J., & Myslobodsky M. (1993). A new look
at sleep position via head slant. Funct Neurol. 8: 347–350.
Glynn, M.A. (1996). Innovative genius: A framework for relating individual
and organizational intelligences to innovation. Academy of Management Review,
21, 1081-1111.
Goetz, E., Schallert, D., Reynolds, R., & Radin, D. (1983). Reading in perspective:
What real cops and pretend burglars look for in a story. Journal of Educational
181
Psychology, 75, 500-510
Goldberg, L. R. (1981). Language and individual differences: The search for
universals in personality lexicons. In L. Wheeler (Ed.), Review of personality and
social psychology, (Vol. 2, pp. 141165). Beverly Hills, CA: Sage.
Goldberg, L. R. (1992). The development of markers for the Big-Five factor
structure. Psychological Assessment, 4, 26-42.
Goldberg, L. R. (1990). An alternative "description of personality": The Big-Five
factor structure. Journal of Personality and Social Psychology, 59, 1216-1229.
Goldhammer & Moosbrugger, 2006 Moosbrugger, H., Goldhammer, F. (in
preparation). FACT-II Frankfurt Adaptive Concentration Test.
Goleman, D. (1998). Working with Emotional Intelligence. New York: Bantam Books.
Gonzalez, C. (2004). Learning to make decisions in dynamic environments: Effects
of time constraints and cognitive abilities. Human Factors, 46, 449–460.
Gonzalez C., Vanyukov, P., & Martin, M.K. (2005). The use of microworlds to
182
study dynamic decision Making. Computers in Human Behavior, 21, 273-286.
Gottfredson, L. S. (Ed.) (1997). Intelligence and social policy (special
issue). Intelligence, 24, 1-320.
Gough, H. G. (1979). A creative personality scale for the Adjective Check List.
Journal of Personality and Social Psychology, 37, 1398-1405.
Gorlitz, D. (1987). Curiosity, imagination, and play: On the development of
spontaneous cognitive and motivational processes. Hillsdale, NJ: Lawrence Earl
Baum Associates.
Gray, J.R., Chabris, C.F., & Braver, T.S. (2003). Neural mechanisms of general fluid
intelligence. Natural Neuroscience, 6, 316–322
Green, C., & Bavelier, D. (2003). Action video game modifies visual attention.
Nature, 423, 534-537.
Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: Attitudes,
self-esteem, and stereotypes. Psychological Review, 102, 4–27
183
Groborz, M. and Necka, E. (2003). Creativity and cognitive control: Explorations
of generation and valuation skills. Creativity Research Journal, 15, 183-197.
Guilford, J.P. (1956). Fundamental statistics in psychology and education. New
York: McGraw Hill.
Guilford, J.P. (1957). A revised structure of intellect. Report of Psychology, 19, 1-19.
Guildford, J.P. (1959). The three faces of intellect. American Psychologist, 14: 469-479.
Guilford, J.P. (1960). Frontiers in thinking that teachers should know about.
The Reading Teacher, 13, 176-182.
Guilford, J.P., Christensen, P.R. , Merrifield P.R., & R.C. Wilson (1978). Alternate
uses: manual of instructions and interpretation, Sheridan Psychological Services,
Orange, CA.
Hacker, D. J. (1998). Definitions and empirical foundations. In D. J. Hacker, J.
184
Dunlosky, & A. C. Graesser (Eds.), Metacognition in educational theory and
practice (pp. 1- 23). Hillsdale, NJ: Lawrence Erlbaum Associates.
Hair, J.F., Anderson, R.E., Tatham, R.L., & Black, W.C. (1998). Multivariate
Data Analysis, 4th edn.Prentice Hall, New Jersey, United States of America.
Hall, D. (1999). Accelerate executive development – at your peril! Career
Development International, 4, 237 – 239.
Hall, D.T., Mirvis, P.H. (1995). The new career contract: Developing the whole person
at midlife and beyond. Journal of Vocational Behavior, 47, 269–289
Hamilton, D.L. (Ed.). (1981). Cognitive Processes in Stereotyping and Intergroup Behavior. Hillsdale, N J: Erlbaum
Hampton, J.A., (1987). Inheritance of attributes in natural concept conjunctions.
Memory and Cognition. 15, 55–71.
Hart, J. T. (1965). Memory and the feeling-of-knowing experience. Journal
of Educational Psychology, 56, 208-216.
185
Hartigan, J.A., & Wigdor, A.K. (1989). Fairness in employment testing:
Validity generalization, minority issues and the General Aptitude Test Battery.
Washington, DC: National Academy Press.
Hasher, L., and Zacks, R.T. (1979). Automatic and effortful processes in memory.
Journal of Experimental Psychology: General, 108, 356-388,
Haynie , J. M. (2005). Cognitive adaptability: the role of metacognition and feedback
in entrepreneurialdecision policies. Unpublished dissertation, University of
Colorado at Boulder.
Haynie, M.J., Shepherd, D., Mosakowski, E., & Earley, C.P. (2008). A
situated metacognitive model of the entrepreneurial mindset, Journal of Business
Venturing, in press.
Hebb, D.O. (1958). A Textbook of Psychology, W.B. Saunders, Philadelphia
186
Hedge, J.W., and Borman, W.C., (1995). Changing conceptions and practices
in performance appraisal. In Ann Howard, (Ed.), Frontiers of Industrial and
Organizational Psychology: The Changing Nature of Work, Jossey-Bass.
Helson, H. (1964). Adaptation-Level Theory. New York: Harper & Row.
Herrnstein, R. & Murray, C. (1994). The bell curve: Intelligence and class structure
in American life. New York: Free Press.
Hesketh, B. (1997). Dilemmas in training for transfer and retention. Applied
Psychology: An International Review, 46, 317-319.
Higgins, E. T., & Bargh, J. A. (1987). Social perception and social cognition.
Annual Review of Psychology, 38, 369–425
Hilton, D. J. (1995). The social context of reasoning: Conversational inference
and rational judgment. Psychological Bulletin, 118, 248 –271.
187
Hollenbeck, G. P., & McCall, M. W. (1999). Leadership development:
Contemporary practices. In A. I. Kraut & A. Korman (Eds.), Evolving practices in
human resource management (pp. 172-200). San Francisco, CA; Jossey-Bass.
Holmes, T.H., & Rahe R.H. 1967. The Social Readjustment Rating Scale. J. Psychom.
Res. 12, 213–18
Holyoak, K. J. (1991). Symbolic connectionism: toward third-generation theories
of expertise. In K. A. Ericsson & M. J. Smith (Eds.) Toward a general theory of
expertise: prospects and limits. Cambridge, NY: Cambridge University Press.
Horn, J. L. (1985). Remodeling old models of intelligence. In B. B. Wolman
(Ed.), Handbook of intelligence: Theories,measurements, and applications (pp.
267–300). New York. Wiley.
Horn, J. L. (1989). Models of intelligence. In R. L. Linn (Ed.), Intelligence:
Measurement, theory,
Horn, J. L. (1998). A basis for research on age differences in cognitive capabilities. In
188
J. J. McArdle, & R. W. Woodcock (Eds.),Human cognitive abilities in theory and
practice (pp. 57–91). Mahwah, NJ Erlbaum
House, R.J., Shane, S.A., & Herold, D.M. (1996). Rumors of the death of
dispositional research are vastly exaggerated. Academy of Management Review,
21, 203–24.
Houston, J. P. & Mednick, S. A. (1963). Creativity and the need for novelty. Journal
of Abnormal and Social Psychology, 66, 137-141.
Hughes, C. (1998). Finding your marbles: Does preschoolers' strategy behavior
predict later understanding of mind? Developmental Psychology, 34, 1326-1339.
Hunter, J. E., & Hunter, R. F. (1984). Validity and utility of alternative predictors of
job performance. Psychological Bulletin, 76, 72-93.
Ilgen, D. R., & Pulakos, E. D. (1999). Employee performance in today’s organizations.
In D. R. Ilgen and E. D. Pulakos (Eds.), The changing nature of work
performance: Implications for staffing, motivation, and development. San
Francisco: Jossey-Bass.
189
Isen, A.M., Daubman, K.A., & Nowicki, G.P. (1987). Positive affect facilitates
creative problem solving. Journal of Personality and Social Psychology, 52,
1122-1131.
Jackson, D. N. (1967). Personality Research Form Manual. New York:
Research Psychologists Press.
Jackson, P., & Messick, S. (1973). The person, the product, and the response:
Conceptual problems in the assessment of creativity. In M. Bloomberg(Ed.),
Creativity:Theory and research. New Haven, CT: College & University Press.
James, William. (1890/1950). The principles of psychology. New York: Dover
Jacobs, J.E. & Paris, S.G. (1987). Children’s metacognition about reading: Issues
in definition, measurement, and instruction. Educational Psychologist 22, 255–278.
Jacoby, L.L. (1991). A process dissociation framework: Separating automatic
from intentional uses of memory. Journal of Memory & Language, 30, 513-541.
190
Jarvis, C., MacKenzie, S. and Podsakoff, P., 2003. A Critical Review of
Construct Indicators and Measurement Model Misspecification in Marketing and
Consumer Research. Journal of Consumer Research. 30 (September), 199-218.
Jensen, A.R. (1993). Test validity: g versus "tacit knowledge." Current Directions
in Psychological Science, 1: 9-10.
Johnson-Laird, P. N.(1983). Mental Models. Harvard University Press, Cambridge, MA,.
Jones D.M. & Davies D.R. (1984). Individual and Group Differences in the Response
to Noise in Noise and Society, Jones, D.M.,& Chapman A.J. (Eds), John Wiley &
Sons Ltd, 125-154.
Jost, J.T., Kruglanski, A.W., & Nelson, T.O. (1998). Social metacognition:
An expansionist review. Personality and Social Psychology Review, 2, 137–154.
Kahneman, D. (1973). Attention and effort. Englewood Cliffs: Prentice Hall.
191
Kahneman, D., & Treisman, A. (1984). Changing views of attention and automaticity.
In R. Parasuraman & D. R. Davies (Eds.), Varieties of attention (pp. 29–61). New
York: Academic Press.
Kalmijn, S., van Boxtel, M. P., Verschuren, M. W., Jolles, J., Launer, L. J.
(2002). Cigarette smoking and alcohol consumption in relation to cognitive
performance in middle age. American Journal of Epidemiology, 156, 936-944
Kane, M.J., Bleckley, M.K., Conway, A.R.A., & Engle, R.W. (2001). A controlled
-attention view of working-memory capacity. Journal of Experimental
Psychology: General, 130, 169-183.
Kasof, J. (1997). Creativity and Breadth of Attention. Creativity Research Journal.
10, 303-315.
Kiesler, S., & Sproull, L. (1982). Managerial response to changing environments
:Perspectives on problem sensing from social cognition. Administrative Science
Quarterly, 27, 548-570
Kimura, D., (1999). Sex and Cognition. MIT Press, Cambridge, Mass.
192
King, L.A., Walker, L.M., & Broyles, S.J. (1996). Creativitiy and the five factor
model. Journal of research in Personality, 30, 189-203.
Kinicki, A., & Latack, J. (1990). Explication of the Construct of Coping
With Involuntary Job Loss. Journal of Vocational Behavior, 36, 339-360.
Kjellberg, A., Landstrom, U., Tesarz, M., Soderberg, L., Akerlund, E. (1996) The
effects of nonphysical noise characteristics, ongoing task and noise sensitivity on
annoyance and distraction due to noise at work. Journal of Environmental
Psychology. 16, 123-136.
Koestler, A. (1964). The act of creation. New York: Macmillan.
Koriat, A. & Goldsmith, M. (1996). Monitoring and control processes in the
strategic regulation of memory accuracy. Psychological Review, 103, 490-517.
Kolb, D. A. (1984). Experiential learning: Experience as the source of learning
193
and development, Upper Saddle River, New Jersey: Prentice Hall.
Kolb, D. A. (1999). Learning Style Inventory, Version 3: Technical
specifications. Boston, MA: Hay Group, Hay Resources Direct. 116 Huntington
Avenue, Boston, MA
Kotter, J. P. (1990). A Force for Change: How Leadership Differs from
Management. New York: Free Press.
Kozlowski, S.W.J., Gully, S.M., Brown, K.G., Salas, E.,Smith, E.M., & Nason,
E.R. (2001). Effects of training goals and goal orientation traits on
multidimensional training outcomes and performance adaptability. Organizational
Behavior & Human Decision Processes, 85, 1-31
Krapp, A. (1999). Interest, motivation and learning: An educationalpsychological perspective. European Journal of Psychology of Education, 14, 2340.
Kuhl, J. (1992). A theory of self-regulation: Action versus state orientation, self
194
-discrimination, and some applications. Applied Psychology: An International
Review, 41, 95-129.
Kuhl, J. (1994b). Action versus state orientation: Psychometric properties of the
action-control scale (ACS-90). In J. Kuhl & J. Beckmann (Eds.), Volition and
personality: action versus state orientation. Toronto/Göttingen: Hogrefe.
Kuhl, J., & Beckmann, J. (1994). Volition and personality: Action versus
state orientation. Göttingen/Seattle: Hogrefe.
Kuhl, J., & Kazen-Saad, M. (1988). A motivational approach to volition: Activation
and de-activation of memory representations related to unfulfilled intentions. In
V. Hamilton, G. H. Bower, & N. H. Firjda (Eds.), Cognitive perspectives on
emotion and motivation (pp. 63–85). Dordrecht, The Netherlands: Martinus
Nijhoff.
Kuhn, D. (1989). Children and adults as intuitive scientists. Psychological Review,
96, 674-689.
195
Kyllonen, P. C. (1985). Dimensions of information processing speed (AFHRL-TP84-56). Brooks Air Force Base, TX: Air Force Systems Command.
Kyllonen, P. C, & Christal, R. E. (1990). Reasoning ability is (little more than)
working memory capacity. Intelligence, 14, 389-433.
Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life.
Cambridge, MA: Harvard University Press.
Kinicki, A., & Latack, J. (1990). Explication of the Construct of Coping
With Involuntary Job Loss. Journal of Vocational Behavior, 36, 339-360.
King, L.A., Walker, L.M., & Broyles, S.J. (1996). Creativity and the five factor
model. Journal of Research in Personality, 3, 189-203.
Kodish, J.L., Gibson, D.V., & Amos, J.W. (1995). The development and operation of
196
an agile manufacturing consortium: the case of AAMRC, in: Proceedings of the
Fourth Annual Conference on Models, Metrics and Pilots, vol. 2, Atlanta, Georgia.
Kramer, A. F., Humphrey, D. G., Larish, J. F., Logan, G. D., & Strayer, D. L.
(1994). Aging and inhibition: Beyond a unitary view of inhibitory processing in
attention. Psychology & Aging, 9, 491-512
LaBerge, D. (1995). Attentional Processing: The Brain’s Art of Mindfulness.
Cambridge, MA: Harvard University Press
Langer, E. (1978). Rethinking the role of thought in social interaction. In J. Harvey,
W. Ickes, & R. Kidd (Eds.), New directions in attribution research (Vol. 2).
Hillsdale, NJ: Erlbaum.
Langer, E. (1989). Minding matters: The consequences of mindlessness-mindfulness.
In L. Berkowitz (Ed.), Advances in experimental social psychology (pp. 137–
173). San Diego, CA: Academic Press.
Langer, E. (1992). Matters of mind: Mindfulness/Mindlessness in
197
perspective. Consciousness and Cognition, 1, 289–305.
Langer, E. (1997). The power of mindful learning. Reading, Massachusetts:
Addison-Wesley.
Langer, E. (1989). Minding matters: The consequences of mindlessness-mindfulness.
In L. Berkowitz (Ed.), Advances in experimental social psychology (pp. 137–
173). San Diego, CA: Academic Press.
Laurens, K.R., Ngan, E.T., Bates, A.T., Kiehl, K.A., & Liddle, P.F. (2003).
Rostral anterior cingulate cortex dysfunction during error processing in
schizophrenia. Brain, 126: 610–622.
Lehman, H.C., (1953), Age and Achievement, Princeton, NJ: Princeton University Press
Litman, J. A. (2005). Curiosity and the pleasures of learning: wanting and liking
new information. Cognition and Emotion, 19, 793–814.
198
Lewin K. (1951). Field Theory in Social Science. Harper and Row, New York.
LePine, J.A., Colquitt, J.A., & Erez, A. (2000). Adaptability to changing task
contexts: effects of general cognitive ability, conscientiousness, and openness to
experience. Personnel Psychology, 53, 563-593.
Lerch, F. J., & Harter, D. E. (2001). Cognitive support for real-time dynamic
decision making. Information Systems Research, 12, 63– 82.
Lesch, M. F., & Hancock, P. A. (2004). Driving performance during concurrent
cell-phone use: are drivers aware of their performance decrements? Accident
Analysis and Prevention, 36, 471-480.
Litman,J.A., & Jimerson,T.L.(2004).The measurement of curiosity as a feeling
of deprivation. Journal of Personality Assessment, 82, 147– 157.
London, M. & Mone, E. M. (1999). Continuous learning. In D. R. Ilgen & E. D.
Pulakos (Eds.), The changing nature of performance: Implications for staffing,
motivation, anddevelopment. San Francisco: Jossey-Bass.
199
Loewenstein, G. (1994). The Psychology of curiosity: A review and
reinterpretation. Psychological Bulletin, 116, 75-98.
Loewenstein, G. (1992) . The Fall and Rise of Psychological Explanations in
the Economics of Intertemporal Choice. In G. F. Loewenstein and J. Elster eds. ,
Choice over Time. New York: Russell Sage Foundation
Lubart, T.I., (1994). Creativity. In Thinking and Problem Solving, R.J. Sternberg
(ed.), New York: Academic Press, pp. 289-332
Luchins, A. S., & Luchins, E. H. (1959). Rigidity in behavior. Eugene, OR: University
of Oregon Press.
Lustig, C., May, C. P. & Hasher, L. (2001). Working memory span and the role
of proactive interference. Journal of Experimental Psychology General, 130, 199–
207.
MacCallum, R. C., & Browne, M. W. (1993). The use of causal indicators in
covariance structure models: Some practical issues. Psychological Bulletin, 114,
200
533-541.
Mackinnon, D. (1962). The nature and nurture of creative talent. American
Psychologist, 17, 484–495.
MacLeod, C. M. (1991). Half a century of research on the Stroop effect: An
integrative review. Psychological Bulletin, 109, 163-203.
Mandler, G. (1995) Origins and consequences of novelty. In S. Smith, T. Ward, &
R. Finke (Eds.) The Creative Cognition Approach (pp. 9-25). Cambridge, MA:
The MIT Press
March, J. G. (1991). Exploration and exploitation in organizational learning.
Organization Science 2, 71‐87.
Martin, M. M., & Anderson, C. M. (1998). The Cognitive flexibility Scale: Three
validity studies. Communication Reports, 11, 1–9.
201
Martin, C. L., & Halverson, C. F. (1981). A schematic processing model of sex
typing and stereotyping in children. Child Development, 52, 1119-1134.
Matthews, G., & Deary, I.J. (1998). Personality Traits. Cambridge, UK: Cambridge.
Mayer, R. (1998) Cognitive, Metacognitive, and Motivational Aspects of
Problem Solving. Instructional Science, 26, 49–63.
Mazzoni, G., & Nelson, T. O. (Eds.). Metacognition and cognitive
neuropsychology. Mahwah, NJ: Erlbaum.
MacLeod, C. (2005). The Stroop task in clinical research. In A. Wenzel and D. C.
Rubin (Eds.), Cognitive methods and their applications to clinical research (pp.
41-62), Washington, DC: APA
McLennan, J., Omodei, M. M., Holgate, A. M. & Wearing, A. J. (2004).
Effective fireground incident control decision making: information processing
competencies, Proceedings of the 39th Annual Conference of the Australian
Psychological Society, Sydney, pp. 186–190.
202
McCrae, R.R. (1987). Creativity, divergent thinking and openness to experience.
Journal of Personality and Social Psychology, 52, 1258-1265
McCrae, R. R., & Costa, P. T. (1985). Openness to experience. In R. Hogan &
W. H. Jones, Perspectives in personality (Vol. 1, pp. 145-172). Greenwich, CT:
JAI Press.
McCrae, R. R., & Costa, P. T. (1997). Personality trait structure as a human
universal. American Psychologist, 52, 509-516.
McCrae, R. R., & Costa, P. T. (1990). Personality in adulthood. New York:
Guilford Press.
McDougall, W. (1921). An introduction to social psychology. Boston: Luc.
Mednick, S. A. (1962). The associative basis of the creative process.
Psychological Review, 69, 220-232.
203
Mendelsohn, G.A. (1976). Associative and attentional processes in creative
performance. Journal of Personality, 44, 341–369.
Mendelsohn, G., & Griswold B. (1964). Differential use of incidental stimuli in
problem solving as a function of creativity. Journal of Abnol and Social
Psychology, 68, 431-436.
Mendelsohn, G., & Griswold, B. (1966). Assessed creative potential, vocabulary
level, and sex as predictor of the use of incidental cues in verbal problem solving.
Journal of Personality and Social Psychology, 4, 423-431.
Mendelson, H. & Pillai, R. R. (1998). Clockspeed and informational response:
Evidence from information technology industry. Information Systems Research,
9, 415-433.
Metcalfe, J. (1993). Novelty monitoring, metacognition, and control in a
composite holographic associative recall model: Implications for korsakoff
amnesia. Psychological Review, 100, 3-22.
204
Milliken, F. J. (1990). Perceiving and interpreting environmental change: An
examination of College administrators’ interpretation of changing demographics.
Academy of Management Journal, 33, 42-63.
Milner, B., & Petrides, M. (1984). Behavioural effects of frontal-lobe lesions in
man. Trends in Neurosciences, 7, 403-407.
Mitroff, S. R., Simons, D. J., & Franconeri, S. L. (2002). The siren song of
implicit change detection. Journal of Experimental Psychology: Human
Perception and Performance. 28, 798 - 815.
Monsell, S. (1996). Control of mental processes. In Unsolved Mysteries of the
Mind. Bruce, V., Ed. Erlbaum, Hove, U.K. pp. 93--148.
Montague, M. & Bos, C.S. (1990) Cognitive and metacognitive characteristics of
eighth-grade students' mathematical problem solving. Learning and Individual
Differences, 2, 371–388.
205
Monsell, S. (1996) Control of mental processes. In Unsolved Mysteries of the
Mind. Bruce, V., Ed. Erlbaum, Hove, U.K. pp. 93--148.
Morgan, G. A., Griego, O. V, & Gloeckner G.W. (2001). SPSS for Windows:
An introduction to the use and interpretation in research. Mahway, NJ: Erlbaum
Associates.
Moses, L. J., & Baird, J. A. (1999). Metacognition. In R. Wilson (Ed.), Encyclopedia
of cognitive neuroscience. Cambridge, MA: MITPress.
Mumford, M. D., Baughman, W. A., Maher, M. A., Costanza, D. P., & Supinski, E.
P. (1997). Process based measures of creative problem-solving skills: IV.
category combination. Creativity Research Journal, 10, 59-71.
Mumford, M. D., Zaccaro, S. J., Harding, F. D., Jacobs, T. O., & Fleishman, E.
A. (2000). Leadership skills for a changing world: Solving complex social
problems. Leadership Quarterly, 11, 11-35.
Murphy, G.L., 1988. Comprehending complex concepts. Cognitive Sciences. 12,
206
529–562.
Murray, N., Sujan, H., Hirt, E., & Sujan, M. (1990). The influence of mood
on categorization: A cognitive flexibility interpretation. Journal of Personality
and Social Psychology. 59, 411-425.
Naylor, F. D. (1981). A state–trait curiosity inventory. Australian Psychologist, 16,
172–183.
Nęcka, E. (1999). Creativity and attention. Polish Psychological Bulletin, 30, 85-97.
Neisser, U. (1966). Cognitive Psychology. New York: Appleton-Century-Crofts.
Neisser, U., Boodoo, G., Bouchard, T.J., Boykin, A.W., & Brody, N.,
(1996). Intelligence: knowns and unknowns. American Psychologist, 51, 77–101.
Nelson, T. O. (Ed.) (1992), Metacognition: Core Readings. Needham Heights, MA:
Allyn and Bacon.
207
Nelson, T. O. (1996). Consciousness and metacognition. American Psychologist, 51,
102-116.
Nelson, T. O., & Narens, L. (1990). Metamemory: A theoretical framework and
new findings. In G. Bower (Ed.), The psychology of learning and motivation
(Vol. 26). New York: Academic Press.
Nelson, T. O., & Narens, L. (1994). Why investigate metacognition. In J. Metcalfe &
A. P. Shimamura (Eds.), Metacognition: Knowing about knowing. Cambridge,
MA: MIT Press.
Nelson, T. O., Dunlosky, J., Graf, A., & Narens, L. (1994). Utilization of
metacognitive judgments in the allocation of study during multitrial learning.
Psychological Science, 5, 207-213.
Norman, W. T. (1963). Toward an adequate taxonomy of personality
attributes: Replicated factor structure in peer nomination personality ratings.
Journal of Abnormal and Social Psychology, 66, 574-583.
208
Neuberg, S.L. (1989). The goal of forming accurate impressions during
social interactions: Attenuating the impact of negative expectancies. Journal of
Personality and Social Psychology, 56, 374-386.
Noe, R. A., & Ford, J. K. (1992). Emerging issues and new directions for
training research. Research in Personnel and Human Resources Management, 10,
345-384.
O’Connell, D., McNeely, E., & Hall, D.T. (2008). Unpacking personal adaptability
at work. Journal of Leadership and Organizational Studies 14, 248–259.
Omodei, M. M., & Wearing, A. J. (1995). The Fire Chief microworld
generating program: An illustration of computer-simulated microworlds as an
experimental paradigm for studying complex decision-making behavior. Behavior
Research Methods, Instruments & Computers, 27, 303-316.
Omodei, M. M., Anastasios, J., Waddell, F., & Wearing, A. J. (1997).
209
Person characteristics in dynamic decision making. Paper presented at SPUDM15.
Ormrod, J. E. (1995). Human learning. Englewood Cliffs, NJ: Prentice Hall.
Paris, S. G. & Lindauer, B. K. (1982). The development of cognitive skills
during childwood. Em B. Wolman (Orgs.), Handbook of developmental
psychology (pp. 333-349). Englewood Cliffs, N. J.: Prentice-Hall
Pearson, P. H. (1970). Relationships between global and specified measures
of novelty seeking. Journal of Consulting and Clinical Psychology, 34, 199-204.
Perkins, D.N. (1981) The mind’s Best work, Harvard University Press.
Perkins, D.N. (1990). The nature and nurture of creativity. In: B.F. Jones and L.
Idol, Editors, Dimensions of thinking and cognitive instruction, Lawrence
Erlbaum Associates, Hillsdale, NJ.
Piaget, J. (1952). The origin of intelligence in children. New York 7th
210
International Universities Press.
Pintrich, P. R., Marx, R. W., & Boyle, R. A. (1993). Beyond cold conceptual change: The
role of motivational beliefs and classroom contextual factors in the process of
conceptual change. Review of Educational Research, 63, 167-199.
Posner, M.I., & Peterson, S.E. (1990). The attention system of the human brain.
Annual Review of Neuroscience, 13, 25-42.
Posner, M. I. & Snyder, C.R.R. (1975). Attention and cognitive control. In R. L.
Solso (Ed.), Information Processing and Cognition: The Loyola Symposium (pp.
55-85). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Posner, M. I., DiGirolamo, G. J., & Fernandez-Duque, D. (1997). Brain mechanisms
of cognitive skills. Consciousness and Cognition, 6, 267–290.
Posner, M. I. (1980). Orienting of attention. Quarterly Journal of
Experimental Psychology, 32, 3–25.
211
Posner, M. I., & Rafal, R. D. (1987). Cognitive theories of attention and the
rehabilitation of attention deficits. In R. J. Meier, A. C. Benton, & L. Diller
(Eds.), Neuropsychological rehabilitation. Edinburgh7 Churchill Livingstone.
Posner, M.I., & Peterson, S.E. (1990). The attention system of the human brain.
Annual Review of Neuroscience, 13, 25-42.
Pulakos, E.D., Arad, S., Donovan, M.A. & Plamondon, K.E. (2000). Adaptability in
the workplace: Development of a taxonomy of adaptive performance. Journal of
Applied Psychology, 85, 612–624.
Putz-Osterloh, W., & Luer, G. (1981). Uber die Vorhersagbarkeit
komplexer Problemloseleistungen durch Ergebnisse in einem Intelligenztest
[On whether results from a test of intelligence can predict problem solving
performance]. Zeitschrift fur Experimentelle und Angewandte Psychologie, 28,
309–33
212
Quinn, Robert E., Spreitzer, Gretchen M., and Hart, Stuart. (1992). Challenging
The assumptions of bipolarity: Interpenetration and effectiveness. In S. Srivastva
and R. Fry (eds.) Executive Continuity. San Francisco, CA: Jossey-Bass
Rabbit, P.M.A., 1966. Errors and error correction in choice reaction tasks. Journal
of Experimental Psychology 71, 264–272.
Rabbit, P.M., & Rodgers, B. (1977). What does a man do after he makes an error?
An analysis of response programming. Quarterly Journal of Experimental
Psychology. 29, 722-743.
Raven, J. C., Court, J. H., & Raven, J. (1977). Manual for Raven's
Progressive Matrices and Vocabulary Scales. London: Lewis
Raven, J., Raven, J.C., & Court, J.H. (2003). Manual for Raven's Progressive
Matrices and Vocabulary Scales. Section 1: General Overview. San Antonio, TX:
Harcourt
213
Raven, J.C., Assessment.(1962). Coloured Progressive Matrices Sets A, AB, B,
H.K. Lewis, London (originally published in 1947, revised order in- 1956).
Reder, L. M., & Schunn, C. D. (1996). Metacognition does not imply awareness:
Strategy choice is governed by implicit learning and memory. In L. M. Reder
(Ed.), Implicit memory and metacognition (pp. 45–78). Hillsdale, NJ: Erlbaum.
Reder, LM, & Schunn, C.D. (1999). Bringing together the psychometric and
strategy worlds: Predicting adaptivity in a dynamic task. In: Gopher D, Koriat A. ,
editors. Attention and performance XVII: Cognitive regulation of performance:
Interaction of theory and application. MIT Press; Cambridge, MA: pp. 315–342.
Ree, M.J. & Earles, J.A. (1993). g is to psychology what carbon is to chemistry: A
reply to Stemberg and Wagner, McClelland, and Calfee. Current Directions in
Psychological Science, 1, 11-12.
Reeves, A., & Sperling, G. (1986). Attention gating in short-term visual
memory. Psychological Review, 93,180-206.
214
Rende, B. (2000). Cognitive flexibility: Theory, assessment, and treatment. Seminars
in Speech and Language, 21: 121–133.
Ridderinkhof, K.R., Span, M.M., & van der Molen, M.W.(2002). Perseverative
behavior and adaptive control in older adults: Performance monitoring, rule
induction, and set shifting. Brain & Cognition 49, 382-401.
Rigas, G., & Brehmer, B. (1999). Mental processes in intelligence tests and
dynamics decision making tasks. London: Erlbaum.
Rigas, G., Carling, E., & Brehmer, B. (2002). Reliability and validity of
performance measures in microworlds. Intelligence, 30, 463–480.
Robertson, I. H., Manly, T., Andrade, J., Baddeley, B. T., & Yiend, J. (1997).
Oops!: Performance correlates of everyday attentional failures in traumatic brain
injured and normal subjects. Neuropsychologia, 35, 747–758
215
Rokeach (1960) The Open and Closed Mind. John Wiley, New York
Rostan, S. M. (1994). Problem finding, problem solving, and cognitive controls:
An empirical investigation of critically acclaimed productivity. Creativity Research
Journal, 7, 97-110.
Rozin, P. (1976) The evolution of intelligence and access to the cognitive
unconscious. Progress in psychobiology and physiological psychology, 6, 245280
Rubinstein, J.S., Meyer, D.E. & Evans, J.E. (2001). Executive Control of
Cognitive Processes in Task Switching. Journal of Experimental Psychology: Hum
Perception Performance, 27, 763-797.
Runco, M.A. (1991). Divergent Thinking. Norwood,NJ: Ablex
Runco MA. (1994). Creativity and its discontents.,pp.102–123
216
Runkel, P. J. & J. E. McGrath (1972). Research on Human Behavior: A
Systematic Guide to Method. New York: Holt, Rinehart & Winston
Salthouse, T.A. (2009). When does age-related cognitive decline begin?
Neurobiol Aging. 30, 507-14.
Salthouse T. A., Meinz E. J., (1995). Aging, inhibition, working memory, and
speed. Journal of Gerontology: Psychological Sciences. 50, 297-306.
Salthouse, T. A., & Somberg, B. L. (1982). Isolating the age deficit in
speeded performance. Journal of Gerontology, 37, 59–63
Sanchez J. I. & Levine E. L. (2001). Analysis of work in the 20th and 21st centuries. In
N. Anderson, D. S. Ones, H. K. Sinangil, C. Viswesvaran (Eds.), Handbook of
Industrial, Work and Organizational Psychology (Vol. 1, pp. 71-89). Thousand
Oaks, CA: Sage Publications
217
Saucier, G. (1992). Openness versus intellect: Much ado about nothing?
European Journal of Personality, 6, 381–386.
Saucier, G. (2003). Factor structure of English-language personality type-nouns.
Journal of Personality and Social Psychology, 85, 695–70
Savickas, M. L. (1997). Adaptability: An integrative construct for life-span, lifespace theory. Career Development Quarterly, 45: 247-259.
Schacter, D. L. (1996). Searching for Memory: The Brain, the Mind, and the Past.
New York: BasicBooks.
Schmidt, A. M., & Ford, J. K. (2003). Learning within a learner control
training environment: The interactive effects of goal orientation and metacognitive
instruction on learning outcomes. Personnel Psychology,
56, 405-429.
Schmidt, F.L.& Hunter, J.E. (198l). Employment testing: Old theories and new
research findings. American Psychologist, 36: 1128-1137
218
Schmidt, F.L., & Hunter J.E. (1989). Interrater reliability coefficients cannot be
computed when only one stimulus is rated. Journal of Applied Psychology, 74,
368–370.
Schneider, W., & Shiffrin, R. M. (1977). Controlled and automatic human
information processing: I. Detection, search, and attention. Psychological Review,
84, 1–55.
Schneider, D. J., & Blankmeyer, B. L. (1983). Prototype salience and implicit
personality theories. Journal of Personality and SocialPsychology, 44, 712-722
Schneider, W., Dumais, S.T., & Shiffrin, R.M. (1984) Automatic and control
processing and attention. In Parasuraman & Davies (Eds.), Varieties of Attention.
Orlando, FL: Acad
Schoppeck, W. (1991). Spiel und wirklichkeit-reliabilitat und validitat
von verhaltensmusternin komplexen situationen. Sprache & Kognition, 10, 15 – 27.
219
Schroder H. M., Driver, J. J., and Streufert, S. (1967). Human Information
Processing, Holt, Rinehart, and Winston Inc, New York, pp. 54-61.
Schunn, C. D., & Reder, L. M. (1998).Individual differences in strategy adaptivity. In
D. L. Medin (Ed.), The psychology of learning and motivation.
Schwarz, N. (1996). Cognition and communication: Judgmental biases, research
methods and the logic of conversation. Hillsdale, NJ: Erlbaum.
Scratchley, L. S., & Hakstian, A. R. (2000). The measurement and prediction
of managerial creativity. Creativity Research Journal, 13, 367–384
Senge, P. M. (1990). The fifth discipline: The art and practice of the
learning organization. New York: Doubleday/Currency.
Shafer, E.W. P., & Marcus, M. M. (1973). Self-stimulation alters human sensory
brain responses. Science, 181, 175-177.
220
Shapiro, K.L., and J.E. Raymond. (1994). Temporal allocation of visual
attention: Inhibition orinterference. In Inhibitory Process in Attention, Memory
and Language, D. Dagenbach and T. Carr, eds. New York, NY: Academic Press.
Shallice, T. (1988). From neuropsychology to mental structure. Cambridge
University Press, Cambridge.
Shiffrin, R. M., & Schneider, W. (1977). Controlled and automatic human
information processing: II. Perceptual learning, automatic attending and a general
theory. Psychological Review, 84, 127–190.
Showers, C. & Cantor, N. (1985). Social cognition: A look at motivated
strategies. Annual Review of Psychology, 36, 275-305.
Shiffrin, R. M. (1988). Attention. In R. C. Atkinson & R. J. Herrnstein (Eds.),
Stevens’ Handbook of Experimental Psychology. New York: John Wiley &
Sons.
221
Shilling, V. M., Chetwynd, A., & Rabbitt, P. M. A. (2002). Individual
inconsistency across measures of inhibition: An investigation of the construct
validity of inhibition in older adults. Neuropsychologia, 40, 605–619
Showers, C. & Cantor, N. (1985). Social cognition: A look at motivated
strategies. Annual Review of Psychology, 36, 275-305.
Scott, W. A. (1962). Cognitive complexity and cognitive flexibility. Sociometry, 25,
405-414.
Scratchley, L. S., & Hakstian, A. R. (2000). The measurement and prediction
of managerial creativity. Creativity Research Journal, 13, 367–384.
Shaw, M., & Shaw, P. (1977). Optimal allocation of cognitive resources to spatial
location. Journal of Experimental Psychology: Human Perception and
Performance, 3, 201-211
222
Simons, D.J., & Chabris, C.F. (1999). Gorillas in our midst: Sustained
inattentional blindness for dynamic events. Perception, 28, 1059-1074
Silvern, K. (1991). Edgar A. Poe: Mournful and never-ending remembrance. New
York: HarperCollins.
Sitkin, S.B. & Pablo, A.L. (1992). Reconceptualizing the determinants of risk
behavior. Academy of Management Review, 17, 9-38.
Smith, E.M. Ford J.K. , & Kozlowski, S.W.J. (1997). Building adaptive
expertise: Implications for training design. In: M.A. Quinones and A. Ehrenstein
et al., Editors, Training for a rapidly changing workplace, American
Psychological Association, Washington, DC (1997), pp. 89–118.
Son, L. K., & Schwartz, B. L. (2002). The relation between metacognitive
monitoring and control. In T. J. Perfect & B. S. Schwartz (Eds.), Applied
metacognition (pp. 15–38). Cambridge, United Kingdom: Cambridge university
press.
223
Spearman, C. (1923). The nature of intelligence and the principles of cognition.
New York: Macmillan
Spearman, C. (1927). The Abilities of Man. MacMillan, New York, New York.
Spielberger, C. D., & Butler, T. F. (1971). On the relationship between anxiety,
curiosity, and arousal: a working paper. Unpublished manuscript. Florida State
University.
Spiro, R.J. & Jehng, J.C. (1990). Cognitive Flexibility and Hypertext: Theory
and Technology for the Nonlinear and Multi-dimensional Traversal of Complex
Subject Matter. Chapter 7 in D. Nix & R.J. Spiro (Eds.) Cognition, Education and
Multi-Media: Exploring Ideas in High Technology. Hillsdale, New Jersey:
Lawrence Erlbaum Associates.
Stavridou, A., & Furnham, A. (1996). The relationship between psychoticism, trait
-creativity the attentional mechanism of cognitive inhibition. Personality and
224
Individual Differences, 21, 143–153
Stein, J. F. (1992). The representation of egocentric space in the
Posterior parietal cortex. Behavioral and Brain Sciences, 15, 691–700
Sterman, J. (2000). Business Dynamics: Systems Thinking and Modeling for a
Complex World. New York: Irwin/McGraw-Hill.
Sternberg, R. J. (1985). Beyond IQ: A triarchic theory of human intelligence. New
York: Cambridge University Press.
Sternberg, R., & Lubart, T. (1996). Investing in creativity. American Psychologist,
51, 677–688.
Sternberg, R.J., Lubart, T.I. (1991). An investment theory of creativity and
its development. Human Development, 34, 1–32.
Stäudel, T. (1987). Problemlösen, Emotionen und Competenz. Regensburg,
225
Germany: Roderer.
Stroop, J. R. (1935). Studies of interference in serial verbal reactions. Journal
of Experimental Psychology, 12, 643-662.
Strohschneider, S. (1986). Zur Stabilität und Validität von Handeln in
komplexen Realitätsbereichen.Sprache & Kognition, 5, 42-48.
Strohschneider, S. (1991). Problemlösen und Intelligenz: Über die Effekte
der Konkretisierung komplexer Probleme. Diagnostica, 37, 353-371.
Sturm, W., & Zimmermann, P. (2000). Aufmerksamkeitsstfrungen [attentional
deficits]. In W. Sturm, M. Herrmann, & C. -W. Wallesch (Eds.), Lehrbuch der
klinischen Neuropsychologie [Textbook of clinical neuropsychology] (pp. 345–
365). Lisse, Netherland7 Swets and Zeitlinger.
Swann, W. B., Jr., & Read, S. J. (1981). Self-verification processes: How we sustain
our self-conceptions. Journal of Experimental Social Psychology, 17, 351-372
226
Terlecki, M.S., & Newcombe, N.S. (2005). How important is the digital divide?
The relation of computer and videogame usage to gender
differences in mental rotation ability. Sex Roles, 53, 433–441.
Taylor, S.E., Crocker, J. & D'Agostino, J. 1978. Schematic Bases of Social
Problem Solving. Personality and Social Psychology Bulliten, 4, 447-451.
Tetlock, P.E. (1992). The impact of accountability on judgment and choice: toward
a social contingency model. Advances in Experimental Social Psychology, 25.
331-376.
Thach, L. & Woodman, R.W. (1994). Organizational change and information
technology: Managing on the edge of cyberspace. Organizational Dynamics,
Summer, 31-46.
227
Thagard, P., 1984. Conceptual combination and scientific discovery. In: Asquith,
P., Kitcher, P. (Eds.), Proceedings of the Biennial Meeting of the Philosophy of
Science Association, vol. 1. Philosophy of Science Association, East Lansing, MI,
pp. 3–12.
Theeuwes, J. (1994). Endogenous and exogenous control of visual selection.
Perception, 23, 429-440.
Thoreson, C.J., Bradley, J.C., Bliese, P.D. & Thoreson, J.D. (2004). The Big
Five personality traits and individual performance growth trajectories in
maintenance and transitional job stages. Journal of Applied Psychology, 89. 835–
853.
Thompson, A. L., & Klatzky, R. L. (1978). Studies of visual synthesis: Integration
of fragments into forms. Journal of Experimental Psychology: Human Perception
and Performance, 4, 244–263
Tobias, S. (1995). Interest and metacognitive word knowledge. Journal of
Educational Psychology, 87, 399-405.
228
Toplyn, G., & Maguire, W. (1991). The differential effect of noise on creative
task performance. Creativity Research Journal, 4, 337-347
Torrance, E.P. (1972). Can we teach children to think creatively? Journal of
Creative Behavior, 6, 236-262
Torrance, E.P. (1974). Torrance Tests of Creative Thinking: Norms and
Technical Manual. Bensenville, IL: Scholastic Test. Serv.
Tsal, Y. (1983). Movement of attention across the visual field. Journal of
Experimental Psychology: Human Perception and Performance, 9, 523– 530.
Tushman, Michael L. and O’Reilly, Charles A. (1996). The ambidextrous
organization: managing evolutionary and revolutionary change. California
Management Review, 38, 1- 23.
Tuner, M., & Engle, R. W. (1989). Is working memory capacity task dependent?
229
Journal of Memory and Language, 28, 127-154.
Urban, K. K., & Jellen, H. G. (1996). Test for Creative Thinking - Drawing
Production (TCT-DP). Lisse, Netherlands: Swets & Zeitlinger.
Valot, C. (2002). An Ecological Approach to Metacognitive Regulation in the Adult. In
P. Chambres, M. Izaute & P.J. Marescaux (Eds.), Metacognition: Process, Function
and Use. Dordrecht: Kluwer Academic Publishers, 2002.
Van Zile-Tamsen, C. M. (1994). The role of motivation in metacognitive selfregulation. Unpublished manuscript, State University of New York at Buffalo.
Van Zile-Tamsen, C. M. (1996). Metacognitive self-regualtion and the daily
academic activities of college students. Unpublished doctoral dissertation, State
University of New York at Buffalo.
Vernon, D., & Usher, M. (2003). Dynamics of metacognitive judgments: pre
and postretrieval mechanisms. Journal of Experimental Psychology: Learning,
230
Memory, and Cognition, 29, 339–346.
Van Zomeren, A. H., & Brouwer, W. H. (1994). Clinical neuropsychology of
attention. New York:Oxford University Press.
Voss, H. G., & Keller, H. (1983). Curiosity and exploration: theories and results.
New York, NY: Academic Press.
Voyer, D., Voyer, S., & Bryden, M.P. (1995). Magnitude of sex differences in
spatial abilities: A meta-analysis and consideration of critical variables.
Psychological Bulletin, 117, 250–270.
Wallach, M.A. (1970). Creativity. Carmichael’s Handbook of Child Psychology, ed.
P Mussen, pp. 1211–72. New York: Wiley.
Wallach MA, Kogan N. (1965). Modes of Thinking in Young Children. New York:
Holt, Rinehart & Winston
231
Wallas, G. (1926). The art of thought. New York: Harcourt Brace.
Ward, T.B., 2001. Creative cognition, conceptual combination and the creative writing
of Stephen R. Donaldson. Am. Psychol. 56, 350–354.
Ward, T.B., 1994. Structured imagination: the role of conceptual structure in
exemplar generation. Cognitive Psychology, 27, 1–40.
Weick, K.E. (1995). Sensemaking in organizations, SAGE, Thousand Oaks.
Weinstein, C. & Mayer, R. 1986. The teaching of learning strategies. In M. C.
Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 315-327). New
York: Macmillan.
West, R., & Bell, M.A. (1997), Stroop color-word interference and
electroencephalogram activation: evidence for age-related decline of the anterior
attention system. Neuropsychology, 11, 421-427.
232
Whimbey, A. and J. Lochhead, Problem Solving & Comprehension, 6 th ed.,
Lawrence Erlbaum Associates, Publishers, New Jersey, 1999
Wickens, C. D. (2000). The trade-off of design for routine and unexpected
performance:Implications of situation awareness. In M. R. Endsley & D. J.
Garland (Eds.), Situation awareness analysis and measurement (pp. 211-225).
Mahway, NJ: Lawrence Erlbaum.
Wilson, P.T., & Anderson, R.C. (1986). What they don't know will hurt them: The role
of prior knowledge in comprehension. In Orasanu, J. (Ed.), Reading
comprehension: From research to practice, 31-48. Hillsdale, NJ: Lawrence
Erlbaum.
Wilson, T.D., Lindsey, S. and Schooler, T.Y., 2000. A model of dual
attitudes. Psychological Review 107, 101-126.
Wright JC, Mischel W. (1987). A conditional approach to dispositional constructs:
the local predictability of social behavior. Journal of Personality and Social
Psychology, 53, 1159–1177.
233
Wyer, R. S., & Srull, T. K. (1989). Memory and cognition in its social context.
Hillsdale, NJ : Lawrence.
Yantis, S. (1993). Stimulus-driven attentional capture. Current Directions
in Psychological Science 2, 156-161.
Zaccaro, S. J., Foti, R. J., & Kenny, D. A. (1991). Self-monitoring and traitbased variance in leadership: An investigation of leader flexibility across
multiple group situations. Journal of Applied Psychology. 76, 308—315.
Zimmermann, P., & Fimm, B. (2000). Test for attentional performance
(TAP). Herzogenrath7 PSYTEST.
Zomeren, A.H. Van, & Brouwer, W. H. (1994). Clinical neuropsychology of
attention. New York Oxford University Press.
234