Fifth Graders` Flow Experience in a Digital Game

International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 69
Fifth Graders’ Flow Experience
in a Digital Game-Based Science
Learning Environment
Meixun Zheng, The Friday Institute for Educational Innovation, North Carolina State
University, Raleigh, NC, USA
Hiller A. Spires, Friday Institute for Educational Innovation, North Carolina State University,
Raleigh, NC, USA
ABSTRACT
This mixed methods study examined 73 5th graders’ flow experience in a game-based science learning environment using two gameplay approaches (solo and collaborative gameplay). Both survey and focus group
interview findings revealed that students had high flow experience; however, there were no flow experience
differences that were contingent upon gameplay approaches. Results identified four game design features
and student personal factors (reading proficiency) that significantly impacted student game flow experience.
Students made significant science content learning gains as a result of gameplay, but game flow experience
did not predict learning gains. The study demonstrated that the game was effective in supporting students’
flow experience and science content learning. The findings indicated that the adapted game flow experience
survey provided a satisfactory measure of students’ game flow experience. The results also have implications
for educational game design, as game design features that significantly contributed to students’ flow experience were identified.
Keywords:
Collaborative Gameplay, Flow Experience, Game-Based Learning, K12 Education, Science
Education
INTRODUCTION
In recent years the popularity of game-based
learning has prompted increased attention to
students’ subjective gameplay experience. Re-
search in this area is important because students’
experience of playfulness/enjoyment during
gameplay has been found to motivate them to
engage in learning. Recently, researchers have
begun to examine students’ gameplay experi-
DOI: 10.4018/ijvple.2014040106
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
70 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
ence using Csikszentmihalyi’s (1975) flow
theory. Flow describes an optimal experience
of deep engagement that a person has when she/
he is completely absorbed in an activity. Game
flow experience studies focus on the context of
gameplay and examine how game design features impact student flow experience. This type
of research is especially valuable considering
that educational game design and evaluation
is still in its infancy. A full body of empirical
literature has yet to be developed in terms of
how to devise games that are both engaging and
effective for student learning (Zheng, Spires,
& Meluso, 2011).
The phenomenon of game flow experience has not yet been fully understood.
Therefore, the primary purpose of this study
was to investigate 5th graders’ flow experience in a 3D game-based science learning
environment, which is called Crystal Island
and funded by NSF. A second purpose of
this study was to examine the impact of
two different gameplay approaches (solo
and face-to-face collaborative gameplay)
on students’ flow experience. Employing a
mixed methods research design (Creswell &
Plano Clark, 2006), the study was guided by
three research questions:
1. To what extent did 5th graders in a suburban elementary school in the southern US
experience game flow while playing the
Crystal Island game?
a. What were the differences in students’
game flow experience based on
two different gameplay approaches
(solo and face-to-face collaborative
gameplay)?
b. What were the differences in students’
game flow experience based on
individual differences (i.e., reading
proficiency)?
2. How did game design features impact
students’ game flow experience?
3. What was the relationship between students’ game flow experience and their
science content learning gains?
LITERATURE REVIEW AND
THEORETICAL FRAMEWORK
Research on Game
Flow Experience
Overview of Flow Theory
Flow is an emotion of complete consciousness and engagement that is experienced by
individuals who are deeply involved in doing
an enjoyable activity. People who experience
flow are often considered to be “in the zone”
(Csikszentmihalyi, 1975, 1991). Csikszentmihalyi (1991) defined flow experience as having
eight dimensions: (1) clear goals, (2) immediate and unambiguous feedback, (3) balance
between the challenges of an activity and the
skills required to meet challenges; (4) merging
of action and awareness, (5) concentration, (6)
sense of control, (7) loss of self-consciousness,
and (8) a distorted sense of time. Jackson and
Marsh (1996) proposed that autotelic experience (a state of the mind when the individual
performs an activity out of the great enjoyment
derived from performing the task itself instead
of doing it for external rewards) be added as
the 9th dimension.
In the studies of flow experience during
human-computer interaction, these dimensions are often categorized into three stages,
including flow antecedents, flow experience,
and flow consequences (Ghani & Deshpande,
1994; Shin, 2006; Skadberg & Kimmel, 2004;
Webster, Trevino, & Ryan, 1993). Kiili (2005)
proposed a game flow experience scale to examine the three stages of flow experience with
college students who played a computer science
game (see Table 1). This survey was validated
in a later study with a different group of college
students (Kiili & Lainema, 2008). This present
study was based on Killi and Lainema’ s (2008)
game flow model.
For flow antecedents, a game with good
playability should have an easy user interface
to allow the player to find necessary functionalities. There should not be too many distracting
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 71
Table 1. Kiili’s gameflow experience model (Kiili, 2005; Kiili & Lainema, 2008)
Game-Flow Antecedents
Balance of challenge/skill
Immediate feedback
Clear goals
Playability
Gamefulness
Frame story
Game-Flow Experience
Concentration
Time distortion
Sense of control
Autotelic experience
Loss of self-consciousness
factors to add to the players’ cognitive load.
Gamefulness refers to how many opportunities
there are for players to use in-game rewards, or
to what extent the players can play in different
ways. The game frame story is a technique used
to connect background stories in the game.
Flow Experience in the
Game Environment
Research of game players’ flow experience
is limited but promising. Existing game flow
studies have identified factors that contribute
to game players’ flow experience, including
the perceived balance of challenge and skills,
clear goals, useful feedback, good playability,
and gamefulness (Hsu & Lu, 2004; Inal &
Cagitay, 2007; Kiili, 2005; Kiili & Lainema,
2008; Zheng, Spires, & Meluso, 2011).
Unfortunately, the existing gameflow experience studies have reported mixed results
when it comes to the effects of flow experience
on players’ learning gains. In a study of college
student game flow experience, Kiili (2005)
found that students experienced high flow and
that game flow experience was highly related
to their content learning outcomes. Other studies, however, have reported negative findings
with regard to the relationship between learning
outcomes and game flow experience. Kiili and
Lainema (2008) examined college students’
flow experience while playing a computer science game, and found only a loose connection
between students’ game flow experience and
learning achievement. Lee and Kwon (2005)
also studied how flow experience was con-
Flow Consequences
Content learning gains
Exploratory behaviors
nected to success in a simulation game with
100 university students. Similarly, the survey
data indicated that game flow experience was
not a significant learning predictor.
In addition to the reported mixed findings,
these existing studies have been conducted
exclusively with high school or college students. Based on a review of literature, only two
game flow experience studies conducted with
elementary school students were identified. For
instance, in their game flow experience study
with 3rd graders using an qualitative interview
approach, Inal and Cagiltay (2007) found that a
balance of challenge and skill, useful feedback,
and a less cognitively demanding user interface
were factors that led to students’ game flow
experience. It was also found that the challenge
provided by the game had greater impact on
boys’ flow experience, while the game stories
had greater impact on girls’ flow experience.
Zheng and her colleagues (2011) examined
gameplay experience differences between
5th graders in the U.S and China. The study
revealed that students’ English proficiency
significantly impacted their flow experience
during gameplay.
Research on the Effectiveness
of Game-Based Learning
Studies have demonstrated the positive impact
of educational gameplay on student content
learning (e.g., Brom, Preuss, & Klement,
2011; Gillispie, Martin, & Parker, 2009; Miller,
Chang, Wang, Beier, & Klisch, 2011; Lester,
Spires, Nietfeld, Minogue, Mott, & Lobene,
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
72 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
2013). Brom and his colleagues (2011) reported
that high school students who learned science
through gameplay outperformed students in the
control group in terms of knowledge acquisition and retention. Meluso and her colleagues
(2012) also reported that 5th graders who played
a 3D science learning game made significant
content learning gains. Some other studies,
however, have reported negative results in
terms of the learning effectiveness of gameplay
(e.g., Annetta, Minogue, Holmes, & Cheng,
2009; Wrzesien & Raya, 2010). For instance,
in a study about game-based science learning
with 8th graders (Spires, Turner, Rowe, Mott, &
Lester, 2010), the authors found that significant
learning differences between the two groups
were not evident.
There are also studies providing evidence
that collaborative gameplay positively impacts
students’ content area learning. Foko and Amory
(2008) found that middle school students overcame more misconceptions about the science
content when they played the game in pairs.
Howard, Morgan and Ellis (2006) also reported
that students highly valued the usefulness of
peer discussion during gameplay. Some other
researchers, however, are more conservative
in making their conclusions (e.g., González González & Izquierdo, 2012; Shih, Shih, Shih,
Su, & Chuang, 2010; Meluso, et al., 2012). In
particular, Shih et al. (2010) promoted that the
effectiveness of collaboration is highly dependent on the specific collaborative gameplay
strategies used.
Summary of Literature Review
Several themes emerged from the literature
review. First, there is no consensus regarding
the learning effects of educational gameplay and
the effects of peer collaboration during gamebased learning. Second, the few existing game
flow studies have been conducted exclusively
with older players (high school students and
above). Third, findings also differ across studies, especially in terms of the extent to which
flow experience impacts students’ learning
outcomes. These identified gaps in the literature
have guided the current study.
RESEARCH METHOD
This research study employed an embedded
mixed methods design. Quantitative and qualitative data were collected simultaneously (see
Appendix A for the design diagram).
Participants and Research Context
Participants were 75 5th graders (males = 40, females = 35) at a public school in the southeastern
United States. The students’ ethnic categories
were: White/Caucasian (n = 49), Black/African
American (n = 7), Hispanic/Latino (n = 7),
Asian American (n = 3), and American Indian
(n = 1). The rest of the students chose “other”
as their demographic category (n = 8). Students
were randomly assigned to either the solo or the
face-to-face collaborative gameplay approach.
Only students who completed all three sessions
of gameplay and completed all items on the preand post measures (see below for details) were
included in data analysis. This resulted in a final
sample of 73 students (males=38, females=35;
solo players=37, collaborative players=36).
Students played a 3D science learning game
called Crystal Island (See Figure 1). The goal
of the Crystal Island game was for students
to complete a series of tasks which focused on
landforms, map navigation, and modeling in
5th grade science. In order to complete these
tasks, students chose an avatar to interact with
the virtual environment and in-game characters.
The tasks required students to place landform
signs in the correct locations (e.g., waterfall,
delta, plateau, and volcano), take photos of
different landforms, use the map coordinates/
scales to navigate to designated locations to
collect flags, match pictures with 3D models
of the Crystal Island, and rearrange dislocated objects to recreate the Crystal Island
environment. To help students with the tasks,
the game provided a virtual tablet with a series
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 73
Figure 1. Image taken of student avatar interacting with the game characters in the Crystal
Island game
of applications preloaded with multimedia
resources that students could access anytime
during gameplay. Students received instant
feedback from the in-game characters each
time they completed a task.
Quantitative and Qualitative
Data Sources
Pre-Measures
The pre measures included a science content
test and reading proficiency. The science content test was developed by the Crystal Island
research team in collaboration with five content
experts (fifth grade science teachers from a
local elementary school). The content test was
developed to align with the curriculum of the
game based on the Full Option Science System
(FOSS), a research-based K12 science curriculum, and fifth grade science End of Grade
(EOG) tests. The test was piloted with another
group of fifth graders and revisions were made
accordingly before administrating to students
in the current study. The final test consisted of
27 multiple-choice questions that assessed both
low-level recognition and higher-level application/transfer knowledge (see Appendix B). The
test-retest reliability was .83.
Students’ End-Of-Grade (EOG) reading
test scores, measured on a 4-point scale (Level
I, II, III, and IV) were used as a measure of
their reading proficiency. Students’ reading
proficiency was examined because Crystal
Island is a narrative-centered learning game
that requires students to read and comprehend
the gameplay instructions and resources. The
EOG reading test is a reliable and valid instrument developed by the Department of Public
Instruction to assess students’ reading achievements against state standards. EOG reading test
raw scores ranging from 0 to 50 points were
converted into scale scores ranging from 321
to 374, which were then converted into the
4-point scale scores used in the current study.
The official cut-off scores were: Level 1 (equal
to or less than 340), Level 2 (between 341 to
348), Level 3 (between 349 to 360), and Level
4 (equals to or higher than 361). For informa-
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
74 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
tion on the EOG reading tests, see http://www.
ncpublicschools.org/accountability/testing/
eog/reading/.
Eigenvalue greater than 1.0, accounting for
56.80% of the total variance. Specifically, items
measuring the two flow experience subscales of
time distortion and loss of self-consciousness
loaded on the same factor to form the new 4th
factor. This might indicate that these two concepts were not distinct enough for 5th graders.
Overall, the 4-factor game flow experience
structure was interpretable, indicating that even
though flow experience is theorized to have 5
subscales for adults, it only has 4 subscales
for upper elementary students in this current
study. These 4 subscales are concentration,
sense of control, time distortion and loss of
self-consciousness, and autotelic experience.
The factor correlations are presented in Table
2. The reliability for each of the 4 subscales as
measured by Cronbach’s alpha ranged from
.77 to .84. The item-total correlations were all
within the “good” range (from .44 to .74) based
on Ebel’s (1965) guidelines that item-total correlation values above .40 are good. The mean
inter-item correlations for these 4 subscales
ranged from .39 to .64.
These results indicated that even though
there are some items that warrant future work,
the 4-dimension game flow experience instrument worked well with 5th graders in this
current study, and provided a useable basis
for subsequent data analyses of participants’
experiences with Crystal Island.
Post-Measures
The post measures consisted of science content
test (identical to the pre test) and game flow experience. Game flow experience was measured
using an adapted flow experience survey and
semi-structured focus group interviews.
Adapted Game Flow Survey
As discussed earlier, students’ game flow experience was measured with a game flow questionnaire adapted from an existing flow scale
developed and validated by Kiili and Lainema
(2008). The original questionnaire consists of
two parts: (a) eighteen 5-point Likert-scale
items (1=strongly disagree to 5=strongly agree)
measuring six flow antecedents (game design
features), and (b) fifteen 5-point Likert-scale
items measuring 5 theorized dimensions of
game flow experience, including concentration,
sense of control, sense of time distortion, loss
of self-consciousness, and autotelic experience.
In order for 5th graders to understand the survey
questions, some items were reworded (see Appendix C for flow experience items).
An exploratory factor analysis (EFA) using
Maximum Likelihood factoring and an Oblique
rotation (i.e., Direct Oblimin) was conducted
on the adapted survey (Costello & Osborne,
2005). During the first round of EFA, one item
measuring students’ sense of concentration was
found to be double loaded with higher loading
on the non-theorized factor. After removing
this item, the 2nd EFA extracted 4 factors with
Semi-Structured Focus
Group Interview
During focus group interviews, students from
both gameplay groups were asked questions
about their game flow experience based on
Table 2. Factor correlations for flow experience items (n=73)
Factor
1
2
3
4
1
1.00
.33
.48
.16
2
.33
1.00
.55
.48
3
.48
.55
1.00
.43
4
.16
.48
.43
1.00
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 75
the flow experience subscales, and questions
about Crystal Island game design features.
Focus group participants were selected based on
three criteria: First, students indicated whether
they were willing to be interviewed. Second,
based on students’ willingness, the classroom
teachers recommended those who were more
talkative. Finally, to reduce potential bias, the
researchers selected 5 solo game players and 5
collaborative game players from both genders
and different ethnic/racial backgrounds. In
qualitative research, it is common for “information rich” participants to be selected to allow
the researchers to fully understand a complex
issue (Marshall, 1996). More talkative students
were selected for the focus group interview in
the current study to provide “richer” information
to better understand the complex phenomenon
of game flow experience. Due to the design
of the study, focus group participants were
not selected based on their self-reported flow
experience levels. Students either participated
in the focus group one day prior to completing
the flow survey or immediately following the
flow survey (see below for details).
Data Collection
One week prior to gameplay students took the
science content test. Using the school’s computer lab, students played the Crystal Island
game on three consecutive days for 40 to 50
minutes each day. On the first day of gameplay,
participants were randomly assigned to either
the solo or face-to-face collaborative gameplay
approaches. Students then viewed a background
story video about Crystal Island, followed by a
tutorial familiarizing them with game controls.
Students then played the game for 45 minutes,
progressing at their own pace. Immediately
after gameplay, 5 solo game players (girls=2,
boys=3; White Americans=3, African Americans=1, Asian Americans=1) were selected to
participate in a 30-minute focus group interview. On the 2nd day, students played the game
for 50 minutes, and at the conclusion of the
gameplay session the flow experience survey
was administrated through Survey Monkey.
Immediately after the flow survey, 5 collaborative game players (girls=3, boys=2; White
Americans=3, African Americans=1, Latino
Americans=1) were selected to participate in a
30-minute focus group interview. On the third
day, the time for gameplay was shortened to 40
minutes to allow students time to take the post
science content test.
Solo game players worked individually.
Collaborative game players worked in dyads,
with one of them being the game driver (driving
the in-game characters around) and the other
one being the game planner (observing and
providing suggestions to help with gameplay).
Students in the dyads switched the roles halfway
through each gameplay sessions.
Note that during the game-based learning
process, students did not receive supplementary
classroom instruction on the science curriculum
covered in the game from their teachers. The
teachers were not present in the computer lab
during gameplay sessions. Only the researchers
were present to help students get seated, log
in to the game, and solve potential technical
problems during gameplay.
DATA ANALYSIS AND FINDINGS
Data Coding
For science content tests, a total score was
calculated for each student. For the flow experience survey, for each student, a mean score was
calculated for each subscale (i.e., sum of all
items measuring the same subscale divided by
the number of items measuring the subscale).
Focus group interview data was coded using a
prior coding guided by flow theory at the first
round, and open coding at the second round to
identify additional themes mentioned by students. Interview coding results were triangulated
with the flow survey data. Since the interview
questions were constructed based on the flow
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
76 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
experience subscales (see Appendix D), the
corresponding codes were readily apparent to
the researcher (Creswell & Plano Clark, 2006).
Findings for Research
Question One
Descriptives
These results in Table 3 demonstrated that,
regardless of students’ gameplay approaches
(solo and collaborative gameplay), the mean
scores for the 4 flow experience subscales were
greater than 4 on a 5-point scale, indicating high
game flow experience for students.
The results from the flow survey were
supported by data from focus group interviews.
For the flow experience subscale of sense of
concentration, the focus group interview data
also indicated that, regardless of gameplay
approaches, students were highly concentrated
during gameplay. For the flow experience subscale of autotelic experience, students’ interview
responses also supported the survey results.
Students commented that they “played the
game for the fun of it”, an indicator of students’
intrinsic motivation. For the flow experience
subscale of sense of control, all 10 students
interviewed reported that they felt in good
control of their gameplay actions. However, 8
out of the 10 interviewed students also reported
that sometimes they felt “frustrated” because
it was difficult to control the movement of the
game characters using the keyboard.
Finally, for the 4th flow experience subscale
of time distortion and loss of self-consciousness,
both solo and collaborative game players in the
focus groups reported that they felt time went by
faster than usual. One solo player commented,
“…you were just having so much fun…and you
felt like you have played for only 10 minutes.”
One collaborative player also commented,
“Time went so fast that it is almost time to go
and I got so mad…” Regarding loss of self-consciousness in this 4th subscale, students’ focus
group responses illustrated that flow experience
in the Crystal Island game was categorized
as the tendency to shut themselves off from
the outside world. Students used phrases such
as “zoned out” and “did not hear anybody” to
describe their gameplay experiences.
Reading Proficiency and
Flow Experience
Simple regression analyses were conducted
to examine the relationship between students’
reading proficiency and their game flow experience. The results revealed that reading EOG
scores significantly predicted students’ autotelic
experience, R2= .07, β = .13, t = 2.35, p = .02.
Gameplay Approaches
and Flow Experience
A 2(Gameplay approaches: solo and collaborative players) *2(Gender: male and female)
MANCOVA analysis controlling for the effect of
reading EOG scores was conducted to examine
flow experience differences based on the two
gameplay approaches. The results failed to
reveal a main effect for gameplay approaches.
The results also failed to reveal a main effect
for gender. However, there was a significant
Table 3. Descriptive data for game flow experience based on gameplay conditions
Subscales
Solo (n=37)
Mean
Collaborative (n=36)
SD
Mean
SD
All (n=73)
Mean
SD
Attention focus
4.51
.48
4.36
.65
4.44
.57
Sense of control
4.20
.70
4.06
.76
4.13
.73
Loss of time and self-consciousness
4.03
.56
3.99
.51
4.01
.53
Autotelic experience
4.58
.53
4.63
.44
4.60
.49
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 77
main effect for reading EOG score on the flow
subscale of autotelic experience, F (1, 68) =4.21,
p= .04, partial eta squared = .06.
Findings for Research
Question Two
Descriptives
Table 4 demonstrated that the mean scores of the
6 flow antecedents (game design features) were
all approaching or greater than 4 on a 5-point
scale, indicating that students thought that the
game was designed with desirable features.
As shown in Table 4, a mean score of 4.15
indicated that the challenge level of the game
tasks was appropriate for the students, which
was supported by students’ responses to the
focus group interview questions. However, for
the game design feature of clear goals, even
though a mean score of 4.04 indicated that
students knew about the goals of the game, only
3 out of the 10 students interviewed were able
to correctly articulate the goal of the game. For
the game design feature of clear and immediate
feedback, a mean score of approaching 4 also
indicated that students perceived this to be a
desirable feature of the game. Qualitatively,
all students interviewed also commented that
the game did a good job in providing encouraging and useful feedback. The focus group
interviews also revealed that students easily
learned how to play the game, an indicator of
good playability. This was also consistent with
findings from survey.
For the game design feature of gamefulness, a mean score of 3.95 indicated that this
was a desirable feature of the Crystal Island
game design. When students were asked about
their gameplay experience, both solo and collaborative players mentioned features that could
be categorized as high gamefulness. One boy
in the solo gameplay group reported that the
Crystal Island game was fun because it had
“different (difficulty) levels,” an important
feature of high gamefulness. A girl in the collaborative gameplay group also commented
that she enjoyed playing the game because
she could use the earned in-game rewards to
perform other gameplay tasks, another indicator of high gamefulness. The high mean score
for game frame story was also supported by
students’ focus group interview responses in
that students orally commented that the game
background story video was very “cool” and it
told them what they needed to do in the game.
Impact of Game Design
Features on Flow Experience
To examine the relationship between the 6 game
design features and students’ flow experience
as measured with the 4 subscales, separate
multiple regression analyses were conducted.
The results showed that 4 game design features,
including balance of challenge and perceived
skills, playability, gamefulness, and game
frame/background story, significantly predicted
students’ game flow experience (See Tables 5
through 8).
Table 4. Descriptive data for game flow antecedents across gameplay conditions
Flow Antecedents
Mean
Std. Deviation
Challenge/skill balance
4.15
.56
Playability
4.03
.65
Gamefulenss
3.95
.56
Clear goals
4.04
.63
Immediate & unambiguous feedback
3.82
.68
Game frame story
4.14
.54
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
78 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
Table 5. Multiple regression of game design features on students’ attention focus
B
Variables
t
Sig. (p)
Challenge/skill balance
.29
2.26
.03*
Playability
.10
.98
.33
Gamefulness
.21
1.64
.11
Clear goals
.05
.38
.71
Immediate and unambiguous feedback
.08
.81
.42
Game frame story
-.003
-.02
.98
Note: R = .30, p< .05
2
Table 6. Multiple regression of game design features on student sense of control
B
Variables
t
Sig. (p)
Challenge/skill balance
.09
.63
.53
Playability
.48
4.02
.00*
Gamefulness
-.02
-.16
.87
Clear goals
.18
1.37
.18
Immediate and unambiguous feedback
.10
.91
.37
Game frame story
.24
1.60
.11
Note: R2= .45, p< .05
Table 7. Multiple regression of game design features on time distortion and loss of self-consciousness
B
Variables
t
Sig. (p)
Challenge/skill balance
.13
1.13
.26
Playability
.01
4.02
.92
Gamefulness
.23
2.11
.04*
Clear goals
.09
.84
.40
Immediate and unambiguous feedback
.05
.62
.54
Game frame story
.30
2.65
.01*
Note: R = .41, p< .05
2
Findings for Research
Question Three
Science Content Learning Gains
To examine group differences in science content learning gains from pre- to post-test, a 2
(Groups: solo player, collaborative player) x 2
(Test: pre, post) repeated measures analysis of
variance (RM-ANOVA) was conducted. The
results revealed a main effect of test, indicating
that students demonstrated significant increases
in correct responses to the science content test
from pre- (M = 14.64, SD = 4.97) to post-test (M
= 15.74, SD = 5.41), collapsed across gameplay
approaches, F (1,71) = 5.26, p = .03, partial eta
squared = .07. However, the RM-ANOVA failed
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 79
Table 8. Multiple regression of game design features on students’ autotelic experience
B
Variables
t
Sig. (p)
Challenge/skill balance
.19
1.78
.08
Playability
.19
2.23
.03*
Gamefulness
-.06
-.55
.59
Clear goals
.07
.71
.48
Immediate and unambiguous feedback
-.08
-1.00
.32
Game frame story
.31
2.87
.01*
Note: R = .35, p<. 05
2
to reveal a main effect of gameplay approaches,
indicating that the solo and collaborative game
players did not differ in terms of learning gains
from pre- to post-content-knowledge test, F (1,
71) = .01, p = .93.
Flow Experience and
Content Learning Gains
Residual gain scores for the science content test
(post-pre) were calculated for all students. A
multiple regression analysis was then conducted
to examine how flow experience predicted science learning gains. Results indicated that, none
of the 4 flow subscales significantly predicted
science content learning gains.
CONCLUSION AND
IMPLICATIONS
dents’ game flow experience. More importantly,
the study revealed that game flow experience
only has 4 subscales for 5th graders instead of 5
theorized dimensions. As previously discussed,
this might be because the concepts of time
distortion and loss of self-consciousness were
not distinct enough for 5th graders. It might
also imply that these items need to be further
reworded to make the two concepts clearer for
young participants. Meanwhile, note that the
EFA results were based on a relatively small
sample size. In future studies, a larger sample
size should be recruited to further validate the
survey. In addition, the current study included
only 3 items for each subscale in the survey,
which might not produce stable results (Costello
& Osborne, 2005). Moving forward in working on the adapted survey, more items need to
be written.
Flow Experience in the
Crystal Island Game
Flow Experience
Impacting Factors
The study examined 5th graders’ flow experience
in the Crystal Island game environment, its
impacting factors, and the relationship between
game flow experience and science learning
gains. Both quantitative and qualitative data
indicated that the game afforded students an
enjoyable flow experience regardless of gameplay approaches. The results also indicated that
flow theory provided a new lens to examine
students’ playful science learning experience
in the game environments, and that the adapted
survey provided a satisfactory measure of stu-
The study demonstrated that the Crystal
Island game was designed with several desirable features, including balance between challenge and student perceived skill, playability,
gamefulness, and game background story, that
contributed to students’ flow experience. It is
possible that future educational game designs
that take these design principles into consideration may be more likely to support students’
flow experience. This is important in that although flow experience did not predict student
content learning gains in the current study, some
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
80 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
previous studies have demonstrated that flow
experience is positively related to content learning outcomes in the virtual learning environments (e.g., Kiili, 2005; Skadberg & Kimmel,
2004). On the other hand, while not supported
in the current study, the game design features
of clear goals and unambiguous feedback were
both found to be predictors of students’ game
flow experience in several previous studies.
These two factors may not have been significant
predictors in this study due to the fact that the
game was still at a developmental stage at the
time of implementation.
Besides the several desirable game design
features, the novelty of game-based learning
might have also contributed to students’ high
game flow experience, especially considering
that students only played the Crystal Island
game for a relatively short period of time.
Students who are easily attracted by novel
events may experience higher flow when playing the Crystal Island game. Future research
should examine how students’ motivation in
game-based learning might be sustained. For
instance, students may be continuously engaged
with games based on certain desirable design
features as well as the ways in which games are
integrated in the school curriculum.
Effectiveness of
Collaborative Gameplay
Students in both gameplay approaches not only
experienced high flow but also made significant
content learning gains, supporting the findings
reported in previous studies that well designed
games have the potential of supporting students’
science learning. Collaborative game players
in this study, however, did not make higher
content learning gains than solo game players.
Shih and his colleagues (2010) asserted that for
collaboration to contribute to learning, students
need to engage in meaningful discussion and
idea exchanges. However, during collaborative
gameplay, it was observed that some students
were talking about other things such as funny
gameplay actions. Apparently, a clear role
should be set for students in the gameplay dyads in order for meaningful peer collaboration
to occur (Hämäläinen, Manninen, Järvelä, &
Häkkinen, 2006). In particular, in future studies
students need to be more explicitly instructed as
to how to engage in active acquisition of information together with a partner. In this current
study, even though students in the gameplay
dyads were told to work together to complete
the game quests, no further instruction on effective collaborative gameplay was provided.
Relationship between Flow
Experience and Science
Content Learning
In this study, even though students across
both gameplay approaches made significant
content learning gains, game flow experience
was not found to significantly predict content
learning gains. One possible reason is that
students played the game for only a relatively
short amount of time, which may have limited
the possibility to observe the actual impact
of game flow experience on science learning
gains. Graesser and his colleagues (2009) point
out that there is often a tradeoff between deep
learning and students’ positive emotion in the
virtual environment. Students’ deep enjoyment
doesn’t translate into better performance, probably because their attention has been focused too
much on the entertainment aspect of the game.
A certain degree of scaffolding might help direct
students’ attention to the learning aspect of the
game. For example, the researchers/facilitators
should monitor student gameplay behaviors to
correct their off-task behaviors.
The finding that students’ game flow experience did not predict their science learning gains
also has implications for game design. While
playing the Crystal Island game, students
had the freedom to navigate through the game
environment in whatever ways they would like
to. As a result, some of them easily went offtask by driving their characters around for fun.
Therefore, for these young students who lack
the self-regulation skills of keeping attention
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 81
on content-related game quests, it is important to take this into account when designing
games in order to maintain a balance between
the degree of autonomy given to students and
their self-regulated gameplay behaviors. For
instance, the game could be designed in such a
way that when students went off off-task, they
would be alerted to redirect their attention to
the game quests.
ACKNOWLEDGMENT
This research was supported by the National
Science Foundation under Grant DRL-0822200.
Any opinions, findings, and conclusions or
recommendations expressed in this material are
those of the authors and do not necessarily reflect
the views of the National Science Foundation.
REFERENCES
Annetta, L. A., Minogue, J., Holmes, S. Y., & Cheng,
M. T. (2009). Investigating the impact of video
games on high school students’ engagement and
learning about genetics. Computers & Education,
53(1), 74–85. doi:10.1016/j.compedu.2008.12.020
Brom, C., Preuss, M., & Klement, D. (2011). Are
educational computer micro-games engaging and
effective for knowledge acquisition at high-schools?
A quasi-experimental study. Computers & Education
57(3), 1971-1988.
Costello, A. B., & Osborne, J. W. (2005). Best
practices in exploratory factor analysis: Four recommendations for getting the most from your analysis.
Practical Assessment, Research & Evaluation,
10(7), 1–9.
Creswell, J. W., & Clark, V. L. (2006). Designing
and conducting mixed methods research. Thousand
Oaks: Sage Publications.
Csikszentmihalyi, M. (1975). Beyond boredom and
anxiety. San Francisco: Jossey-Bass.
Csikszentmihalyi, M. (1991). Flow: The psychology
of optimal experience. New York: Harper Perennial.
Ebel, R. L. (1965). Measuring educational achievement. Englewood cliffs, NJ: Prentice-Hall.
Foko, T., & Amory, A. (2008). Social constructivism
in games based learning in the South African context.
Paper presented at the Proceedings of World Conference on Educational Multimedia, Hypermedia and
Telecommunications 2008, 5757-5764. Retrieved
from http://www.editlib.org/p/29180
Foster, T. (2008). Games and motivation to learn
science: Personal identity, applicability, relevance
and meaningfulness. Journal of Interactive Learning
Research, 19(4), 597–614.
Fu, F., Su, R., & Yu, S. (2009). E-game flow: A
scale to measure learners’ enjoyment of elearning
games. Computers & Education, 52(1), 101–112.
doi:10.1016/j.compedu.2008.07.004
Gao, J., Yang, Y. C., & Chen, I. (2009). How digital
game-based learning can improve students’ academic
achievement and problem solving. Proceedings of
World Conference on E-Learning in Corporate,
Government, Healthcare, and Higher Education
2009, 1256-1263.
Ghani, J., & Deshpande, S. (1994). Task characteristics and the experience of optimal flow in humancomputer interaction. The Journal of Psychology,
128(4), 281–391. doi:10.1080/00223980.1994.97
12742
Gillispie, L., Martin, F., & Parker, M. (2009). Effects
of the dimension-M 3D video gaming experience on
middle school student achievement and attitude in
mathematics. In I. Gibson et al. (Eds.), Proceedings
of Society for Information Technology & Teacher
Education International Conference 2009 (pp. 14621469). Chesapeake, VA: AACE.
González-González, C., & Blanco-Izquierdo,
F.González -González. (2012). Designing social
videogames for educational uses. Computers &
Education, 58(1), 250–262. doi:10.1016/j.compedu.2011.08.014
Graesser, A. C., Chipman, P., Leeming, F., & Biedenbach, S. (2009). Deep learning and emotion
in serious games. In U. Ritterfeld, M. Cody, & P.
Vorderer (Eds.), Serious games: Mechanisms and
effects (pp. 81–100). New York, London: Routledge,
Taylor & Francis.
Hämäläinen, R., Manninen, T., Järvelä, S., & Häkkinen, P. (2006). Learning to collaborate – scripting
computer-supported collaboration in a 3-D game
environment. The Internet and Higher Education,
9(1). doi:10.1016/j.iheduc.2005.12.004
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
82 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
Hays, R. T. (2005). The effectiveness of instructional
games: A literature review and discussion. Retrieved
September 10, 2011, from http://stinet.dtic.mil/oai/
oai?&verb=getRecord&metadataPrefix=html&ide
Howard, C., Morgan, M., & Ellis, K. (2006). Games
and learning: Does this compute? In PearsonE.
BohmanP. (Eds.), Proceedings of the ED-Media
2006-World Conference on Educational Multimedia, Hypermedia and Telecommunications (pp.
1217–1224). Orlando, FL.
Hsu, C., & Lu, H. (2004). Why do people play online
games? An extended TAM with social influences and
flow experience. Information & Management, 41(7),
853–868. doi:10.1016/j.im.2003.08.014
Inal, Y., & Cagiltay, K. (2007). Flow experiences of
children in an interactive social game environment.
British Journal of Educational Technology, 38(3),
455–464. doi:10.1111/j.1467-8535.2007.00709.x
Jackson, S., & Marsh, H. (1996). Development and
validation of a scale to measure optimal experience:
The flow state scale. Journal of Sport & Exercise
Psychology, 18(1), 17–35.
Jackson, S. A., & Eklund, R. C. (2002). Assessing
flow in physical activity: The Flow State Scale-2
and Dispositional Flow Scale-2. Journal of Sport
& Exercise Psychology, 24, 133–150.
Kang, B., & Tan, S. (2008). Impact of digital games
on intrinsic and extrinsic motivation, achievement,
and satisfaction. In K. McFerrin et al. (Eds.), Proceedings of Society for Information Technology &
Teacher Education International Conference 2008
(pp. 1825-1832). Chesapeake, VA: AACE.
Ketelhut, D. J., Dede, C., Clarke, J., & Nelson,
B. (2006). A multi-user virtual environment for
building higher order inquiery skills in science.
Paper presented at the Annual Conference of the
American Educational Research Association, San
Francisco, CA.
Kiili, K. (2005). Content creation challenges and flow
experience in educational games: The IT-Emperor
case. The Internet and Higher Education, 8(3),
183–198. doi:10.1016/j.iheduc.2005.06.001
Lester, J., Spires, H., Nietfeld, J., Minogue, J., Mott,
B., & Lobeni, E. (2014). Designing game-based learning environments for elementary science education: A
narrative centered learning perspective. Information
Sciences, 264, 4-18
Marshall, M. N. (1996). Sampling for qualitative research. Family Practice, 13(6), 522–525.
doi:10.1093/fampra/13.6.522 PMID:9023528
Meluso, A., Zheng, M., Spires, H., & Lester, J.
(2012). Enhancing 5th graders’ science content
knowledge and self-efficacy through game-based
learning. Computers & Education, 59(2), 42–57.
doi:10.1016/j.compedu.2011.12.019
Miller, L. M., Chang, C., Wang, S., Beier, M. E.,
& Klisch, Y. (2011). Learning and motivational
impacts of a multimedia science game. Computers
& Education, 57(1), 1425–1433. doi:10.1016/j.
compedu.2011.01.016
North Carolina State Department of Public Instruction. North Carolina end-of-grade reading comprehension tests: Grades 3-8. Retrieved October
10, 2013, from http://www.ncpublicschools.org/
accountability/testing/eog/reading/
O’Neil, H. F., Wainess, R., & Baker, E. L. (2005).
Classification of learning outcomes: Evidence from
the computer games literature. Curriculum Journal,
16(4) 455-474.
Papastergiou, M. (2009). Digital game-based learning
in high school Computer Science education: Impact
on educational effectiveness and student motivation.
Computers & Education, 52(1), 1–12. doi:10.1016/j.
compedu.2008.06.004
Prensky, M., & Thiagarajan, S. (2007). Digital
game-based learning. St. Paul, MN: Paragon House
Publishers.
Rutten, N., van Joolingen, W. R., & van der Veen, J.
T. (2012). The learning effects of computer simulations in science education. Computers & Education,
58(1), 136–153. doi:10.1016/j.compedu.2011.07.017
Kiili, K., & Lainema, T. (2008). Foundation for
measuring engagement in educational games. Journal
of Interactive Learning Research, 19(3), 469–488.
Shih, J. L., Shih, B. J., Shih, C. C., Su, H. Y., &
Chuang, C. W. (2010). The influence of collaboration styles to children’s cognitive performance in
digital problem-solving game “William Adventure”:
A comparative case study. Computers & Education,
55(3), 982–993. doi:10.1016/j.compedu.2010.04.009
Lee, I., & Kwon, H. J. (2005). The relations between flow, information processing strategies and
performance in a computer-based simulation game.
Proceedings of ED-MEDIA, Montreal, 986-992.
Shin, N. (2006). Online learner’s ‘flow’ experience:
An empirical study. British Journal of Educational
Technology, 37(5), 705–720. doi:10.1111/j.14678535.2006.00641.x
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 83
Skadberg, Y. X., & Kimmel, J. R. (2004). Visitors’
flow experience while browsing a web site: Its measurement, contributing factors, and consequences.
Computers in Human Behavior, 20(3), 403–422.
doi:10.1016/S0747-5632(03)00050-5
Spires, H., Rowe, J. P., Mott, B. W., & Lester, J. C.
(2011). Problem solving and game-based learning:
Effects of middle grade students’ hypothesis testing
strategies on science learning outcomes. Journal of
Educational Computing Research, 44(4), 453–372.
doi:10.2190/EC.44.4.e
Spires, H., Turner, K., Rowe, J. P., Mott, B. W., &
Lester, J. C. (2010). Effects of game-based performance on science learning: A transactional theoretical perspective. Paper presented at the American
Educational Research Association annual conference,
Denver, CO.
Sweetser, P., & Wyeth, P. (2005). E-Game Flow: A
model for evaluating player enjoyment in games.
Computers in Entertainment, 3(3), 1–24.
Vogel, J. J., Greenwood-Ericksen, A., Cannon-Bowers, J., & Bowers, C. A. (2006). Using virtual reality
with and without gaming attributes for academic
achievement. Journal of Research on Technology
in Education, 39(1), 105–118. doi:10.1080/15391
523.2006.10782475
Webster, J., Trevino, L. K., & Ryan, L. (1993). The
dimensionality and correlates of flow in humancomputer interaction. Computers in Human Behavior,
9(4), 411–426. doi:10.1016/0747-5632(93)90032-N
Wrzesien, M., & Raya, M. A. (2010). Learning in
serious virtual worlds: Evaluation of learning effectiveness and appeal to students in the E-Junior
project. Computers & Education, 55(1), 178–187.
doi:10.1016/j.compedu.2010.01.003
Zheng, M., Spires, H., & Meluso, A. (2011). Examining upper elementary students’ gameplay experience:
A flow theory perspective. In A. Mendez-Vilas (Ed.),
Education in a technological world: Communicating
current and emerging research and technological
efforts (pp. 190-198). Badajoz, Spain: Formatex
Research Center.
Meixun Zheng received her Ph.D. in Curriculum and Instruction with a concentration in literacy
education from North Carolina State University. Her research focuses on literacy, new literacies,
game-based learning, and teacher education and professional development in the 21st century.
While at NC State from 2009 to 2012, Dr. Zheng first worked as a Research Assistant, and later
as a Research Associate at The William and Ida Friday Institute for Educational Innovation. As
an Adjunct Teaching Assistant Professor, she also taught face-to-face and online teacher education courses in literacy education in NC State’s Department of Curriculum and Instruction. Dr.
Zheng is currently an instructional designer and Adjunct Assistant Professor at University of
the Pacific where she develops and facilitates faculty and instructional development programs
and conducts research in these related areas.
Hiller Spires is a Professor of Literacy and Technology in the Department of Curriculum and
Instruction; she received her Ph.D. in literacy education from the University of South Carolina.
Dr. Spires served as the founding director of The William and Ida Friday Institute for Educational
Innovation from 2002-2006 and currently serves as FI Senior Research Fellow. She received the
NC State Alumni Distinguished Graduate Professor Award in 2012. Her research focuses on the
effects of digital literacies on learning, including emerging literacies associated with gaming
environments and Web 2.0 applications. She is co-PI on the NSF-funded projects, Crystal Island
and Narrative Theatre. She has published in Journal of Educational Psychology, Cognition &
Instruction, Journal of Adolescent & Adult Literacy, Literacy Research & Instruction, among
other journals. Dr. Spires coordinates the New Literacies & Global Learning K-12 Reading
graduate program and co-directs the Friday Institute’s New Literacies Collaborative (newlit.org).
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
84 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
APPENDIX A
See Figure 2 for a mixed methods research design diagram.
APPENDIX B
Science Content Test (Sample Questions)
•
•
•
Which of the following is NOT a model?
◦◦ A globe
◦◦ A doll
◦◦ A waterfall
◦◦ A map
Which is the MOST accurate way to locate an exact position on a map?
◦◦ Scan the map visually
◦◦ Use the map key
◦◦ Use the compass
◦◦ Use the map grid
You need to get from Ranger Station 1 to Ranger Station 3. About how many meters long
would this hike be if you stay on the road?
◦◦ 100 meters
◦◦ 1600 meters
◦◦ 1200 meters
◦◦ 700 meters
Figure 2. Embedded mixed methods research design diagram
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014 85
APPENDIX C
Adapted Game Flow Experience Survey (Kiili & Lainema, 2008)
Flow Experience Items
•
•
•
•
•
Attention focus
◦◦ The Crystal Island game really grabbed my attention.
◦◦ It was easy for me to pay all my attention to the game.
◦◦ I was completely concentrated in playing the game.
Sense of potential control
◦◦ It was easy for me to control my playing actions in the game.
◦◦ I was able to decide on my own playing actions and made progress in the game.
◦◦ I was in good control of my playing actions.
Loss of self-consciousness
◦◦ When I was playing, I did not care about what others thought about how well I was
playing.
◦◦ I just kept playing and was not worried about how well I was doing.
◦◦ While I was playing the Crystal Island game, I forgot about unhappy things.
Sense of time distortion
◦◦ While I was playing the Crystal Island game, I felt time seemed to go in an unusual
way (either much faster or slower than usual).
◦◦ During gameplay, I forgot about time because I really got into the game.
◦◦ During gameplay, the time seemed to pass very fast. Suddenly, the playing session was
almost over.
Autotelic experience
◦◦ I really enjoyed playing the Crystal Island game.
◦◦ I liked the feeling of playing and want to play it again.
◦◦ I enjoyed playing the Crystal Island game because it made me feel good.
Flow Antecedents (Game Design Features) Items
•
•
•
Balance between challenge and perceived skills
◦◦ The game tasks were challenging, but I had the skills to meet the challenge.
◦◦ The difficult level of the game equaled to my skill level.
◦◦ As I played the game, my skills got improved and so I was able to complete more difficult tasks.
Good playability
◦◦ I understood the game on the screen quickly and knew what to do without having to think.
◦◦ I learned how to lay the game very easily.
◦◦ It was easy for me to understand how to play the game.
Gamefulness
◦◦ The game did not get any more difficult as I played.
◦◦ I was able to play in many different ways.
◦◦ I could use the rewards (i.e., sand dollars) I gained later when I did other tasks.
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
86 International Journal of Virtual and Personal Learning Environments, 5(2), 69-86, April-June 2014
•
•
•
Clear goals
◦◦ I knew clearly what I needed to do in the game.
◦◦ The goal of the game was very clear to me.
◦◦ I understood the goal of the game from the very beginning.
Immediate and useful feedback
◦◦ The feedback given by the game helped me know how well I was doing in the game.
◦◦ The game provided quick feedback of how well I was playing.
◦◦ The feedback that the game provided was very helpful.
Game frame (background) story
◦◦ The game background story was part of the reason why I liked playing the game.
◦◦ The game story made it easier for me to understand what happened in the game.
◦◦ The game story helped me to understand what I needed to do in the game.
Peer Interaction during Collaborative Gameplay Items
•
•
•
I enjoyed playing the game with my partner.
My partner helped me to do better in the game.
I would like to play the game with my partner gain.
APPENDIX D
Game Flow Experience Focus Group Interview (Sample Questions)
•
•
•
•
•
Attention focus: Did you pay all your attention to the game?
Sense of control: Did you feel you were able to easily control your own playing actions?
Time distortion: When you were playing the Crystal Island game, what was your feeling
about time? Did you feel that time went very fast?
Loss of self-consciousness: When you were playing the Crystal Island game, did you
worry about what your friends may think about how you were doing in the game?
Autotelic experience: Do you think playing the Crystal Island game make you feel good?
Do you think you enjoyed playing the Crystal Island game so much that you would like
to play it again in the future?
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.