Time to Stop Playing Games - Seriously

Time to Stop Playing Games - Seriously
Matt Moncrieff, Shawn Parr; Steve Salmon
Calytrix Technologies Pty Ltd
[email protected]
Abstract. The term Serious Games 1 is finding increasing use in military circles as an attempt to distinguish between
entertainment products and those games used for training purposes. In recent years there has been a trend away from
complex and expensive purpose built military simulations, towards solutions based on commercial gaming engines.
In addition to vastly reduced development costs, commercially based games offer advantages in terms of their support
for commonly available hardware and operating systems, the maturity and currency of their graphic engines, and the
ability to access the games easily from work and home locations. Game designers also bring a wealth of experience in
their ability to develop engaging and creative games which maintain high levels of player interest, which is essential
to any training design process. The link however, has still not been made between games and training. There is a
danger that this element of ‘training’ is moving away from a controlled and monitored environment, with built in
feedback and error correction mechanisms, to one where the trainees operate entirely by themselves.
A catch cry amongst the providers of serious games has been, ‘Serious Games for Serious Training’. Many militaries
have subscribed to this philosophy and have invested heavily in the acquisition of these games. There is however a
growing undercurrent from military members who were starting to question whether they were actually getting real
training value from the games that they had invested so heavily in. Though many are proficient in designing training
scenarios that replicate current conditions, there is still no method to link the trainee’s performance with specified
learning objectives. How do you really know that the trainee has achieved anything worthwhile, how do you provide
an auditable trail to say that the trainee is achieving any level of competency and how do you record those results
with any degree of proficiency? The next step for Serious Trainers is to begin to understand how games and
simulations are embedded and employed within a Serious Training environment. To be considered ‘serious’ games
must be part of a deliberate and carefully planned training activity.
1.
INTRODUCTION
While the term Serious Games attempts to highlight
the differences between entertainment and
educational gaming products, it still falls short in
defining the vastly different environments in which
commercial gaming and military training take place.
As a very visible and engaging element of a training
system, simulations tend to occupy centre stage.
This is particularly so for the increasing number of
visually rich immersive environments which are
becoming available to military trainers at an ever
decreasing cost. Prime examples of serious games
include Bohemia Interactive Virtual Battlespace 2
(VBS2) and eSIM games Steel Beasts. These
simulations offer a potentially powerful and cost
effective training tool but there is a danger that their
intrinsic appeal overruns the understanding of what
role they actually play in a structured training
system. The question remains, what separates a pure
game from its serious counterpart, when many of
them look the same on the surface?
1
1.1 What is a Serious Game?
There is no accepted definition of a serious game
although most would generally agree that it is a
’game designed for a primary purpose other than
entertainment’. J. Purdy in his article titled Getting
Serious About Digital Games in Learning says,
‘conceptually, the real differentiator between
traditional computer based training and serious
games is that well developed serious games use solid
learning methodologies combined with modern
game design techniques to create a hybrid – a highly
engaging and entertaining learning experience’.
The trainee’s perspective should remain the same.
They should be immersed in a highly entertaining
activity that enables them to build their skills and be
rewarded for their efforts and experience.
Those responsible for designing the training
however would consider there to be large
differences between an entertainment product and a
serious game. In the first instance, the ability to
A serious game is a game designed for a primary purpose other than pure entertainment. The "serious" adjective is generally appended to
refer to products used by industries like defense, education, scientific exploration, health care, emergency management, city planning,
engineering, religion, and politics.
create and modify real-world based training
scenarios, terrain and the game entities to suit the
required training objective is useful in separating
serious games from entertainment products. While
many commercially based games (and some offered
for military use) offer rich and detailed
environments, realistic movement, fire and
responses and high levels of interaction, they are
quickly made redundant when the scenarios offered
can’t be rapidly altered to suit changing training
requirements.
Secondly, the game must be able to support trainers
to review the performance of trainees during the
scenario and develop products to support a serious
after action review (AAR) process. Without this,
potential valuable learning points are lost to both the
trainee and the trainer.
Thirdly, the game must be able to be integrated into
a large training environment using standards based
protocols, notably via the Distributed Interactive
Simulation (DIS) and High-Level Architecture
(HLA) standards. Without this ability the game is
effectively a stand-alone. However, regardless of the
capabilities of the game in use, unless it is applied
within the confines of a rigorous training structure
then the value to the training audience is rapidly
diluted.
2.
CLOSING THE TRAINING LOOP
While there are many variations on the theme, most
organisations apply a process model to training
development and delivery which attempts to break
the process down into its component parts. The
majority of these models share a common flow and
structure. For the purposes of this paper we will use
Vaughn’s Model (Vaughn, 2005) as it clearly
addresses the requirement to develop specific
measures of learning as part of the design process.
Figure 1: Vaughn's Model of Instructional Systems
Development
Even with the application of serious games as part of
the ‘Conduct the Training’ phase and the advantages
that those games offer, there remains a clear
requirement to embed the games within a training
construct that is informed by the desired outcomes
and measures of learning. Equally, the desired
outcomes must inform the design of the simulated
training activities and the manner in which they are
conducted.
Perhaps most important is the ability to evaluate the
trainee’s performance against a well defined set of
tasks, standards and measures which provide a clear
and measurable link from the desired training
outcomes to training activities and ultimately trainee
performance.
Regardless of where a particular activity sits on the
training continuum, if simulation is to be utilized as
part of a rigorous, well defined training environment it
must be influenced by all stages of the training design
process. Simulation is just one of a range of training
tools and techniques available to training developers.
As with any other tool, it needs to be incorporated as
part of a deliberate, structured approach to training or
it will remain as a shiny, but largely ineffective
showpiece.
3. A COMPREHENSIVE TRAINING
MANAGEMENT ENVIRONMENT
In order to maximize the potential value of serious
games, a framework supporting the design,
management and measurement of training activities in
the context of pre-defined training metrics must be
achieved.
Critical to this process is the ability to provide a system
which can rapidly capture both objective and
subjective data, record these against a rigorously
defined set of training objectives and measures and
then provide that information to both the training
audience and those responsible for AAR and training
design review.
outcomes through a structured and recognised set of
performance metrics.
Training outcomes generated as part of the training
development cycle need to be articulated in terms of a
definable set of measurable metrics which should drive
the anticipated training continuum as well as the design,
sequencing and execution of the scenarios offered to the
trainee. These metrics should, at least in part, include
the following:
•
A sequenced set of training
represented in the game scenario,
•
The tasks required to be performed for each of
these activities,
•
The associated standards
performance required,
•
Specific performance measures, and
•
The underlying scoring models (subjective or
objective) to be applied.
Figure 2: Training Management Environment
The Training Management Environment envisages a
complete training system which moves away from the
use of simulation as a stand-alone training tool into an
environment where simulation becomes part of a
measured and carefully planned training environment.
Such a system places the simulation in the middle of a
total training environment designed to avoid the pitfalls
of using a game outside of a structured training system.
The environment outlined in Figure 2 consists of three
interrelated elements designed to achieve an end-to-end
simulation based training environment.
3.1.1
The Simulation
At the core of the training system is the simulation,
serious game or the simulation federation. While this
paper is focused on the integration of the current crop of
serious games, the same principles apply to larger
constructive simulations such as JSAF or OneSAF or
even instrumented live training environments.
Modern simulations offer a rich, safe and engaging
training activity ‘layer’ which provide the stimulus to
the training audience to achieve specific training
outcomes. This, of course, assumes that the simulated
training activities and scenarios have been designed as
part of a complete training package as opposed to adhoc scenarios developed in isolation to the larger
training goals.
In order for the simulation to support a correct learning
environment it should also provide a game replay
capability which contributes to a comprehensive AAR
process. This is critical to both trainee performance
feedback as well as future training evaluation and
design activities. The AAR must be based on and
reinforce the initial learning objectives.
3.1.2
The Training Management System (TMS)
The next element of this proposed approach is the
critical, and often absent, piece which provides the clear
linkages from the game to the specified learning
or
activities
level
of
The training management system therefore provides a
traceable path from simulated activities to the training
outcomes. Additionally, it provides the framework for
the collection and presentation of assessment generated
by the instructor as well as objective assessment drawn
directly from both the game and the overarching
learning management system.
Assessment data falls into two broad categories:
objective and subjective assessment. Objective data
consists of hard, quantifiable measures which can be
relatively easily scored. This may include such things as
rounds fired vs. actual hits; own, enemy and civilian
casualties, emergency response times, etc. Subjective
data is much harder to quantify and typically involves
the analysis of a range of actions presented to the
trainee through the simulated scenario. Observations are
made against the course of action chosen by the trainee
in light of the circumstances and information available
to them at the time.
Both objective and subjective assessments are critical to
developing an overall picture of trainee performance.
What simulation and indeed instrumented environments
offer is the ability to automatically capture elements of
the system generated objective data and to record that
data against pre-defined training tasks, standards and
measures in the training management system.
This has two beneficial outcomes: Firstly, it serves to
defuse a potentially hostile AAR process by providing
irrefutable evidence of trainee performance. This
evidence is then not clouded by personal recollections
and can be supported by artifacts taken and stored by
the training management system. Secondly, it allows the
instructor to focus on those subjective ‘grey’ areas of
assessment which require the instructor’s experience
and depth of knowledge in order to analyse courses of
action taken by the student.
3.1.3
The Learning Management System (LMS)
The third element of the training environment provides
overarching management of a related series of training
activities as well as access to learning resources as part
of a structured and deliberate approach to training.
The integration of a SCORM 2 compliant learning
management system offers a number of outcomes for
training designers and managers. The LMS provides the
trainee with access to relevant learning materials
(doctrine, reference material, etc) which can be
presented at key points in the simulated scenario to
guide and shape their subsequent actions as the scenario
progresses. In order to reinforce and assess the trainee’s
understanding of these learning points, the LMS
presents the trainee with a series of exams, quizzes or
surveys which provide the instructor with the
confidence that the trainee understands what is required
as the scenario unfolds. The results of these ‘exams’ are
automatically exported to the TMS and are recorded
against pre-defined training metrics.
The LMS also provides the higher-level training and
competency management requirements. Overall results
of a training activity can be exported from the training
management system into the LMS and into existing
organizational competency or training progression
record management systems.
3.2 Training Architecture
From an architecture perspective the three principle
components, that is the game, the Learning
Management System and the Training Management
System, need to be integrated to provide information
flows and to increase automation of the design and
evaluation processes.
This integration is achieved in the design phase through
the marriage of training objectives in the TMS being
tested in the game scenario, with support material in the
LMS. At the technology level the game must be able to
access the LMS to allow the player to read material and
do exams at the correct point in the scenario; the LMS
must exchange results with the TMS; and the TMS must
be able to analyse data from the game to calculate
objective results and to fuse these results with instructor
observations.
Figure 3: Training System Architecture
This architecture places the game at the center of the
environment, but ensures it is supported by an LMS to
assist the student to learn and meet the training
objectives, while the TMS layer is able to monitor and
report on the outcome in a rigorous and repeatable
manner.
The use of recognised standards such as DIS and HLA
for the game layer and SCORM for the LMS layer
means that the system is able to be rapidly expanded to
incorporate additional simulations as well as linking in
to existing learning management systems.
4.
A NEW TRAINING ENVIRONMENT
The successful delivery of simulation-based training
relies on a number of related elements. The fidelity and
realism of the simulation in use, selection and use of
peripheral systems and hardware which enhance realism
and the ability to design training scenarios which are
clearly related to desired training outcomes.
The training system proposed in this paper is focused on
providing the automation of a number of design,
recording and assessment elements, as part of an end-toend approach to simulated training design, execution,
assessment and review. However, this integration and
automation is designed supplement, not supplant the
most important elements of any training activity; the
human dimension.
4.1 The Role of the Instructor
Regardless of the complexity and fidelity of the
simulation system in use there will always be the need
for an experienced instructor or instructional team to
make subjective comment against trainee decision
making and to provide guidance and feedback to the
trainee.
2
Sharable Content Object Reference Model (SCORM) is a collection
of standards and specifications for web-based e-learning.
The ability to provide increasing levels of realism and
automation in training environments should not be seen
as a threat to the role of an experienced instructor or
mentor. The manner in which an instructor is engaged
in the training design process and the manner in which
that instructor interacts with the trainee during the
execution and performance analysis phases of the
exercise are still vitally critical elements in enhancing
the trainee’s perception of reality and their attainment of
training outcomes from the exercise. No serious game
on the market today (and arguably for a long time to
come) is able to provide the level of fidelity, feedback
and guidance that an experienced instructor is able to
provide.
While the current crop of serious games are generally
very good at simulating kinetic effects (including the
impact of electronic warfare), the physical operating
environment and even a range of enemy and friendly
tactical responses through the use of artificial
intelligence, they still fall a long way short of being able
to accurately represent many of the more complex and
subtle factors present on the modern battlefield. These
non-kinetic actions may include human interaction and
its impact on responses to various situations, cultural
aspects, political influences on enemy and own
behaviour, the role and impact of the media as well as
the impact of stress, fear and fatigue on trainee
performance. It is in introducing and managing these
elements to a training scenario, that the experienced
instructor plays a crucial role in providing the subtle
fidelity to the training scenario.
Far from threatening the role of the instructor, a
complete (and largely automated) training system needs
to be designed to free up the instructor to concentrate on
more complex elements of assessment and trainee
development which serious games, used in isolation, are
still incapable of providing.
Traditional military simulations can be very heavy
weight, rely on custom operating systems or restrict the
number and location of users through expensive
licensing and a reliance on dedicated networks to
operate. The advent of serious military games adapted
from their commercial counterparts is starting to break
these barriers down and allows the training audience
more regular access at work, or even from home
locations. There is great strength in allowing this to
occur as trainees can continue training at their own pace
in an environment which may be more conducive to
making errors and experimenting with different tactics
and procedures. But, there is also a very real danger that
this practice can serve to reinforce bad habits (negative
transfer of training) without the presence of an
experienced instructor and without the framework
offered by a carefully designed and monitored training
environment.
4.2 The Role of the Simulation
It is important for the training designer and instructor to
recognise that the role of simulation changes when
transitioning from simple training scenarios to more
difficult ones which provide the basis for decision
making in complex operational settings.
In simple scenarios relying on repetitive training
responses the simulation provides a near-real
environment and, through the underlying game logic,
the great majority of adjudication on trainee
performance. In these types of scenarios, performance
will generally be characterised by relatively simple
metrics such as weapons accuracy, casualties, posture,
timings, etc. The role of the instructor remains
important, particularly in the ability to analyse a range
of performance metrics and to identify opportunities for
improvement or to detect patterns of trainee behaviour
which need to be corrected.
In a simulated training environment, these types of
basic scenarios can be characterised by:
• High correlation between simulated scenarios and
the operating environment,
• Limited response options; there is generally a right
answer.
• Adherence to established doctrine, TTPs, SOPs
and ROE.
• A high percentage of objective measures in overall
assessment metrics.
• Increased opportunities for automated gathering of
assessment data (measurements of weapon
accuracy, enemy and own casualties, etc).
• Less reliance on experienced instructors provides
immediate training performance feedback.
As the trainee or team moves through the training
continuum into more complex scenarios with uncertain
outcomes, the role of the simulation shifts away from
adjudication towards providing the stimulus and
background for exploration of a range of response
options which require a more intimate instructor-trainee
relationship and less reliance of hard mathematical
performance metrics.
These types of training scenarios can be characterised
by:
• Lower correlation between simulated scenarios
and the operational context, particularly in terms
of the human dimensions which are extremely
difficult to simulate.
• A greater range of response options. There is no
one right answer, rather a series of options which
will each have advantages and disadvantages.
• Scenarios which may challenge the bounds of
established doctrine, TTPs, SOPs and ROE.
• A high percentage of subjective measures in
overall assessment metrics.
• Reduced opportunities for automated gathering of
assessment data where simple mathematical
measures have less bearing on the training
objectives.
• A greatly increased reliance on experienced
instructors to provide context to trainee actions
based on personal experience.
The training system must, therefore, be able to capture
subjective performance metrics as well as more
objective metrics. The system must then be able to fuse
a number of sources and categories of assessment data
to provide rapid feedback to the trainee as well as allow
for more detailed training analysis activities. It should
be capable of presenting these results, capturing
evidence of training performance (artifacts) as well as
detailed instructor observations and scores against
performance, and do this for both simple and complex
scenarios.
Most importantly, the linkages between the simulated
scenarios played out during the training, the
performance parameters under which the trainee is
assessed, and the assessment itself must be clearly
linked to the training objectives developed as part of the
training needs analysis. These visible and traceable
linkages are not just important in determining the
effectiveness of the training, but also serve to provide
the framework which guides the instructor-trainee
relationship for any given scenario and result in much
more productive outcomes for both parties.
So, while simulation offers great opportunities to
expand the range of training activities which can be
effectively undertaken, it also brings inherent
limitations. Failure to account for these limitations as
part of a structured approach to training design can lead
to a false sense of security in terms of training outcomes
and training designers can unwittingly introduce a
significant risk of negative behaviour transfer.
5.
CONCLUSION
Current military simulations reflect the direction that the
gamers have pushed the simulation market. A number
of modern militaries have invested heavily in designing
specific training scenarios for their games but have yet
to link those scenarios to defined metrics.
Ultimately it is the training design and learning
management environment in which games are employed
which gives them their validity and power. To be
effective simulated training must be structured to align
with pre-determined training objectives, outcomes and
standards. This has always been true of any training
technique or tool, but there can be a tendency for the
‘shiny’ factor of visually appealing simulations to
submerge this discipline. There needs to be a clearly
defined link from individual training activities and
associated measures of performance through to overall
training objectives. Similarly, trainers need to be able to
capture and contrast actual training performance against
those objectives to ensure that the design and conduct of
the training is hitting the mark. Serious games and a
structured training environment need to align.
Developing specific scenarios that replicate current
operations is not enough. Serious games must be able to
demonstrate a link between the trainee’s performance
and the standards and measures attached to a specific
task on the Mission Essential Task List (METL).
It is time that the trainer’s ensured that the game is
actually meeting a predefined training outcomes; that
the trainee is achieving a specific standard and that the
results are repeatable and recorded for future reference.
The games themselves are simply just another training
tool. Serious Trainers need to begin to understand how
games are embedded and employed within a Serious
Training environment. Until this is done, serious games
will remain just that; games.
1.
Thorndike, E.L. & Woodworth, R. S. (1901) “The
Influence of Improvement in One Mental Function
upon the Efficiency of Other Functions,” First
Published in Psychological Review, 8, 247-261.
2.
Laker, D. R. (1990) “Dual Dimensionality of Training
Transfer,” Human Resource Development Quarterly,
1(3), 209-224.
3.
Purdy. J. (2007) “Getting Serious About Digital
Games in Learning.” Corporate University Journal
2007, Vol 1
4.
Vaughn, R. H. (2005) The Professional Trainer 2nd
Edition, Berret-Koehler Publishers, San Francisco.