Methods of using game technology in higher education: a review of

Methods of using game technology in higher
education: a review of the development of FAST,
a feedback & assessment support tool
John Twycross
Oxford Brookes University
[email protected]
http://f-a-s-t.org.uk
Abstract
An investigation into using game technology as a method of enhancing student engagement
has led to the development of a bespoke software package. It enables rapid feedback cycles,
continual assessment and peer learning. It has been designed to support a range of media
types and has been trialled with Media Technology students. The methodology has
relevance to all coursework-based assessments.
Keywords
student engagement, game-based learning, continual feedback
1. Introduction
This study addresses the needs of lecturers in traditional face-to-face teaching contexts. It is
concerned with how technology can be used in higher education to enhance student
learning. It aligns video game technology with pedagogical research aiming to suggest ways
of augmenting traditional teaching methods. The project set out to address the following
issues.
• The dichotomy between academic study and industrial practice
• The management of assessment and feedback
• Critical self-awareness
•
Industries “need computer science and art graduates who can hit the road running as well
as those with excellent general STEM and art skills." (Livingstone, I, and Hope, A, 2010).
Academics need to develop a teaching methodology that fosters a deep approach to
learning, "providing more feedback and helping to pace student learning". (Rust, C, 2002).
Metacognition, the highest level of learning in Bloom’s (1984) revised taxonomy requires
“teachers to help students make accurate assessments of their self-knowledge, not inflate
their self-esteem” (Pintrich, P.R., 2002:219).
The project started by defining games as “rule-based activities; they offer rewards, provide
feedback, record progress and rate performance” (Twycross, J, 2014). It responded to the
"new opportunities with game-based learning for reconsidering how we learn” (de Freitas,
S, 2010:60) and how the role of lecturer will change "towards one of facilitator,
collaborator, producer or author" (de Freitas, S, 2010:57).
2. Methods
The focus was on creating a tool that which graphically illustrated students’ progress. It was
intended to be part of a blended learning experience, used in the lecture theatre, studio,
laboratory, or workshop space. It would be delivered face-to-face as a catalyst to discussion
and action. Ethical considerations of the project prevented an open source development
methodology. The action research approach (Reason, P, and Bradbury, H, 2006), led me to
develop the work using real data on students and so it was not released online. To avoid
data security issues the software was run locally on a laptop. Student numbers were used as
identification of individuals, providing a form of anonymity.
Core to the concept was a weekly progress review, especially in the formative weeks early
in the semester. This was informed by a “Threshold Concept” (Meyer, J, and Land, R, 2003)
methodology in which the fundamental learning required to progress in the module was
documented. Activities set in practical sessions required a submission that would be
reviewed prior to the next lecture where feedback would be given to all in the cohort. This
would build over time to provide evidence of student progress. To make the review
process manageable work would be required to follow a specific naming convention. This
was based on the student’s identification number, for example 1234567.jpg or 1234567.txt,
and software developed to automatically open and display these files for review. Student
progress was recorded on a spreadsheet, which was fed into a MySQL database. This data
would then be accessed via a PHP script and displayed in a format similar to a video game
high score table.
Figure 1. Software design overview
A wireframe prototype was developed. The high score system expanded into a matrix
aligned with the learning objectives for the module. It displayed students’ progress on
screen alongside their submitted work. To keep this simple and manageable the student was
graded in binary form - positive or negative. A student who was able to demonstrate the
achievement of the learning objectives would automatically receive a pass for the module. A
subsequent coursework would use traditional assessment methods to assess higher level
learning and advanced demonstration of the concepts taught. This would enable the award
of higher grades of merit or distinction.
Figure 2. Screen shots of FAST, software menu and image / data display state
3. Results
The teaching tool succeeded in creating a rapid feedback cycle. Students were receiving
formative advice that was directly aligned with the grading criteria in a manner to which
they could respond to and improve on without fear of failure. I had created a method of
continual assessment. This reduced stress on both teacher and students by providing a more
even workload.
Benefits to the students were evident in the students’ module evaluations. Compared to the
previous year there was a significantly improved response to the questions relating to
feedback. The table below shows extracts of data from two modules used as a pilot.
Questions were asked as a multiple choice of A through E (strongly agree to strongly
disagree). The numeric score for each question shows the percentage of respondents
answering ‘positively’ to that question (the sum of A and B answers). The letter indicates
the median response.
Figure 3. Module evaluation data prior to using project (2011-12) and cohort using FAST system
(2012-13)
By automating the display of the students’ work in progress I was able to rapidly provide
feedback. Further to this, because students were privy to the feedback given to the whole
cohort their overall exposure to feedback was greater, even though the majority of this was
directed at their colleagues rather than themselves. Discussion on progress, problems and
future plans created a fertile learning environment. Students were encouraged to discuss
techniques shown resulting in an informal process of peer review. Students’ critical
reflection improved, as they were aware of the standard of work of their peers.
The images below show coursework submissions from a digital modelling module,
comparing the submissions of students equivalent in ability. This module is very challenging
and requires students new to 3d modelling to demonstrate the advanced skills of character
modelling and rigging. The work prior to the project shows errors in missing textures and
many students fell short creating animation ready work. The subsequent work shows these
issues resolved and students achieving results well beyond expectations with fully rendered
sequences.
Visual data comparing coursework submissions from students of equivalent ability (2011-12 and
2012-13)
These methods embraced the just-in-time workflow that both industry and students rely on.
The adoption of industrially relevant methods of assessment required a level of discipline
that put the emphasis firmly on the student to provide work regularly and in the correct
format. It was explained to the students that in a production environment, there would be
strict naming conventions for all files and that adherence to this is a fundamental
requirement.
The competitive nature of the students became apparent immediately. Students are
generally aiming for merits and distinctions in their grades. Learner Performance showed
marked improvement, high achievers were increasingly motivated by peer review. Others,
feeling the steep learning curve, were supported by the transparency of the process.
Collaboration was hard to gauge quantitatively in terms of change, but it was observed that
informal discussion increased and became a positive aspect of the learning environment.
4. Conclusion
What has been developed is a visualisation tool allowing a clear and graphical representation
of student progression. Its function in terms of continual assessment is manageable. It is
however, useful to remember that this tool is designed only to aid the documentation of
lower level learning, that is, the fundamental concepts of a module. Further assessment of
higher-level learning is currently not suited to the restrictions of this method.
This work is relevant to all disciplines that have a coursework that builds over a number of
weeks. A wider range of media could be facilitated by the system. There is already available
a range of tools supporting the administration of essays and it is beyond the scope of this
project to address this topic. However, in some cases it is useful to include text-based
work. We are extending the software to allow the automatic display of short texts. This
method could be applied to automatically call program functions in work submitted by
students of computer science.
This project relies on established pedagogy augmented by technology. At its heart is a
framework designed to provide feedback for both teacher and learner. Student progression
is documented using a spreadsheet, a familiar and trustworthy method of logging
achievement. The face-to-face delivery to the whole cohort or on a one-to-one basis
provides an opportunity to know the students work in detail and manage expectations.
Interestingly, my attitude has changed through this action research process. Prior to starting
I had considered my teaching role to be that of a facilitator. The focused rapport with
students built through using this system and the structured dialogue framework led me to
redefine this as mentor. This shift in my personal perspective has evolved throughout the
project and was an unpredicted result of this investigation into using game technology in
teaching.
References:
Biggs, J. and Tang, C. (4th ed.) (2011) Teaching for Quality Learning at University.
Maidenhead: Open University Press.
de Freitas, S. (2006) Using games and simulations for supporting learning. Learning, Media
Technology. 31 (4) 343-58
de Freitas, S. (2010) Learning in immersive Worlds: A review of game-based learning. JISC
Publications. Available at:
http://www.jisc.ac.uk/media/documents/programmes/elearninginnovation/gamingreport_v3.p
df (Accessed on: 12/9/2012).
Gray, C. & Malins, J. (2004) Visualizing Research: A Guide to the Research Process in Art
andDesign, Aldershot: Ashgate.
Kolb, D. A. (1984) Experiential learning: Experience as the source of learning and
development. New Jersey: Prentice-Hall.
Livingstone, I. and Hope, A. (2010) Next Gen. Transforming the UK into the world’s leading
talent hub for the video games and visual effects industries, NESTA. Available at:
http://www.nesta.org.uk/sites/default/files/next_gen.pdf (Accessed: 12 September 2012).
Meyer, J.H.F. and Land, R. (2003) Threshold concepts and troublesome knowledge: linkages
to ways of thinking and practicing. In: Improving Student Learning - Theory and Practice Ten
Years On. (ed. C. Rust). Oxford: Oxford Centre for Staff and Learning Development
(OCSLD), pp 412-424.
Pintrich, P.R. (2002) The Role of Metacognitive Knowledge in Learning, Teaching, and
Assessing Theory Into Practice, 41, Number 4, Autumn 2002 Available at:
http://www.unco.edu/cetl/sir/stating_outcome/documents/Krathwohl.pdf (Accessed: 14
February 2014).
Reason, P. and Bradbury, H. (ed.) (2006) Handbook of Action Research London: Sage.
Rust, C. (2002) The Impact of Assessment on Student Learning: How Can the Research
Literature Practically Help to Inform the Development of Departmental Assessment
Strategies and Learner-Centred Assessment Practices? Active Learning in Higher Education.
3:145-158.
Twycross, J. (2014 In Press) Ahead of the Game: Enhancing Student Engagement through
Contemporary Modes of Delivering Learning Material. Media Education Research Journal.
Whitton, N. (2010) Learning with digital games: a practical guide to engaging students in
higher education. London: Routledge.
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed for
profit or commercial advantage and that copies bear this notice and the full citation on
the first page. To copy otherwise, to republish, to post on servers or to redistribute to
lists, requires prior specific permission.
© 2014 The Higher Education Academy