Kinesthetic Interaction - Revealing the Bodily

Kinesthetic Interaction
- Revealing the Bodily Potential in Interaction Design
Maiken Hillerup Fogtmann
Jonas Fritsch
Karen Johanne Kortbek
Center for Interactive Spaces
Aarhus School of Architecture
Norreport 20, DK-8000 Aarhus C,
Denmark
Information and Media Studies
University of Aarhus
Helsingforsgade 14, DK-8200 Aarhus
N, Denmark
[email protected]
[email protected]
Center for Interactive Spaces
Dept. of Computer Science,
University of Aarhus
Aabogade 34, DK-8200 Aarhus N,
Denmark
[email protected]
ABSTRACT
Within the Human-Computer Interaction community there is a
growing interest in designing for the whole body in interaction
design. The attempts aimed at addressing the body have very
different outcomes spanning from theoretical arguments for
understanding the body in the design process, to more practical
examples of designing for bodily potential. This paper presents
Kinesthetic Interaction as a unifying concept for describing the
body in motion as a foundation for designing interactive systems.
Based on the theoretical foundation for Kinesthetic Interaction, a
conceptual framework is introduced to reveal bodily potential in
relation to three design themes – kinesthetic development,
kinesthetic means and kinesthetic disorder; and seven design
parameters – engagement, sociality, movability, explicit
motivation, implicit motivation, expressive meaning and
kinesthetic empathy. The framework is a tool to be utilized when
analyzing existing designs, as well as developing designs
exploring new ways of designing kinesthetic interactions.
Categories and Subject Descriptors
H.5.2 User Interfaces – Interaction Styles, Theory and Methods.
H.5.m Information Interfaces and Presentation (e.g., HCI):
Miscellaneous.
General Terms
Design, Human Factors, Theory.
Keywords
Interaction design, bodily movement, motor skills, kinesthetic
interaction, kinesthesis, kinesthetic experience, interactive
technologies.
1. INTRODUCTION
In 1986, Bill Buxton [3] described a human interacting with a
computer as a being with one well-developed eye, a long right
arm, uniform-length fingers, ears, however, lacking legs, and a
sense of smell or touch. He argued for a greater involvement of
the body in Human-Computer Interaction, “(…) when compared
to other human operated machinery (such as the automobile),
today's computer systems make extremely poor use of the
potential of the human's sensory and motor systems” [3].
Today, the idea that interaction is strictly about a perceptive
human being in a sedentary body interacting with a mouse and
keyboard in front of a desktop computer is rapidly evolving to
also include bodily movement that enables the user to experience
the world through physically and socially engaging activities. In
Grudin’s historical continuity of interface design [14], the
interface is pushed further and further away from the computer,
and it becomes an even greater part of a social context in physical
space. This new territory both comprises new design ideals and
directions [7], new design domains [35], and new interaction
forms made possible through developments of both interactive
technologies, and theoretical shifts in the conception of how we
might interact, or live, with the computer [28].
In the field of Human-Computer Interaction (HCI), several
attempts have been made to meet these new challenges, both in
developing theoretical frameworks [7], and in carrying out
practical research projects [30][6] to explore the bodily potential
when designing interactive systems. However, no coherent
vocabulary has been proposed to address the bodily potential
unifying the existing work done within interaction design as well
as presenting a foundation for work to be done in the future.
The growing interest in movement-based interfaces and
interactive systems can be ascribed to an increasing focus on new
domains of interaction, such as experience centers [6], sports [31]
[16] and museums [8][15]. A bodily experience at a museum, for
instance, can be much more present and palpable when the user
can move around freely in the physical space of experience,
compared to exploring a system through an index finger [6].
Another reason for involving the body more directly in the design
of interactive systems stems from an increased focus on health and work related problems or life style problems, such as obesity1
and back problems due to sedentary work [21]. These new
challenges call for a conceptual framework to identify unexplored
possibilities when designing interactive systems addressing the
body in motion.
OZCHI 2008, December 8-12, 2008, Cairns, QLD, Australia.
Copyright the author(s) and CHISIG.
Additional copies are available at the ACM Digital Library
(http://portal.acm.org/dl.cfm) or can be ordered from
CHISIG([email protected])
The first section of the paper presents the diversity of work done
on bodily approaches in interaction design. We then introduce a
foundation for defining Kinesthetic Interaction (KI). KI functions
as a unifying concept for understanding the bodily potential in
interaction design comprising physiological aspects of
OZCHI 2008 Proceedings ISBN: 0-9803063-4-5
1
http://www.msnbc.msn.com/id/7722888/
kinesthetics, kinesthetic experience and interactive technologies in
a kinesthetic perspective. The concept describes a physiologically
grounded new direction in the design of technologically mediated
bodily interaction. Founded on the notion of KI, we present three
design themes - kinesthetic development, kinesthetic means and
kinesthetic disorder - in order to develop a conceptual framework
revealing bodily potential in interaction design. In relation to the
three themes we derive a set of design parameters, which support
a practical exploration of the possibilities for addressing bodily
potential in the design of interactive systems. We finish with a
conclusion of the bodily potential based on a discussion on how to
design for kinesthetic means, kinesthetic development and
kinesthetic disorder in interaction design.
2. RELATED WORK ON BODILY
APPROACHES IN INTERACTION DESIGN
The design literature is rich with work addressing bodily potential
defined by a great diversity of approaches and concepts dealing
with the body in motion and its relation to technology interaction.
The directions within embodiment account for a large amount of
this work. Dourish [7] develops the concept of embodied
interaction as a foundation for a range of design principles which
aim at ensuring a design of interactive systems based on our
embodied actions in the world. Dourish ties together Tangible
Computing [20] and Social Computing taking phenomenology
and how we experience the world as a starting point for the
design. In line with Dourish [7], Klemmer et al. [22] further
explore how bodies matter in interaction design, while Fishkin et.
al. [9] also use the idea of embodiment for designing meaningful
interfaces that mimic our physical interaction with real-life
objects. Other approaches include Larssen et al. [25] who look at
the role the kinesthetic (kinesthetics and proprioception) and
haptic sense, as well as motor skills play when incorporating a
tool into bodily space. Likewise, Hummels et. al. [19] use the
concept of movement-based interaction focusing on movement as
central to all human skills, including emotional and perceptual
motor skills.
Another range of prominent directions in revealing the bodily
potential in interaction design is theoretically informed by
pragmatism. McCarthy and Wright [28] develop a theoretical
foundation for understanding technology as part of human felt
experience, where our sensory engagement in a situation is
important for the design of experience-oriented interactive
systems. Also focusing on experience, Forlizzi and Battarbee [10]
advocate that the design of interactive systems should address the
whole experience of a digital artefact by addressing physical,
sensual, cognitive and aesthetic modes of human experience.
Petersen et al. [35] develop the notion of Aesthetic Interaction as a
new ideal for interactive systems design, creating involvement,
experience and serendipity in the interaction through promoting
bodily, as well as complex symbolic representations, when
interacting with the system [35].
Other directions exploring the role of the body in interaction
design include Gesture Based Interaction [26], Exertion Interfaces
[31] and Full Body Interaction [37][18][34] which seeks to exploit
and explore the possibilities of designing novel interfaces and
applications which experiment with new interaction techniques
utilizing parts of, or our whole body as an input device. Moen also
describes Kinesthetic (Movement) Interaction as an approach to
interaction design which explores free and expressive full-body
movement as an interaction modality [30].
Although the starting point for the approaches within embodiment
and pragmatism is our bodily presence in the world and the
intertwining of the body and the mind in our experience of the
world, in neither of the presented approaches is the body in
motion and how it biologically functions explicitly defined from a
physiological perspective. As presented in the next section, we
will argue that the body can be described in a distinct
physiological manner opening new possibilities for creating
interactive experiences. In embodiment and pragmatism it remains
unclear how we can use our knowledge of the physiological body
to enhance or develop for instance motor skills on a perceptual
level when interacting with interactive systems. In Gesture Based
Interaction and Full Body Interaction, the authors seek to reveal
the bodily potential in interactive systems design, but fall short in
giving a coherent explanation of the theoretical foundations
leading to the design terminology. Except for Moen [30], none of
the presented approaches explicitly deal with describing the
physiological properties of the moving body, and how the
interaction with interactive systems might enhance or experiment
with bodily motor skills. As for Moen, we will argue that her
description needs to be further explicated in physiological terms
to provide a genuine resource for interaction design.
Following this overview, we would argue that there is a need to
establish a common ground for understanding the physiological
body in motion and its relation to human experience within the
interaction design community. This necessitates the development
of a unified terminology which embraces the body’s role in the
interaction with technology - spanning from minor bodily
involvement, as seen when interacting with a desktop computer
through a computer mouse, to interfaces that require utilization of
the whole body.
3. TOWARDS THE CONCEPT OF
KINESTHETIC INTERACTION
To create a unified understanding of what it means to address the
body in motion in interaction design, we now turn to a
physiological definition of the body in motion, and how it
conditions our experience of interactive technologies synthesized
in the concept of Kinesthetic Interaction as shown in Figure 1.
We do so by defining the physiological aspects of kinesthesis, and
by giving a definition of kinesthetic experience. From this follows
an overview of interactive technologies in a kinesthetic
perspective. Finally, the concept of Kinesthetic Interaction is
presented.
Figure 1: Three axioms of Kinesthetic Interaction
In defining KI, the focus is both on the human and the
technological side of the design work. The theoretical positioning
moves from a definition of the body in motion and how this
conditions our experience of the world (and vice versa), to how
we as humans experience the world and hence interactive
technologies through our bodies in motion. The physiological
understanding of kinesthetics is positioned in relation to an
overview of how existing and future technologies enable the
design of interactive systems that address the body in motion, and
the development and utilization of motor abilities.
3.1 The Physiological Aspect of Kinesthesis
Our bodies are the foundation for the manner in which we
experience and interact with our surroundings. We use various
sensory feedbacks to determine an adequate response to our
surrounding environment [39]. The stimuli received act as
motivation for bodily action [12]. This is similar to the way in
which we use the five senses: smell, sight, touch, hearing and
taste. An example is how we use sight in order to know when to
stretch out our arm to catch a ball [36][39].
Kinesthesis is part of the sensory capacities dealing with bodily
perception. The physiological definition of the term is the
awareness of the position and movement of the body in space
[36]. If a person closes her eyes and places the index finger on the
nose, the kinesthetic sense is utilized. Kinesthesis is part of the
somatosensory system that is conscious bodily perception
distributed throughout the whole body. This also includes all skin
sensation, proprioception, and the perception of the internal
organs [1]. When defining kinesthesis the proprioception is often
included, since both deal with the perception of bodily movement.
The difference between the two is that kinesthesis is kinetic
motion, while the proprioception is the sensory faculty of being
aware of the position of the limbs, and the state of internal organs.
When introducing the term into interaction design, we choose to
follow this approach, and make one combined definition.
[36][38][39]. It is through the kinesthetic sense that the body
keeps track of its own movements and position of the limbs: “(…)
my body itself I move directly, I do find it at one point of
objective space and transfer it to another, I have no need to look
for it, it is already with me…”[29]. The knowledge of whether a
movement has been performed accurately is not only determined
by how it looks, but more importantly, from how it feels [5].
Movements are produced by the motor system. How well a
movement is performed depends on a person’s ability to
coordinate and control muscular movement. Motor skills are
divided into two groups, fine or gross motor skills. Gross motor
skills involve movement of the whole body or large portions of
the body, as when crawling, walking, running, balancing, jumping
on one leg, catching or swinging. As opposed to the gross motor
skills, fine motor skills are defined by the coordination of small
muscle movement, which occur in the fingers, eyes, mouth, feet
and toes [23][17]. A skill is defined as the ability to use the
correct muscles to execute pre-determined actions with minimum
outlay of time and energy [5]. It is not only vital to know how to
execute a certain action, but also to know where and when to
apply it. This is the empathic part of our innate bodily intelligence
[4]. A person’s kinesthetic empathy is affected by his or her
relation to other people, and stimuli from the surrounding
environment.
When training motor skills one is learning where and when to
apply a certain action, how to adapt the action to the changing
environmental conditions, and how to practice the consistency of
the action from time to time. In interaction design it is relevant to
look at how we can design artefacts and installations that support
our motor learning, and through that utilize the bodily potential
inherently present within the human body [16]. There has been a
tendency to utilize the fine motor skills in interaction as seen with
the invention of the computer mouse, and in areas such as
Tangible Interaction. Interactive systems that incorporate the
gross motor skills and utilize the kinesthetic sense are a fairly new
and unexplored area. When designing for larger parts of the body
it becomes relevant to question the movability of the body. A
design can either help mediate existing movements or physically
change a person’s movement patterns.
3.2 Kinesthetic Experience
Based on the physiological definition of kinesthetics, kinesthetic
experience describes how the kinesthetic sense grounds our
everyday actions in the world as moving bodies [39]. As MerleauPonty states, our experience of the world is always grounded in
our bodily movement in it [29]. Our kinesthetic sense therefore
conditions the manner in which we experience the world in
framing our embodied actions, by providing a sense of spatiality
and bodily-motor potential in our relation to the physical and
socio-cultural world. Our motor abilities are developed into motor
skills when they meet the cultural world [32].
The kinesthetic sense and our motor abilities and skills are
constantly mediating other forms of sensation, e.g. vision, hearing
and the tactile sense, both in relation to exteroceptive and
interoceptive sensation. Since our experience of the world is
always rooted in the body, kinesthesis is the backbone of our
perception of the world; a perception which is always actionoriented and intentional in motor terms [32]. Our experience of
the world is thus always accompanied by a kinesthetic experience
of the world, and our bodily relation to it. This kinesthetic
experience can be more or less conscious, and our acting in the
world is constantly mediated by what Merleau-Ponty [29] refers
to as “praktagnosia”, a “motor memory” which includes motor
skills and the kinesthetic memory of performing them [32].
According to Merleau-Ponty human subjects are primarily
engaged in answering a motor question with a motor response, by
searching through a catalogue of movement memories or gestural
routines [29]. The motor memory is therefore composed of both
naturally and culturally appropriated motor skills, which guide our
actions in the world. As moving bodies in the world, we
constantly have to choose a motor response in relation to the
perceptual signals the body receives [32]. This is determined as a
more or less conscious reflection relating the given cultural
situation to our motor skills.
The concept of kinesthetic experience specifically points to the
possibility of understanding kinesthetis, and the manner in which
it mediates our experience of a cultural and social world, which in
turn manifests itself in our motor memory as motor skills and
potential for action. Our bodily experience of movement is not a
particular case of knowledge; rather it is the basis for our
experience of the world as we experience the world through our
motor memory.
3.3 Interactive Technologies in a Kinesthetic
Perspective
The amount of interactive technologies that enables bodily
movement in interaction design is steadily growing. Looking at
the execution of interaction, the computer gets its input via one or
more sensors. Sensors are the sense organs of the computer,
through which the computer can sense its environment. There are
sensors available for all things perceivable for humans, e.g. light,
sound, temperature; and on the unperceivable side -
electromagnetic fields and ultrasonic sounds. Sensors can be
found on the body, in equipment or in the environment (as shown
in Figure 2). This distinction of three types of “real-world objects”
is derived from [27]. Actuators, on the other hand, are the
opposite of sensors, in the sense that they convert input signals
(mainly electrical signals) into outputs, for instance those
perceivable by human beings (which is our main focus), such as
loudspeakers or motors [2]. They thus contribute to creating a
kinesthetic experience to the user when interacting. However,
mainly the sensors enable bodily movement in interaction, thus
we will focus on sensors in the following.
applications such as a tennis racket in a tennis game, or as a golf
club in a golf tournament.
The three above-mentioned interaction technologies make visible
the diversity in possibilities for sensing bodily movement in
interactive systems. Yet, they only cover a small area of the wide
range of technologies which could be utilized. Each technology
senses only a very specific part of bodily movement; the cameras
sense the position of the body in two dimensions; the pressure
sensors sense e.g. whether or not the body has stepped on a tile;
and the accelerometers sense the acceleration of an object or the
body. This means that the comprehension of the body when
interacting is highly dependent on the technology used, and hence
the benefits and drawbacks of that particular technology. Further,
the choice of technology is closely related to how the system can
motivate use in terms of facilitated interaction forms. When
interacting with interactive systems there will always be
constraints in how the body is involved in the interaction,
however, these constraints do not necessarily dictate the design
process. Instead, they can act as a substantiating factor in
exploring the bodily possibilities.
3.4 Kinesthetic Interaction
Figure 2: The sensors and actuators can be placed on the
body, in equipment and in the environment.
In recent years, several sensor technologies have emerged and
given rise to a wide range of new possibilities in interaction
design. One of the more disseminated technologies used is
cameras. Here, a program gets an input from the camera, which
decodes the position of the user. The program analyses every
single picture using algorithms of picture analysis, which makes it
possible to identify colors, contrast, contours and movement.
When the program has analyzed the full picture, coordinates and
id numbers are sent to the application using the camera input (e.g.
a game of table tennis in PlayStation 2®’s EyeToy Play™ 2). In
this manner, the application gets to know where the user is located
from the input of the camera, and the position of an arm or the full
body in camera based interaction will for instance correspond to
using a cursor in a traditional graphical user interface.
Another technology utilized in interaction design is pressure
sensors. One type of pressure sensors are force-sensing resistors
that have a variable resistance as a function of applied pressure.
When external force is applied to the sensor, the resistive element
is deformed against the substrate. The sensor exhibits a
"switchlike response", meaning some amount of force is necessary
to break the sensor's resistance at rest. The Magic Carpet [33] and
LiteFoot [13] are examples of interaction systems utilizing
pressure sensors.
Furthermore, accelerometers are sensors that can measure the
acceleration and gravity induced reaction forces it experiences.
When looking at technologies that enable bodily movement, it is
interesting to explore physical objects with built-in
accelerometers, as the movements of these objects are results of
the bodily movements during interaction. As an example the
Nintendo WiiTM 3 controller can be used in a vast number of
2
http://www.eyetoy.com.
3
http://uk.wii.com/.
In the previous sections, we have identified the three main axioms
for developing a unifying concept of Kinesthetic Interaction –
physiology, kinesthetic experience and interactive technologies.
Kinesthesis as a physiological term defines the kinesthetic sense
as the perception of the position and movements of one’s body
parts in space. The kinesthetic sense is made up of motor abilities
and motor skills, the latter being a result of kinesthetic
experiences in a social and cultural world. The kinesthetic
perspective on interactive technologies makes visible some of the
possibilities for more directly addressing the bodily potential in
interactive systems.
KI works as a theoretical foundation and vocabulary for
describing the body in motion and how it conditions our
experience of the world in the interactions with and through
interactive technologies. This leads to a broad definition of
Kinesthetic Interaction as when the body in motion experiences
the world through interactive technologies.
This definition of KI offers a view on interaction with interactive
environments where the focus is on the awareness of the body and
the perception of the body’s movements mediated by interactive
technologies.
4. REVEALING THE BODILY
POTENTIAL IN INTERACTION DESIGN
As shown earlier, much of the work done in interaction design
focuses on using only a limited part of the body - mostly only
design for the eyes and index finger as physical means of
interacting with a system. Although these approaches are also
covered by the definition of Kinesthetic Interaction, we would like
to open the field of interaction design to alternative directions,
exploring new bodily potential in designing for the body in
motion. By designing for the body in motion, it is possible to
motivate people to take actively part in the interaction and to
design interactive environments and artefacts that explore new
configurations of kinesthetic use and interactive technologies,
making it possible for the kinesthetic sensations to interact and
influence each other in the motor development. Instead of bodily
movement being dictated by the interaction, the interactions are
designed based on the potential inherently present within the
body. Kinesthetic Interaction thus makes it possible to enhance,
utilize or develop one’s motor skills through the interaction with
interactive technologies.
We therefore present three design themes, kinesthetic
development, kinesthetic means and kinesthetic disorder. Each
theme points to a specific area of interest in designing interactive
systems, and is composed of seven parameters highlighting
specific areas of interest. By overlaying the three themes and the
seven parameters in a conceptual framework, we make visible the
bodily potential in the analysis of existing interactive systems.
The framework can also be used to inform the design process by
highlighting which of the themes and parameters the concept
adheres to, and by providing new directions in which the design
could be developed. Further, the framework identifies new areas
of interest to be explored in future work of designing for bodily
potential in interaction design.
4.1 Kinesthetic Design Themes
The three themes are derived partly from the presented survey on
related work (section 2), and partly from our theoretical
development of Kinesthetic Interaction. They highlight three
different motivations to address the bodily potential. When
looking into previous work the most dominant motivation for
designing for KI focuses on bodily interaction and means to reach
a higher goal (e.g. the gameplays of the Nintendo Wii games).
New tendencies within the field point in the direction of utilizing
interaction to improve bodily skills [31][16]. Another inspiration
comes from digital aesthetics creating a more experimental
approach to bodily exploration. Though the three themes differ,
they are not mutually exclusive. It is possible for one design
concept to adhere to more themes at the same time.
Kinesthetic development deals with acquiring, developing or
improving bodily skills. This is possible on three levels; knowing
where and when to apply a certain action, knowing how to adapt
the action to the changing environmental conditions, or by
practicing the consistency of the action from time to time.
how the bodily potential is addressed in the design process, and
they provide more practical guidelines for designing KI. The
parameters make it possible to analyze existing concepts by
pointing out possible limitations and design possibilities to
generate new ideas based on KI.
Engagement describes KIs that engage users in a kinesthetically
memorable manner, and facilitate interested exploration through
the body in motion.
Sociality relates to designing for a body among other bodies. By
designing Kinesthetic Interaction, the interaction often moves into
a collaborative and social place, where others are invited to take
part in the interaction, actively or as spectators.
Explicit motivation means that the system tells the users explicitly
how to interact with the system. The range of movements is
restricted, and there is a direct motivational invitation to react.
Implicit motivation is when the interaction with the system is
open, and there are no restrictions on the movements. Implicit
motivation denotes a more explorative motivational form.
Movability is central for an understanding of whether the body can
move freely, or is physically restricted while interacting with the
system.
Expressive meaning occurs when the bodily engagement fits the
system output. The interactive possibilities are congruent with the
kinesthetic capabilities the design has been made for – and the
bodily interaction is meaningful for achieving the system goal.
Kinesthetic empathy is where specific and controlled movement
patterns are affected by the relation to other people, and stimuli
from the surrounding environment. Kinesthetic empathy is
achieved when the system opens up for the possibility of the users
being able to read, decode and react on each others’ movements.
4.3 Applying the Conceptual Framework
The themes and parameters are brought together in a conceptual
framework (Figure 3).
Kinesthetic means deals with KI as a means for reaching a goal
other than kinesthetic development. While the interaction can be
defined as kinesthetic, the goal of the interaction is something
other than improving bodily skills such as learning activities,
playful experiences and gameplay.
Kinesthetic disorder deals with transforming the kinesthetic
experience in a given situation by challenging the kinesthetic
sense. This can be achieved by changing the possibility of
kinesthetic experience, either by affecting how a person senses,
which motor skills can be applied, or how the environment is
perceived.
Following these design themes, we provide a conceptual
framework that, when applied, reveals bodily potential in
interaction design that otherwise would have remained hidden.
The design themes point in the direction of actively using
interactive technologies to explore new configurations in the
manner in which we as moving bodies experience the world. It is
thus possible to design interactive technologies that explore or
experiment with how we can develop our kinesthetic sense in
radically new ways.
4.2 Kinesthetic Design Parameters
The seven parameters are derived from the theoretical
development of KI in relation to the three design themes to reveal
Figure 3: The conceptual framework showing how the four
design concepts relate to the three design themes and seven
design parameters.
The framework makes it possible to evaluate an existing
interactive system in relation to Kinesthetic Interaction, and
provides an overview of the system’s capacity to address bodily
potential. In the following, we analyze four different interactive
systems to illustrate how the framework works. Finally, we sum
up the section by discussing the relevance of the framework in
interaction design.
4.3.1 Octopus Trainer
The Octopus Trainer4 is a piece of interactive sports equipment
that trains abilities such as reaction time, strength, concentration
and speed. The equipment consists of a computer and 8 lamps,
illuminating randomly when the device is in use. The user must
locate the light and turn it off by waving a body part 30 cm in
front of it as quickly as possible. This extinguishes the light, and
another target light ignites at random. Feedback is given to the
participant on how well he/she has performed.
The main purpose of the Octopus Trainer is to develop and train
the users’ bodily movement skills by engaging the users in a
kinesthetic experience (Figure 3). Therefore, the Octopus trainer
solely falls within the kinesthetic development theme. The
motivation for executing an action is explicitly provided by the
system through the lights telling the user where and when to apply
an action. Although the system only detects minimal movement,
the users are able to move freely, without constrains, while
interacting with the system. The movements provoked by the
equipment are the same as the ones executed when playing the
actual sport, in this case team handball, which gives expressive
meaning to the kinesthetic development. Since the Octopus
Trainer is designed for one person only, the system does not invite
for others to join in, thereby disabling the users in exploiting
his/her kinesthetic empathy or sociality.
4.3.2 BodyBug
The BodyBug is a small digital device, designed to generate new
and otherwise unexplored movements. It consists of a small box
that can move up and down a metal wire. It can be attached to
optional parts of one or two users by using Velcro strips. The
device senses the users’ movements, and responds by moving up
and down the wire. To keep the device moving, the users have to
continuously feed the BodyBug with movements. The movements
needed are not explicitly defined which encourage the users to
explore their own repertoire of movements, thus enhancing their
kinesthetic potential [30].
The BodyBug falls within kinesthetic development and
kinesthetic means at the same time, addressing the same design
parameters in each theme (Figure 3). Through interaction with the
BodyBug it is possible to acquire and develop new motor skills.
At the same time it can be used for simple play, leisure or as a
social activity as well as for personal exploration and expression.
By moving in relation to the BodyBug, the users are
kinesthetically engaged by exploring and broadening their
kinesthetic sense. The artefact calls for social engagement
between the users, as well as attracting spectators, who might
influence the interaction by commenting on the use of the
BodyBug. The users are implicitly motivated by their own
interpretation of the device, which ultimately provokes the motor
development in relation to both kinesthetic development and
kinesthetic means. Since the meaning of the interaction with the
device is created by the users themselves, expressive meaning is
missing. Though it is possible for two people to interact with the
BodyBug at the same time, kinesthetic empathy is absent because
it is not possible to predict the movements developed by the users
on the course of the interaction.
4.3.3 Nintendo Wii Tennis
The Nintendo WiiTM 5 application is a computer game, where one
physically utilizes the Nintendo WiiTM console as a tennis racket
to control a tennis player in the game world. By moving your arm
back and forth and from side to side, the tennis player performs
accordingly in the virtual world. It is possible to play against the
computer or against another human player.
Wii Tennis activates the user of the system by providing a
kinesthetically engaging experience of interacting with the virtual
world in relation to the gameplay (Figure 3). Wii Tennis is a
social game, where users come together either by playing against
each other or by observing other players. The motivation for the
game is made explicit by the rules of the gameplay. The users are
physically free to move when interacting with the system. The
expressive meaning is clear in the mapping of the interaction in
the physical world, and the resulting gameplay in the virtual
world. The attention of the players is on the screen, and the
movements are related to the gameplay, not to the movements of
other people. Therefore there is no kinesthetic empathy in Wii
Tennis. Although the Wii engages the users kinesthetically, there
is no direct kinesthetic development of new motor skills that can
be directly transferred to the tennis court.
4.3.4 Shaking the World
Shaking the World (SW)6 is a sensation interface device that uses
galvanic vestibular stimulation (GVS) to control balance. The
device is placed on the neck of a person and can be remotely
controlled by another person or computer-system. This allows for
the person with the remote to alter the sense of balance of the
person wearing the device, thereby directly affecting the
kinesthetic sense. SW’s most direct application is in walking
guidance and postural support. Other possible applications include
automatic avoidance of collisions or falls, GPS-guided walking
navigation, and pedestrian flow control.
SW is mainly an example of kinesthetic disorder, since it directly
influences the kinesthetic experience of the environment by
altering the kinesthetic sensation (Figure 3). It is through the
exploration of the kinesthetic sense that the user, wearing the
sensors, engages in a memorable kinesthetic experience, thus
broadening his/her own kinesthetic awareness. SW is a social
device that due to its spatiality invites others to indirectly take part
in and affect the experience as spectators. The motivation for
interacting with the device is both explicit and implicit. The user
wearing the sensors is explicitly motivated, due to the fact that
another person or a computer is controlling his/her kinesthetic
interaction. The person in control of the remote is implicitly
motivated to explore the interaction possibilities provided by the
device. The movability in the interaction is present for the person
controlling the remote, whereas the person being controlled is
restricted to move as the other person sees fit. Expressive meaning
5
6
http://www.nintendo.com/wii/.
http://www.siggraph.org/s2005/main.php?f=conference&p=
etech&s=etech24.
4
http://www.octopustrainer.dk/.
occurs because there is congruence between what the person with
the remote control is doing, and the kinesthetic effect on the
person wearing the device. The goal of the interaction becomes a
mean for reaching a higher goal such as walking guidance or
postural control, which is why the expressive meaning is placed
within kinesthetic means, and not in kinesthetic disorder. Since
the person wearing the device has no control over his/her own
body, the interaction achieved by the system does not open up for
the possibility of the user being able to read and react on other
people’s movements, the main goal of kinesthetic empathy.
4.4 Discussion
Through the analysis of the four interactive systems, we have
shown the way in which the conceptual framework works as a
tool for visualizing the manner in which the bodily potential is
addressed in relation to the concept of Kinesthetic Interaction. The
model provides a coherent framework for revealing what themes
and parameters the interactive systems adhere to. In doing so, it
becomes possible to identify similarities and differences in the
bodily potential. The framework facilitates an analysis of systems
from seemingly disparate directions, bridging them through
highlighting the bodily potential in the designs.
In addition, the framework can be used to redesign existing
interactive systems. This is achieved through exploring the way in
which the system can cut across themes, or incorporate new
parameters into the design. An example could be the Wii Tennis,
which engages the body kinesthetically as a means of being able
to play a computer game (kinesthetic means). In redesigning the
system, it would be possible to develop Kinesthetic Interaction
that enhances motor skills, thus relating it to tennis in the real
world (kinesthetic development). In a redesign of Wii Tennis,
kinesthetic empathy can be applied through moving the screen in
between players, so the players are able to read and react to each
others’ movements in addition to those simulated by the animated
figures on the screen. This makes it possible to react on the bodily
movements before they are shown on the screen, in the same
manner you would react to a player’s movements in real life
tennis. Another example would be to experiment with the
expressive meaning of Shaking the World in relation to
kinesthetic disorder, exploring the way in which altering the
kinesthetic sense might provide self-reflective experiences. Also,
the Octopus trainer could be redesigned by incorporating
kinesthetic means through adding a strategic element to the
exercise, thus supplementing the kinesthetic development.
When starting a new design process, the three themes can help
identify possible areas of interest that can be explored through the
design. An initial draft or idea for a design can be applied to the
framework to spark new design ideas through addressing the
unused parameters to the design idea. This will help address the
full bodily potential, thus bringing the design concepts into
directions that would have otherwise remained untouched.
Furthermore, the conceptual framework for KI makes it possible
to imagine interactive systems that explore new configurations of
themes and parameters.
5. CONCLUSION AND FUTURE WORK
This paper contributes to the field of interaction design for the
whole body in two respects; First, it presents Kinesthetic
Interaction as a unified concept for describing the body in motion
and how it conditions our experience of the world in the
interactions with and through interactive technologies. Second, it
provides a conceptual framework consisting of three themes and
seven parameters that make it possible to analyze existing
interactive systems in relation to how they meet bodily potential.
It further opens new directions for a future exploration of
addressing bodily potential in interactive systems design. The
conceptual framework presented is not exhaustive in describing
all the aspects of designing for Kinesthetic Interaction. However,
it provides a way of articulating pertinent aspects when it comes
to addressing the bodily potential in the design of interactive
technologies.
In developing the concept of KI, we seek to inform the previous
approaches and directions presented in the related work section by
introducing a physiological description of the body. In doing so
we initiate a more direct exploration of the design themes
uncovering new potential in interaction design.
Future work focuses on empirically informing the conceptual
framework through developing working prototypes in real
contexts. This will provide feedback on how the framework
works, and make it possible to further develop the design themes
and parameters.
6. ACKNOWLEDGMENTS
This work has been supported by Aarhus School of Architecture,
Institute of Information and Media Studies (University of
Aarhus), Department of Computer Science (University of Aarhus)
and Center for Interactive Spaces. We would like to thank our
colleagues, especially Professor Kaj Grønbæk who has provided
useful recommendations for structuring and focusing our writing.
7. REFERENCES
[1] Bloom, F. et al. 2001. Brain, Mind, and Behavior, third ed.,
Educational Broadcasting Corporation, USA.
[2] Bongers, B. 2000. Physical Interfaces in the Electronic Arts:
Interaction Theory and Interfacing Techniques for Real-time
Performance. Trends in Gestural Control of Music. M.
Wanderley, M. Battier (eds.), IRCAM, Paris.
[3] Buxton, B. 1986. There's More to Interaction than Meets the
Eye: Some Issues in Manual Input. In Norman, D. A. and
Draper, S. W. (Eds.), User Centered System Design: New
Perspectives on Human-Computer Interaction. Lawrence
Erlbaum Associates, Hillsdale, New Jersey, 319-337.
[4] Czajkowski, Z. 2006. The essence and importance of timing
(sense of surprise) in fencing. Kinesiology. 16 (2006), 35-42.
[5] Davids, K. et al. 2003. Acquiring Skill in Sport: A
Constraints-Led Perspective. In International Journal of
Computer Science in Sport, 2, 2.
[6] Dindler, C., Krogh, P., Beck, S., Stenfeldt, L. Nielsen, K. &
Grønbæk, K. 2007. Peephole Experiences – Field
Experiments with Mixed Reality Hydroscopes in a Marine
Center. In Proceedings of DUX2007. ACM Press.
[7] Dourish, P. 2001. Where the Action Is: The Foundations of
Embodied Interaction, MIT Press.
[8] Eisenberg, M., Elumeze, N., Buechley, L., Blauvelt, G.,
Hendrix, S., and Eisenberg, A. 2005. The homespun
museum: computers, fabrication, and the design of
personalized exhibits. In Proc. of the 5th Conference on
Creativity & Amp; Cognition, London, UK, April 12 - 15,
New York, NY, 13-21.
[9] Fishkin, K. P., Moran, T. P. & Harrison, B. L. 1998.
Embodied User Interfaces: Towards Invisible User
Interfaces. In Proc. of EHCI’98 (Heraklion, Greece, 1-18.
[10] Forlizzi, J., Battarbee, K. 2004. Understanding Experience in
Interactive Systems. In Proceedings of the 5th conference on
Designing interactive systems: processes, practices, methods,
and techniques (DIS2004). Massachusetts. USA.
[11] Forman, G. 2003. An extensive empirical study of feature
selection metrics for text classification. J. Mach. Learn. Res.
3 (Mar. 2003), 1289-1305.
[12] Fredens, K. 1987. Hvad er motorik? Motorikkens betydning
for den kognitive udvikling. Tidsskrift for idræt. 3 (Danish).
[13] Griffith, N. & Fernström, M. 1998. LiteFoot: A Floor Space
for Recording Dance and Controlling Media. In Proceedings
of ICMC 1998.
[14] Grudin, J. 1990. The Computer Reaches Out: The Historical
Continuity of Interface Design Evolution and Practice in
User Interface Engineering. In Proc. of ACM CHI’90 Conf.
on Humane Factors in Computing Systems, 261-268.
[15] Hall, T. and Bannon, L. 2005. Designing ubiquitous
computing to enhance children's interaction in museums. In
Proceeding of the 2005 Conference on interaction Design
and Children (Boulder, Colorado, June 08 - 10, 2005). IDC
'05. ACM, New York, NY, 62-69.
[16] Hämäläinen, P. 2007. Novel applications of real-time
audiovisual signal processing technology for art and sports
education and entertainment. Doctoral thesis for Helsinki
University of Technology.
[17] Holm, K. 2002. Sæt fart på dit barns udvikling. Algarve,
Rønne (in Danish).
[18] Höysniemi, J. and Perttu H. 2005. Children’s and Parents’
Perception of Full-Body Interaction and Violence in a
Martial Arts Game. In Proc. of the 2005 conference on
Designing for User experience, San Francisco, California.
[19] Hummels, C., Overbeeke, K. C. J. & Klooster, S. 2007.
Move to get moved: a search for methods, tools and
knowledge to design for expressive and rich movementbased interaction. In Personal and Ubiquitous Computing,
Volume 11, Springer-Verlag, 677-690.
[20] Ishii, H. & Ulmer, B. 1997. Tangible Bits: Towards Seamless
Interfaces Between People, Bits and Atoms. In Proceedings
of the SIGCHI conference on Human factors in computing
systems, 234-241, March 22-27, 1997, Atlanta, Georgia.
[21] Kjölberg, J. 2004. Designing Full Body Movement
Interaction Using Modern Dance as a Starting Point. In
Proceedings of the 2004 conference on Designing Interactive
Systems, Cambridge, MA, 353-356.
[22] Klemmer, S. R., Hartmann, B., Takayama, L. 2006. How
Bodies Matter: Five Themes for Interaction Design. In
Proceedings of the 6th conference on Designing Interactive
systems (DIS 2006). Pennsylvania. USA.
[23] Kolb, B. 2006. An Introduction to Brain and Behavior.
Second Ed. Worth Publishers, New York.
[24] Lakoff, G., Johnson, M. 1999. Philosophy in the Flesh: The
Embodied Mind and its Challenge to Western Thought.
Basic Books.
[25] Larssen, A. T., Robertson, T. & Edwards, J. 2006. How it
Feels, not Just How it Looks: When Bodies Interact with
Technology. In Proceedings of OZCHI 2006, November 2024, Sydney, Australia.
[26] Long, A. C., Landay, J. A., Rowe, L. A. 1999. Implications
For a Gesture Design Tool. In CHI '99: Proceedings of the
SIGCHI conference on Human factors in computing systems,
Pittsburgh, PA, USA.
[27] MacKay, W. 1998. Augmented Reality: Linking real and
virtual worlds - A new paradigm for interacting with
computers. In Proceedings of the working conference on
Advanced visual interfaces (AVI ’98), 13-21.
[28] McCarthy, J., Wright, P. 2004. Technology as Experience,
MIT Press, Cambridgde Massachusetts.
[29] Merleau-Ponty, M. 1945 (publ. 2002). Phenomenology of
Perception, Routledge Classics.
[30] Moen, J. 2006. KinAesthetic Movement Interaction:
Designing for the pleasure of Motion, Ph.D. dissertation,
KTH, Numerical Analysis and Computer Science, Sweden.
[31] Mueller, F., Agamanolis, S. and Picard, R. 2003. Exertion
Interfaces: Sports over a Distance for Social Bonding and
Fun. In Proceedings of CHI 2003, April 5-10, Ft. Lauderdale,
Florida, USA, 561-568.
[32] Noland, C. 2007. Motor Intentionality: Gestural Meaning in
Bill Viola and Merleau-Ponty. Postmodern Cultures, Volume
17, Number 3.
[33] Paradiso, J., Abler, C., Hsiao, K., Reynolds 1997. The Magic
Carpet: Physical Sensing for Immersive Environments. In
CHI ’97 extended abstracts on Human factors in computing
systems: looking to the future. ACM Press, 277-278.
[34] Parés, N., Carreras, A., & Soler, M. 2004. Non-invasive
attitude detection for full-body interaction in MEDIATE, a
multisensory interactive environment for children with
autism. In Proceedings of Vision, Modeling, and
Visualization 2004 (VMV'04). Stanford, California, USA.
[35] Petersen, M. G., Iversen, O. S., Krogh, P. G., Ludvigsen, M.
2004. Aesthetic Interaction: A Pragmatist's Aesthetics of
Interactive Systems. In Proc. of the 2004 Conference on
Designing interactive systems (DIS2004), August 1-4,
Cambridge, Massachusetts, USA.
[36] Rasch, P.J., and Burke, R.K. 1971. Kinesiology and Applied
Anatomy. Henry Kimpton, Great Britain.
[37] Samanci, Ö., Chen, Y. & Mazalek, A. 2007. Tangible
Comics: A performance space with full body interaction. In
Proceedings of the international conference on Advance in
Computer Entertainment Technology. Salzburg, Austria.
[38] Seitz, J. A. I move…Therefore I Am. 2003. Psychology
Today, Marts 2003.
[39] Thompson, C. W. 2004. Manual of Structural Kinesiology.
McGraw-Hill, New York.
[40] Ulmer, B & Ishii, H. 2001. Emerging Frameworks for
Tangible User Interfaces. In Human-Computer Interaction in
the New Millenium, John M. Carroll, ed., Addison-Wesley,
August 2001, 579-601.
[41] Zahavi, D. 2001. Husserl's Phenomenology: New, revised
edition. Gyldendal.