Soccer for Players with Visual Impairments

Design of an Accessible and
Portable System for Soccer Players
with Visual Impairments
Alireza Zare
Clemson University
School of Computing
Human Centered Computing
Clemson, SC.29634
[email protected]
Kyla A.McMullen
Clemson University
School of Computing
Human Centered Computing
Clemson, SC.29634
[email protected]
Christina Gardner-McCune
Clemson University
School of Computing
Human Centered Computing
Clemson, SC.29634
[email protected]
Paste the appropriate copyright statement here. ACM now supports three
different copyright statements:
• ACM copyright: ACM holds the copyright on the work. This is the historical
approach.
• License: The author(s) retain copyright, but ACM receives an exclusive
publication license.
• Open Access: The author(s) wish to pay for the work to be open access.
The additional fee must be paid to ACM.
This text field is large enough to hold the appropriate release statement
assuming it is single spaced.
Abstract
Many people with visual impairments actively play soccer,
however the task of making the game accessible is met
with significant challenges. These challenges include: the
need to constantly talk to signify location and detecting
the positions of silent objects on the field. Our work aims
to discover methods to help persons with visual
impairments play soccer more efficiently and safely. The
proposed system uses headphone-rendered spatial audio,
an on-person computer, and sensors to create 3D sound
that represents the objects on the field in real-time. This
depiction of the field will help players to more accurately
detect the locations of objects and people on the field.
The present work describes the design of such a system
and discusses perceptual challenges. Broadly, our work
aims to discover ways to enable people with visual
impairments to detect the position of moving objects,
which will allow them to feel empowered in their personal
lives and give them the confidence to navigate more
independently.
Author Keywords
Spatial Audio, Soccer, Football, Visually Impaired
ACM Classification Keywords
K.4.2 [Social Issues]: Assistive technologies for persons
with disabilities; H.5.2 [Information Interfaces and
Presentation]: User InterfacesAuditory (non-speech)
feedback; H.5.1 [Information Interfaces and Presentation]:
Multimedia Information SystemsArtificial, augmented, and
virtual realities
and objects on the field in real-time. Specifically, our goal
is to allow players to detect the ball, other players, goal,
and walls, while receiving updated audio cues in real-time.
Background
General Terms
Design, Documentation, Human Factors
Introduction
For a person with visual disability, communication must
occur using alternate channels. An assumption, which has
almost attained mythical status, is that the senses are
naturally balanced. If a person is lacking one sense, then
the other senses become more acute [4]. Visually disabled
people are often assumed to have better hearing.
Although there is no evidence to support this idea, hearing
can be an obvious alternative means of communication.
The sense of hearing can help to perceive the environment
and detect unseen objects.
The use of hearing to detect location can augment various
tasks, such as playing soccer. Sounds play an important
role in visually impaired soccer as they are used to indicate
teammates’ locations and avoid possible accidents.
However, making many of these sounds while playing can
cause the player to incur frustration and fatigue.
This study aims to utilize spatial audio to help players
with visual impairments perceive the locations of moving
objects using their ears. Also, reducing and eventually
eliminating the need to talk makes the game more
accessible, safer, and less fatiguing.The broad goal of the
study is to create an interactive system to apply spatial
audio technology to real world games. To achieve this, we
employ a variety of portable and affordable hardware and
software solutions to track and sonically render the people
Paralympic soccer
Paralympic football consists of adaptations of Association
football for athletes with a physical disability. These
sports are typically played using International Federation
of Association Football (FIFA) rules, with modifications to
the field of play, equipment, number of players, and other
rules as required to make the game suitable for the
athletes. It should be noted that the word ”football” in
the United States, usually refers to the sport of American
football, but in most countries ”football” refers to
American ”soccer”. All references to ”football” in this
document refer to American ”soccer”.
Rules
One version of Paralympic soccer is called 5-a-side. The
sport, governed by the International Blind Sports
Federation (IBSA), is played with modified FIFA rules [2].
• The field of play is smaller than the standard soccer
field (40 x 20 meters) and is surrounded by boards
that keep the ball in play.
• Teams are reduced to five players, in which four
players have similar levels of visual impairment.
These players are assigned the roles of attack and
defense. The fifth player is usually a fully-sighted
goalkeeper, who must remain within the penalty
area. Each team may also use one guide, positioned
behind the opponent’s goal, who assists in directing
players, by talking to them or striking the goalpost
to indicate its position. All players wear eye shields
to eliminate any possible advantages gained by
players with more visual ability.
• A special ball is used, which is equipped with a
noise-making device to help players detect its
location.
• The players talk to one another and pass the ball by
calling each other by name or by shouting ”Yeah!”.
When a player is approaching another player to steal
the ball, they must shout ”Voy!” which means ”I am
here”! Shouting ”Voy!” allows a player to determine
the proximity of other players, thus avoiding
collision. The game requires players to recognize
and distinguish the voices of their teammates and
localize the ball. As a result, effectively playing the
game heavily relies on vocal communication.
In short, the current challenges include:
• More dribbling and close control is required in
visually impaired soccer as compared to a sighted
game. This requirement may cause players to incur
specific injuries associated with repeating this
specific movement.
• Players are required to recognize their teammate’s
voices. If they lack this ability, the ball may be
passed to the incorrect player. Additionally,
attention must be taken away from playing the
game and directed towards differentiating voices.
• The need to constantly talk can frustrate players.
• The goalkeeper position is usually played by a fully
sighted player which unfairly prevents those with
more severe visual impairments from playing this
position.
Related Work
Pepsi’s ’Sound of Football’ project
To date, there is one notable solution to the challenges of
making soccer more accessible for persons with visual
impairments. Specifically, in 2011, Pepsi in Sweden
arranged a soccer match between a team of visually
impaired players and a team of blindfolded former
professional soccer players. Pepsi created a tracking
technology to enable players to receive 3D sound cues
corresponding to the locations of other players and the
ball. Their goal was to investigate how the two teams
would perform under seemingly equal conditions.
The system was divided into several interconnected parts:
the sound, the tracking data, the orientation, and location
feedback. Pepsi used the same tracking system as the
2010 World Cup. The system identified and tracked the
locations of all the objects on the field. It distinguished
players using shirt colors. The ball, however had its own
shape so it was automatically recognized. During the
game, a camera system followed the targets and extracted
data which was sent to the main computer. The
computer’s software broadcasted the updated audio
information to the correct player, as recognized by their
jersey number.
Each player wore an iPhone on their head, which was
equipped with an application that connected the players.
The tracking system contained 16 cameras to cover the
field’s pitch and triangulate the position of each player in
real-time. The iPhone received the information from the
tracking system and converted it into 3D sound. The
spatial sound rotated as the player moved their head, by
using the orientation data gathered from the iPhone
sensors. The field was rendered over headphones allowing
each player to perceive the objects around them [5].
Motivation
Design
Although Pepsi’s ”Sound of Football” project is an
innovative solution to the accessibility challenges of blind
soccer, there still exist some areas that can be improved.
Although it was a great idea and a fun experience for the
players with visually impairments, the technology and this
type of playing could be developed into a system with
more availability and affordability to be used by more
passionate players with visual impairments.
Limitations
Although “The Sound of Football” is an amazing idea, it
still does not allow widespread usage due to the following
barriers:
• Cost: Installing 16 cameras to cover a field is an
expensive process that is not realistic for most
players. The cost of cameras and iPhones makes
this an extremely cost-prohibitive system.
• Mobility: A person with visual impairments can only
play soccer on this particular field with the
prescribed setup. This restriction limits the number
of players that can take advantage of the
technology. Furthermore, if a player with visual
impairments desires to play soccer in another
location, the system is not portable enough to
afford such an option.
• Calibration and Setup: The camera calibration is a
time consuming and costly process. Additionally,
setup and gameplay would require the assistance of
a sighted person.
Figure 1: Design of a portable system that senses player and
object location and renders a 3D sound representation of the
environment.
Our proposed system addresses the aforementioned
limitations by creating an affordable and portable system
that requires minimal setup. Such a system requires a
mechanism to localize the field in real-time, the capability
to distinguish players, identify the goal, and track the ball.
This information must be rendered as spatial audio cues,
that are sent to each player through stereo headphones.
As depicted in Figure 1, such a system contains:
• Portable Computer - A small, on-person computer
has the capability to receive location and orientation
information from the central processor, gyroscope,
and compass. This information is necessary in order
to perform the digital signal processing necessary to
convolve the digital sounds with pre-measured
•
•
•
•
filters, known as Head-Related Transfer Functions
(HRTFs) [3]. Convolving the sound with HRTFs
gives the illusion of 3D space. The use of a credit
card sized computer resolves the mobility
limitations. It also costs less, as compared to the
iPhone. The low cost allows more players to be able
to afford to play.
Active RFID (Radio-Frequency IDentification) RFID tags and RFID readers are necessary to track
and identify object locations. Although active RFID
is more expensive than passive RFID, active RFID
can communicate data at further distance with a
significantly faster update rate. Active RFID has the
capacity to disseminate data at distance of as far as
300 feet. Each person on the field, the goals, and
the ball are equipped with one or more active RFID
tags. The active RFID reader detects the positions
of each tag and feeds the information to the main
computer that acts as a central processor.
Central Processor - The central processor receives
the spatial information from the active RFID reader
and transmits field location information, represented
as global coordinates, over bluetooth to the spatial
sound processor, located on the portable computer.
Spatial Sound Processor - The on-board spatial
sound processor is needed to render the sounds in
virtual space. The processor uses the global
coordinates, orientation, and location data from the
central processor, gyroscope, and compass to create
the spatial sounds. After the sounds are created,
they are played over headphones, giving the
perception that sounds are being emitted from 3D
space.
Wireless headphones - Headphones are needed to
play sound over 2 channels and to detect the range,
angle, and movement of an oncoming person or
player. Headphone usage allows more control of
audio cues presented to the players, because signals
can be transferred to both ears directly and
independently, thus allowing precise control over the
conveyed spatial information [6].
Discussion & Future Work
To date, all of the necessary system components have
been identified. Next, we will implement the solution and
conduct user studies to determine how well a visually
impaired participant can track a sound in our
environment, move towards the sound, and make contact
with the object. This work will uncover many perceptual
issues when using such a device.
Perceptually, our work studies the capacity of stream
segregation while a person is in motion. We will study a
player’s ability to focus on one sound while extracting
meaning and location information in the midst of
competing sounds. So far, it is known that people can
perceive a message played in one ear while rejecting the
sounds in the other ear [1], however coupled with the
tasks of movement, recognition, and localization, this may
become a challenging undertaking.
Though the 3D sounds rendered by the spatial sound
processor may not possess the same acoustic
transformations as those given by the player’s body, the
human perceptual system is able to compensate.
Although the HRTFs used to produce the spatialized
sound will not be individualized for each user, we expect
that perceptual recalibration will occur, and users will
adjust to the audio cues [7]. Furthermore we will
investigate the necessity of using high-fidelity HRTFs and
convolution algorithms as compared to those with
lower-fidelity and examine its effect on recognition and
localization. If players can effectively use the system with
lower fidelity algorithms, this will significantly decrease
computation requirements and cost.
Acknowledgments
Lastly, we must address the perceptual issue of assigning
the appropriate sounds to represent objects on the field. If
users cannot remember the sound associated with each
object, this may result in an inappropriate action being
used for a specific object (ex: kicking a teammate after
confusing the ball sound for the player sound). Also, the
spectral content of the sound must be chosen in such a
way to aid localization (typically, containing broadband
sound) but also such that it does not mask other nearby
sounds. Additionally, the sounds must be selected in such
a manner that they are perceived as two segregable
sounds that do not encourage the user to collapse the
sound image into one large sound.
References
Upon the completion of the system, we will have the
capability to expand and make other sports and tasks
more accessible. Our goal is to help people with visual
disabilities that are passionate about sports to play with
fewer challenges. In all, our work seeks to discover ways
to render moving objects in real-time for those with visual
impairments, thus empowering them and increasing their
confidence to live independently.
This research is supported by Clemson University. We
would like to thank the Creative Inquiry program for
allowing the resources to perform this work.
[1] Arons, B. A review of the cocktail party effect.
Journal of the American Voice I/O Society 12, 7
(1992), 35–50.
[2] Association, B. P. Football 5-a-side.
[3] Cheng, C. I., and Wakefield, G. H. Introduction to
head-related transfer functions (hrtfs):
Representations of hrtfs in time, frequency, and space.
In Audio Engineering Society Convention 107, Audio
Engineering Society (1999).
[4] Edwards, A. D. Soundtrack: An auditory interface for
blind users. Human-Computer Interaction 4, 1 (1989),
45–66.
[5] Pepsi. Pepsi’s sound of football, 2011.
[6] Shilling, R. D., and Shinn-Cunningham, B. Virtual
auditory displays. Tech. rep., DTIC Document, 2000.
[7] Zahorik, P., Bangayan, P., Sundareswaran, V., Wang,
K., and Tam, C. Perceptual recalibration in human
sound localization: Learning to remediate front-back
reversals. The Journal of the Acoustical Society of
America 120 (2006), 343.