Electric wheelchair control with gaze direction and eye blinking

Artif Life Robotics (2009) 14:397–400
DOI 10.1007/s10015-009-0694-x
© ISAROB 2009
ORIGINAL ARTICLE
Djoko Purwanto · Ronny Mardiyanto · Kohei Arai
Electric wheelchair control with gaze direction and eye blinking
Received and accepted: May 18, 2009
Abstract We propose an electric wheelchair controlled by
gaze direction and eye blinking. A camera is set up in front
of a wheelchair user to capture image information. The
sequential captured image is interpreted to obtain the gaze
direction and eye blinking properties. The gaze direction is
expressed by the horizontal angle of the gaze, and this is
derived from the triangle formed by the centers of the eyes
and the nose. The gaze direction and eye blinking are used
to provide direction and timing commands, respectively.
The direction command relates to the direction of movement of the electric wheelchair, and the timing command
relates to the time when the wheelchair should move. The
timing command with an eye blinking mechanism is designed
to generate ready, backward movement, and stop commands for the electric wheelchair. Furthermore, to move at
a certain velocity, the electric wheelchair also receives a
velocity command as well as the direction and timing commands. The disturbance observer-based control system is
used to control the direction and velocity. For safety purposes, an emergency stop is generated when the electric
wheelchair user does not focus their gaze consistently in any
direction for a specified time. A number of simulations and
experiments were conducted with the electric wheelchair in
a laboratory environment.
Key words Electric wheelchair control · Gaze estimation ·
Blinking measurement
D. Purwanto · R. Mardiyanto
Department of Electrical Engineering, Institute of Technology
“Sepuluh Nopember” (ITS), Surabaya, Indonesia
K. Arai ()
Department of Information Science, Saga University, 1 Honjo, Saga
840-8502, Japan
e-mail: [email protected]
This work was presented in part at the 14th International Symposium
on Artificial Life and Robotics, Oita, Japan, February 5–7, 2009
1 Introduction
The ability to move freely is highly valued by all people.
However, it is sometimes difficult for a person with a physical disability. Nowadays, an electric wheelchair is commercially available for disabled people. It generally requires
considerable skill to operate. Moreover, some disabled
people cannot drive an electric wheelchair manually, even
with a joystick, because they lack the physical ability to
control the movement. To enable a disabled person to drive
a wheelchair safely and easily so that they can enjoy a
higher quality of life, researchers have proposed several
electric wheelchair systems. The use of voice commands to
control an electric wheelchair is one research result. A small
number of command words and high-performance voice
recognition are employed in this system. An electric wheelchair control with electro-oculography (EOG) techniques
has also been proposed. In this case, the different commands for the wheelchair are derived from the electro-oculography (EOG) potential signals of eye movements.
A system for electric wheelchair control using the eyes
was proposed in 2007. A commercially available web camera
on a head-mounted display (HMD) which the user wears is
used to capture moving pictures of the user’s face. A computer mounted on the electric chair processes the captured
image data, detecting and tracking movements of the user’s
eyes, estimating the line-of-sight vector, and actuating the
electric wheelchair in the desired direction indicated by the
user’s eyes. One of the key essentials of the proposed system
is detecting and tracking the eye movements. This article
deals with the experimental results of the accuracy of the
estimation of the rotation angle of the eyeball, and with the
acquired image of the eyes and their surroundings. Throughout the experiments, a success rate of almost 99% was confirmed in terms of matching the accuracy of the electric
wheelchair’s movement with the direction in which the user
intended to move.1
In this research, a method of electric wheelchair control
by gaze direction and eye blinking properties is proposed.
A camera is set up in front of the wheelchair user to capture
398
image information. The camera direction is focused onto
the face area of the wheelchair user. The camera is connected to a computer with vision processing and electric
wheelchair motion control capabilities. With vision processing, the sequential captured image is interpreted to obtain
the gaze direction and eye blinking properties. The gaze
direction is expressed as the horizontal angle of the user’s
gaze, and is derived from the triangle formed by the centers
of the eyes and the nose. The eye-blinking properties are
obtained from the timing of blinks. The gaze direction and
eye blinking are used to provide the direction and timing
commands, respectively, for the electric wheelchair. The
direction command relates to the direction of movement of
the electric wheelchair, and the timing command relates to
the time when the electric wheelchair should move. The
eye-blinking mechanism with a blink duration of at least
400 ms is designed to generate a timing command for when
to move. Furthermore, the electric wheelchair also receives
a velocity command to move at a certain velocity, as well
as the direction and timing commands. The disturbance
observer-based control system is used to control the direction and velocity. For safety purposes, an emergency stop
is generated when the electric wheelchair user does not
focus their gaze consistently in one direction for a specified
time.
2 Proposed system
2.1 System overview
processing and control purposes is placed in the bottom of
the wheelchair. The electric wheelchair uses a differential
steering mechanism. The wheelchair design considers the
following four important factors: safety, ease of operation,
low price, and convenience.
Four commands are use to control the electric wheelchair, i.e., the safety, timing, direction, and velocity commands (Fig. 2). The safety command is for any situation in
which the wheelchair must stop immediately. The timing
command is obtained from the eye-blinking mechanism,
and is used to move or stop the chair. The direction
command is derived from the properties of the points of the
triangle made by both eyes and the nose of the user. The
velocity command is set independently and does not depend
on the vision system. Both the direction and velocity
commands refer to the wheelchair’s motion in Cartesian
space.
2.2 Electric wheelchair model
Figure 3 illustrates the differential steering of the electric
wheelchair model. The desired wheelchair motion is determined by the command value of direction θ and linear
velocity v. The wheelchair motion depends on the electric
signal applied to the motor of each wheel. The angular
velocities ωl and ωr represent the direct input signals to
move the wheels of the electric wheelchair to the left or to
the right, respectively. The relation between direction and
velocity (θ, v) and the angular velocities of the wheels
The hardware of an electric wheelchair equipped with a
camera is shown in Fig. 1. The electronic system for vision
Fig. 2. Commands for the electric wheelchair
Fig. 1. Electric wheelchair hardware
Fig. 3. Electric wheelchair model
399
Fig. 4. Electric wheelchair system
(ωl, ωr) is given by Eq. 1, where Rw is radius of a wheel, η
is the gear ratio, and l is the distance between the wheels.
⎡ RW
v
⎡ ⎤ ⎢ 2η
⎢⎣θ ⎥⎦ = ⎢⎢ R
W
⎢⎣ lη
RW ⎤
2 η ⎥ ⎡ω l ⎤
⎥
R ⎥ ⎢⎣ω ⎥⎦
− W⎥ r
lη ⎦
(1)
2.3 Electric wheelchair system
The overall block diagram of the electric wheelchair system
is shown in Fig. 4. The system consists of a vision interpretation system and an electric wheelchair control system. All
the visual information processing, which includes gaze
direction estimation, eye blinking interpretation, and safety
detection, is carried out by a vision interpretation system.
The gaze direction estimation generates smooth-running
direction commands. The eye blinking interpretation and
safety detection systems generate command signals which
are sent directly to the controller. In the wheelchair control system, a disturbance observer-based control strategy
is applied to solve the two inputs/two outputs control
problem.
Fig. 5. Relation between the triangular shape and the gaze
Fig. 6. Triangular shape from the
eyes and the nose
2.4 Vision interpretation system
2.4.1 Gaze direction estimation
The gaze direction is interpreted from the positions of both
eyes and the nose of the wheelchair user. As shown in Fig.
5, these three positions form a triangular shape that relates
to the direction of the gaze. With the parameters of the triangle, as shown in Fig. 6, the estimation of the gaze direction is formulated as Eq. 2. The function of f(·)is determined
during the calibration procedure.
a − b⎞
θ̂ = f1 ⎛
⎝ L ⎠
(2)
The direction command θcmd is derived from the gaze
direction estimation θ̂ with a smoothing function formulation to yield natural changes in commands and to suppress
measurement noise.
()
θcmd = f2 θˆ
(3)
400
2.4.2 Eye-blinking interpretation
Linear velocity response
An interpretation of eye blinking is useful to generate a
timing command which can be applied directly to the controller. Eye blinking with a duration of at least 400 ms is
used as the timing command. The algorithm of the timing
command is as follows:
(m/sec)
0.4
0.2
command
actual
0
1. If <1 blink>then <disable controller>
2. If <2 blinks>then <enable controller>
3. If <long blinking>then <set to move backward>
0
5
10
time (sec)
15
20
Rate of direction response
0.2
The safety detection control monitors the behavior of the
electric wheelchair user and the performance of the vision
system. “Bad” behavior is defined as when the user does
not focus during wheelchair control. Failure of the vision
system occurs when the system fails to detect the user’s face.
The safety command algorithm is defined as
(rad/sec)
2.4.3 Safety detection
In both of the wheelchair motors, the disturbance observer
technique is applied. The disturbance observer is a technique to estimate the disturbance existing in a plant, and to
make the motion controller be an acceleration controller
also.3 With an acceleration controller, the wheelchair motors
can be modeled as an ideal integrator, and the block diagram
of the controller is realized by constant gain K.
⎡v ⎤ ⎡ K
=
⎢⎣
θ ⎥⎦ ⎢⎣0
0 ⎤ ⎡vcmd − v ⎤
K ⎥⎦ ⎣⎢θ cmd − θ ⎦⎥
10
time (sec)
15
20
Actual response of wheelchair motors
20
left
right
10
0
0
5
10
time (sec)
15
20
Fig. 7. Simulation results
4 Conclusion
The conversion from angular to linear velocity is formulated in Eq. 1. Furthermore, the acceleration reference can
be derived from Eq. 1 as follows:
−1
RW ⎤
2 η ⎥ ⎡ ν ⎤
⎥
⎥⎦
RW ⎥ ⎢⎣θ
−
lη ⎥⎦
5
(4)
2.5.2 Angular to linear velocity conversion and acceleration reference generator
⎡ RW
⎡ω l ⎤ ⎢ 2 η
⎢⎣ω ⎥⎦ = ⎢⎢ R
r
W
⎢⎣ lη
0
30
(rad/sec)
2.5.1 Electric wheelchair motors and controller
0.1
0
If <bad user behavior>or <face detect failed>then <disable
controller>
2.5 Electric wheelchair control system
command
actual
The feature points in the triangle made by the positions of
the eyes and the nose can be formulated to generate a direction command for the electric wheelchair. Combining the
direction, velocity, timing, and safety commands yields an
effective method of controlling an electric wheelchair. With
the disturbance observer technique applied as a control
strategy, the design criteria of the wheelchair control system
can easily be satisfied.
(5)
3 Results
To verify the effectiveness of the proposed method, several
simulations were carried out. Figure 7 shows that the actual
response follows the command in a specified time. Furthermore, the actual response of the motor shows that there
is a difference in the angular velocity of the left and right
motors when the command is “change direction.”
References
1. Arai K, Purwanto D (2007) Electric wheelchair control with the
human eye. 7th International Conference on Optimization: Techniques and Applications, December 2007, Kobe, Japan
2. Arai K, Uwataki H (2007) Computer input system based on viewing
vector estimation with iris center detection from the image of users
acquired with relatively cheap web camera allowing user movements. J Inst Elec Eng Jpn C-127(7):1107–1114
3. Ohnishi K, Shibata M, Murakami T (1996) Motion control for
advance mechatronics. IEEE/ASME Trans Mechatronics 1(1):
56–67