DRAG: A Database for Recognition and Analysis of Gait

DRAG: A Database for Recognition
and Analysis of Gait
Prem Kuchia , Raghu Ram Hiremagalura , Helen Huangb , Michael Carhartb ,
Jiping Heb , Sethuraman Panchanathana
a Research
Center for Ubiquitous Computing
Arizona State University
b Motor Control and Rehabilitation Laboratory
Arizona State University
ABSTRACT
A novel approach is proposed for creating a standardized and comprehensive database for gait analysis. The
field of gait analysis is gaining increasing attention for applications such as visual surveillance, human-computer
interfaces, and gait recognition and rehabilitation. Numerous algorithms have been developed for analyzing and
processing gait data; however, a standard database for their systematic evaluation does not exist. Instead, existing
gait databases consist of subsets of kinematic, kinetic, and electromyographic activity recordings by different
investigators, at separate laboratories, and under varying conditions. Thus, the existing databases are neither
homogenous nor sufficiently populated to statistically validate the algorithms. In this paper, a methodology for
creating a database is presented, which can be used as a common ground to test the performance of algorithms
that rely upon external marker data, ground reaction loading data, and/or video images. The database consists
of: (i) synchronized motion-capture data (3D marker data) obtained using external markers, (ii) computed joint
angles, and (iii) ground reaction loading acquired with plantar pressure insoles. This database could be easily
expanded to include synchronized video, which will facilitate further development of video-based algorithms for
motion tracking. This eventually could lead to the realization of markerless gait tracking. Such a system would
have extensive applications in gait recognition, as well as gait rehabilitation. The entire database (marker, angle,
and force data) will be placed in the public domain, and made available for downloads over the World Wide
Web.
1. INTRODUCTION
Gait recognition and analysis are becoming increasingly important in the contexts of surveillance, intelligent user
interfaces, etc. Gait analysis is also of tremendous importance in rehabilitation techniques. For example, partial
weight bearing therapy8, 17 (involves gait analysis) has shown to be very effective in facilitating the recovery
of ambulation in individuals with incomplete spinal cord injury .Numerous advances have been made in these
fields.3, 5, 10, 12, 14 A comprehensive and a standardized database that would allow a systematic study of gait
does not exist. To a great extent, individual researchers have been creating databases to help answering specific
questions about gait. However, these databases are not general enough to be applicable for variety of research.
One of the important problems yet to solved in gait recognition is how to account for variation in gait with
different appendages (eg. footwear, backpacks etc.). Also, it is not yet clear how gait varies with time (steady
state vs. transient). Research in these directions creates a need for a a gait database accurate enough to perform kinematic modelling. Presently most databases are created to test the performance of already developed
algorithms and not suitable for studying gait (see sections 2 and 3). Also, recent research has shown that incorporating multimodal information like the ground reaction force data into gait recognition algorithms improves
performance.7 However, such information is not available in public domain databases.
In this paper, a methodology for creating a standardized and a comprehensive database that includes (i)
synchronized motion-capture data (3D marker data) obtained using external markers, (ii) computed joint angles,
Prem Kuchi; E-mail: [email protected], Telephone: 1 480 727 3612
Address: 920, S. Terrace Road, 107, Tempe, AZ 85281. USA
1
and (iii) ground reaction loading acquired with plantar pressure insoles, is presented. Sections 2 and 3 describe
the existing prominent gait databases and their limitations. Calculation of joint angles and compensating for
offset errors are described in section 4. The gait capture methodology is discussed in section 5. Experimental
procedure for arriving at preferred speed and walking at steady state on a threadmill are also described in this
section. Sample results are shown in section 6. In section 7, some observations are made with respect to the
procedure and the paper concludes with section 8.
2. RELATED WORK
As mentioned above, several research groups have built gait databases for testing their gait recognition algorithms
or analyze gait in terms of its bio-mechanics. This section provides a brief review on some prominent databases.
2.1. Georgia Tech Database
This database, created in the computational perception laboratory at the Georgia Tech consists of gait samples
of 20 participants.1, 4, 16 For each participant gait data is available both in terms of video as well as 3D motion
capture trajectories. Gait is captured both indoor and outdoor. However, in the outdoor scenario motion
captured data is not available. For some of the participants, gait data from different view angles is available.
2.2. MIT Database
In this database,2 the participants walk back and forth twice in each direction frontoparallel to the camera.
One important feature of this database is that each participant may have their gait data collected on multiple
days. This would allow testing of algorithms’ robustness to clothing changes. Also, it would show how much
gait varies with trials on different days. Gait data was captured over a period of three months. The database
consists of only video data captured indoors. Two different background scenes were used during the capture. A
total of 194 sequences are available as a part of this database.
2.3. University of Maryland Database
The University of Maryland database contains walking sequences of four different poses.11 The database consists
of only video sequences captured using Philips G3 EnviroDome camera systems. The database can be divided
into 3 individual datasets. Dataset 1 has gait information of 25 participants captured in four views, viz., (i)
frontal view/walking-toward, (ii) frontal view/walking-away, (iii) frontal-parallel view/toward left, (iv) frontalparallel view/toward right. The second dataset consists of videos of 55 participants walking along a T-shape
pathway. Two cameras with their optical axes orthogonal to each other capture the gait of each participant. The
third dataset contains only 12 participants and has people walking at angles of 0, 15 , 30, 45, and 60 degrees to
the camera.
2.4. University of Southampton Database
The Southampton HID database15 consists of two segments: (i) Large database of about 100 people and (ii) and
a small, but a detailed database of 10 people. In the large database, each participants’ gait was recorded in two
different view angles, viz., fronto-parallel and oblique. Gait was captured in three different scenarios – outdoor,
indoor and on a threadmill. The large size of the database would help to answer the question of applicability of
gait as a biometric.
The smaller dataset is more extensive with respect to the environmental conditions such as the type of
footwear, clothes and appendages (eg. backpack). Also, the gait sequences are long. However, for this dataset,
gait was captured only indoors, walking on a platform (as opposed to a threadmill).
2.5. Arizona State University Database
This database was created primarily for analyzing the biomechanics of gait.6 The objective was to accurately
measure the three dimensional, whole body kinematics of normal human gait. The database consists of gait
patterns of 5 participants. A four camera MacReflex (Qualysis, Inc., Glastonbury, CT) motion measurement
system, running at 120 Hz was used to collect raw 3D marker positions. Bertec Model 4080H force platforms
(Bertec Corp., Columbus, OH) was used for capturing 3D ground reaction force. Kinematics are available in
terms of joint-angle trajectories.
2
3. LIMITATIONS OF EXISTING DATABASES
The following sections provide a short analysis of each of the gait databases described above, discussing their
limitations.
3.1. Georgia Tech Database
This database has both 3D marker data as well as video data. However, the joint-angles are not calculated. It
does not provide multimodal information like force data. Also, this database has gait of a limited number of
participants (20). Care has not been taken to make sure that the gait captured is steady state gait. This is
especially important when it comes to walking on a threadmill.
3.2. MIT Database
Neither 3D marker data or force data is available with this database. This database is useful for testing the
algorithms developed, but one cannot understand gait using this database. Also, only video only from single
view-point (fronto-parallel) is available.
3.3. University of Maryland Database
University of Maryland database is very good for testing the gait algorithms developed, but it does not provide
3D marker data or joint angles. Once again, multimodal information like ground reaction loading is not available.
3.4. University of Southampton Database
The large subsection of this database has 100 participants in it. This would provide as a good test bed for gait
recognition algorithms. However, environmental variability for gait is available only for 10 participants. It should
be noted that even this dataset consists of only conventional video and does not contain any ground reaction
force data.
3.5. Arizona State University Database
This database has joint-angle trajectories as well as ground reaction force data. However, the database has only
5 participants. Also, in this database, participants walk with their hands folded. This could alter the natural
gait of participants. Also, care has not been taken to ensure that participants walk with their steady state gait.
4. THEORY
4.1. Calculation of Joint-Angle Trajectories
To calculate the joint-angle trajectories, a fixed set of mutually orthogonal units vectors corresponding to the three
anatomical axes (anterior-posterior, superior-inferior, and medial-lateral) is defined. Using these unit vectors,
instantaneous orientation of each segment in a global reference frame is calculated. Finally, this instantaneous
orientation is used for calculating joint-angles.
The global reference frame is defined based on the recommendation by the International Society of Biomechanics.18 A right handed orthogonal triad of unit vectors is used to define this reference frame (N), in which,
ŷN denotes the vertical axis, x̂N denotes the direction of walking, and ẑN is an axis that is orthogonal to
both ŷN and x̂N and is defined by their dot product. Another set of right hand orthogonal triads are used to
define the segment anatomical axes. For any segment I, x̂I denotes the anterior-posterior axis, ŷI defines the
superior-inferior axis, and ẑI defines the medial lateral axis. When the subject stands in the neutral position∗ ,
the segment anatomical axes align with the global reference frame.
Three markers could be used to characterize each segment. These three markers should not be collinear. If
the 3D position of these markers is given, then a set of mutually orthogonal unit vectors can be computed. These
vectors are called the segment marker axes. If for a segment U, these unit vectors are x̂UM , ŷUM , and ẑUM ,
∗
The position of the subtalar joint in which the foot is neither pronated or supinated. The plantar surface of the
calcaneus is perfectly parallel to the supporting surface (i.e. the ground). When the hindfoot is normal, the bisector of
the calcaneus is at 90 degrees to the supporting surface.
3
(a) Front View
(b) Back View
(c) Side View
Figure 1. Baseline Pictures
then we can define a rotation matrix
that
N
RUM from the segment marker axes to the global reference frame, such
x̂N = N RUM .x̂UM
(1)
4.1.1. Baseline Recording
However, in order to calculate the joint angles, we need a rotation matrix from the global reference frame to the
anatomical axes N RU . This could be written as
N
RUM .UM RU = N RU
(2)
As can be seen in the above equation, simultaneous knowledge of both the marker positions and anatomical axes
is required. A Baseline is recorded to gain such a knowledge. During this recording, the participant stands in a
position such that the segment axes are aligned to the global reference frame as close as possible. In this position
N U
R = I and therefore
UM U
R = [N RUM ]−1
(3)
Once UM RU is calculated, N RU can be used to define the unique orientation of each segment anatomical axes
with respect to the global reference frame. Sample baseline recordings are shown in figure 1.
The next step is to calculate the joint rotation matrix, that is the matrix that describes the relative motion
between the segments. For example, U RV would be the relative motion between segments U and V with U as
the fixed segment. We can write
N V
R = N RU .U RV
(4)
Therefore,
U
It should be noted that
U
V
R
6=
V
U
RV = [N RU ]−1 .N RV
R . Instead
V
U
R
U
V T
=[ R ] .
4
(5)
4.1.2. Joint-angles
The relative motion between two rigid frames could be considered to have occurred as a result of three simple
rotations. Let θ, φ, and ψ be the three rotations. If U is the fixed frame and V is the rotating frame, then (i)
θ is the first simple rotation of the frame V relative to U about the z-axis fixed in U (ẑU ), (ii) φ is the second
rotation about the x-axis defined by the first simple rotation (x̂0V ) and (iii) ψ is the final rotation about the
z-axis defined by the first two simple rotations (ẑ00V ). Now these angles can be calculated as follows:
U
RV
=
U
RV
0
V0
RV
00
V00
RV



1 0
0
cφ 0 sφ
cψ −sψ
1 0   sψ cψ
=  0 cθ −sθ   0
0 sθ cθ
−sφ 0 cφ
0
0


cφ cψ
−sψ cφ
sφ
=  sφ cθ + sθ sφ cψ cθ cψ − sθ sφ sψ −sθ cφ 
sθ sψ − cθ sφ cψ sθ cψ + cθ sφ sψ cθ cφ

0
0 
1
where, cθ is cos(θ), sθ is sin(θ) and so on. The angles are determined by comparing the above matrix and the
joint rotation matrix. Therefore,
φ =
sin−1 (rUV13 )
θ
=
sin−1
=
−1
ψ
sin
−rUV23
cos(φ)
−rUV12
cos(φ)
In the above equations, r’s are the components of the joint rotation matrix.
4.2. Compensating Offset Errors
Even when the participant is standing in the neutral position, small differences exist between the segment
anatomical axes and the axes of the global reference frame. These differences, even though they are small, lead
to unrealistic body positions.6 To offset these errors, the deviation of the individual segment orientations from
the idealized positions is estimated using the baseline pictures recorded before. Using this, the static marker-tobone rotation matrix is recomputed for each segment and further calculations are performed on these recomputed
matrices.
5. AN OVERVIEW OF OUR METHODOLOGY
†
VisualEyez motion capture system was used for capturing the 3D motion and Tekscan pressure insoles were used
for obtaining ground reaction loading. For gait capture, the VisualEyez cameras are positioned to capture the
most volume. The workspace is calibrated using the calibration wand. The calibration is repeated to ascertain
minimum variance in the calibration output. The motion capture system is set to capture at 80 Hz. This was
followed by the calibration of the global co-ordinate reference frame. Fifteen LED markers were used to quantify
the 3D kinematics. The marker configuration used for our capture is shown in the figure 2. It could be noticed
that the measurement is done only for the right half of the participant. This is because of the general agreement
of gait researchers that healthy gait could be considered symmetric.6 The Tekscan system was set to capture at
80 scans a second. The Tekscan system and the VisualEyez system were synchronized using an external trigger.
The ultrathin insole of the Tekscan system was cut to fit the participants’ shoe and was rigged into it.
†
Trademark of the PTI 3D motion tracking systems.
5
The protocol used for capturing gait is summarized as algorithm 1. This protocol was submitted for review
to the Arizona State University Human Subjects Institutional Review Board and was approved by it. The gait
capture was performed only on participants with no history of musculosketetal problems. Participants walked
on a threadmill at their preferred speed and the gait is captured in 3 intervals and for 1 minute during each of
those intervals. Along with the force and kinematics of each participant, his/her age, height, weight, preferred
speed of walking are documented.
5.1. Threadmill Walking and Preferred Speed
Most people, when walking on threadmill, walk with unnatural gait and take some warm-up time before they
walk with their natural gait. It has been recommended by several researchers13 that 30 minutes of walking on
a threadmill would be sufficient for the participant to arrive at his steady state natural gait. Also, Up-velocity
protocol and Down-velocity 13 protocol were used to arrive at the participants’ preferred speed and gait capture
was done at this speed. During up-velocity protocol, the speed of the threadmill is increased in steps till a little
beyond the speed at which the participant says he/she is at a comfortable speed. The comfortable speed is noted
down. During the down-velocity protocol, the speed of the threadmill is decreased in steps from a value higher
than the comfortable speed noted above. Now, speed at which the participant says he/she is comfortable is
noted again. Ideally, the comfortable speeds obtained during the up-velocity and down-velocity protocols should
match. In this experiment, the average of the two speeds was used as the speed of the threadmill. It has been
shown that the an error of 0.1 to 0.3 miles an hour is possible.13
Figure 2. Marker locations used for data capture. (a) Front view. (b) Back view.
6. RESULTS
Some sample results are shown in figures 3, 4, and 5. The figures show group average joint angle trajectories
for the foot, ankle, knee, hip, and torsol of one participant averaged over 157 gait cycles. The dark line denotes
6
Algorithm 1 Protocol for Gait Capture
1. Marker placement: The markers are placed on the participant according to the figure 2. The participants’ shoe is fitted with the pressure insole.
2. Warm-up: The participant is asked to walk on the treadmill for 30 mins. The visibility of all the markers
is checked during this warm up.
3. Up velocity protocol: Increase the speed of the treadmill in steps and note the speed at which the
participant feels comfortable. This is done for around 2-3 mins.
4. Down velocity protocol: Decrease the speed of the treadmill from the speed attained in the previous
step. Again, note the speed at which the participant feels comfortable. Ideally, the two speeds i.e. the
speed noted during the up velocity protocol and down velocity protocol should match. There is a possibility
of an error of around 0.1-0.3 miles per hour.
5. Rest: The participant is rested for 5 minutes.
6. Capture: The participant is asked to walk again on the treadmill and a Baseline frame is captured. After
5 minutes of warm up his/her gait is captured in 3 intervals and for 1 min during each of those intervals.
the group mean, while the gray shaded region encompasses the mean plus and minus 1 standard deviation. It
should be noted that variance is usually small. Some of the variance may due to change in the shape of segment
(rigid body assumption is made for each of the segments during the calculations). Figure 6(a) shows the segment
diagram (with 5 segments and 12 degrees of freedom). The figure shows the screenshot of the VisualEyez software
and figure 6(b) shows sample ground reaction loading obtained during our gait capture.
7. DISCUSSION
There are numerous challenges in building a standardized and a comprehensive gait database. Perhaps the most
fundamental challenge is to obtain error-free joint-angle trajectories. This requires a lot of manual processing
after gait is captured. It would be interesting to develop algorithms that could automatically build the static
marker-to-bone rotation matrix using the baseline pictures as described in section 4.2. An often neglected, yet
important challenge is enrolling participants for gait capture. The gait capture process is very difficult for the
participant, involving walking for about 45 minutes. Also the process of getting markers strapped could at
best be described as unpleasant. This creates a need to develop methods that would make this process for the
participant more pleasant than it is now.
8. CONCLUSION AND ONGOING WORK
While the attention of gait recognition researchers has largely been focussed on classifying between different
motion patterns, robust gait recognition will not be obtained until the effect of various environmental conditions
on gait (such as the effect of different shoes) is understood. To accomplish this, it is necessary to create an
accurate gait database under carefully calibrated conditions. This paper describes such a database and the
methodology to create it. The process of assembling a database is a continuous undertaking. Participants are
being enrolled to increase the size of the database to a point where algorithms developed for gait analysis could
be statistically validate. Also, for many applications (for eg: gait recognition and rehabilitation) would have a
much lager value if it is performed on conventional video. So, addition of synchronized video to this database
would be very valuable, in the sense that algorithms developed on 3D data can be converted easily to work with
conventional data and the results that are obtained on the conventional video can be validated. This forms a
part of our future work. The database described here is made available for download from the web.9
7
Figure 3. Sample Results I
REFERENCES
1. The Georgia Tech gait recognition database. http://www.cc.gatech.edu/cpl/projects/hid/images.html.
2. The MIT gait recognition database. http://www.ai.mit.edu/people/llee/HID/data.htm.
3. T. P. Andriachi, J. A. Ogle, and J. O. Galante. Walking speed as a basis for normal and abnormal gait
measurements. Journal of Biomechanics, 10:261–268, 1977.
4. A. Bobick and A. Johnson. Gait recognition using static, activity-specific parameters. In Proceedings of the
IEEE Conference on Computer Vision and Pattern Recognition, volume 2, pages 423–430, 2001.
5. Jeffrey E. Boyd and James J. Little. Phase in model-free perception of gait. In Proceedings of the Workshop
on Human Motion, pages 3–10, 2000.
6. M. Carhart. Biomechanical analysis of compensatory stepping: implications for paraplegics standing via
functional neuromuscular stimulation. PhD thesis, Arizona State University, 2000.
7. Ph. Cattin, D. Zlatnik, and R. Borer. Biometric system using human gait. In Proceedings of Mechatronics
and Machine Vision in Practice (M2VIP), Hong Kong, 2001.
8. R. D. DeLeon and J. A. Hodgson. Full weight-bearing hindlimb standing following stand training in the
adult spinal cat. Journal of Neurophysiology, 80(1):83–91, 1998.
9. DRAG. Database for recognition and analysis of gait. http://cubic.asu.edu/projects/DRAG.html.
10. Qiang He and Chris Debrunner. Individual recognition from periodic activity using hidden markov models.
In Proceedings of the Workshop on Human Motion, pages 47–52, 2000.
11. HID-UMD. The UMD gait recognition database. http://degas.umiacs.umd.edu/Hid/data.html.
12. H. M. Lakany and G. M. Hayes. An algorithm for recognising walkers. In J. Bigun, G. Chollet, and
G. Borgefors, editors, Proceedings of 1st International Conference on Audio and Video-Based Biometric
Person Authentication, pages 112–118, 1997.
13. P. E. Martin, D. E. Rothstein, and D. D. Larish. Effects of age and physical activity status on the speedaerobic demand relationship of walking. Journal of Applied Physiology, 73:200–206, 1992.
8
Figure 4. Sample Results II
14. M. Milner, J. V. Basmajian, and A. O. Quanbury. Multifactorial analysis of walking by electromyography
and computer. American Journal of Physical Medicine, 50:235–258, 1971.
15. J. D. Shutler, M. G. Grant, M. S. Nixon, and J. N. Carter. On a large sequence-based human gait database.
In Special Session on Biometrics, Proc. 4th International Conference on Recent Advances in Soft Computing,
Nottingham (UK), pages 66–71, 2002.
16. R. Tanawongsuwan and A. Bobick. Gait recognition from time-normalized joint-angle trajectories in the
walking plane. In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, volume 2,
pages 726–731, 2001.
17. J. R. Wolpaw and A. M. Tennissen. Activity-dependent spinal cord plasticity in health and disease. Annu
Rev Neuroscience, 24:807–843, 2001.
18. G. Wu and P. R. Cavanagh. ISB recommendations for standardization in the reporting of kinematic data.
Journal of Biomechanics, 28:1257–1261, 1995.
9
Figure 5. Sample Results III
(a) Segment diagram of VisualEyez
Figure 6. Sample Results IV
10
(b) Ground Reaction
Loading