Présentation générale du projet

PULSAR
PULSAR
Perception Understanding Learning
Systems for Activity Recognition
Theme: Cognitive Systems Cog C
Multimedia data: interpretation and man-machine interaction
Multidisciplinary team:
Computer vision, artificial intelligence, software engineering
Team presentation
5 Research Scientists:
François Bremond (CR1 Inria, HDR)
Guillaume Charpiat (CR2 Inria, 15 December 07)
Sabine Moisan (CR1 Inria, HDR)
Annie Ressouche (CR1 Inria)
(team leader)
Monique Thonnat (DR1 Inria, HDR)
1 External Collaborator: Jean-Paul Rigault (Prof. UNSA)
1 Post-doc:
Sundaram Suresh (PhD Bangalore, ERCIM)
5 Temporary Engineers: B. Boulay (PhD) , E. Corvee (PhD)
R. Ma (PhD) , L. Patino (PhD) , V. Valentin
8 PhD Students:
B. Binh, N. Kayati, L. Le Thi, M.B. Kaaniche,
V. Martin, A.T. Nghiem, N. Zouba, M. Zuniga
1 External visitor:
September 2007
Tomi Raty (VTT Finland)
PULSAR
2
PULSAR
Objective: Cognitive Systems for Activity Recognition
Activity recognition: Real-time Semantic Interpretation of Dynamic Scenes
Dynamic scenes:

Several interacting human beings, animals or vehicles

Long term activities (hours or days)

Large scale activities in the physical world (located in large space)

Observed by a network of video cameras and sensors
Real-time Semantic interpretation:

Real-time analysis of sensor output

Semantic interpretation with a priori knowledge of interesting behaviors
September 2007
PULSAR
3
PULSAR Scientific objectives:
Objective: Cognitive Systems for Activity Recognition
Cognitive systems: perception, understanding and learning systems

Physical object recognition

Activity understanding and learning

System design and evaluation
Two complementary research directions:

Scene Understanding for Activity Recognition

Activity Recognition Systems
September 2007
PULSAR
4
PULSAR target applications
Two application domains:

Safety/security (e.g. airport monitoring)

Healthcare (e.g. assistance to the elderly)
September 2007
PULSAR
5
Cognitive Systems for Activity Recognition
Airport Apron Monitoring
Outdoor scenes with complex interactions between humans, ground vehicles,
and aircrafts
Aircraft preparation: optional tasks, independent tasks, temporal constraints
September 2007
PULSAR
6
Cognitive Systems for Activity Recognition
Monitoring Daily Living Activities of Elderly
Goal: Increase independence and quality of life:

Enable people to live at home

Delay entrance in nursing home

Relieve family members and caregivers
Approach:

Detecting changes in behavior
(missing activities, disorder, interruptions,
repetitions, inactivity)

Calculate the degree of frailty of elderly people
Example of normal activity:
Meal preparation (in kitchen) (11h– 12h)
Eat (in dinning room) (12h -12h30)
Resting, TV watching, (in living room) (13h– 16h)
…
September 2007
PULSAR
7
Gerhome laboratory (CSTB,PULSAR)
http://gerhome.cstb.fr
Presence sensor
Contact sensors to detect
“open/close”
September 2007
PULSAR
Water sensor
8
From ORION to PULSAR
Orion contributions

4D semantic approach to Video Understanding

Program supervision approach to Software Reuse

VSIP platform for real-time video understanding
 Keeneo start-up

LAMA platform for knowledge-based system design
September 2007
PULSAR
9
From ORION to PULSAR
1) New Research Axis:
Software architecture for activity recognition
2) New Application Domain:
Healthcare (e.g. assistance to the elderly)
3) New Research Axis:
Machine learning for cognitive systems (mixing perception,
understanding and learning)
4) New Data Types:
Video enriched with other sensors (e.g. contact sensors, ….)
September 2007
PULSAR
10
PULSAR research directions
Perception for Activity Recognition (F Bremond, G Charpiat, M Thonnat)

Goal: to extract rich physical object description

Difficulty: to obtain real-time performances and robust detections in
dynamic and complex situations

Approach:



Perception methods for shape, gesture and trajectory description of
multiple objects
Multimodal data fusion from large sensor networks sharing same 3D
referential
Formalization of the conditions of use of the perception methods
September 2007
PULSAR
11
PULSAR research directions
Understanding for Activity Recognition (M Thonnat F Bremond S Moisan)

Goal: physical object activity recognition based on a priori models
Difficulty: vague end-user specifications and numerous observations

Approach:

conditions



Perceptual event ontology interfacing the perception and the human opera
levels
Friendly activity model formalisms based on this ontology
Real-time activity recognition algorithms handling perceptual features
uncertainty and activity model complexity
September 2007
PULSAR
12
PULSAR research directions
Learning for Activity Recognition (F Bremond, G Charpiat, M Thonnat)



Goal: learning to decrease the effort needed for building activity models
Difficulty: to get meaningful positive and negative samples
Approach:





Automatic perception method selection by performance evaluation and
ground truth
Dynamic parameter setting based on context clustering and parameter
value optimization
Learning perceptual event concept detectors
Learning the mapping between basic event concepts and activity
models
Learning complex activity models from frequent event patterns
September 2007
PULSAR
13
PULSAR research directions
Activity Recognition Systems (S Moisan, A Ressouche, J-P Rigault)

Goal: provide new techniques for easy design of effective and efficient
activity recognition systems

Difficulty: reusability vs. efficiency
 From VSIP library and LAMA platform to AR platform

Approach:



Activity Models: models, languages and tools for all AR tasks
Platform Architecture: design a platform with real time response,
parallel and distributed capabilities
System Safeness: adapt state of the art verification & validation
techniques for AR system design
September 2007
PULSAR
14
Objectives for the next period
PULSAR: Scene Understanding for Activity Recognition



Perception: multi-sensor fusion, interest points and mobile regions, shape
statistics
Understanding: uncertainty, 4D coherence, ontology for activity recognition
Learning: parameter setting, event detector, video mining
PULSAR: Activity Recognition Systems
From LAMA platform to AR platform:



Model extensions: modeling time and scenarios
Architecture: real time response, parallelization, distribution
User-friendliness and safeness of use: theory and tools for component
framework, scalability of verification methods
September 2007
PULSAR
15
Multimodal Fusion for Monitoring Daily
Living Activities of Elderly
Meal preparation
activity
Multimodal recognition
Person recognition
Resting in living
room activity
Person recognition
September 2007
PULSAR
3D Posture recognition
16
Multimodal Fusion for Monitoring Daily
Living Activities of Elderly
Resting in living room activity
Person recognition
September 2007
PULSAR
3D Posture recognition
17
Multimodal Fusion for Monitoring Daily
Living Activities of Elderly
Meal preparation activity
Multimodal recognition
Person recognition
September 2007
PULSAR
18
Understanding and Learning for Airport
Apron Monitoring
European project AVITRACK (2004-2006) predefined activities
European project COFRIEND (2008-2010) activity learning, dynamic
configurations
September 2007
PULSAR
19
Activity Recognition Platform Architecture
Airport
monitoring
Vandalism
detection
Elderly
monitoring
Application level
Configuration and deployment tools
Program
supervision
Object recognition
and tracking
Scenario
recognition
Task level
Communication and interaction facilities
Perception
components
Understanding
components
Learning
components
Component level
Usage support tools
Ontology
management
Parser
generation
Component
assembly
Simulation
& testing
Verification
PULSAR Project-team
Any Questions?
Video Data Mining
Objective: Knowledge Extraction for video activity monitoring with
unsupervised learning techniques.
Methods: Trajectory characterization through clustering (SOM) and
behaviour analysis of objects with relational analysis1.
Self Organizing Maps (SOM)
m1
mk
Relational Analysis
Analysis of the similarity ciij ' between two individuals i,i’
given a variable V j :
ti  i th trajectory
i  1...n; n : nb _ trajectories
m2
Vj
w : winning _ neuron  ti  mw
mK
2
 ti  mk
O1
2
 w k   e
2
Oi'
.
On
O1
mk  mk   w k ti  mk 
  p p
k
w

2

2

.
.
Oi




ciiV'
j
.
On
k  1...K ; K : nb _ clusters
1
September 2007
PULSAR
BENHADDA H., MARCOTORCHINO F., Introduction à la similarité régularisée en analyse relationnelle,
revue de statistique, Vol. 46, N°1, pp. 45-69, 1998
22
Video Data Mining Results
Step 1:Trajectory
clustering (SOM)
Trajectory Cluster 9: Walk from north
gates to south exit.
2052 trajectories
Step 2: Behaviour Relational Analysis
Trajectory Cluster 1: Walk from north door
to vending machines
Behavior Cluster 19: Individuals and not
Groups buy a ticket at the entrance
September 2007
PULSAR
23
Multimodal Fusion for Monitoring Daily
Living Activities of Elderly
Scenario for Meal preparation
Composite Event (Use_microwave,
Physical Objects ( (p: Person), (Microwave: Equipment), (Kitchen: Zone))
Components ((p_inz: PrimitiveState inside_zone (p, Kitchen))
(open_mw: PrimitiveEvent Open_Microwave (Microwave))
(close_mw: PrimitiveEvent Close_Microwave (Microwave)) )
Constraints ((open_mw during p_inz )
(open_mw->StartTime + 10s < close_mw->StartTime) ))
Detected by contact sensor
Detected by video camera
September 2007
PULSAR
24