DocumentCode :
2247854
Title :
Human behavior interpretation system based on view and motion-based aspect models
Author :
Furukawa, Masayuki ; Kanbara, Yoshio ; Minato, Takashi ; Ishiguro, Hiroshi
Author_Institution :
Dept. of Adaptive Mech. Syst., Osaka Univ., Suita, Japan
Volume :
3
fYear :
2003
fDate :
14-19 Sept. 2003
Firstpage :
4160
Abstract :
This paper proposes an interpretation system for recognizing human motion behaviors and constructing the behavior rules called Behavior Grammar. The system recognizes human motion behaviors based on the gestures, locations, directions, and distances by using a distributed omnidirectional vision system (DOVS). The DOVS consisting of multiple omnidirectional cameras is a prototype of a perceptual information infrastructure for monitoring and recognizing the real world. The sequences of interpreted behaviors are represented as a behavior graph to extract behavior rules. This paper shows how the system realizes robust and real-time visual recognition based on View and Motion based Aspect Models (VAMBAM) and the resultant behavior graph.
Keywords :
computer vision; gesture recognition; pattern recognition; real-time systems; VAMBAM; behavior grammar; behavior graphs; distributed omnidirectional vision system; human behavior interpretation system; human gestures; humans motion behavior recognition; motion based models; multiple omnidirectional cameras; perceptual information infrastructure; real time visual recognition; view and motion-based aspect models; view based models; Adaptive systems; Cameras; Data mining; Humans; Machine vision; Monitoring; Prototypes; Real time systems; Robots; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation, 2003. Proceedings. ICRA '03. IEEE International Conference on
ISSN :
1050-4729
Print_ISBN :
0-7803-7736-2
Type :
conf
DOI :
10.1109/ROBOT.2003.1242237
Filename :
1242237
Link To Document :
بازگشت