Title :
Human behavior interpretation system based on view and motion-based aspect models
Author :
Furukawa, Masayuki ; Kanbara, Yoshio ; Minato, Takashi ; Ishiguro, Hiroshi
Author_Institution :
Dept. of Adaptive Mech. Syst., Osaka Univ., Suita, Japan
Abstract :
This paper proposes an interpretation system for recognizing human motion behaviors and constructing the behavior rules called Behavior Grammar. The system recognizes human motion behaviors based on the gestures, locations, directions, and distances by using a distributed omnidirectional vision system (DOVS). The DOVS consisting of multiple omnidirectional cameras is a prototype of a perceptual information infrastructure for monitoring and recognizing the real world. The sequences of interpreted behaviors are represented as a behavior graph to extract behavior rules. This paper shows how the system realizes robust and real-time visual recognition based on View and Motion based Aspect Models (VAMBAM) and the resultant behavior graph.
Keywords :
computer vision; gesture recognition; pattern recognition; real-time systems; VAMBAM; behavior grammar; behavior graphs; distributed omnidirectional vision system; human behavior interpretation system; human gestures; humans motion behavior recognition; motion based models; multiple omnidirectional cameras; perceptual information infrastructure; real time visual recognition; view and motion-based aspect models; view based models; Adaptive systems; Cameras; Data mining; Humans; Machine vision; Monitoring; Prototypes; Real time systems; Robots; Robustness;
Conference_Titel :
Robotics and Automation, 2003. Proceedings. ICRA '03. IEEE International Conference on
Print_ISBN :
0-7803-7736-2
DOI :
10.1109/ROBOT.2003.1242237