Title :
Recognizing non-rigid human actions using joints tracking in space-time
Author :
Lu, Xin ; Liu, Qiong ; Oe, Shunichiro
Author_Institution :
Fac. of Eng., Tokushima Univ., Japan
Abstract :
This paper presents a new method to recognize and restructure the human motions in space-time using the tracking information of human joints. A specific posture can be recognized in actual frame by finding the correspondence between actual frame and key frame. Within key frame prototype information has been extracted to represent defined posture (one step of action). By recognizing some person´s postures in order, it is possible to map body locations from the key frames to actual frames in long video sequence and confirm the posture-sequence (action). The proposed method is tolerant to substantial deformation between image and prototype and recognizes qualitatively similar joints moving traces that compose human action. The proposed method is achieved by using a LKT feature tracker approach to track joints in key frames and actual frames. Then, the factorized sampling method is applied to replace one tracker with tracker-set to enhance the joints tracking accuracy. The experiments results demonstrate that accuracy and efficiency of recognizing non-rigid human actions.
Keywords :
feature extraction; gesture recognition; image sequences; motion estimation; target tracking; Lucas-Kanade-Tomasi feature tracker; body locations mapping; human action recognition; human joints tracking; human motion restructuring; human motion synthesis; motion detection; motion segmentation; posture recognition; posture sequences; video sequences; Data mining; Humans; Image recognition; Image sequences; Joints; Motion detection; Prototypes; Sampling methods; Tracking; Video sequences;
Conference_Titel :
Information Technology: Coding and Computing, 2004. Proceedings. ITCC 2004. International Conference on
Print_ISBN :
0-7695-2108-8
DOI :
10.1109/ITCC.2004.1286534