DocumentCode :
248242
Title :
Automatic extraction of semantic features for real-time action recognition using depth architecture networks
Author :
Tran Thang Thanh ; Fan Chen ; Kotani, Koji ; Le Bac
Author_Institution :
Sch. of Inf. Sci., Japan Adv. Inst. of Sci. & Technol., Ishikawa, Japan
fYear :
2014
fDate :
27-30 Oct. 2014
Firstpage :
1540
Lastpage :
1544
Abstract :
Motion analysis automatically captures, recognizes and predicts ongoing human activities, which can be widely applied to various useful domains such as security surveillance in public spaces, including shopping centers and airports. With the development of the technologies like 3D specialized markers, we could capture the moving signals from marker joints and create a huge set of 3D motion capture (MOCAP) data. We propose in this work a method to automatically extract the action features which can be used for action recognition. We create an depth architecture model by combining multilevel networks which can focus on the recognizing objects in detail. These networks can learn the extracted features and perform action recognition. This propose model not only can extract the semantic action features from 3D MOCAP data, but also can apply for the real-time action recognition.
Keywords :
feature extraction; image motion analysis; object recognition; 3D MOCAP data; 3D motion capture data; 3D specialized markers; airports; automatic action feature extraction; automatic semantic feature extraction; depth architecture networks; motion analysis; public spaces; real-time action recognition; security surveillance; shopping centers; Computer architecture; Feature extraction; Joints; Radio frequency; Real-time systems; Semantics; Three-dimensional displays; 3D MOCAP Data; Depth architecture networks; Relational Features; Semantic Action Features; Semantic-based;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Image Processing (ICIP), 2014 IEEE International Conference on
Conference_Location :
Paris
Type :
conf
DOI :
10.1109/ICIP.2014.7025308
Filename :
7025308
Link To Document :
بازگشت