DocumentCode :
2527467
Title :
Classification of hand postures based on 3D vision model for human-robot interaction
Author :
Takimoto, Hironori ; Yoshimori, Seiki ; Mitsukura, Yasue ; Fukumi, Minoru
Author_Institution :
Fac. of Comput. Sci. & Syst. Eng., Okayama Prefectural Univ., Okayama, Japan
fYear :
2010
fDate :
13-15 Sept. 2010
Firstpage :
292
Lastpage :
297
Abstract :
In this paper, a method for hand posture recognition, which is robust for hand posture changing in an actual environment, is proposed. Conventionally, a data glove device and a 3D scanner have been used for the feature extraction of hand shape. However, the performance of each approach is affected by hand posture changing. Therefore, this paper proposes the posture fluctuation model for efficient hand posture recognition, based on 3D hand shape and color feature obtained from a stereo camera. A large set of dictionary for posture recognition is built by various leaned hand images which were auto-created from one scanned hand image, based on plural proposed models. In order to show the effectiveness of proposed method, performance and processing times for posture recognition are compared to conventional method. In addition, we perform the evaluation experiment by using the Japanese sign language.
Keywords :
feature extraction; human-robot interaction; image classification; image colour analysis; robot vision; stereo image processing; 3D scanner; 3D vision model; Japanese sign language; data glove device; feature extraction; hand color features; hand posture recognition; hand postures classification; human-robot interaction; plural proposed models; posture fluctuation model; stereo camera; Cameras; Humans; Image color analysis; Shape; Solid modeling; Three dimensional displays; Wrist;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
RO-MAN, 2010 IEEE
Conference_Location :
Viareggio
ISSN :
1944-9445
Print_ISBN :
978-1-4244-7991-7
Type :
conf
DOI :
10.1109/ROMAN.2010.5598646
Filename :
5598646
Link To Document :
بازگشت