Title :
Gesture analysis for human-robot interaction
Author :
Kim, Kye Kyung ; Kwak, Keun Chang ; Su Young Ch
Author_Institution :
Electron. & Telecommun. Res. Inst.
Abstract :
This paper is to present gesture analysis for human-robot interaction. Gesture analysis is consisted of four processes such as detecting of hand in bimanual movements, splitting of a meaning gesture region from image stream, extracting features and recognizing gesture. Skin color analysis, image motion detection and shape information are used to detect bimanual hand movements and gesture spotting. Skin color information for tracking hand gesture is obtained from face detection region. The velocity of moving hand is calculated for detecting a meaning gesture region from consecutive image frames. Combined gesture features such as structural and statistical features are extracted from image stream. We have experimented to evaluate detection of bimanual hand movements and gesture recognition with a camera, which is pan/tilt and a single camera that is mounted on mobile robot. Performance evaluation of gesture recognition has experimented using ETRI database and an encouraging recognition rate of 89 % has been obtained
Keywords :
feature extraction; gesture recognition; image colour analysis; image motion analysis; man-machine systems; object detection; robot vision; bimanual movements; feature extraction; gesture analysis; gesture recognition; gesture spotting; hand detection; hand gesture; human-robot interaction; image motion detection; image stream; shape information; skin color analysis; statistical features; Cameras; Data mining; Feature extraction; Image analysis; Image color analysis; Image recognition; Motion detection; Robot vision systems; Skin; Streaming media; Detecting moving object; bimanual hand detection; gesture recognition.;
Conference_Titel :
Advanced Communication Technology, 2006. ICACT 2006. The 8th International Conference
Conference_Location :
Phoenix Park
Print_ISBN :
89-5519-129-4
DOI :
10.1109/ICACT.2006.206345