DocumentCode :
2334409
Title :
Gesture interface: modeling and learning
Author :
Yang, Jie ; Xu, Yangsheng ; Chen, C.S.
Author_Institution :
Robotics Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
1994
fDate :
8-13 May 1994
Firstpage :
1747
Abstract :
This paper presents a method for developing a gesture-based system using a multidimensional hidden Markov model (HMM). Instead of using geometric features, gestures are converted into sequential symbols. HMMs are employed to represent the gestures and their parameters are learned from the training data. Based on “the most likely performance” criterion, the gestures can be recognized by evaluating the trained HMMs. We have developed a prototype to demonstrate the feasibility of the proposed method. The system achieved 99.78% accuracy for a 9 gesture isolated recognition task. Encouraging results were also obtained from experiments of continuous gesture recognition. The proposed method is applicable to any multidimensional signal representation gesture, and will be a valuable tool in telerobotics and human computer interfacing
Keywords :
hidden Markov models; learning systems; man-machine systems; pattern recognition; robots; user interfaces; Gesture interface; continuous gesture recognition; geometric features; gesture-based system; human computer interfacing; learning; modeling; multidimensional hidden Markov model; multidimensional signal representation gesture; sequential symbols; telerobotics; Dictionaries; Hidden Markov models; Humans; Neural networks; Pattern recognition; Performance evaluation; Prototypes; Robot sensing systems; Statistical distributions; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation, 1994. Proceedings., 1994 IEEE International Conference on
Conference_Location :
San Diego, CA
Print_ISBN :
0-8186-5330-2
Type :
conf
DOI :
10.1109/ROBOT.1994.351340
Filename :
351340
Link To Document :
بازگشت