Title :
Novel Machine Learning for Hand Gesture Recognition Using Multiple View
Author_Institution :
Inst. of Commun. & Inf. Technol., Zhejiang Gongshang Univ., Hangzhou, China
Abstract :
Different from the conventional communication method between users and machines, we use hand gesture to control the equipments. This paper presents hand gesture recognition applied human-computer interaction (HCI) system. It presents new method to automatic gesture area segmentation and orientation normalization of the gesture. It is not mandatory for the user to keep upright gestures in the regular position, the system segments and normalizes the gestures automatically. The method is an unsupervised nonlinear dimensionality reduction approach that utilizes the local linearity to discover the low dimensional manifold embedded in the high dimensional space. This suggests that the method may preserve the neighborhood configuration for the nonlinear structure of the multi-view hand shape data distribution. The experiment shows this method is very accurate. The gesture pointing accuracy of our system is measured by 80 times of pointing recognition test, the success rate above 90%.
Keywords :
gesture recognition; human computer interaction; image segmentation; learning (artificial intelligence); statistical analysis; HCI system; automatic gesture area segmentation; conventional communication method; hand gesture recognition; human-computer interaction system; locally linear embedding; machine learning; multiview hand shape data distribution; orientation normalization; statistical LLE algorithm; unsupervised nonlinear dimensionality reduction approach; Automatic control; Cameras; Communication system control; Control systems; Human computer interaction; Machine learning; Microphones; Shape; Target tracking; Video sequences; gesture recognition; machine learning; manifold;
Conference_Titel :
Control, Automation and Systems Engineering, 2009. CASE 2009. IITA International Conference on
Conference_Location :
Zhangjiajie
Print_ISBN :
978-0-7695-3728-3
DOI :
10.1109/CASE.2009.169