Title :
Attentive gesture recognition
Author :
Dodge, S.F. ; Karam, Lina J.
Author_Institution :
Sch. of Electr., Arizona State Univ., Tempe, AZ, USA
fDate :
Sept. 30 2012-Oct. 3 2012
Abstract :
This paper presents a novel method for static gesture recognition based on visual attention. Our proposed method makes use of a visual attention model to automatically select points that correspond to fixation points of the human eye. Gesture recognition is then performed using the determined visual attention fixation points. For this purpose, shape context descriptors are used to compare the sparse fixation points of gestures for classification. Simulation results are presented in order to illustrate the performance of the proposed perceptual-based attentive gesture recognition method. The proposed method not only helps in the development of more natural user-centric interactive interfaces but is also able to achieve a 96.42% classification accuracy on the Triesch database of hand postures, which is superior to other methods presented in the literature.
Keywords :
gesture recognition; image classification; visual databases; Triesch database; classification accuracy; eye fixation point; gesture classification; hand posture; perceptual-based attentive gesture recognition method; shape context descriptor; sparse fixation point; static gesture recognition; user-centric interactive interface; visual attention fixation point; visual attention model; Context; Databases; Gesture recognition; Humans; Image segmentation; Shape; Visualization; Static gesture recognition; human computer interaction; visual attention;
Conference_Titel :
Image Processing (ICIP), 2012 19th IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4673-2534-9
Electronic_ISBN :
1522-4880
DOI :
10.1109/ICIP.2012.6466824