Title :
A Robust and Fast Monocular-Vision-Based Hand Tracking Method for Virtual Touch Screen
Author :
Zhao Yong-jia ; Dai Shu-ling
Author_Institution :
Sch. of Autom. Sci. & Electr. Eng., Beihang Univ., Beijing, China
Abstract :
An articulated hand tracking method which can be used in virtual touch screen system (VTSS) using monocular camera is proposed. The VTSS is a novel vision-based interactive surface without the need for physical touch screen. The articulated hand model provides more information than fingertips to be in touch with the interactive surface. Using a simplified kinematic model of the hand, a rough estimation for automatic initialization is obtained using the fingertips´ orientation and position under hypothesis of open hand. Then pose of the palm is refined and updated using line features by Gauss-Newton method and re-computation of fingers´ pose based on particle filter followed. Experiments show the potential of the proposed method in applications embedded in VTSS.
Keywords :
cameras; virtual reality; Gauss-Newton method; articulated hand tracking method; hand kinematic model; monocular camera; monocular-vision-based hand tracking; particle filter; virtual touch screen system; vision-based interactive surface; Cameras; Fingers; Kinematics; Least squares methods; Newton method; Particle filters; Recursive estimation; Robustness; Rough surfaces; Surface roughness;
Conference_Titel :
Image and Signal Processing, 2009. CISP '09. 2nd International Congress on
Conference_Location :
Tianjin
Print_ISBN :
978-1-4244-4129-7
Electronic_ISBN :
978-1-4244-4131-0
DOI :
10.1109/CISP.2009.5304724