Title :
VTouch: Vision-enhanced interaction for large touch displays
Author :
Yinpeng Chen ; Zicheng Liu ; Phil Chou ; Zhengyou Zhang
Author_Institution :
Microsoft Res., Redmond, WA, USA
fDate :
June 29 2015-July 3 2015
Abstract :
We propose a system that augments touch input with visual understanding of the user to improve interaction with a large touch-sensitive display. A commodity color plus depth sensor such as Microsoft Kinect adds the visual modality and enables new interactions beyond touch. Through visual analysis, the system understands where the user is, who the user is, and what the user is doing even before the user touches the display. Such information is used to enhance interaction in multiple ways. For example, a user can use simple gestures to bring up menu items such as color palette and soft keyboard; menu items can be shown where the user is and can follow the user; hovering can show information to the user before the user commits to touch; the user can perform different functions (for example writing and erasing) with different hands; and the user´s preference profile can be maintained, distinct from other users. User studies are conducted and the users very much appreciate the value of these and other enhanced interactions.
Keywords :
colour displays; gesture recognition; image sensors; touch sensitive screens; Microsoft Kinect; VTOUCH; color palette; large touch-sensitive display; soft keyboard; touch input; vision-enhanced interaction; visual analysis; visual modality; visual understanding; Boards; Cameras; Color; Indexes; Keyboards; Three-dimensional displays; Visualization; RGBD sensor; gesture recognition; touch display; vision-enhanced;
Conference_Titel :
Multimedia and Expo (ICME), 2015 IEEE International Conference on
Conference_Location :
Turin
DOI :
10.1109/ICME.2015.7177390