DocumentCode :
2228070
Title :
Using eye contact and contextual speech recognition for hands-free surgical charting
Author :
Lepinski, G.Julian ; Vertegaal, Roel
Author_Institution :
Human Media Lab., Queen´´s Univ., Kingston, ON
fYear :
2008
fDate :
Jan. 30 2008-Feb. 1 2008
Firstpage :
119
Lastpage :
120
Abstract :
In this paper we discuss ongoing research into applications for multimodal Attentive User Interfaces in hands-free charting during surgical procedures. Although speech recognition has matured enough that it can now be used for some software and hardware control, speech recognition solutions still have trouble filtering ldquocommand speechrdquo from ldquoambient speech.rdquo Our research builds on previous research that couples eye contact sensing with speech recognition to gauge intent. Users enable a voice activation system used for surgical time charting by fixing their gaze on a small camera before speaking command words.
Keywords :
eye; medical computing; speech recognition; speech-based user interfaces; surgery; ambient speech; command speech; contextual speech recognition; eye contact; hands-free surgical charting; multimodal attentive user interfaces; surgical time charting; Application software; Cameras; Face detection; Hardware; Information filtering; Information filters; Light emitting diodes; Microphones; Speech recognition; Surgery; Attentive UI; charting; gaze; voice recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pervasive Computing Technologies for Healthcare, 2008. PervasiveHealth 2008. Second International Conference on
Conference_Location :
Tampere
Print_ISBN :
978-963-9799-15-8
Type :
conf
DOI :
10.1109/PCTHEALTH.2008.4571047
Filename :
4571047
Link To Document :
بازگشت