DocumentCode :
3593018
Title :
Visual attention in spoken human-robot interaction
Author :
Staudte, Maria ; Crocker, Matthew W.
Author_Institution :
Dept. of Comput. Linguistics, Saarland Univ., Saarbrucken, Germany
fYear :
2009
Firstpage :
77
Lastpage :
84
Abstract :
Psycholinguistic studies of situated language processing have revealed that gaze in the visual environment is tightly coupled with both spoken language comprehension and production. It has also been established that interlocutors monitor the gaze of their partners, a phenomenon called “joint attention”, as a further means for facilitating mutual understanding. We hypothesise that human-robot interaction will benefit when the robot´s language-related gaze behaviour is similar to that of people, potentially providing the user with valuable non-verbal information concerning the robot´s intended message or the robot´s successful understanding. We report findings from two eye-tracking experiments demonstrating (1) that human gaze is modulated by both the robot speech and gaze, and (2) that human comprehension of robot speech is improved when the robot´s real-time gaze behaviour is similar to that of humans.
Keywords :
human factors; human-robot interaction; robot vision; speech-based user interfaces; eye-tracking experiments; joint attention; language-related gaze behaviour; psycholinguistic studies; robot gaze; robot speech; situated language processing; spoken human-robot interaction; spoken language comprehension; spoken language production; valuable nonverbal information; visual attention; Humans; Monitoring; Robots; Speech; Time factors; Videos; Visualization; experimental methods; gaze; user study; visual attention;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Human-Robot Interaction (HRI), 2009 4th ACM/IEEE International Conference on
ISSN :
2167-2121
Print_ISBN :
978-1-60558-404-1
Type :
conf
Filename :
6256097
Link To Document :
بازگشت