Title :
Multi-modal integration for personalized conversation: Towards a humanoid in daily life
Author :
Fujie, Shinya ; Watanabe, Daichi ; Ichikawa, Yuhi ; Taniyama, Hikaru ; Hosoya, Kosuke ; Matsuyama, Yoichi ; Kobayashi, Tetsunori
Author_Institution :
Waseda Inst. for Adv. Study, Waseda Univ., Tokyo
Abstract :
Humanoid with spoken language communication ability is proposed and developed. To make humanoid live with people, spoken language communication is fundamental because we use this kind of communication every day. However, due to difficulties of speech recognition itself and implementation on the robot, a robot with such an ability has not been developed. In this study, we propose a robot with the technique implemented to overcome these problems. This proposed system includes three key features, image processing, sound source separation, and turn-taking timing control. Processing image captured with camera mounted on the robotpsilas eyes enables to find and identify whom the robot should talked to. Sound source separation enables distant speech recognition, so that people need no special device, such as head-set microphones. Turn-taking timing control is often lacked in many conventional spoken dialogue system, but this is fundamental because the conversation proceeds in real-time. The effectiveness of these elements as well as the example of conversation are shown in experiments.
Keywords :
human-robot interaction; humanoid robots; robot vision; source separation; speech recognition; camera; distant speech recognition; humanoid robot; image processing; multimodal integration; personalized conversation; robot eyes; sound source separation; spoken dialogue system; spoken language communication ability; turn-taking timing control; Cameras; Control systems; Eyes; Image processing; Microphones; Natural languages; Robot vision systems; Source separation; Speech recognition; Timing;
Conference_Titel :
Humanoid Robots, 2008. Humanoids 2008. 8th IEEE-RAS International Conference on
Conference_Location :
Daejeon
Print_ISBN :
978-1-4244-2821-2
Electronic_ISBN :
978-1-4244-2822-9
DOI :
10.1109/ICHR.2008.4756014