DocumentCode :
138106
Title :
Guiding computational perception through a shared auditory space
Author :
Martinson, E. ; Yalla, V.
Author_Institution :
InfoTechnol. Center, Toyota, Mountain View, CA, USA
fYear :
2014
fDate :
14-18 Sept. 2014
Firstpage :
3156
Lastpage :
3161
Abstract :
Blind or visually impaired people want to know more about things they hear in the world. They want to know what other people can “see”. With its cameras, a robot can fill that role. But how can an individual make requests about arbitrary objects they can only hear? How can people make requests about objects they do not know either the exact location of, or any uniquely identifiable traits? This work describes a solution for querying the robot about unknown, audible objects in the surrounding environment through a combination of computational sound source localization and human input including pointing gestures and spoken descriptors.
Keywords :
acoustic generators; cameras; handicapped aids; service robots; vision defects; audible objects; blind people; cameras; computational perception guidance; computational sound source localization; human input; pointing gestures; shared auditory space; spoken descriptors; visually impaired people; Arrays; Computers; Microphones; Object recognition; Position measurement; Robots; Speech;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Robots and Systems (IROS 2014), 2014 IEEE/RSJ International Conference on
Conference_Location :
Chicago, IL
Type :
conf
DOI :
10.1109/IROS.2014.6942999
Filename :
6942999
Link To Document :
بازگشت