Title :
Visual attention by saliency leads cross-modal body representation
Author :
Hikita, Masayuki ; Fuke, S. ; Ogino, Mamami ; Minato, Tsuneaki ; Asada, Minoru
Author_Institution :
Grad. Sch. of Eng., Osaka Univ., Suita
Abstract :
One of the most fundamental issues for physical agents (humans, primates, and robots) in performing various kinds of tasks is body representation. Especially during tool-use by monkeys, neurophysiological evidence shows that the representation can be dynamically reconstructed by spatio-temporal integration of different sensor modalities so that it can be adaptive to environmental changes. However, to construct such a representation, an issue to be resolved is how to associate which information among various sensory data. This paper presents a method that constructs cross-modal body representation from vision, touch, and proprioception. Tactile sensation, when the robot touches something, triggers the construction process of the visual receptive field for body parts that can be found by visual attention based on a saliency map and consequently regarded as the end effector. Simultaneously, proprioceptive information is associated with this visual receptive field to achieve the cross-modal body representation. The proposed model is applied to a real robot and results comparable to the activities of parietal neurons observed in monkeys are shown.
Keywords :
cognition; neurophysiology; spatiotemporal phenomena; touch (physiological); visual perception; cross-modal body representation; environmental changes; neurophysiological evidence; parietal neurons; proprioception; spatio-temporal integration; tactile sensation; visual attention; visual receptive field; Automatic control; Biological system modeling; Biological systems; Cameras; Cognitive robotics; Humans; Intelligent robots; Neurons; Robot sensing systems; Robot vision systems;
Conference_Titel :
Development and Learning, 2008. ICDL 2008. 7th IEEE International Conference on
Conference_Location :
Monterey, CA
Print_ISBN :
978-1-4244-2661-4
Electronic_ISBN :
978-1-4244-2662-1
DOI :
10.1109/DEVLRN.2008.4640822