DocumentCode :
2668608
Title :
Vision-based multisensor integration in remote-brained robots
Author :
Inaba, Masayuki ; Kagami, Satoshi ; Sakaki, Kazuhiko ; Kanehiro, Fumio ; Inoue, Hirochika
Author_Institution :
Dept. of Mechano-Informatics, Tokyo Univ., Japan
fYear :
1994
fDate :
2-5 Oct 1994
Firstpage :
747
Lastpage :
754
Abstract :
This paper presents a method for the integration of multiple sensors in a “remote-brained robot”, that is, one where processing capability is accessed by wireless links. The remote-brained approach allowed the authors to build a robot with a free body and a heavy brain. In order to integrate multiple sensors in a remote-brained robot, it is crucial to consider how to multiplex multisensor information. The authors have examined several configurations and developed a method using a video screen to integrate touch, force and visual information in a sensor image. This paper describes the possible configurations for multisensor integration in remote-brained robots and presents real examples of vision-based sensor integration for a dynamic biped robot and a multi-limbed apelike robot
Keywords :
mobile robots; robot vision; sensor fusion; telerobotics; dynamic biped robot; force; multi-limbed apelike robot; remote-brained robots; touch; video screen; vision-based multisensor integration; visual information; wireless links; Animals; Cameras; Computer vision; Hardware; Intelligent robots; Machine vision; Robot sensing systems; Robot vision systems; Sensor systems; TV;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Multisensor Fusion and Integration for Intelligent Systems, 1994. IEEE International Conference on MFI '94.
Conference_Location :
Las Vegas, NV
Print_ISBN :
0-7803-2072-7
Type :
conf
DOI :
10.1109/MFI.1994.398380
Filename :
398380
Link To Document :
بازگشت