DocumentCode :
3163873
Title :
Self-organizing visual perception for mobile robot navigation
Author :
Von Wichert, Georg
Author_Institution :
Control Syst. Theory & Robotics Dept., Tech. Univ. Darmstadt, Germany
fYear :
1996
fDate :
9-11 Oct 1996
Firstpage :
194
Lastpage :
200
Abstract :
Visual navigation for mobile robots is almost always performed on the basis of complex CAD models which have to be given to the system in advance. There are some approaches that map visual information more or less directly to robot actions in order to implement some basic capabilities like road following or docking. Complex navigation task, e.g. in buildings, however, are usually solved using three dimensional models of the environment or the objects within it. This paper presents a method for self-organizing the visual perception of a mobile robot in order to adapt it to the surroundings without the need to define and model the relevant aspects of the environment. The system uses a self-organization process to transform the continuous pour of images into a limited number of discrete perceptions which can be used far navigation purposes. Results obtained in an indoor office environment are presented. Nevertheless, the approach is suited also for outdoor navigation
Keywords :
feature extraction; image segmentation; mobile robots; navigation; path planning; robot vision; self-adjusting systems; indoor office environment; mobile robot navigation; outdoor navigation; robot actions; self-organizing visual perception; visual information; Cameras; Control system synthesis; Discrete transforms; Image sensors; Mobile robots; Navigation; Robot sensing systems; Robot vision systems; Sensor systems; Visual perception;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Advanced Mobile Robot, 1996., Proceedings of the First Euromicro Workshop on
Conference_Location :
Kaiserslautern
Print_ISBN :
0-8186-7695-7
Type :
conf
DOI :
10.1109/EURBOT.1996.552020
Filename :
552020
Link To Document :
بازگشت