Title :
Fusing vision and LIDAR - Synchronization, correction and occlusion reasoning
Author :
Schneider, Sebastian ; Himmelsbach, Michael ; Luettel, Thorsten ; Wuensche, Hans-Joachim
Author_Institution :
Inst. for Autonomous Syst. Technol. (TAS), Univ. of the Bundeswehr Munich, Neubiberg, Germany
Abstract :
Autonomous navigation in unstructured environments like forest or country roads with dynamic objects remains a challenging task, particularly with respect to the perception of the environment using multiple different sensors. The problem has been addressed from both, the computer vision community as well as from researchers working with laser range finding technology, like the Velodyne HDL-64. Since cameras and LIDAR sensors complement one another in terms of color and depth perception, the fusion of both sensors is reasonable in order to provide color images with depth and reflectance information as well as 3D LIDAR point clouds with color information. In this paper we propose a method for sensor synchronization, especially designed for dynamic scenes, a low-level fusion of the data of both sensors and we provide a solution for the occlusion problem that arises in conjunction with different viewpoints of the fusioned sensors.
Keywords :
colour graphics; colour vision; computer vision; navigation; optical radar; road traffic; sensor fusion; traffic engineering computing; 3D LIDAR point cloud; LIDAR sensor; Velodyne HDL-64; autonomous navigation; camera; color images; color perception; computer vision; country road; depth perception; forest; laser range finding; multiple sensor; occlusion reasoning; sensor synchronization; sensors fusion; Cameras; Color; Computer vision; Image sensors; Laser fusion; Laser radar; Navigation; Reflectivity; Roads; Sensor fusion;
Conference_Titel :
Intelligent Vehicles Symposium (IV), 2010 IEEE
Conference_Location :
San Diego, CA
Print_ISBN :
978-1-4244-7866-8
DOI :
10.1109/IVS.2010.5548079