DocumentCode :
2944724
Title :
Visual-lidar odometry and mapping: low-drift, robust, and fast
Author :
Ji Zhang ; Singh, Sanjiv
Author_Institution :
Robot. Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
2015
fDate :
26-30 May 2015
Firstpage :
2174
Lastpage :
2181
Abstract :
Here, we present a general framework for combining visual odometry and lidar odometry in a fundamental and first principle method. The method shows improvements in performance over the state of the art, particularly in robustness to aggressive motion and temporary lack of visual features. The proposed on-line method starts with visual odometry to estimate the ego-motion and to register point clouds from a scanning lidar at a high frequency but low fidelity. Then, scan matching based lidar odometry refines the motion estimation and point cloud registration simultaneously.We show results with datasets collected in our own experiments as well as using the KITTI odometry benchmark. Our proposed method is ranked #1 on the benchmark in terms of average translation and rotation errors, with a 0.75% of relative position drift. In addition to comparison of the motion estimation accuracy, we evaluate robustness of the method when the sensor suite moves at a high speed and is subject to significant ambient lighting changes.
Keywords :
distance measurement; image matching; image registration; motion estimation; optical radar; KITTI odometry benchmark; aggressive motion; ambient lighting changes; ego-motion estimation; first principle method; point cloud registration; scan matching based lidar odometry; temporary lack of visual features; visual-lidar mapping; visual-lidar odometry; Cameras; Distortion; Feature extraction; Laser radar; Motion estimation; Three-dimensional displays; Visualization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation (ICRA), 2015 IEEE International Conference on
Conference_Location :
Seattle, WA
Type :
conf
DOI :
10.1109/ICRA.2015.7139486
Filename :
7139486
Link To Document :
بازگشت