Title :
Visual-inertial simultaneous localization, mapping and sensor-to-sensor self-calibration
Author :
Kelly, Jonathan ; Sukhatme, Gaurav S.
Author_Institution :
Robotic Embedded Syst. Lab., Univ. of Southern California, Los Angeles, CA, USA
Abstract :
Visual and inertial sensors, in combination, are well-suited for many robot navigation and mapping tasks. However, correct data fusion, and hence overall system performance, depends on accurate calibration of the 6-DOF transform between the sensors (one or more camera(s) and an inertial measurement unit). Obtaining this calibration information is typically difficult and time-consuming. In this paper, we describe an algorithm, based on the unscented Kalman filter (UKF), for camera-IMU simultaneous localization, mapping and sensor relative pose self-calibration. We show that the sensor-to-sensor transform, the IMU gyroscope and accelerometer biases, the local gravity vector, and the metric scene structure can all be recovered from camera and IMU measurements alone. This is possible without any prior knowledge about the environment in which the robot is operating. We present results from experiments with a monocular camera and a low-cost solid-state IMU, which demonstrate accurate estimation of the calibration parameters and the local scene structure.
Keywords :
Kalman filters; gyroscopes; robots; sensor fusion; IMU gyroscope; data fusion; inertial measurement unit; mapping tasks; monocular camera; pose self-calibration; robot navigation; sensor-to-sensor self-calibration; unscented Kalman filter; visual-inertial simultaneous localization; Calibration; Cameras; Layout; Navigation; Robot sensing systems; Robot vision systems; Sensor fusion; Sensor systems; Simultaneous localization and mapping; System performance;
Conference_Titel :
Computational Intelligence in Robotics and Automation (CIRA), 2009 IEEE International Symposium on
Conference_Location :
Daejeon
Print_ISBN :
978-1-4244-4808-1
Electronic_ISBN :
978-1-4244-4809-8
DOI :
10.1109/CIRA.2009.5423178