• DocumentCode
    3528552
  • Title

    Gaussian Process Gauss-Newton for 3D laser-based Visual Odometry

  • Author

    Chi Hay Tong ; Barfoot, Timothy D.

  • Author_Institution
    Autonomous Space Robot. Lab., Univ. of Toronto, Toronto, ON, Canada
  • fYear
    2013
  • fDate
    6-10 May 2013
  • Firstpage
    5204
  • Lastpage
    5211
  • Abstract
    In this paper, we present a method for obtaining Visual Odometry (VO) estimates using a scanning laser rangefinder. Though common VO implementations utilize stereo camera imagery, cameras are dependent on ambient light. In contrast, actively-illuminated sensors such as laser rangefinders work in a variety of lighting conditions, including full darkness. We leverage previous successes by applying sparse appearance-based methods to laser intensity images, and address the issue of motion distortion by considering the estimation problem in continuous time. This is facilitated by Gaussian Process Gauss-Newton (GPGN), an algorithm for non-parametric, continuous-time, nonlinear, batch state estimation. We include a concise derivation of GPGN, along with details on the extension to three-dimensions (3D). Validation of the 3D laser-based VO framework is provided using 1.1km of experimental data, which was gathered by a field robot equipped with a two-axis scanning lidar.
  • Keywords
    Gaussian processes; Newton method; image sensors; laser ranging; mobile robots; optical radar; robot vision; stereo image processing; 3D laser-based VO framework; 3D laser-based visual odometry; GPGN; Gaussian Process Gauss-Newton algorithm; VO implementations; actively-illuminated sensors; ambient light; field robot; laser intensity images; motion distortion; scanning laser rangefinder; sparse appearance-based methods; stereo camera imagery; two-axis scanning lidar; Laser radar; Lasers; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Robotics and Automation (ICRA), 2013 IEEE International Conference on
  • Conference_Location
    Karlsruhe
  • ISSN
    1050-4729
  • Print_ISBN
    978-1-4673-5641-1
  • Type

    conf

  • DOI
    10.1109/ICRA.2013.6631321
  • Filename
    6631321