• DocumentCode
    3580239
  • Title

    Towards dense moving object segmentation based robust dense RGB-D SLAM in dynamic scenarios

  • Author

    Youbing Wang ; Shoudong Huang

  • Author_Institution
    Fac. of Eng. & Inf. Technol., Univ. of Technol., Sydney, NSW, Australia
  • fYear
    2014
  • Firstpage
    1841
  • Lastpage
    1846
  • Abstract
    Based on the latest achievements in computer vision and RGB-D SLAM, a practical way for dense moving object segmentation and thus a new framework for robust dense RGB-D SLAM in challenging dynamic scenarios is put forward. As the state-of-the-art method in RGB-D SLAM, dense SLAM is very robust when there are motion blur or featureless regions, while most of those sparse feature-based methods could not handle them. However, it is very susceptible to dynamic elements in the scenarios. To enhance its robustness in dynamic scenarios, we propose to combine dense moving object segmentation with dense SLAM. Since the object segmentation results from the latest available algorithm in computer vision are not satisfactory, we propose some effective measures to improve upon them so that better results can be achieved. After dense segmentation of dynamic objects, dense SLAM can be employed to estimate the camera poses. Quantitative results from the available challenging benchmark dataset have proved the effectiveness of our method.
  • Keywords
    SLAM (robots); computer vision; image restoration; image segmentation; image sensors; motion estimation; object detection; benchmark dataset; camera poses; computer vision; dynamic elements; dynamic scenarios; motion blur; object segmentation; robust dense RGB-D SLAM; sparse feature based methods; Cameras; Computer vision; Dynamics; Motion segmentation; Object segmentation; Robustness; Simultaneous localization and mapping; RGB-D SLAM; motion segmentation; moving object segmentation; robustness;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Control Automation Robotics & Vision (ICARCV), 2014 13th International Conference on
  • Type

    conf

  • DOI
    10.1109/ICARCV.2014.7064596
  • Filename
    7064596