DocumentCode :
3688467
Title :
Vision-based Markov localization across large perceptual changes
Author :
Tayyab Naseer;Benjamin Suger;Michael Ruhnke;Wolfram Burgard
Author_Institution :
Autonomous Intelligent Systems Group, University of Freiburg, Germany
fYear :
2015
Firstpage :
1
Lastpage :
6
Abstract :
Recently, there has been significant progress towards lifelong, autonomous operation of mobile robots, especially in the field of localization and mapping. One important challenge in this context is visual localization under substantial perceptual changes, for example, coming from different seasons. In this paper, we present an approach to localize a mobile robot with a low frequency camera with respect to an image sequence, recorded previously within a different season. Our approach uses a discrete Bayes filter and a sensor model based on whole image descriptors. Thereby it exploits sequential information to model the dynamics of the system. Since we compute a probability distribution over the whole state space, our approach can handle more complex trajectories that may include same season loop-closures as well as fragmented sub-sequences. Throughout an extensive experimental evaluation on challenging datasets, we demonstrate that our approach outperforms state-of-the-art techniques.
Keywords :
"Databases","Robot sensing systems","Context","Lead","Cameras","Matched filters"
Publisher :
ieee
Conference_Titel :
Mobile Robots (ECMR), 2015 European Conference on
Type :
conf
DOI :
10.1109/ECMR.2015.7324181
Filename :
7324181
Link To Document :
بازگشت