DocumentCode :
31641
Title :
A Robust Vision-Based Sensor Fusion Approach for Real-Time Pose Estimation
Author :
Assa, Akbar ; Janabi-Sharifi, F.
Author_Institution :
Dept. of Mech. & Ind. Eng., Ryerson Univ., Toronto, ON, Canada
Volume :
44
Issue :
2
fYear :
2014
fDate :
Feb. 2014
Firstpage :
217
Lastpage :
227
Abstract :
Object pose estimation is of great importance to many applications, such as augmented reality, localization and mapping, motion capture, and visual servoing. Although many approaches based on a monocular camera have been proposed, only a few works have concentrated on applying multicamera sensor fusion techniques to pose estimation. Higher accuracy and enhanced robustness toward sensor defects or failures are some of the advantages of these schemes. This paper presents a new Kalman-based sensor fusion approach for pose estimation that offers higher accuracy and precision, and is robust to camera motion and image occlusion, compared to its predecessors. Extensive experiments are conducted to validate the superiority of this fusion method over currently employed vision-based pose estimation algorithms.
Keywords :
Kalman filters; cameras; computer vision; image fusion; nonlinear filters; pose estimation; Kalman-based sensor fusion approach; augmented reality; camera motion; extended Kalman filter; image occlusion; monocular camera; motion capture; multicamera sensor fusion techniques; object pose estimation; real-time pose estimation; robust vision-based sensor fusion approach; sensor defects; vision-based pose estimation algorithms; visual servoing; 3-D object pose estimation; adaptive; extended Kalman filter; iterative; robust estimation; sensor fusion;
fLanguage :
English
Journal_Title :
Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
2168-2267
Type :
jour
DOI :
10.1109/TCYB.2013.2252339
Filename :
6506990
Link To Document :
بازگشت