Title :
[POSTER] Fusion of Vision and Inertial Sensing for Accurate and Efficient Pose Tracking on Smartphones
Author :
Xin Yang;Xun Si;Tangli Xue;Kwang-Ting Tim Cheng
Abstract :
This paper aims at accurate and efficient pose tracking of planar targets on modern smartphones. Existing methods, relying on either visual features or motion sensing based on built-in inertial sensors, are either too computationally expensive to achieve realtime performance on a smartphone, or too noisy to achieve sufficient tracking accuracy. In this paper we present a hybrid tracking method which can achieve real-time performance with high accuracy. Based on the same framework of a state-of-the-art visual feature tracking algorithm [5] which ensures accurate and reliable pose tracking, the proposed hybrid method significantly reduces its computational cost with the assistance of a phone´s built-in inertial sensors. However, noises in inertial sensors and abrupt errors in feature tracking due to severe motion blurs could result in instability of the hybrid tracking system. To address this problem, we propose to employ an adaptive Kalman filter with abrupt error detection to robustly fuse the inertial and feature tracking results. We evaluated the proposed method on a dataset consisting of 16 video clips with synchronized inertial sensing data. Experimental results demonstrated our method´s superior performance and accuracy on smartphones compared to a state-of-the-art vision tracking method [5]. The dataset will be made publicly available with the publication of this paper.
Keywords :
"Tracking","Visualization","Sensors","Feature extraction","Cameras","Accuracy","Smart phones"
Conference_Titel :
Mixed and Augmented Reality (ISMAR), 2015 IEEE International Symposium on
DOI :
10.1109/ISMAR.2015.23