DocumentCode :
3206555
Title :
Positioning, tracking and mapping for outdoor augmentation
Author :
Karlekar, Jayashree ; Zhou, Steven ZhiYing ; Lu, Weiquan ; Loh, Zhi Chang ; Nakayama, Yuta ; Hii, Daniel
Author_Institution :
Dept. ECE, Nat. Univ. of Singapore, Singapore, Singapore
fYear :
2010
fDate :
13-16 Oct. 2010
Firstpage :
175
Lastpage :
184
Abstract :
This paper presents a novel approach for user positioning, robust tracking and online 3D mapping for outdoor augmented reality applications. As coarse user pose obtained from GPS and orientation sensors is not sufficient for augmented reality applications, sub-meter accurate user pose is then estimated by a one-step silhouette matching approach. Silhouette matching of the rendered 3D model and camera data is carried out with shape context descriptors as they are invariant to translation, scale and rotational errors, giving rise to a non-iterative registration approach. Once the user is correctly positioned, further tracking is carried out with camera data alone. Drifts associated with vision based approaches are minimized by combining different feature modalities. Robust visual tracking is maintained by fusing frame-to-frame and model-to-frame feature matches. Frame-to-frame tracking is accomplished with corner matching while edges are used for model-to-frame registration. Results from individual feature tracker are fused using a pose estimate obtained from an extended Kalman filter (EKF) and a weighted M-estimator. In scenarios where dense 3D models of the environment are not available, online 3D incremental mapping and tracking is proposed to track the user in unprepared environments. Incremental mapping prepares the 3D point cloud of the outdoor environment for tracking.
Keywords :
Global Positioning System; Kalman filters; augmented reality; cameras; image matching; image sequences; pose estimation; rendering (computer graphics); sensor fusion; shape recognition; solid modelling; tracking; 3D point cloud; GPS; camera data; coarse user pose estimation; corner matching; extended Kalman filter; feature tracker; frame to frame tracking; model to frame feature match; noniterative registration approach; one step silhouette matching approach; online 3D incremental mapping; online 3D mapping; orientation sensor; outdoor augmented reality application; rendered 3D model; robust tracking; robust visual tracking; rotational error; shape context descriptor; user positioning; vision based approach; weighted M-estimator; Cameras; Context; Robustness; Shape; Target tracking; Three dimensional displays; 3D mapping; Augmented reality; robust tracking; sensor fusion; shape matching; user positioning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on
Conference_Location :
Seoul
Print_ISBN :
978-1-4244-9343-2
Electronic_ISBN :
978-1-4244-9345-6
Type :
conf
DOI :
10.1109/ISMAR.2010.5643567
Filename :
5643567
Link To Document :
بازگشت