DocumentCode
2448367
Title
Robust pose estimation in untextured environments for augmented reality applications
Author
Guan, Wei ; Wang, Lu ; Mooser, Jonathan ; You, Suya ; Neumann, Ulrich
Author_Institution
Comput. Graphics & Immersive Technol. Lab., Univ. of Southern California, Los Angeles, CA, USA
fYear
2009
fDate
19-22 Oct. 2009
Firstpage
191
Lastpage
192
Abstract
We present a robust camera pose estimation approach for stereo images captured in untextured environments. Unlike most of existing registration algorithms which are point-based and make use of intensities of pixels in the neighborhood, our approach imports line segments in registration process. With line segments as primitives, the proposed algorithm is capable to handle untextured images such as scenes captured in man-made environments, as well as the cases when there are large viewpoint changes or illumination changes. Furthermore, since the proposed algorithm is robust to large base-line stereos, there are improvements on the accuracy of 3D points reconstruction. With well-calculated camera pose and object positions in 3D space, we can embed virtual objects into existing scene with higher accuracy for realistic effects. In our experiments, 2D labels are embedded in the 3D scene space to achieve annotation effects as in AR.
Keywords
augmented reality; cameras; image registration; image segmentation; pose estimation; stereo image processing; 3D points reconstruction; annotation effect; augmented reality; image registration; line segments; man-made environments; point-based algorithm; robust camera pose estimation; stereo images; untextured environments; Application software; Augmented reality; Cameras; Computer graphics; Image registration; Image segmentation; Layout; Lighting; Robustness; Stereo vision; augmented reality; image registration; pose estimation;
fLanguage
English
Publisher
ieee
Conference_Titel
Mixed and Augmented Reality, 2009. ISMAR 2009. 8th IEEE International Symposium on
Conference_Location
Orlando, FL
Print_ISBN
978-1-4244-5390-0
Electronic_ISBN
978-1-4244-5389-4
Type
conf
DOI
10.1109/ISMAR.2009.5336470
Filename
5336470
Link To Document