Title :
Multi-scale panoramic Augmented Reality
Author :
Arican, E. ; Ozden, Kemal Egemen ; Oguz, S.
Author_Institution :
Bilgisayar Muhendisligi Bolumu, Bahcesehir Univ., Istanbul, Turkey
Abstract :
We are describing an Augmented Reality (AR) system which uses a panoramic representation and aims at easy interaction with the environment. The final target is to render meta data about the environment in the right place on the images coming from the mobile device´s camera. The local feature points on the images coming from a rotating camera with varying zoom levels are put into a hierarchical tree structure which represents a panoramic image. Panoramic images as anchor is getting more attention in AR field and novel aspects of our work can be described as follows: 1- Panoramic representation is multi-scale, tree structured and consists of local image features 2-The proposal of multi-scale “Augmented Knowledge”. Our system can achieve lower error rates for close-up scenes in comparison to classical non-vision techniques and image lookup is efficient thanks to the tree structure. Since only local images features are kept, many complicated steps for actual panorama creation are avoided while the need for global optimization of the computed geometry is minimized.
Keywords :
augmented reality; feature extraction; image representation; meta data; AR system; classical nonvision techniques; close-up scenes; computed geometry; global optimization; hierarchical tree structure; image lookup; local feature points; local image features; meta data; mobile device camera; multiscale augmented knowledge; multiscale panoramic augmented reality; panorama creation; panoramic image; panoramic representation; rotating camera; varying zoom levels; Abstracts; Augmented reality; Cameras; Computer vision; Geometry; Google; Mobile communication; computer vision; multi scale augmented reality; panaromic images;
Conference_Titel :
Signal Processing and Communications Applications Conference (SIU), 2013 21st
Conference_Location :
Haspolat
Print_ISBN :
978-1-4673-5562-9
Electronic_ISBN :
978-1-4673-5561-2
DOI :
10.1109/SIU.2013.6531539