DocumentCode :
2456981
Title :
Layer-Based Media Integration for Mobile Mixed-Reality Applications
Author :
Lee, Ryong ; Kwon, Yong-jin ; Sumiya, Kazutoshi
Author_Institution :
Sch. of Human Sci. & Environ., Univ. of Hyogo, Himeji, Japan
fYear :
2009
fDate :
15-18 Sept. 2009
Firstpage :
58
Lastpage :
63
Abstract :
Rapidly evolving and widely-used smart phones provide many novel applications and services, making it possible to gather information from any location. Recent advances in technology have introduced many useful functions to assist users of outdoor mobile devices by sensing nearby conditions, such as the user´s current location and even the user´s slightest motions. Among these growing capabilities and their potential applications, the development of new mobile mixed-reality applications will need to consider various integration forms to be beneficial in both mobile device applications and services. In this paper, a layer-based media integration model for mobile mixed-reality applications is proposed to help developers gather diverse resources in a unified form, i.e., media, sensing, and even internal processing controls. In particular, to make it possible to identify and look for relevant information about real-world geospatial objects of interests in a very intuitive and direct manner by simply pointing towards an object, an object-identification layer is introduced. For each identified object, various types of services can be naturally integrated with the layer-based model; to take a snapshot or to jump to a related Web page, or to collaborate with the popular touch control. The authors address an integration model that will significantly simplify mixed-reality application development work to establish a relationship between a geospatial object, its graphic image on a project screen, and user interaction. The improved simplicity, advantages, and new capabilities of the proposed model are also validated with two implemented applications, ldquoautomatic tagging camerardquo and ldquotouch-based mixed-reality Web searchrdquo based on the layer model.
Keywords :
graphical user interfaces; human computer interaction; mobile computing; virtual reality; automatic tagging camera; graphic image; graphical user interface; layer-based media integration; mobile mixed-reality application; object-identification layer; project screen; real-world geospatial object; smart phone; touch-based mixed-reality Web search; user interaction; Cameras; Databases; Displays; Engines; Humans; Information retrieval; Mobile handsets; Object detection; Telecommunications; Virtual reality; Layer-based Integration; Mobile Mixed-Reality;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Next Generation Mobile Applications, Services and Technologies, 2009. NGMAST '09. Third International Conference on
Conference_Location :
Cardiff, Wales
Print_ISBN :
978-0-7695-3786-3
Type :
conf
DOI :
10.1109/NGMAST.2009.90
Filename :
5337106
Link To Document :
بازگشت