DocumentCode :
629805
Title :
The what, why and how of achieving urban telepresence
Author :
Balfour, Robert E. ; Donnelly, Brian P.
Author_Institution :
Balfour Technol. LLC, Bethpage, NY, USA
fYear :
2013
fDate :
3-3 May 2013
Firstpage :
1
Lastpage :
6
Abstract :
21st century advancements in information technology are enabling a powerful emerging capability known as Urban Telepresence (UT). This capability allows users to experience an operational environment (e.g. an urban cityscape) via an immersive, remote browser interface. UT operators can interact in real-time with personnel and sensor assets in that environment, and can derive comprehensive shared situational awareness (SA) from a mixed reality (i.e. live-over-virtual-over-time) augmentation of the environment with supporting intelligence, including past/present/forecast information. The deployment of UT capability becomes a force multiplier for military operations as well as civilian safety, security and emergency response. By providing on-scene sensor data registered into the virtual scene of the real environment, UT facilitates effective supervisory control of the sensors to follow situation dynamics, and creates powerful enhanced “overwatch” abilities. With super-human ability to change perspective and evaluate the scene both spatially and temporally, the “virtual operator” becomes a valued team member to personnel in the environment in real-time. In the future, UT will play a substantial role for missions in both non-cooperative (e.g. military) and cooperative (e.g, natural disaster response) environments. [ourDscape® four-dimensional (4D) browser/server technology is already deployed and providing enhanced situational awareness in homeland security applications, and is currently being leveraged by the U.S. Air Force Research Laboratory (AFRL) as a baseline technology for developing and evaluating a comprehensive Urban Telepresence integrated system and human-machine interface (HMI). Some of the important technical components of this UT capability include: an lmmersive, augmented virtual reality (AVR) environment providing an untethered perspective into the real-world operational environment; a naturalistic user- interface for temporal-spatial navigation and information management; an effective HMI to perform supervisory control of manned/unmanned on-scene assets and sharing of vital information; and disparate multi-systems integration to develop a complete temporal-spatial 4D context for comprehensive shared situational awareness. In addition, a Testing/Training Methodology and simulation environment is being developed, combining both performance-based and knowledge-based measures of effectiveness (MoEs) techniques, which will evaluate the value UT adds to SA in an overwatch capacity.
Keywords :
aircraft navigation; augmented reality; graphical user interfaces; human computer interaction; information management; military computing; national security; natural scenes; personnel; real-time systems; spatiotemporal phenomena; 4D browser-server technology; AVR environment; HMI; MoE techniques; US AFRL; US Air Force Research Laboratory; UT capability; UT operators; augmented virtual reality; civilian safety; civilian security; complete temporal-spatial 4D context develop; comprehensive shared SA; comprehensive shared situational awareness; cooperative environments; emergency response; enhanced overwatch abilities; force multiplier; four-dimensional browser-server technology; fourDscape technology; homeland security applications; human-machine interface; immersive remote browser interface; information management; information technology; knowledge-based measures of effectiveness; manned on-scene assets; military operations; mixed reality augmentation; multi-systems integration; naturalistic user interface; noncooperative environments; on-scene sensor data registration; operational environment; overwatch capacity; performance-based measures of effectiveness; personnel; real environment; real-time environment; real-time interaction; scene evaluation; sensor assets; simulation environment; situation dynamics; spatial evaluation; super-human ability; supervisory control; team member; temporal evaluation; temporal-spatial navigation; testing methodology; training methodology; unmanned on-scene assets; urban telepresence; virtual operator; virtual scene; Browsers; Buildings; Cameras; Navigation; Real-time systems; Supervisory control; Virtual reality; augmented virtual reality; fourDscape; overwatch; situational awareness; urban telepresence;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Applications and Technology Conference (LISAT), 2013 IEEE Long Island
Conference_Location :
Farmingdale, NY
Print_ISBN :
978-1-4673-6244-3
Type :
conf
DOI :
10.1109/LISAT.2013.6578234
Filename :
6578234
Link To Document :
بازگشت