Title :
Getting from Here to There: Locomotion in Virtual Environments
Author_Institution :
Univ. of North Carolina, Chapel Hill, NC, USA
Abstract :
Summary form only given. Walking is perhaps the most fundamental example of user interaction with a virtual environment (VE). Making walking, a sensory-motor action, seem natural to the VE user is a problem with many facets: locomotion interfaces should properly stimulate the senses, they should enable movement in any direction, e.g., walking backwards), and they should not impair the user´s ability to build a mental model of the scene. Evaluating common locomotion techniques using path analysis and task performance has revealed their relative strengths and weaknesses. I will discuss three threads of recent research in locomotion: increasing the realism of eye position movement (and resulting optical flow), techniques that enable VE users in head-worn-display to really walk around scenes that are larger than the active area of their head tracker, and investigations of the effects of various locomotion interfaces on the cognitive aspects of locomotion.
Keywords :
image motion analysis; image sensors; image sequences; user interfaces; virtual reality; eye position movement; head tracker; head-worn-display; locomotion interface; optical flow; path analysis; sensory-motor action; task performance; user interaction; virtual environment; walking; Cognitive science; Image motion analysis; Legged locomotion; Message systems; Optical sensors; Tracking; Virtual environment;
Conference_Titel :
Distributed Simulation and Real Time Applications (DS-RT), 2010 IEEE/ACM 14th International Symposium on
Conference_Location :
Fairfax, VA
Print_ISBN :
978-1-4244-8651-9
DOI :
10.1109/DS-RT.2010.38