DocumentCode :
2687263
Title :
Finding location using omnidirectional video on a wearable computing platform
Author :
Rungsarityotin, Wasinee ; Starner, Thad E.
Author_Institution :
Coll. of Comput., Georgia Inst. of Technol., Atlanta, GA, USA
fYear :
2000
fDate :
16-17 Oct. 2000
Firstpage :
61
Lastpage :
68
Abstract :
In this paper we present a framework for a navigation system in an indoor environment using only omnidirectional video. Within a Bayesian framework we seek the appropriate place and image from the training data to describe what we currently see and infer a location. The posterior distribution over the state space conditioned on image similarity is typically not Gaussian. The distribution is represented using sampling and the location is predicted and verified over time using the condensation algorithm. The system does not require complicated feature detection, but uses a simple metric between two images. Even with low resolution input, the system may achieve accurate results with respect to the training data when given favorable initial conditions.
Keywords :
Bayes methods; feature extraction; navigation; notebook computers; Bayesian framework; condensation algorithm; image similarity; indoor environment; navigation system; omnidirectional video; posterior distribution; wearable computing platform; Cameras; Computer vision; Feedback; Mobile robots; Robot sensing systems; Robot vision systems; Sensor systems; Training data; Wearable computers; Wearable sensors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Wearable Computers, The Fourth International Symposium on
Conference_Location :
Atlanta, GA, USA
Print_ISBN :
0-7695-0795-6
Type :
conf
DOI :
10.1109/ISWC.2000.888466
Filename :
888466
Link To Document :
بازگشت