DocumentCode :
1629471
Title :
An adaptive fusion architecture for target tracking
Author :
Loy, Gareth ; Fletcher, Luke ; Apostoloff, Nicholas ; Zelinsky, Alexander
Author_Institution :
Dept. of Syst. Eng., Australian Nat. Univ., Canberra, ACT, Australia
fYear :
2002
Firstpage :
261
Lastpage :
266
Abstract :
A vision system is demonstrated that adaptively allocates computational resources over multiple cues to robustly track a target in 3D. The system uses a particle filter to maintain multiple hypotheses of the target location. Bayesian probability theory provides the framework for sensor fusion, and resource scheduling is used to intelligently allocate the limited computational resources available across the suite of cues. The system is shown to track a person in 3D space moving in a cluttered environment.
Keywords :
Bayes methods; computer vision; image motion analysis; probability; resource allocation; scheduling; sensor fusion; tracking; 3D target tracking; Bayesian probability theory; adaptive fusion architecture; cluttered environment; computational resource allocation; computer vision system; multiple cues; particle filter; person tracking; resource scheduling; sensor fusion; Bayesian methods; Computer architecture; Computer vision; Machine vision; Particle filters; Processor scheduling; Resource management; Robustness; Sensor fusion; Target tracking;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Automatic Face and Gesture Recognition, 2002. Proceedings. Fifth IEEE International Conference on
Conference_Location :
Washington, DC, USA
Print_ISBN :
0-7695-1602-5
Type :
conf
DOI :
10.1109/AFGR.2002.1004164
Filename :
1004164
Link To Document :
بازگشت