DocumentCode
3325577
Title
Saliency selection for robust visual tracking
Author
Wang, Qing ; Chen, Feng ; Xu, Wenli
Author_Institution
Dept. of Autom., Tsinghua Univ., Beijing, China
fYear
2010
fDate
26-29 Sept. 2010
Firstpage
2785
Lastpage
2788
Abstract
This paper proposes a robust visual tracking approach based on saliency selection. In this method, salient patches and their spatial context inside the object region are exploited for object representation and appearance modeling. Tracking is then implemented by a hybrid stochastic and deterministic mechanism, which needs a small number of samples for particle filtering and escapes local minimum in conventional deterministic tracking. As time progresses, the selected salient patches and their spatial context are updated online to adapt the appearance model to both object and environmental changes. We carry out experiments on several challenging sequences and compare our method with the state-of-the-art algorithm to show its improvement in terms of tracking performance.
Keywords
image representation; object tracking; particle filtering (numerical methods); appearance modeling; conventional deterministic tracking; object region; object representation; particle filtering; robust visual tracking; saliency selection; salient patches; Adaptation model; Computational modeling; Histograms; Stochastic processes; Target tracking; Visualization; Saliency selection; adaptive appearance modeling; hybrid of stochastic and deterministic tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Image Processing (ICIP), 2010 17th IEEE International Conference on
Conference_Location
Hong Kong
ISSN
1522-4880
Print_ISBN
978-1-4244-7992-4
Electronic_ISBN
1522-4880
Type
conf
DOI
10.1109/ICIP.2010.5651016
Filename
5651016
Link To Document