DocumentCode :
651112
Title :
Patch-based robust L1 tracker to dynamic appearance change
Author :
Won Jin Kim ; Tae-Hyun Oh ; Kyungdon Joo ; In So Kweon
Author_Institution :
IBM Korea, Seoul, South Korea
fYear :
2013
fDate :
Oct. 30 2013-Nov. 2 2013
Firstpage :
268
Lastpage :
273
Abstract :
In this paper, we propose a robust l1 tracking method based on a two phases sparse representation, which consists of a patch and a global appearance trackers. While recently proposed l1 trackers showed impressive tracking accuracies, tracking the dynamic appearance is not easy to them. To overcome dynamic appearance change and achieve robust visual tracking, we model the dynamic appearance of the object by a set of local rigid patches and enhance the distinctiveness of the global appearance tracker by positive/negative learning. The integration of two approaches makes visual tracking robust to occlusion and illumination variation. We demonstrate the experiments with five challenging video sequences and compare with state-of-art trackers. We show that the proposed method successfully handle occlusion, noise, scale, illumination, and appearance change of the object.
Keywords :
image representation; image sequences; learning (artificial intelligence); lighting; minimisation; object tracking; sparse matrices; dynamic appearance change; global appearance trackers; illumination variation robustness; local rigid patches; negative learning; noise handling; occlusion handling; occlusion variation robustness; patch-based robust L1 tracker; positive learning; robust visual tracking; scale handling; two-phase sparse representation; video sequences; Dynamic appearance; Patch-based tracking; Sparse representation; l1 minimization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Ubiquitous Robots and Ambient Intelligence (URAI), 2013 10th International Conference on
Conference_Location :
Jeju
Print_ISBN :
978-1-4799-1195-0
Type :
conf
DOI :
10.1109/URAI.2013.6677365
Filename :
6677365
Link To Document :
بازگشت