DocumentCode
2288158
Title
Efficient discriminative local learning for object recognition
Author
Lin, Yen-Yu ; Tsai, Jyun-Fan ; Liu, Tyng-Luh
fYear
2009
fDate
Sept. 29 2009-Oct. 2 2009
Firstpage
598
Lastpage
605
Abstract
Although object recognition methods based on local learning can reasonably resolve the difficulties caused by the large variations in images from the same category, the high risk of overfitting and the heavy computational cost in training numerous local models (classifiers or distance functions) often limit their applicability. To address these two unpleasant issues, we cast the multiple, independent training processes of local models as a correlative multi-task learning problem, and design a new boosting algorithm to accomplish it. Specifically, we establish a parametric space where these local models lie and spread as a manifold-like structure, and use boosting to perform local model training by completing the manifold embedding. Via sharing the common embedding space, the learning of each local model can be properly regularized by the extra knowledge from other models, while the training time is also significantly reduced. Experimental results on two benchmark datasets, Caltech-101 and VOC 2007, support that our approach not only achieves promising recognition rates but also gives a two order speed-up in realizing local learning.
Keywords
image classification; learning (artificial intelligence); object recognition; boosting algorithm; correlative multitask learning problem; discriminative local learning; image variations; manifold-like structure; object recognition methods; Airplanes; Boosting; Face detection; Image resolution; Information science; Linear discriminant analysis; Machine learning; Object recognition; Training data; Uncertainty;
fLanguage
English
Publisher
ieee
Conference_Titel
Computer Vision, 2009 IEEE 12th International Conference on
Conference_Location
Kyoto
ISSN
1550-5499
Print_ISBN
978-1-4244-4420-5
Electronic_ISBN
1550-5499
Type
conf
DOI
10.1109/ICCV.2009.5459182
Filename
5459182
Link To Document