DocumentCode :
3672184
Title :
Learning coarse-to-fine sparselets for efficient object detection and scene classification
Author :
Gong Cheng;Junwei Han;Lei Guo;Tianming Liu
Author_Institution :
School of Automation, Northwestern Polytechnical University, Xi´an, China
fYear :
2015
fDate :
6/1/2015 12:00:00 AM
Firstpage :
1173
Lastpage :
1181
Abstract :
Part model-based methods have been successfully applied to object detection and scene classification and have achieved state-of-the-art results. More recently the “sparselets” work [1-3] were introduced to serve as a universal set of shared basis learned from a large number of part detectors, resulting in notable speedup. Inspired by this framework, in this paper, we propose a novel scheme to train more effective sparselets with a coarse-to-fine framework. Specifically, we first train coarse sparselets to exploit the redundancy existing among part detectors by using an unsupervised single-hidden-layer auto-encoder. Then, we simultaneously train fine sparselets and activation vectors using a supervised single-hidden-layer neural network, in which sparselets training and discriminative activation vectors learning are jointly embedded into a unified framework. In order to adequately explore the discriminative information hidden in the part detectors and to achieve sparsity, we propose to optimize a new discriminative objective function by imposing L0-norm sparsity constraint on the activation vectors. By using the proposed framework, promising results for multi-class object detection and scene classification are achieved on PASCAL VOC 2007, MIT Scene-67, and UC Merced Land Use datasets, compared with the existing sparselets baseline methods.
Keywords :
"Detectors","Training","Object detection","Linear programming","Visualization","Artificial neural networks","Computational modeling"
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition (CVPR), 2015 IEEE Conference on
Electronic_ISBN :
1063-6919
Type :
conf
DOI :
10.1109/CVPR.2015.7298721
Filename :
7298721
Link To Document :
بازگشت