DocumentCode :
250479
Title :
Hard negative classes for multiple object detection
Author :
Kanezaki, Asako ; Inaba, Shogo ; Ushiku, Yoshitaka ; Yamashita, Yukihiko ; Muraoka, Hiroaki ; Kuniyoshi, Yasuo ; Harada, Tatsuya
Author_Institution :
Grad. Sch. of Inf. Sci. & Technol., Univ. of Tokyo, Tokyo, Japan
fYear :
2014
fDate :
May 31 2014-June 7 2014
Firstpage :
3066
Lastpage :
3073
Abstract :
We propose an efficient method to train multiple object detectors simultaneously using a large scale image dataset. The one-vs-all approach that optimizes the boundary between positive samples from a target class and negative samples from the others has been the most standard approach for object detection. However, because this approach trains each object detector independently, the scores are not balanced between object classes. The proposed method combines ideas derived from both detection and classification in order to balance the scores across all object classes. We optimized the boundary between target classes and their “hard negative” samples, just as in detection, while simultaneously balancing the detector scores across object classes, as done in multi-class classification. We evaluated the performances on multi-class object detection using a subset of the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) 2011 dataset and showed our method outperformed a de facto standard method.
Keywords :
image classification; object detection; ILSVRC 2011 dataset; ImageNet Large Scale Visual Recognition Challenge; detector scores; large scale image dataset; multiclass classification; multiclass object detection; multiple object detection; negative samples; object classification; one-versus-all approach; positive samples; Detectors; Object detection; Optimization; Standards; Support vector machines; Training; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robotics and Automation (ICRA), 2014 IEEE International Conference on
Conference_Location :
Hong Kong
Type :
conf
DOI :
10.1109/ICRA.2014.6907300
Filename :
6907300
Link To Document :
بازگشت