Title :
Parallel incremental SVM for classifying million images with very high-dimensional signatures into thousand classes
Author :
Thanh-Nghi Doan ; Thanh-Nghi Do ; Poulet, Francois
Author_Institution :
IRISA, Rennes, France
Abstract :
ImageNet dataset [1] with more than 14M images and 21K classes makes the problem of visual classification more difficult to deal with. One of the most difficult tasks is to train a fast and accurate classifier on computers with limited memory resource. In this paper, we address this challenge by extending the state-of-the-art large scale classifier Power Mean SVM (PmSVM) proposed by Jianxin Wu [2] in three ways: (1) An incremental learning for PmSVM, (2) A balanced bagging algorithm for training binary classifiers, (3) Parallelize the training process of classifiers with several multi-core computers. Our approach is evaluated on 1K classes of ImageNet (ILSVRC 1000 [3]). The evaluation shows that our approach can save up to 84.34% memory usage and the training process is 297 times faster than the original implementation and 1508 times faster than the state-of-the-art linear classifier (LIBLINEAR [4]).
Keywords :
image classification; support vector machines; visual databases; ImageNet dataset; PmSVM; balanced bagging algorithm; binary classifiers; high dimensional signatures; image classification; incremental learning; linear classifier; memory resource; memory usage; multicore computers; parallel incremental SVM; power mean SVM; visual classification; Accuracy; Bagging; Computers; Kernel; Support vector machines; Training; Visualization;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6707121