Title :
Sample Selection Based on K-L Divergence for Effectively Training SVM
Author :
Junhai Zhai ; Chang Li ; Ta Li
Author_Institution :
Key Lab. of Machine Learning & Comput. Intell. of Hebei Province, Hebei Univ., Baoding, China
Abstract :
The computational time and space complexity of support vector machine (SVM) are O(n3) and O(n2) respectively, where n is the number of training samples. It is inefficient or impracticable to train an SVM on relatively large datasets. Actually, the removal of training samples that are not support vector (SVs) has no effect on constructing the optimal hyper plane. Based on this idea, this paper proposed a sample selection method which can efficiently choose the candidate SVs from original datasets. The selected samples are used to train SVM. The experimental results show that the proposed method is effective and efficient, it can efficiently reduce the computational complexity both of time and space especially on relatively large datasets.
Keywords :
computational complexity; data mining; learning (artificial intelligence); support vector machines; K-L divergence; SVM training; computational O(n2) space complexity; computational O(n3) time complexity; optimal hyperplane; sample selection method; support vector machine; training samples; Accuracy; Classification algorithms; Educational institutions; Neural networks; Probabilistic logic; Support vector machines; Training; K-L divergence; PNN; SVM; samples selection;
Conference_Titel :
Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on
Conference_Location :
Manchester
DOI :
10.1109/SMC.2013.823