Title :
A bottom-up method for simplifying support vector solutions
Author :
Nguyen, D. ; TuBao Ho
Author_Institution :
Japan Adv. Inst. of Sci. & Technol., Ishikawa
fDate :
5/1/2006 12:00:00 AM
Abstract :
The high generalization ability of support vector machines (SVMs) has been shown in many practical applications, however, they are considerably slower in test phase than other learning approaches due to the possibly big number of support vectors comprised in their solution. In this letter, we describe a method to reduce such number of support vectors. The reduction process iteratively selects two nearest support vectors belonging to the same class and replaces them by a newly constructed one. Through the analysis of relation between vectors in input and feature spaces, we present the construction of the new vectors that requires to find the unique maximum point of a one-variable function on (0,1), not to minimize a function of many variables with local minima in previous reduced set methods. Experimental results on real life dataset show that the proposed method is effective in reducing number of support vectors and preserving machine´s generalization performance
Keywords :
support vector machines; bottom-up method; high generalization ability; one-variable function; reduction process; support vector machines; unique maximum point; Eigenvalues and eigenfunctions; Equations; Matrix decomposition; Object detection; Principal component analysis; Robustness; Singular value decomposition; Statistical analysis; Support vector machines; Testing; Feature space; input space; kernel methods; reduced set method; support vector machines (SVMs); Algorithms; Artificial Intelligence; Information Storage and Retrieval; Neural Networks (Computer); Pattern Recognition, Automated; Systems Theory;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2006.873287