DocumentCode :
1196995
Title :
Comments on "Pruning Error Minimization in Least Squares Support Vector Machines
Author :
Kuh, Anthony ; De Wilde, Philippe
Author_Institution :
Univ. of Hawaii, Honolulu
Volume :
18
Issue :
2
fYear :
2007
fDate :
3/1/2007 12:00:00 AM
Firstpage :
606
Lastpage :
609
Abstract :
In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines" by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (gamma = infin). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (gamma finite and nonzero) and is also computationally more efficient.
Keywords :
least squares approximations; support vector machines; error minimization; least squares support vector machines; training examples; Computational complexity; Cost function; Councils; Equations; Kernel; Least squares methods; Optimization methods; Quadratic programming; Sparse matrices; Support vector machines; Least squares kernel methods; online updating; pruning; regularization; Algorithms; Computer Simulation; Feedback; Information Storage and Retrieval; Least-Squares Analysis; Models, Theoretical; Neural Networks (Computer); Pattern Recognition, Automated;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2007.891590
Filename :
4118265
Link To Document :
بازگشت