Title :
Comments on "Pruning Error Minimization in Least Squares Support Vector Machines
Author :
Kuh, Anthony ; De Wilde, Philippe
Author_Institution :
Univ. of Hawaii, Honolulu
fDate :
3/1/2007 12:00:00 AM
Abstract :
In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines" by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (gamma = infin). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (gamma finite and nonzero) and is also computationally more efficient.
Keywords :
least squares approximations; support vector machines; error minimization; least squares support vector machines; training examples; Computational complexity; Cost function; Councils; Equations; Kernel; Least squares methods; Optimization methods; Quadratic programming; Sparse matrices; Support vector machines; Least squares kernel methods; online updating; pruning; regularization; Algorithms; Computer Simulation; Feedback; Information Storage and Retrieval; Least-Squares Analysis; Models, Theoretical; Neural Networks (Computer); Pattern Recognition, Automated;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2007.891590