Title :
Regularized Least Squares Potential SVRs
Author :
Jayadeva ; Deb, Alok Kanti ; Khemchandani, Reshma ; Chandra, Suresh
Author_Institution :
Dept. of Electr. Eng., Indian Inst. of Technol., New Delhi
Abstract :
In this paper, we propose a regularized least squares approach to potential SVRs. The proposed solution involves inverting a single matrix of small dimension. In the case of linear SVRs, the size of the matrix is independent of the number of data samples. Results involving benchmark data sets demonstrate the computational advantages of the proposal. In a recent publication, it has been highlighted that the margin in support vector machines (SVMs) is not scale invariant. This implies that an appropriate scaling can have an impact on the generalization performance of the SVM based regressor. Potential SVMs address this issue and suggest a new approach to regression
Keywords :
learning (artificial intelligence); least squares approximations; matrix inversion; regression analysis; support vector machines; SVM based regressor; benchmark data sets; matrix inversion; potential SVR; regularized least squares approach; support vector machine; Function approximation; Kernel; Lagrangian functions; Least squares methods; Machine learning; Proposals; Quadratic programming; Senior members; Support vector machine classification; Support vector machines; Approximation methods; Function Approximation; Least Squares Methods; Machine Learning; Pattern Classification; Regression; Support Vector Machines;
Conference_Titel :
India Conference, 2006 Annual IEEE
Conference_Location :
New Delhi
Print_ISBN :
1-4244-0369-3
Electronic_ISBN :
1-4244-0370-7
DOI :
10.1109/INDCON.2006.302859