Title :
Regularized Least Squares Twin SVR for the Simultaneous Learning of a Function and its Derivative
Author :
Jayadeva ; Khemchandani, Reshma ; Chandra, Suresh
Author_Institution :
Indian Inst. of Technol., New Delhi
Abstract :
In a recent publication, Lazaro et al. addressed the problem of simultaneously approximating a function and its derivative using support vector machines. In this paper, we propose a new approach termed as regularized least squares twin support vector regression, for the simultaneous learning of a function and its derivatives. The regressor is obtained by solving one of two related support vector machine-type problems, each of which is of a smaller size than the one obtained in Lazaro´s approach. The proposed algorithm is simple and fast, as no quadratic programming problem needs to be solved. Effectively, only the solution of a pair of linear systems of equations is needed.
Keywords :
function approximation; learning (artificial intelligence); least squares approximations; quadratic programming; regression analysis; support vector machines; function approximation; function simultaneous learning; linear equation; quadratic programming problem; regularized least squares twin support vector regression; support vector machine-type problem; Equations; Function approximation; Least squares approximation; Least squares methods; Linear systems; Pattern classification; Quadratic programming; Statistical learning; Support vector machine classification; Support vector machines; Support Vector regression; Support vector machines; function approximation; least squares approximations;
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
DOI :
10.1109/IJCNN.2006.246826