Title :
Incremental / decremental SVM for function approximation
Author :
Galmeanu, H. ; Andonie, R.
Author_Institution :
Electron. & Comput. Dept., Transilvania Univ. of Brasov, Brasov
Abstract :
Training a support vector regression (SVR) resumes to the process of migrating the vectors in and out of the support set along with modifying the associated thresholds. This paper gives a complete overview of all the boundary conditions implied by vector migration through the process. The process is similar to that of training a SVM, though the process of incrementing / decrementing of vectors into / out of the solution does not coincide with the increase / decrease of the associated threshold. The analysis shows the details of incremental and decremental procedures used to train the SVR. Vectors with duplicate contribution are also considered. The migration of vectors among sets on decreasing the regularization parameter C is particularly given attention. Eventually, experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value, to tune up the approximation performance of the regression.
Keywords :
function approximation; learning (artificial intelligence); regression analysis; support vector machines; vectors; associated threshold; boundary conditions; decremental SVM; function approximation; incremental SVM; support vector regression; vector migration; Boundary conditions; Computer science; Equations; Function approximation; Lagrangian functions; Large-scale systems; Machine learning; Resumes; Support vector machines; Writing; SVR; incremental learning; regularization parameter; vector migration;
Conference_Titel :
Optimization of Electrical and Electronic Equipment, 2008. OPTIM 2008. 11th International Conference on
Conference_Location :
Brasov
Print_ISBN :
978-1-4244-1544-1
Electronic_ISBN :
978-1-4244-1545-8
DOI :
10.1109/OPTIM.2008.4602473