Title :
Momentum Sequential Minimal Optimization: An accelerated method for Support Vector Machine training
Author :
Barbero, Álvaro ; Dorronsoro, José R.
Author_Institution :
Dept. de Ing. Inf., Univ. Autonoma de Madrid, Madrid, Spain
fDate :
July 31 2011-Aug. 5 2011
Abstract :
Sequential Minimal Optimization (SMO) can be regarded as the state-of-the-art approach in non-linear Support Vector Machines training, being the method of choice in the successful LIBSVM software. Its optimization procedure is based on updating only a couple of the problem coefficients per iteration, until convergence. In this paper we notice that this strategy can be interpreted as finding the sparsest yet most useful updating direction per iteration. We present a modification of SMO including a new approximate momentum term in the updating direction which captures information from previous updates, and show that this term presents a trade-off between sparsity and suitability of the chosen direction. We show how this novelty is able to provide substantial savings in practice in SMO´s number of iterations to convergence, without increasing noticeably its cost per iteration. We study when this saving in iterates can result in a reduced SVM training times, and the behavior of this new technique when combined with caching and shrinking strategies.
Keywords :
approximation theory; iterative methods; learning (artificial intelligence); optimisation; support vector machines; LIBSVM software; SMO modification; accelerated method; approximate momentum term; iteration method; momentum sequential minimal optimization; nonlinear support vector machine training; Convergence; Equations; Kernel; Optimization; Signal processing algorithms; Support vector machines; Training;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033245