DocumentCode :
1709173
Title :
Sublinear Optimization for Machine Learning
Author :
Clarkson, Kenneth L. ; Hazan, Elad ; Woodruff, David P.
Author_Institution :
IBM Almaden Res. Center, San Jose, CA, USA
fYear :
2010
Firstpage :
449
Lastpage :
457
Abstract :
We give sub linear-time approximation algorithms for some optimization problems arising in machine learning, such as training linear classifiers and finding minimum enclosing balls. Our algorithms can be extended to some kernelized versions of these problems, such as SVDD, hard margin SVM, and L2-SVM, for which sub linear-time algorithms were not known before. These new algorithms use a combination of a novel sampling techniques and a new multiplicative update algorithm. We give lower bounds which show the running times of many of our algorithms to be nearly best possible in the unit-cost RAM model. We also give implementations of our algorithms in the semi-streaming setting, obtaining the first low pass polylogarithmic space and sub linear time algorithms achieving arbitrary approximation factor.
Keywords :
computational complexity; learning (artificial intelligence); optimisation; pattern classification; polynomial approximation; support vector machines; SVM; arbitrary approximation factor; linear classifier; machine learning; multiplicative update algorithm; polylogarithmic space; sampling technique; sublinear optimization; sublinear time approximation; support vector machine; Approximation algorithms; Approximation methods; Classification algorithms; Machine learning algorithms; Optimization; Support vector machines; Vectors; classification; machine learning; optimization; sublinear algorithms;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Foundations of Computer Science (FOCS), 2010 51st Annual IEEE Symposium on
Conference_Location :
Las Vegas, NV
ISSN :
0272-5428
Print_ISBN :
978-1-4244-8525-3
Type :
conf
DOI :
10.1109/FOCS.2010.50
Filename :
5671238
Link To Document :
بازگشت