Title :
An Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization
Author :
Koh, Kwangmoo ; Kim, Seung-Jean ; Boyd, Stephen
Author_Institution :
Stanford Univ., Stanford
fDate :
Jan. 29 2007-Feb. 2 2007
Abstract :
Convex loss minimization with lscr1 regularization has been proposed as a promising method for feature selection in classification (e.g., lscr1-regularized logistic regression) and regression (e.g., lscr1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale lscr1-regularized convex loss minimization problems that uses a preconditioned conjugate gradient method to compute the search step. The method can solve very large problems. For example, the method can solve an lscr1-regularized logistic regression problem with a million features and examples (e.g., the 20 Newsgroups data set), in a few minutes, on a PC.
Keywords :
conjugate gradient methods; minimisation; regression analysis; conjugate gradient method; convex loss minimization method; interior-point method; large-scale lscr1-regularization; lscr1-regularized least squares; lscr1-regularized logistic regression; Compressed sensing; Gradient methods; Large-scale systems; Least squares methods; Logistics; Minimization methods; Optimization methods; Predictive models; Signal processing; Vectors;
Conference_Titel :
Information Theory and Applications Workshop, 2007
Conference_Location :
La Jolla, CA
Print_ISBN :
978-0-615-15314-8
DOI :
10.1109/ITA.2007.4357584