DocumentCode :
254483
Title :
Newton Greedy Pursuit: A Quadratic Approximation Method for Sparsity-Constrained Optimization
Author :
Xiao-Tong Yuan ; Qingshan Liu
Author_Institution :
S-mart Lab., Nanjing Univ. of Inf. Sci. & Technol., Nanjing, China
fYear :
2014
fDate :
23-28 June 2014
Firstpage :
4122
Lastpage :
4129
Abstract :
First-order greedy selection algorithms have been widely applied to sparsity-constrained optimization. The main theme of this type of methods is to evaluate the function gradient in the previous iteration to update the non-zero entries and their values in the next iteration. In contrast, relatively less effort has been made to study the second-order greedy selection method additionally utilizing the Hessian information. Inspired by the classic constrained Newton method, we propose in this paper the NewTon Greedy Pursuit (NTGP) method to approximately minimizes a twice differentiable function over sparsity constraint. At each iteration, NTGP constructs a second-order Taylor expansion to approximate the cost function, and estimates the next iterate as the solution of the constructed quadratic model over sparsity constraint. Parameter estimation error and convergence property of NTGP are analyzed. The superiority of NTGP to several representative first-order greedy selection methods is demonstrated in synthetic and real sparse logistic regression tasks.
Keywords :
Newton method; approximation theory; greedy algorithms; parameter estimation; regression analysis; Hessian information; NTGP method; Newton greedy pursuit; classic constrained Newton method; constructed quadratic model; convergence property; cost function; differentiable function; first-order greedy selection algorithm; first-order greedy selection method; function gradient; newton greedy pursuit method; nonzero entry; parameter estimation error; quadratic approximation method; real sparse logistic regression task; second-order Taylor expansion; second-order greedy selection method; sparsity constraint; sparsity-constrained optimization; synthetic sparse logistic regression task; Approximation algorithms; Approximation methods; Convergence; Logistics; Newton method; Optimization; Vectors; Greedy Pursuit; Newton Method; Optimization; Sparsity Model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on
Conference_Location :
Columbus, OH
Type :
conf
DOI :
10.1109/CVPR.2014.525
Filename :
6909921
Link To Document :
بازگشت