Title :
Efficient and Accelerated Online Learning for Sparse Group Lasso
Author :
Li Zhi-Jie ; Li Yuan-Xiang ; Wang Feng ; Yu Fei ; Xiang Zheng-Long
Author_Institution :
State Key Lab. of Software Eng., Wuhan Univ., Wuhan, China
Abstract :
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O (d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O (1/T2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
Keywords :
computational complexity; learning (artificial intelligence); accelerated online learning algorithm; batch-mode group lasso algorithms; large-scale problem; memory cost; online algorithm; online learning algorithms; sparse group lasso model; theoretical convergence rate; time complexity; worst-case time complexity; Acceleration; Algorithm design and analysis; Convergence; Optimization; Software algorithms; Time complexity; Vectors; accelerated convergence; dual averaging method; group lasso; online learning; sparsity;
Conference_Titel :
Data Mining Workshop (ICDMW), 2014 IEEE International Conference on
Conference_Location :
Shenzhen
Print_ISBN :
978-1-4799-4275-6
DOI :
10.1109/ICDMW.2014.94