DocumentCode :
2982443
Title :
Healing Truncation Bias: Self-Weighted Truncation Framework for Dual Averaging
Author :
Oiwa, H. ; Matsushima, Satoru ; Nakagawa, Hirotoshi
Author_Institution :
Grad. Sch. of Inf. Sci. & Technol., Univ. of Tokyo, Tokyo, Japan
fYear :
2012
fDate :
10-13 Dec. 2012
Firstpage :
575
Lastpage :
584
Abstract :
We propose a new truncation framework for online supervised learning. Learning a compact predictive model in an online setting has recently attracted a great deal of attention. The combination of online learning with sparsity-inducing regularization enables faster learning with a smaller memory space than a conventional learning framework. However, a simple combination of these triggers the truncation of weights whose corresponding features rarely appear, even if these features are crucial for prediction. Furthermore, it is difficult to emphasize these features in advance while preserving the advantages of online learning. We develop an extensional truncation framework to Dual Averaging, which retains rarely occurring but informative features. Our proposed framework integrates information on all previous sub gradients of the loss functions into a regularization term. Our enhancement of a conventional L1-regularization accomplishes the automatic adjustment of each feature´s truncations. This extension enables us to identify and retain rare but informative features without preprocessing. In addition, our framework achieves the same computational complexity and regret bound as standard Dual Averaging. Experiments demonstrated that our framework outperforms other sparse online learning algorithms.
Keywords :
computational complexity; learning (artificial intelligence); compact predictive model; computational complexity; conventional L1-regularization; dual averaging; extensional truncation framework; healing truncation bias; informative features; loss functions; online supervised learning; regularization term; self-weighted truncation framework; sparsity-inducing regularization; Educational institutions; Equations; Indexes; Optimization; Prediction algorithms; Predictive models; Vectors; Feature Selection; Online Learning; Sentiment Analysis; Sparsity-inducing Regularization; Supervised Learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2012 IEEE 12th International Conference on
Conference_Location :
Brussels
ISSN :
1550-4786
Print_ISBN :
978-1-4673-4649-8
Type :
conf
DOI :
10.1109/ICDM.2012.33
Filename :
6413743
Link To Document :
بازگشت