DocumentCode :
2210291
Title :
Averaged Stochastic Gradient Descent with Feedback: An Accurate, Robust, and Fast Training Method
Author :
Sun, Xu ; Kashima, Hisashi ; Matsuzaki, Takuya ; Ueda, Naonori
Author_Institution :
Dept. of Math. Inf., Univ. of Tokyo, Tokyo, Japan
fYear :
2010
fDate :
13-17 Dec. 2010
Firstpage :
1067
Lastpage :
1072
Abstract :
On large datasets, the popular training approach has been stochastic gradient descent (SGD). This paper proposes a modification of SGD, called averaged SGD with feedback (ASF), that significantly improves the performance (robustness, accuracy, and training speed) over the traditional SGD. The proposal is based on three simple ideas: averaging the weight vectors across SGD iterations, feeding the averaged weights back into the SGD update process, and deciding when to perform the feedback (linearly slowing down feedback). Theoretically, we demonstrate the reasonable convergence properties of the ASF. Empirically, the ASF outperforms several strong baselines in terms of accuracy, robustness over the noise, and the training speed. To our knowledge, this is the first study of ``feedback´´ in stochastic gradient learning. Although we choose latent conditional models for verifying the ASF in this paper, the ASF is a general purpose technique just like SGD, and can be directly applied to other models.
Keywords :
classification; data mining; feedback; gradient methods; learning (artificial intelligence); stochastic processes; very large databases; averaged stochastic gradient descent; classification; data mining; feedback; large datasets; weight vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2010 IEEE 10th International Conference on
Conference_Location :
Sydney, NSW
ISSN :
1550-4786
Print_ISBN :
978-1-4244-9131-5
Electronic_ISBN :
1550-4786
Type :
conf
DOI :
10.1109/ICDM.2010.26
Filename :
5694086
Link To Document :
بازگشت