Title :
Averaged Stochastic Gradient Descent with Feedback: An Accurate, Robust, and Fast Training Method
Author :
Sun, Xu ; Kashima, Hisashi ; Matsuzaki, Takuya ; Ueda, Naonori
Author_Institution :
Dept. of Math. Inf., Univ. of Tokyo, Tokyo, Japan
Abstract :
On large datasets, the popular training approach has been stochastic gradient descent (SGD). This paper proposes a modification of SGD, called averaged SGD with feedback (ASF), that significantly improves the performance (robustness, accuracy, and training speed) over the traditional SGD. The proposal is based on three simple ideas: averaging the weight vectors across SGD iterations, feeding the averaged weights back into the SGD update process, and deciding when to perform the feedback (linearly slowing down feedback). Theoretically, we demonstrate the reasonable convergence properties of the ASF. Empirically, the ASF outperforms several strong baselines in terms of accuracy, robustness over the noise, and the training speed. To our knowledge, this is the first study of ``feedback´´ in stochastic gradient learning. Although we choose latent conditional models for verifying the ASF in this paper, the ASF is a general purpose technique just like SGD, and can be directly applied to other models.
Keywords :
classification; data mining; feedback; gradient methods; learning (artificial intelligence); stochastic processes; very large databases; averaged stochastic gradient descent; classification; data mining; feedback; large datasets; weight vectors;
Conference_Titel :
Data Mining (ICDM), 2010 IEEE 10th International Conference on
Conference_Location :
Sydney, NSW
Print_ISBN :
978-1-4244-9131-5
Electronic_ISBN :
1550-4786
DOI :
10.1109/ICDM.2010.26