Title :
Improving the learning rate of back-propagation with the gradient reuse algorithm
Author :
Hush, D.R. ; Salas, J.M.
Author_Institution :
Dept. of Electr. Eng. & Comput. Eng., New Mexico Univ., Albuquerque, NM, USA
Abstract :
A simple method for improving the learning rate of the backpropagation algorithm is described and analyzed. The method is referred to as the gradient reuse algorithm (GRA). The basic idea is that ingredients which are computed using backpropagation are reused several times until the resulting weight updates no longer lead to a reduction in error. It is shown that convergence speedup is a function of the reuse rate, and that the reuse rate can be controlled by using a dynamic convergence parameter.<>
Keywords :
artificial intelligence; learning systems; neural nets; artificial intelligence; backpropagation; dynamic convergence parameter; gradient reuse algorithm; learning rate; neural nets; reuse rate; Artificial intelligence; Learning systems; Neural networks;
Conference_Titel :
Neural Networks, 1988., IEEE International Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/ICNN.1988.23877