Title :
Stable dynamic backpropagation learning in recurrent neural networks
Author :
Jin, Liang ; Gupta, Madan M.
Author_Institution :
Microelectron. Group, Lucent Technol. Inc., Allentown, PA, USA
fDate :
11/1/1999 12:00:00 AM
Abstract :
To avoid unstable phenomenon during the learning process, two new learning schemes, called the multiplier and constrained learning rate algorithms, are proposed in this paper to provide stable adaptive updating processes for both the synaptic and somatic parameters of the network. Based on the explicit stability conditions, in the multiplier method these conditions are introduced into the iterative error index, and the new updating formulations contain a set of inequality constraints. In the constrained learning rate algorithm, the learning rate is updated at each iterative instant by an equation derived using the stability conditions. With these stable dynamic backpropagation algorithms, any analog target pattern may be implemented by a steady output vector which is a nonlinear vector function of the stable equilibrium point. The applicability of the approaches presented is illustrated through both analog and binary pattern storage examples
Keywords :
Lyapunov methods; backpropagation; iterative methods; recurrent neural nets; stability; Lyapunov stability; adaptive algorithm; constrained learning rate algorithms; dynamic backpropagation; iterative error index; learning; multiplier algorithms; nonlinear vector function; recurrent neural networks; stability conditions; Associative memory; Backpropagation algorithms; Equations; Heuristic algorithms; Intelligent networks; Iterative algorithms; Iterative methods; Neural networks; Recurrent neural networks; Stability;
Journal_Title :
Neural Networks, IEEE Transactions on