Title :
NLq theory: checking and imposing stability of recurrent neural networks for nonlinear modeling
Author :
Suykens, Johan A K ; Vandewalle, Joos ; De Moor, B.L.R.
Author_Institution :
ESAT-SISTA, Katholieke Univ., Leuven, Belgium
fDate :
11/1/1997 12:00:00 AM
Abstract :
It is known that many discrete-time recurrent neural networks, such as e.g., neural state space models, multilayer Hopfield networks, and locally recurrent globally feedforward neural networks, can be represented as NLq systems. Sufficient conditions for global asymptotic stability and input/output stability of NLq systems are available, including three types of criteria: (1) diagonal scaling; (2) criteria depending on diagonal dominance; (3) condition number factors of certain matrices. The paper discusses how Narendra´s (1990, 1991) dynamic backpropagation procedure, which is used for identifying recurrent neural networks from I/O measurements, can be modified with an NLq stability constraint in order to ensure globally asymptotically stable identified models. An example illustrates how system identification of an internally stable model corrupted by process noise may lead to unwanted limit cycle behavior and how this problem can be avoided by adding the stability constraint
Keywords :
Hopfield neural nets; asymptotic stability; backpropagation; discrete time systems; feedforward neural nets; input-output stability; multilayer perceptrons; noise; nonlinear systems; state-space methods; I/O measurements; NLq stability constraint; NLq systems; NLq theory; condition number factors; diagonal dominance; diagonal scaling; discrete-time recurrent neural networks; dynamic backpropagation; global asymptotic stability; input/output stability; internally stable model; limit cycle behavior; locally recurrent globally feedforward neural networks; matrices; multilayer Hopfield networks; neural state space models; nonlinear modeling; process noise; sufficient conditions; system identification; Asymptotic stability; Backpropagation; Feedforward neural networks; Hopfield neural networks; Multi-layer neural network; Neural networks; Recurrent neural networks; Stability criteria; State-space methods; Sufficient conditions;
Journal_Title :
Signal Processing, IEEE Transactions on