DocumentCode
2863497
Title
Global asymptotic stability for RNNs with a bipolar activation function
Author
Krcmar, Igor R. ; Bozic, Milorad M. ; Mandic, Danilo P.
Author_Institution
Fac. of Electr. Eng., Banjaluka Univ., Bosnia-Herzegovina
fYear
2000
fDate
2000
Firstpage
33
Lastpage
36
Abstract
Conditions for global asymptotic stability of a nonlinear relaxation process realized by a recurrent neural network with a hyperbolic tangent activation function are provided. This analysis is based upon the contraction mapping theorem and corresponding fixed point iteration. The derived results find their application in the wide area of neural networks for optimization and signal processing
Keywords
asymptotic stability; recurrent neural nets; transfer functions; bipolar activation function; contraction mapping theorem; fixed point iteration; global asymptotic stability; hyperbolic tangent activation function; nonlinear relaxation process; optimization; signal processing; Contracts; Convergence; Neural networks; Neurofeedback; Neurons; Recurrent neural networks; Signal design; Signal processing; Stability; State-space methods;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Network Applications in Electrical Engineering, 2000. NEUREL 2000. Proceedings of the 5th Seminar on
Conference_Location
Belgrade
Print_ISBN
0-7803-5512-1
Type
conf
DOI
10.1109/NEUREL.2000.902379
Filename
902379
Link To Document