DocumentCode
324536
Title
Behavior stabilization of complex-valued recurrent neural networks using relative-minimization learning
Author
Hirose, Akira ; Onishi, Hirofumi
Author_Institution
Res. Center for Adv. Sci. & Technol., Tokyo Univ., Japan
Volume
2
fYear
1998
fDate
4-9 May 1998
Firstpage
1078
Abstract
Relative-minimization learning using additional random teacher signals is proposed for recurrent-behavior stabilization. Although the recurrent neural networks can deal with time-sequential data, they tend to show an unstable behavior (positive Lyapunov exponent). The proposed method superimposes a type of basin upon a dynamics-determining hypersurface in an information vector field. This process is equivalent to the relative minimization of the error function in the input-signal partial space. Experiments demonstrate that the relative-minimization learning suppresses positive values of Lyapunov exponents down to zero or negative, resulting in a successful behavior stabilization
Keywords
Lyapunov methods; learning (artificial intelligence); minimisation; recurrent neural nets; stability; Lyapunov exponents; additional random teacher signals; behavior stabilization; complex-valued recurrent neural networks; dynamics-determining hypersurface; information vector field; input-signal partial space; positive Lyapunov exponent; recurrent-behavior stabilization; relative error function minimization; relative-minimization learning; time-sequential data; unstable behavior; Adaptive control; Adaptive filters; Filtering; Neural networks; Neurons; Programmable control; Recurrent neural networks; Signal generators; Signal processing; Statistics;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location
Anchorage, AK
ISSN
1098-7576
Print_ISBN
0-7803-4859-1
Type
conf
DOI
10.1109/IJCNN.1998.685922
Filename
685922
Link To Document