Title :
Convergence analysis of discrete time recurrent neural networks for linear variational inequality problem
Author :
Tang, H.J. ; Tan, K.C. ; Zhang, Y.
Author_Institution :
Dept. of Electr. & Comput. Eng., Nat. Univ. of Singapore, Singapore
fDate :
6/24/1905 12:00:00 AM
Abstract :
We study the convergence of a class of discrete recurrent neural networks to solve linear variational inequality problem (LVIP). LVIP has important applications in engineering and economics. Not only the network´s exponential convergence for the case of positive definite matrix is proved, but its global convergence for positive semidefinite matrix is also proved. Conditions are derived to guarantee the convergences of the network. Comprehensive examples are discussed and simulated to illustrate the results
Keywords :
convergence of numerical methods; eigenvalues and eigenfunctions; matrix algebra; optimisation; recurrent neural nets; convergence; discrete time neural networks; eigenvalues; linear variational inequality; optimization; positive definite matrix; positive semidefinite matrix; recurrent neural networks; Application software; Computer simulation; Continuous time systems; Convergence; Equations; Linear matrix inequalities; Neural networks; Recurrent neural networks; Symmetric matrices; Vectors;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1007530