• DocumentCode
    1679294
  • Title

    Convergence analysis of discrete time recurrent neural networks for linear variational inequality problem

  • Author

    Tang, H.J. ; Tan, K.C. ; Zhang, Y.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Nat. Univ. of Singapore, Singapore
  • Volume
    3
  • fYear
    2002
  • fDate
    6/24/1905 12:00:00 AM
  • Firstpage
    2470
  • Lastpage
    2475
  • Abstract
    We study the convergence of a class of discrete recurrent neural networks to solve linear variational inequality problem (LVIP). LVIP has important applications in engineering and economics. Not only the network´s exponential convergence for the case of positive definite matrix is proved, but its global convergence for positive semidefinite matrix is also proved. Conditions are derived to guarantee the convergences of the network. Comprehensive examples are discussed and simulated to illustrate the results
  • Keywords
    convergence of numerical methods; eigenvalues and eigenfunctions; matrix algebra; optimisation; recurrent neural nets; convergence; discrete time neural networks; eigenvalues; linear variational inequality; optimization; positive definite matrix; positive semidefinite matrix; recurrent neural networks; Application software; Computer simulation; Continuous time systems; Convergence; Equations; Linear matrix inequalities; Neural networks; Recurrent neural networks; Symmetric matrices; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
  • Conference_Location
    Honolulu, HI
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7278-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.2002.1007530
  • Filename
    1007530