• DocumentCode
    1817704
  • Title

    A simplex optimization approach for recurrent neural network training and for learning time-dependent trajectory patterns

  • Author

    Wong, Yee Chin ; Sundareshan, Malur K.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Arizona Univ., Tucson, AZ, USA
  • Volume
    1
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    353
  • Abstract
    A major problem in a successful deployment of recurrent neural networks in practice is the complexity of training due to the presence of recurrent and feedback connections. The problem is further exacerbated if gradient descent learning algorithms that require computation of error gradients for the necessary updating are used, often forcing one to resort to approximations that may in turn lead to reduced training efficiency. We describe a learning procedure that does not require gradient evaluations and hence offers significant implementation advantages. This procedure exploits the inherent properties of nonlinear simplex optimization in realizing these advantages
  • Keywords
    learning (artificial intelligence); optimisation; recurrent neural nets; recurrent neural network training; simplex optimization approach; time-dependent trajectory patterns; Computer errors; Computer networks; Management training; Multi-layer neural network; Neural networks; Neurofeedback; Neurons; Nonlinear dynamical systems; Performance gain; Recurrent neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1999. IJCNN '99. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-5529-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.1999.831518
  • Filename
    831518