• DocumentCode
    465486
  • Title

    TRTRL: A Localized Resource-Efficient Learning Algorithm for Recurrent Neural Netowrks

  • Author

    Budik, Danny ; Elhanany, Itamar

  • Author_Institution
    Univ. of Tennessee, Knoxville
  • Volume
    1
  • fYear
    2006
  • fDate
    6-9 Aug. 2006
  • Firstpage
    371
  • Lastpage
    374
  • Abstract
    This paper introduces an efficient, low-complexity online learning algorithm for recurrent neural networks. The approach is based on the real-time recurrent learning (RTRL) algorithm, whereby the sensitivity set of each neuron is reduced to weights associated either with its input or ouput links. As a consequence, storage requirements are reduced from O(N3) to O(N2) and the computational complexity is reduced to O(N2). Despite the radical reduction in resource requirements, it is shown through simulation results that the overall performance degradation is rather minor. Moreover, the scheme lends itself to parallel hardware realization by virtue of the localized property that is inherent to the approach.
  • Keywords
    computational complexity; learning (artificial intelligence); recurrent neural nets; computational complexity; localized resource-efficient learning algorithm; real-time recurrent learning algorithm; recurrent neural networks; Backpropagation; Computational complexity; Computational modeling; Constraint optimization; Error correction; Neurons; Nonlinear dynamical systems; Nonlinear systems; Recurrent neural networks; System identification; Recurrent neural networks; constraint optimization; real-time recurrent learning (RTRL);
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Circuits and Systems, 2006. MWSCAS '06. 49th IEEE International Midwest Symposium on
  • Conference_Location
    San Juan
  • ISSN
    1548-3746
  • Print_ISBN
    1-4244-0172-0
  • Electronic_ISBN
    1548-3746
  • Type

    conf

  • DOI
    10.1109/MWSCAS.2006.382075
  • Filename
    4267152