• DocumentCode
    1161526
  • Title

    On-line training of recurrent neural networks with continuous topology adaptation

  • Author

    Obradovic, Dragan

  • Author_Institution
    Corp. Res. & Dev., Siemens AG, Munich, Germany
  • Volume
    7
  • Issue
    1
  • fYear
    1996
  • fDate
    1/1/1996 12:00:00 AM
  • Firstpage
    222
  • Lastpage
    228
  • Abstract
    This paper presents an online procedure for training dynamic neural networks with input-output recurrences whose topology is continuously adjusted to the complexity of the target system dynamics. This is accomplished by changing the number of the elements of the network hidden layer whenever the existing topology cannot capture the dynamics presented by the new data. The training mechanism is based on the suitably altered extended Kalman filter (EKF) algorithm which is simultaneously used for the network parameter adjustment and for its state estimation. The network consists of a single hidden layer with Gaussian radial basis functions (GRBF), and a linear output layer. The choice of the GRBF is induced by the requirements of the online learning. The latter implies the network architecture which permits only local influence of the new data point in order not to forget the previously learned dynamics. The continuous topology adaptation is implemented in our algorithm to avoid memory and computational problems of using a regular grid of GRBF´S which covers the network input space. Furthermore, we show that the resulting parameter increase can be handled “smoothly” without interfering with the already acquired information. If the target system dynamics are changing over time, we show that a suitable forgetting factor can be used to “unlearn” the no longer-relevant dynamics. The quality of the recurrent network training algorithm is demonstrated on the identification of nonlinear dynamic systems
  • Keywords
    Kalman filters; feedforward neural nets; filtering theory; learning (artificial intelligence); recurrent neural nets; topology; Gaussian radial basis functions; continuous topology adaptation; dynamic neural networks; extended Kalman filter; forgetting factor; input-output recurrences; linear output layer; nonlinear dynamic systems identification; online training; recurrent neural networks; Backpropagation algorithms; Computer architecture; Computer networks; Feedback loop; Grid computing; Network topology; Neural networks; Nonlinear dynamical systems; Recurrent neural networks; State estimation;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.478408
  • Filename
    478408