• DocumentCode
    2199047
  • Title

    Parallel and separable recursive Levenberg-Marquardt training algorithm

  • Author

    Asirvadam, V.S. ; McLoone, S.F. ; Irwin, G.W.

  • Author_Institution
    Sch. of Electr. & Electron. Eng., Queen´´s Univ., Belfast, UK
  • fYear
    2002
  • fDate
    2002
  • Firstpage
    129
  • Lastpage
    138
  • Abstract
    A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient parallel manner. A separable least squares implementation of decomposed RLM is also introduced. Experiment results for two nonlinear time series problems demonstrate the superiority of the new training algorithms.
  • Keywords
    feedforward neural nets; learning (artificial intelligence); least squares approximations; nonlinear estimation; parallel algorithms; recursive estimation; time series; Levenberg-Marquardt training algorithm; decomposed RLM; feedforward neural networks; neuron level; nonlinear time series problems; parallel algorithm; separable least squares implementation; separable recursive training algorithm; weight updating; Backpropagation algorithms; Convergence; Cost function; Feedforward neural networks; Least squares methods; Neural networks; Neurons; Partitioning algorithms; Resonance light scattering; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks for Signal Processing, 2002. Proceedings of the 2002 12th IEEE Workshop on
  • Print_ISBN
    0-7803-7616-1
  • Type

    conf

  • DOI
    10.1109/NNSP.2002.1030024
  • Filename
    1030024