• DocumentCode
    3264100
  • Title

    Self-adaptive learning rates in backpropagation algorithm improve its function approximation performance

  • Author

    Bhattacharya, U. ; Parui, S.K.

  • Author_Institution
    Comput. Vision & Pattern Recognition Unit, Indian Stat. Inst., Calcutta, India
  • Volume
    5
  • fYear
    1995
  • fDate
    Nov/Dec 1995
  • Firstpage
    2784
  • Abstract
    The backpropagation algorithm helps a multilayer perceptron to learn to map a set of inputs to a set of outputs. But often its function approximation performance is not impressive. In this paper the authors demonstrate that self-adaptation of the learning rate of the backpropagation algorithm helps in improving the approximation of a function. The modified backpropagation algorithm with self-adaptive learning rates is based on a combination of two updating rules-one for updating the connection weights and the other for updating the learning rate. The method for learning rate updating implements the gradient descent principle on the error surface. Simulation results with astrophysical data are presented
  • Keywords
    backpropagation; function approximation; multilayer perceptrons; self-adjusting systems; backpropagation algorithm; connection weights; error surface; function approximation performance; gradient descent principle; multilayer perceptron; self-adaptive learning rates; Approximation algorithms; Backpropagation algorithms; Computational modeling; Computer errors; Computer vision; Function approximation; Joining processes; Multilayer perceptrons; Neural networks; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1995. Proceedings., IEEE International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-2768-3
  • Type

    conf

  • DOI
    10.1109/ICNN.1995.488172
  • Filename
    488172