• DocumentCode
    288337
  • Title

    A first order adaptive learning rate algorithm for backpropagation networks

  • Author

    Nachtsheim, Philip R.

  • Author_Institution
    Inf. Sci. Div., NASA Ames Res. Center, Moffett Field, CA, USA
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    257
  • Abstract
    A simple method for determining the learning rate parameter of the backpropagation algorithm is described and analyzed. The learning rate parameter is determined at each step of the iteration by attempting to find a double root of the quadratic cost function. This is opposed to the traditional approach of viewing learning as an optimization problem. It is shown that this method of determining the learning rate parameter leads to accelerated convergence for several benchmark cases
  • Keywords
    backpropagation; convergence; neural nets; accelerated convergence; backpropagation networks; double root; first order adaptive learning rate algorithm; learning rate parameter; quadratic cost function; Acceleration; Algorithm design and analysis; Backpropagation algorithms; Convergence; Cost function; Information analysis; NASA;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374171
  • Filename
    374171