• DocumentCode
    315243
  • Title

    Comparing parameterless learning rate adaptation methods

  • Author

    Fiesler, E. ; Moreira, M.

  • Author_Institution
    P.O.C., Torrance, CA, USA
  • Volume
    2
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    1082
  • Abstract
    Since the popularization of the backpropagation learning rule for training multilayer neural networks, many improvements and extensions of it have been proposed. Adaptive learning rate techniques are certainly among the most well-known of such improvements, promising a significant increase in learning speed and, in a case when no new tunable parameters are introduced, eliminating the trial-and-error process of finding a suitable learning rate. Hence, in order to compare the most promising of these, five methods without tunable parameters have been selected. Both the online and batch versions of standard backpropagation are also integrated into the study as points of reference. However, in order to compare the convergence speed of different learning rules, a better complexity measure is needed than the commonly used `number of training iterations´. Hence, a refined complexity measure is introduced here and used in the comparison of the seven chosen methods
  • Keywords
    adaptive systems; backpropagation; computational complexity; convergence; feedforward neural nets; adaptive learning; backpropagation; complexity measure; convergence; multilayer neural networks; parameterless learning rate; Adaptive scheduling; Backpropagation algorithms; Convergence; Multi-layer neural network; Multidimensional systems; Neural networks; Neurons; Proposals; Shape; Velocity measurement;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.616179
  • Filename
    616179