• DocumentCode
    3246455
  • Title

    A convergent neural network learning algorithm

  • Author

    Tang, Zaiyong ; Koehler, Gary J.

  • Author_Institution
    Dept. of Decision & Inf. Sci., Florida Univ., Gainesville, FL, USA
  • Volume
    2
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    127
  • Abstract
    A globally guided backpropagation (GGBP) training algorithm is presented. This algorithm is a modification of the standard backpropagation algorithm. Instead of changing a weight wij according to the partial derivative or error, E, with respect to wij, an attempt is made to minimize E in the output space. The change in weights W is computed based on the desired changes in the output O. The new algorithm is an analog to backpropagation with a dynamically adjusted learning rate η. This learning rate changing scheme avoids the problems associated with a heuristic learning rate adjusting method. Two main advantages of GGBP are fast learning speed and convergence to a global optimal solution
  • Keywords
    backpropagation; neural nets; GGBP; convergent learning algorithm; dynamically adjusted learning rate; globally guided backpropagation; neural network; training algorithm; Acceleration; Backpropagation algorithms; Computer networks; Costs; Feedforward neural networks; Feedforward systems; Gradient methods; Jacobian matrices; Multi-layer neural network; Neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.226973
  • Filename
    226973