• DocumentCode
    2391986
  • Title

    An integrated approach to improving back-propagation neural networks

  • Author

    Goh, Yue-Seng ; Tan, Eng-Chong

  • Author_Institution
    Sch. of Appl. Sci., Nanyang Technol. Inst., Singapore
  • fYear
    1994
  • fDate
    22-26 Aug 1994
  • Firstpage
    801
  • Abstract
    Back-propagation is the most popular training method for multi-layer feed-forward neural networks. To date, most researchers aiming at improving back-propagation work at one or two aspects of back-propagation, though there are some researchers who tackle a few aspects of back-propagation at a time. This paper explores various ways of improving back-propagation and attempts to integrate them together to form the new-improved backpropagation. The aspects of back-propagation that are investigated are: net pruning during training, adaptive learning rates for individual weights and biases, adaptive momentum, and extending the role of the neuron in learning
  • Keywords
    backpropagation; feedforward neural nets; learning (artificial intelligence); adaptive learning rates; adaptive momentum; backpropagation neural networks; biases; learning; multilayer feedforward neural networks; net pruning; neuron; training method; weights; Backpropagation; Biology computing; Computer networks; Convergence; Feedforward neural networks; Feedforward systems; Multi-layer neural network; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    TENCON '94. IEEE Region 10's Ninth Annual International Conference. Theme: Frontiers of Computer Technology. Proceedings of 1994
  • Print_ISBN
    0-7803-1862-5
  • Type

    conf

  • DOI
    10.1109/TENCON.1994.369201
  • Filename
    369201