• DocumentCode
    1817240
  • Title

    An adaptive back-propagation learning method: A preliminary study for incremental neural networks

  • Author

    Chen, Hown-Wen ; Soo, Von-Wun

  • Author_Institution
    Inst. of Comput. Sci., Nat. Tsing Hua Univ., Hsinchu, Taiwan
  • Volume
    1
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    713
  • Abstract
    The authors apply the concept of minimizing weight sensitivity cost and training square-error functions using gradient descent optimization techniques, and they obtain a novel supervised backpropagation learning algorithm on a biased two-layered perceptron. In addition to illustrating the conflict locality of an inserted training instance with respect to previous training data, they point out that this adaptive learning method can get a network with a measurable generalization ability. This work can also be extended to an incremental network in which no training instances are needed to be remembered
  • Keywords
    backpropagation; feedforward neural nets; learning (artificial intelligence); adaptive backpropagation learning method; biased two-layered perceptron; conflict locality; gradient descent optimization; incremental neural networks; measurable generalization; minimizing weight sensitivity cost; training square-error functions; Application software; Artificial neural networks; Computer science; Cost function; Humans; Learning systems; Load forecasting; Multilayer perceptrons; Neural networks; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.287103
  • Filename
    287103