• DocumentCode
    2391956
  • Title

    Pruning neural networks during training by backpropagation

  • Author

    Goh, Yue-Seng ; Tan, Eng-Chong

  • Author_Institution
    Sch. of Appl. Sci., Nanyang Technol. Inst., Singapore
  • fYear
    1994
  • fDate
    22-26 Aug 1994
  • Firstpage
    805
  • Abstract
    For any neural network design, the network size is often chosen arbitrarily. Too large a network will tend to memorise the training patterns and thus have a poor generalisation ability. A smaller network is more efficient in computations and learning. However, too small a network may never solve the problem. In general, one would rather overestimate the network size than underestimate it. Pruning, or net pruning, is the reduction of the network size. Karnin (1990) proposed a simple procedure for pruning back-propagation trained neural networks. This paper extends the work by Karnin further and proposes a simple method of pruning during training by backpropagation
  • Keywords
    backpropagation; generalisation (artificial intelligence); neural nets; backpropagation; backpropagation trained neural networks; generalisation; net pruning; network size; neural network pruning; pruning; training; training patterns; Artificial neural networks; Backpropagation; Computational efficiency; Computer networks; Frequency; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    TENCON '94. IEEE Region 10's Ninth Annual International Conference. Theme: Frontiers of Computer Technology. Proceedings of 1994
  • Print_ISBN
    0-7803-1862-5
  • Type

    conf

  • DOI
    10.1109/TENCON.1994.369200
  • Filename
    369200