• DocumentCode
    445972
  • Title

    Finding a succinct multi-layer perceptron having shared weights

  • Author

    Tanahashi, Yusuke ; Chin, Xiang-Fang ; Saito, Kazumi ; Nakano, Ryohei

  • Author_Institution
    Nagoya Inst. of Technol., Japan
  • Volume
    3
  • fYear
    2005
  • fDate
    31 July-4 Aug. 2005
  • Firstpage
    1418
  • Abstract
    We present a method to find a succinct neural network having shared weights. We focus on weight sharing. Weight sharing constrains the freedom of weight values and weights are allowed to have one of common weights. A near-zero common weight can be eliminated, called weight pruning. Recently, a weight sharing method called BCW has been proposed. The BCW employs merge and split operations based on 2nd-order optimal criteria, and can escape local optima through bidirectional clustering. However, the BCW assumes a vital network parameter J, the number of hidden units, is given. This paper modifies the BCW to make the procedure faster so that the selection of J based on cross-validation can be done in reasonable CPU time. Our experiments showed that the proposed method can restore the original model for an artificial data set, and finds a small number of common weights and an interesting tendency for a real data set.
  • Keywords
    multilayer perceptrons; bidirectional clustering; multilayer perceptron; weight pruning; weight sharing; Artificial neural networks; Convergence; Data engineering; Data mining; Electronic mail; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural networks; Polynomials;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-9048-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.2005.1556082
  • Filename
    1556082