• DocumentCode
    396744
  • Title

    On the transformation mechanisms of multilayer perceptrons with sigmoid activation functions for classifications

  • Author

    Daqi, Gao ; Haijun, Zhu ; Nie Guping

  • Author_Institution
    Dept. of Comput., East China Univ. of Sci. & Technol., Shanghai, China
  • Volume
    2
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    1173
  • Abstract
    This paper studies the transformation mechanisms of multilayer perceptrons with sigmoid activation functions for classifications. The viewpoint is presented that in the input spaces the hyperplanes determined by the hidden basis functions with values of 0 do not play the role of separate hyperplanes, and furthermore such "hyperplanes" do not certainly go through the marginal regions between different classes. The number of hidden units is only related to the number of categories and the sample distribution shapes. The rank of output matrix of hidden units should be taken as the basis for pruning or growing the hidden nodes. As a result, an empirical formula for optimally determining the number of hidden neurons is proposed. Finally, two examples are given to verify it.
  • Keywords
    multilayer perceptrons; transfer function matrices; classifications; hidden basis functions; hidden neurons; hidden node roles; hyperplanes; multilayer perceptrons; output matrix rank; sigmoid activation functions; single-hidden-layer perceptrons; threshold activation functions; transformation mechanisms; Bioreactors; Equations; Laboratories; Multilayer perceptrons; Neurons; Paper technology; Shape; Space technology;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223858
  • Filename
    1223858