• DocumentCode
    445910
  • Title

    An analysis of underfitting in MLP networks

  • Author

    Nara, Sridhar ; Tagliarini, Gene

  • Author_Institution
    Dept. of Comput. Sci., North Carolina Univ., Wilmington, NC, USA
  • Volume
    2
  • fYear
    2005
  • fDate
    31 July-4 Aug. 2005
  • Firstpage
    984
  • Abstract
    The generalization ability of an MLP network has been shown to be related to both the number and magnitudes of the network weights. Thus, there exists a tension between employing networks with few weights that have relatively large magnitudes, and networks with a greater number of weights with relatively small magnitudes. The analysis presented in this paper indicates that large magnitudes for network weights potentially increase the propensity of a network to interpolate poorly. Experimental results indicate that when bounds are imposed on network weights, the backpropagation algorithm is capable of discovering networks with small weight magnitudes that retain their expressive power and exhibit good generalization.
  • Keywords
    backpropagation; multilayer perceptrons; MLP networks; backpropagation algorithm; multilayer perceptrons networks; Backpropagation algorithms; Computer science; Intelligent networks; Interpolation; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons; Signal detection; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
  • Print_ISBN
    0-7803-9048-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.2005.1555986
  • Filename
    1555986