• DocumentCode
    2747167
  • Title

    Heuristic configuration of hidden units, in backpropagation neural networks

  • Author

    Indurkhya, Nitin ; Weiss, Sholom M.

  • Author_Institution
    Dept. of Comput. Sci., Rutgers Univ., New Brunswick, NJ, USA
  • fYear
    1991
  • fDate
    8-14 Jul 1991
  • Abstract
    Summary form only given, as follows. For optimum statistical classification and generalization with single hidden-layer backpropagation neural network models, two tasks must be performed: (a) learning the best set of weights for a network of k hidden units and (b) determining k, the best complexity fit. Two approaches to learning have been contrasted: (a) standard backpropagation as applied to a series of networks with different numbers of hidden units; and (b) a heuristic cascade-correlation approach that quickly and dynamically learns and configures a network. Four real-world statistical applications were considered. On these examples, the backpropagation approach yielded somewhat better results, but with far greater computation times. The best k´s for the two approaches were quite similar, suggesting a hybrid approach that chooses k by cascade-correlation, and optimizes the weights by backpropagation
  • Keywords
    neural nets; pattern recognition; statistics; generalization; heuristic cascade-correlation approach; heuristic configuration; hidden units; optimum statistical classification; pattern recognition; single hidden-layer backpropagation neural network models; Adaptive control; Algorithm design and analysis; Backpropagation algorithms; Computer science; Convergence; Cost function; Intelligent networks; Matrices; Neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
  • Conference_Location
    Seattle, WA
  • Print_ISBN
    0-7803-0164-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.1991.155604
  • Filename
    155604