DocumentCode :
1190475
Title :
Network information criterion-determining the number of hidden units for an artificial neural network model
Author :
Murata, Noboru ; Yoshizawa, Shuji ; Amari, Shun-Ichi
Author_Institution :
Dept. of Math. Eng. & Inf. Phys., Tokyo Univ., Japan
Volume :
5
Issue :
6
fYear :
1994
fDate :
11/1/1994 12:00:00 AM
Firstpage :
865
Lastpage :
872
Abstract :
The problem of model selection, or determination of the number of hidden units, can be approached statistically, by generalizing Akaike´s information criterion (AIC) to be applicable to unfaithful (i.e., unrealizable) models with general loss criteria including regularization terms. The relation between the training error and the generalization error is studied in terms of the number of the training examples and the complexity of a network which reduces to the number of parameters in the ordinary statistical theory of AIC. This relation leads to a new network information criterion which is useful for selecting the optimal network model based on a given training set
Keywords :
generalisation (artificial intelligence); information theory; learning (artificial intelligence); neural nets; statistical analysis; Akaike information criterion; general loss criteria; generalization error; hidden units; network information criterion; neural network model; optimal network model; training error; Artificial neural networks; Backpropagation; Feedforward neural networks; Multilayer perceptrons; Neural networks; Parameter estimation; Physics education; Recurrent neural networks; Stochastic processes; Training data;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.329683
Filename :
329683
Link To Document :
بازگشت