Title :
Network information criterion-determining the number of hidden units for an artificial neural network model
Author :
Murata, Noboru ; Yoshizawa, Shuji ; Amari, Shun-Ichi
Author_Institution :
Dept. of Math. Eng. & Inf. Phys., Tokyo Univ., Japan
fDate :
11/1/1994 12:00:00 AM
Abstract :
The problem of model selection, or determination of the number of hidden units, can be approached statistically, by generalizing Akaike´s information criterion (AIC) to be applicable to unfaithful (i.e., unrealizable) models with general loss criteria including regularization terms. The relation between the training error and the generalization error is studied in terms of the number of the training examples and the complexity of a network which reduces to the number of parameters in the ordinary statistical theory of AIC. This relation leads to a new network information criterion which is useful for selecting the optimal network model based on a given training set
Keywords :
generalisation (artificial intelligence); information theory; learning (artificial intelligence); neural nets; statistical analysis; Akaike information criterion; general loss criteria; generalization error; hidden units; network information criterion; neural network model; optimal network model; training error; Artificial neural networks; Backpropagation; Feedforward neural networks; Multilayer perceptrons; Neural networks; Parameter estimation; Physics education; Recurrent neural networks; Stochastic processes; Training data;
Journal_Title :
Neural Networks, IEEE Transactions on