DocumentCode :
1816526
Title :
Probability of error, maximum mutual information, and size minimization of neural networks
Author :
Fakhr, Waleed ; Kamel, M. ; Elmastry, M.I.
Author_Institution :
Waterloo Univ., Ont., Canada
Volume :
1
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
901
Abstract :
An upper bound of the Bayes error probability is used as a generalization performance criterion for supervised neural network classifiers. It is shown that the maximization of the mutual information is equivalent to the minimization of this bound, and leads to a direct implementation of the Bayes framework for classification. This criterion is used both in training neural networks and in minimizing their size by adaptive pruning. A top-down heuristic for adaptively pruning nodes (weights) is proposed. The approach is applied both on probabilistic neural networks and the multilayer perceptron. Two benchmark problem results are given, verifying the validity of the approach
Keywords :
Bayes methods; feedforward neural nets; learning (artificial intelligence); pattern recognition; probability; Bayes error probability; adaptive pruning; generalization performance criterion; maximization; maximum mutual information; multilayer perceptron; mutual information; neural networks; probabilistic neural networks; probability of error; size minimization; supervised neural network classifiers; top-down heuristic; training; upper bound; Adaptive systems; Complex networks; Error probability; Multi-layer neural network; Mutual information; Neural networks; Parameter estimation; Training data; Upper bound; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.287072
Filename :
287072
Link To Document :
بازگشت