Title :
MMI training of minimum complexity adaptive nearest neighbor classifiers
Author :
Fakhr, Waleed ; Kamel, M. ; Elmasry, M.I.
Author_Institution :
Dept. of Electr. & Comput. Eng., Waterloo Univ., Ont., Canada
fDate :
27 Jun-2 Jul 1994
Abstract :
In this paper a minimum complexity adaptive nearest neighbor classifier “ANNC” is proposed, with maximum mutual information “MMI” training. The ANNC employs a winner-Gaussian approximation for each class PDF, with radially symmetrical and equal width Gaussians to produce piece-wise linear decision boundaries between classes. The MMI training minimizes an upper bound of the classification error probability, and thus is used to estimate the ANNC parameters. A discrete stochastic complexity criterion for classification “DSCC” is derived from the Bayesian model selection framework to estimate the minimum number of Gaussians required by the ANNC for optimal classification. Results of 3 experiments show the advantages of using the ANNC framework
Keywords :
Gaussian distribution; adaptive systems; classification; learning (artificial intelligence); neural nets; parameter estimation; probability; ANNC parameter estimation; Bayesian model selection framework; MMI training; adaptive probabilistic neural network; classification; classification error probability; discrete stochastic complexity criterion; maximum mutual information; minimum complexity adaptive nearest neighbor classifiers; optimal classification; piece-wise linear decision boundaries; training; upper bound; winner-Gaussian approximation; Bayesian methods; Error probability; Gaussian approximation; Gaussian processes; Mutual information; Nearest neighbor searches; Parameter estimation; Piecewise linear techniques; Stochastic processes; Upper bound;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374196