Title :
A maximum entropy approach for optimal statistical classification
Author :
Miller, David ; Rao, Ajit ; Rose, Kenneth ; Gersho, Allen
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA
fDate :
31 Aug-2 Sep 1995
Abstract :
A global optimization technique is introduced for statistical classifier design to minimize the probability of classification error. The method, which is based on ideas from information theory and analogies to statistical physics, is inherently probabilistic. During the design phase, data are assigned to classes in probability, with the probability distributions chosen to maximize entropy subject to a constraint on the expected classification error. This entropy maximization problem is seen to be equivalent to a free energy minimization, motivating a deterministic annealing approach to minimize the misclassification cost. Our method is applicable to a variety of classifier structures, including nearest prototype, radial basis function, and multilayer perceptron-based classifiers. On standard benchmark examples, the method applied to nearest prototype classifier design achieves performance improvements over both the learning vector quantiser, as well as over multilayer perceptron classifiers designed by the standard backpropagation algorithm
Keywords :
error statistics; information theory; learning (artificial intelligence); maximum entropy methods; neural nets; pattern classification; probability; classification error; deterministic annealing; deterministic learning algorithm; error probability; information theory; maximum entropy; optimal statistical classification; probability; Algorithm design and analysis; Annealing; Costs; Design optimization; Entropy; Information theory; Nonhomogeneous media; Physics; Probability distribution; Prototypes;
Conference_Titel :
Neural Networks for Signal Processing [1995] V. Proceedings of the 1995 IEEE Workshop
Conference_Location :
Cambridge, MA
Print_ISBN :
0-7803-2739-X
DOI :
10.1109/NNSP.1995.514879