Title :
A general formulation for learning multi-class posterior probabilities
Author :
Ni, Hongmei ; Adali, Tülay ; Wang, Bo
Author_Institution :
Dept. of Comput. Sci. & Electr. Eng., Maryland Univ., Baltimore, MD, USA
Abstract :
We use partial likelihood (PL) theory to introduce a general formulation for learning multi-class posterior probabilities. The formulation establishes a fundamental information-theoretic connection, the equivalence of partial likelihood maximization and relative entropy minimization, without making the common assumption of independent data samples. We further show that this fundamental information-theoretic relationship is satisfied for the basic class of probability models, the exponential family, which includes many important neural network probability models. Thus we provide the prospect of learning the multi-class probabilities on the PL cost using different models. We note the inefficiency of training a Softmax network and propose a modified multi-level classifier structure based on binary coding of the classes. We demonstrate the efficiency of our reduced complexity multi-level classifier by simulation results
Keywords :
learning (artificial intelligence); maximum likelihood estimation; minimum entropy methods; neural nets; probability; Softmax network; binary coding; information-theory; learning; minimum entropy; multiple class posterior probability; neural network; partial likelihood; probability models; Computer science; Costs; Engineering profession; Entropy; History; Maximum likelihood estimation; Neural networks; Probability distribution;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.831175