DocumentCode :
1810667
Title :
Approximate maximum entropy joint feature inference for discrete space classification
Author :
Miller, David J. ; Yan, Lian
Author_Institution :
Dept. of Electr. Eng., Pennsylvania State Univ., University Park, PA, USA
Volume :
2
fYear :
1999
fDate :
36342
Firstpage :
1419
Abstract :
We propose a new method for learning discrete space statistical classifiers. We cast classification/inference within the more general framework of estimating the joint probability mass function (PMF) for the (feature vector, class label) pair. The proposal of Cheeseman (1983) to construct the maximum entropy (ME) joint PMF consistent with general lower order probability constraints has been severely limited by its huge learning complexity. Alternatives such as Bayesian networks require explicit specification of conditional independencies. Here we reconsider the ME problem, propose an approximate method which encodes arbitrary low order constraints, while retaining quite tractable learning. The new method approximates the joint feature PMF during learning on a sub-grid of the full feature space. Extensions to more general inference problems are indicated
Keywords :
inference mechanisms; learning (artificial intelligence); maximum entropy methods; pattern classification; probability; statistical analysis; discrete space classification; feature inference; learning; maximum entropy; probability mass function; Bayesian methods; Diseases; Engineering profession; Entropy; Fault diagnosis; Information retrieval; Optimization methods; Proposals; Spatial databases; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.831172
Filename :
831172
Link To Document :
بازگشت