DocumentCode :
1346635
Title :
General statistical inference for discrete and mixed spaces by an approximate application of the maximum entropy principle
Author :
Yan, Lian ; Miller, David J.
Author_Institution :
Dept. of Electr. Eng., Pennsylvania State Univ., University Park, PA, USA
Volume :
11
Issue :
3
fYear :
2000
fDate :
5/1/2000 12:00:00 AM
Firstpage :
558
Lastpage :
573
Abstract :
We propose a method for learning a general statistical inference engine, operating on discrete and mixed discrete/continuous feature spaces. Such a model allows inference on any of the discrete features, given values for the remaining features. Applications are, e.g., to medical diagnosis with multiple possible diseases, fault diagnosis, information retrieval, and imputation in databases. Bayesian networks (BNs) are versatile tools that possess this inference capability. However, BNs require explicit specification of conditional independencies, which may be difficult to assess given limited data. Alternatively, Cheeseman (1983) proposed finding the maximum entropy (ME) joint probability mass function (pmf) consistent with arbitrary lower order probability constraints. This approach is in principle powerful and does not require explicit expression of conditional independence. However, until now the huge learning complexity has severely limited the use of this approach. Here we propose an approximate ME method, which also encodes arbitrary low-order constraints but while retaining quite tractable learning. Our method uses a restriction of joint pmf support (during learning) to a subset of the feature space. Results on the University of California-Irvine repository reveal performance gains over several BN approaches and over multilayer perceptrons
Keywords :
inference mechanisms; learning (artificial intelligence); maximum entropy methods; pattern classification; probability; University of California-Irvine repository; discrete spaces; general statistical inference engine; imputation; joint probability mass function; lower order probability constraints; maximum entropy principle; medical diagnosis; mixed spaces; tractable learning; Bayesian methods; Diseases; Engines; Entropy; Fault diagnosis; Information retrieval; Medical diagnosis; Multilayer perceptrons; Performance gain; Spatial databases;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.846727
Filename :
846727
Link To Document :
بازگشت