Title :
Learning mixture models with the regularized latent maximum entropy principle
Author :
Wang, Shaojun ; Schuurmans, Dale ; Peng, Fuchun ; Zhao, Yunxin
Author_Institution :
Dept. of Comput. Sci., Univ. of Alberta, Alta., Canada
fDate :
7/1/2004 12:00:00 AM
Abstract :
This paper presents a new approach to estimating mixture models based on a recent inference principle we have proposed: the latent maximum entropy principle (LME). LME is different from Jaynes´ maximum entropy principle, standard maximum likelihood, and maximum a posteriori probability estimation. We demonstrate the LME principle by deriving new algorithms for mixture model estimation, and show how robust new variants of the expectation maximization (EM) algorithm can be developed. We show that a regularized version of LME (RLME), is effective at estimating mixture models. It generally yields better results than plain LME, which in turn is often better than maximum likelihood and maximum a posterior estimation, particularly when inferring latent variable models from small amounts of data.
Keywords :
learning (artificial intelligence); maximum entropy methods; maximum likelihood estimation; expectation maximization algorithm; latent maximum entropy principle; learning mixture models; maximum a posteriori probability estimation; maximum likelihood estimation; Computer science; Entropy; Inference algorithms; Iterative algorithms; Machine learning; Maximum likelihood estimation; Parametric statistics; Robustness; State estimation; Yield estimation; Algorithms; Artificial Intelligence; Computer Simulation; Decision Support Techniques; Entropy; Information Storage and Retrieval; Information Theory; Models, Statistical; Neural Networks (Computer); Pattern Recognition, Automated; Probability Learning;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2004.828755