DocumentCode :
1420578
Title :
Entropy-Based Incremental Variational Bayes Learning of Gaussian Mixtures
Author :
Penalver, A. ; Escolano, F.
Author_Institution :
Dept. de Estadistica, Mat. e Inf., Univ. Miguel Hernandez, Elche, Spain
Volume :
23
Issue :
3
fYear :
2012
fDate :
3/1/2012 12:00:00 AM
Firstpage :
534
Lastpage :
540
Abstract :
Variational approaches to density estimation and pattern recognition using Gaussian mixture models can be used to learn the model and optimize its complexity simultaneously. In this brief, we develop an incremental entropy-based variational learning scheme that does not require any kind of initialization. The key element of the proposal is to exploit the incremental learning approach to perform model selection through efficient iteration over the variational Bayes optimization step in a way that the number of splits is minimized. The method starts with just one component and adds new components iteratively by splitting the worst fitted kernel in terms of evaluating its entropy. Our experimental results, on synthetic and real data sets show the effectiveness of the approach outperforming other state-of-the-art incremental component learners.
Keywords :
Bayes methods; Gaussian processes; entropy; iterative methods; learning (artificial intelligence); optimisation; pattern recognition; Gaussian mixture models; density estimation; entropy-based incremental variational Bayes learning; incremental component learners; iterative method; model selection; pattern recognition; real data sets; synthetic data sets; variational Bayes optimization step; Bayesian methods; Computational modeling; Covariance matrix; Entropy; Estimation; Kernel; Learning systems; Clustering; entropy estimation; mixture models; model order selection; variational Bayes methods;
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2011.2177670
Filename :
6129512
Link To Document :
بازگشت