DocumentCode
1472046
Title
Joint Parsimonious Modeling and Model Order Selection for Multivariate Gaussian Mixtures
Author
Markley, Scott C. ; Miller, David J.
Author_Institution
RKF Eng. LLC, Washington, DC, USA
Volume
4
Issue
3
fYear
2010
fDate
6/1/2010 12:00:00 AM
Firstpage
548
Lastpage
559
Abstract
Multivariate Gaussian mixture models (GMMs) are widely for density estimation, model-based data clustering, and statistical classification. A difficult problem is estimating the model order, i.e., the number of mixture components, and model structure. Use of full covariance matrices, with number of parameters quadratic in the feature dimension, entails high model complexity, and thus may underestimate order, while naive Bayes mixtures may introduce model bias and lead to order overestimates. We develop a parsimonious modeling and model order and structure selection method for GMMs which allows for and optimizes over parameter tying configurations across mixture components applied to each individual parameter, including the covariates. We derive a generalized Expectation-Maximization algorithm for [(Bayesian information criterion (BIC)-based] penalized likelihood minimization. This, coupled with sequential model order reduction, forms our joint learning and model selection. Our method searches over a rich space of models and, consistent with minimizing BIC, achieves fine-grained matching of model complexity to the available data. We have found our method to be effective and largely robust in learning accurate model orders and parameter-tying structures for simulated ground-truth mixtures. We compared against naive Bayes and standard full-covariance GMMs for several criteria: 1) model order and structure accuracy (for synthetic data sets); 2) test set log-likelihood; 3) unsupervised classification accuracy; and 4) accuracy when class-conditional mixtures are used in a plug-in Bayes classifier. Our method, which chooses model orders intermediate between standard and naive Bayes GMMs, gives improved accuracy with respect to each of these performance measures.
Keywords
Bayes methods; Gaussian processes; expectation-maximisation algorithm; information theory; pattern classification; pattern clustering; reduced order systems; signal processing; unsupervised learning; Bayes classifier; Bayesian information criterion; density estimation; expectation maximization algorithm; full covariance matrix; learning accurate model order; mixture components; model based data clustering; model order selection; multivariate Gaussian mixtures; parameter tying structures; parsimonious modeling; penalized likelihood minimization; sequential model order reduction; simulated ground-truth mixture; statistical classification; structure accuracy; test set log-likelihood; unsupervised classification accuracy; unsupervised learning; Bayesian information criterion (BIC); Multivariate Gaussian mixture model (GMM); generalized Expectation–maximization (GEM) algorithm; unsupervised learning;
fLanguage
English
Journal_Title
Selected Topics in Signal Processing, IEEE Journal of
Publisher
ieee
ISSN
1932-4553
Type
jour
DOI
10.1109/JSTSP.2009.2038312
Filename
5447637
Link To Document