Title of article :
Bayesian hybrid generative discriminative learning based on finite Liouville mixture models
Author/Authors :
Bouguila، نويسنده , , Nizar، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2011
Pages :
18
From page :
1183
To page :
1200
Abstract :
Recently hybrid generative discriminative approaches have emerged as an efficient knowledge representation and data classification engine. However, little attention has been devoted to the modeling and classification of non-Gaussian and especially proportional vectors. Our main goal, in this paper, is to discover the true structure of this kind of data by building probabilistic kernels from generative mixture models based on Liouville family, from which we develop the Beta-Liouville distribution, and which includes the well-known Dirichlet as a special case. The Beta-Liouville has a more general covariance structure than the Dirichlet which makes it more practical and useful. Our learning technique is based on a principled purely Bayesian approach which resulted models are used to generate support vector machine (SVM) probabilistic kernels based on information divergence. In particular, we show the existence of closed-form expressions of the Kullback–Leibler and Rényi divergences between two Beta-Liouville distributions and then between two Dirichlet distributions as a special case. Through extensive simulations and a number of experiments involving synthetic data, visual scenes and texture images classification, we demonstrate the effectiveness of the proposed approaches.
Keywords :
Mixture models , SVM , Bayesian inference , Conjugate prior , Bayes factor , image classification , Texture Modeling , Liouville family of distributions , Generative models , Gibbs sampling , Discriminative models , Exponential family
Journal title :
PATTERN RECOGNITION
Serial Year :
2011
Journal title :
PATTERN RECOGNITION
Record number :
1734035
Link To Document :
بازگشت