• DocumentCode
    3060175
  • Title

    Combining multi-distributed mixture models and bayesian networks for semi-supervised learning

  • Author

    Stritt, Manuel ; Schmidt-Thieme, Lars ; Poeppel, Gerhard

  • Author_Institution
    Inf. Syst. & Machine Learning Lab, Hildesheim
  • fYear
    2007
  • fDate
    13-15 Dec. 2007
  • Firstpage
    354
  • Lastpage
    362
  • Abstract
    In many real world scenarios, mixture models have successfully been used for analyzing features in data ([11, 13, 21]). Usually, multivariate Gaussian distributions for continuous data ([2, 8, 4]) or Bayesian networks for nominal data ([15, 16]) are applied. In this paper, we combine both approaches in a family of Bayesian models for continuous data that are able to handle univariate as well as multivariate nodes, different types of distributions, e.g. Gaussian as well as Poisson distributed nodes, and dependencies between nodes. The models we introduce can be used for unsupervised, semi-supervised as well as for fully supervised learning tasks. We evaluate our models empirically on generated synthetic data and on public datasets thereby showing that they outperform classifiers such as SVMs and logistic regression on mixture data.
  • Keywords
    Gaussian processes; belief networks; learning (artificial intelligence); Bayesian networks; Poisson distributed nodes; fully supervised learning; multidistributed mixture models; multivariate Gaussian distributions; semi supervised learning; Bayesian methods; Covariance matrix; Data analysis; Eigenvalues and eigenfunctions; Gaussian distribution; Information analysis; Information systems; Machine learning; Semisupervised learning; Supervised learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Machine Learning and Applications, 2007. ICMLA 2007. Sixth International Conference on
  • Conference_Location
    Cincinnati, OH
  • Print_ISBN
    978-0-7695-3069-7
  • Type

    conf

  • DOI
    10.1109/ICMLA.2007.60
  • Filename
    4457256