• DocumentCode
    1239652
  • Title

    Information theoretic versus cumulant-based contrasts for multimodal source separation

  • Author

    Vrins, Frédéric ; Verleysen, Michel

  • Author_Institution
    UCL Machine Learning Group, Univ. Catholique de Louvain, Louvain-la-Neuve, Belgium
  • Volume
    12
  • Issue
    3
  • fYear
    2005
  • fDate
    3/1/2005 12:00:00 AM
  • Firstpage
    190
  • Lastpage
    193
  • Abstract
    Recently, several authors have emphasized the existence of spurious maxima in usual contrast functions for source separation (e.g., the likelihood and the mutual information) when several sources have multimodal distributions. The aim of this letter is to compare the information theoretic contrasts to cumulant-based ones from the robustness to spurious maxima point of view. Even if all of them tend to measure, in some way, the same quantity, which is the output independence (or equivalently, the output non-Gaussianity), it is shown that in the case of a mixture involving two sources, the kurtosis-based contrast functions are more robust than the information theoretic ones when the source distributions are multimodal.
  • Keywords
    blind source separation; entropy; higher order statistics; independent component analysis; blind source separation; cumulant-based contrasts; entropy; independent component analysis; information theoretic; kurtosis-based contrast functions; multimodal distributions; spurious maxima; Covariance matrix; Entropy; Gaussian distribution; Independent component analysis; Machine learning; Mutual information; Robustness; Source separation; Blind source separation; contrast function; entropy; independent component analysis; kurtosis; multimodal sources;
  • fLanguage
    English
  • Journal_Title
    Signal Processing Letters, IEEE
  • Publisher
    ieee
  • ISSN
    1070-9908
  • Type

    jour

  • DOI
    10.1109/LSP.2004.840863
  • Filename
    1395937