• DocumentCode
    66772
  • Title

    Diversity in Independent Component and Vector Analyses: Identifiability, algorithms, and applications in medical imaging

  • Author

    Adali, Tulay ; Anderson, Matthew ; Geng-Shen Fu

  • Author_Institution
    Dept. of CS & Electr. Eng., Univ. of Maryland Baltimore County, Baltimore, MD, USA
  • Volume
    31
  • Issue
    3
  • fYear
    2014
  • fDate
    May-14
  • Firstpage
    18
  • Lastpage
    33
  • Abstract
    Starting with a simple generative model and the assumption of statistical independence of the underlying components, independent component analysis (ICA) decomposes a given set of observations by making use of the diversity in the data, typically in terms of statistical properties of the signal. Most of the ICA algorithms introduced to date have considered one of the two types of diversity: non-Gaussianity?i.e., higher-order statistics (HOS)?or, sample dependence. A recent generalization of ICA, independent vector analysis (IVA), generalizes ICA to multiple data sets and adds the use of one more diversity, dependence across multiple data sets for achieving an independent decomposition, jointly across multiple data sets. Finally, both ICA and IVA, when implemented in the complex domain, enjoy the addition of yet another type of diversity, noncircularity of the sources?underlying components. Mutual information rate provides a unifying framework such that all these statistical properties?types of diversity?can be jointly taken into account for achieving the independent decomposition. Most of the ICA methods developed to date can be cast as special cases under this umbrella, as well as the more recently developed IVA methods. In addition, this formulation allows us to make use of maximum likelihood theory to study large sample properties of the estimator, derive the Cramer-Rao lower bound(CRLB) and determine the conditions for the identifiability of the ICA and IVA models. In this overview article, we first present ICA, and then its generalization to multiple data sets, IVA, both using mutual information rate, present conditions for the identifiability of the given linear mixing model and derive the performance bounds. We address how various methods fall under this umbrella and give examples of performance for a few sample algorithms compared with the performance bound. We then discuss the importance of approaching the performance bound depending on the goal, and use m- dical image analysis as the motivating example.
  • Keywords
    higher order statistics; independent component analysis; matrix decomposition; maximum likelihood estimation; mixture models; vectors; CRLB; Cramer-Rao lower bound; HOS; ICA algorithm; IVA method; higher order statistics; independent component analyses; independent decomposition; independent vector analysis; linear mixing model; maximum likelihood estimation; mutual information rate; nonGaussianity; performance bound; statistical analysis; Compontent analysis; Covariance matrices; Medical image processing; Mutual information; Signal processing algorithms; Source separation; Statistical analysis;
  • fLanguage
    English
  • Journal_Title
    Signal Processing Magazine, IEEE
  • Publisher
    ieee
  • ISSN
    1053-5888
  • Type

    jour

  • DOI
    10.1109/MSP.2014.2300511
  • Filename
    6784026