DocumentCode
73214
Title
Blind Source Separation by Entropy Rate Minimization
Author
Geng-Shen Fu ; Phlypo, Ronald ; Anderson, Matthew ; Xi-Lin Li ; Adali, Tulay
Author_Institution
Dept. of CSEE, Univ. of Maryland, Baltimore, MD, USA
Volume
62
Issue
16
fYear
2014
fDate
Aug.15, 2014
Firstpage
4245
Lastpage
4255
Abstract
By assuming latent sources are statistically independent, independent component analysis separates underlying sources from a given linear mixture. Since in many applications, latent sources are both non-Gaussian and have sample dependence, it is desirable to exploit both properties jointly. In this paper, we use mutual information rate to construct a general framework for analysis and derivation of algorithms that take both properties into account. We discuss two types of source models for entropy rate estimation-a Markovian and an invertible filter model-and give the general independent component analysis cost function, update rule, and performance analysis based on these. We also introduce four algorithms based on these two models, and show that their performance can approach the Cramér-Rao lower bound. In addition, we demonstrate that the algorithms with flexible models exhibit very desirable performance for “natural” data.
Keywords
blind source separation; independent component analysis; Cramér-Rao lower bound; blind source separation; entropy rate estimation; entropy rate minimization; independent component analysis cost function; invertible filter model; linear mixture; Algorithm design and analysis; Analytical models; Cost function; Entropy; Minimization; Mutual information; Signal processing algorithms; Blind source separation; Markov model; independent component analysis; maximum entropy distribution; mutual information rate;
fLanguage
English
Journal_Title
Signal Processing, IEEE Transactions on
Publisher
ieee
ISSN
1053-587X
Type
jour
DOI
10.1109/TSP.2014.2333563
Filename
6845364
Link To Document