DocumentCode :
349621
Title :
Exact entropy series representation for blind source separation
Author :
Salam, F.M. ; Erten, G.
Author_Institution :
Dept. of Electr. & Comput. Eng., Michigan State Univ., East Lansing, MI, USA
Volume :
1
fYear :
1999
fDate :
1999
Firstpage :
553
Abstract :
An explicit infinite series for the marginal entropy of a probability density function is developed. The series includes all orders of statistics and employs both the Gram-Charlier and the Edgeworth expansions for its derivation. The derivation exploits the fact that the two expansions are equivalent for the same probability density. The developed entropy series expression can be used to express the averaged mutual information to any degree of accuracy. This measure is then used in the derivation of the update laws of the blind separation of sources
Keywords :
decorrelation; entropy; series (mathematics); signal reconstruction; signal representation; Edgeworth expansion; Gram-Charlier expression; averaged mutual information; blind source separation; exact entropy series representation; explicit infinite series; independent component analysis; marginal entropy; nonlinear signal processing; probability density function; Blind source separation; Density functional theory; Entropy; Filtering; Independent component analysis; Mutual information; Neural networks; Probability density function; Signal processing; Statistics;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man, and Cybernetics, 1999. IEEE SMC '99 Conference Proceedings. 1999 IEEE International Conference on
Conference_Location :
Tokyo
ISSN :
1062-922X
Print_ISBN :
0-7803-5731-0
Type :
conf
DOI :
10.1109/ICSMC.1999.814152
Filename :
814152
Link To Document :
بازگشت