DocumentCode :
3495570
Title :
The kernel mutual information
Author :
Gretton, Arthur ; Herbrich, Ralf ; Smola, Alexander J.
Author_Institution :
Max-Planck-Inst. fur Biol. Kybernetik, Tubingen, Germany
Volume :
4
fYear :
2003
fDate :
6-10 April 2003
Abstract :
We introduce a new contrast function, the kernel mutual information (KMI), to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised approximation of the continuous random variables. We show that the kernel generalised variance (KGV) of F. Bach and M. Jordan (see JMLR, vol.3, p.1-48, 2002) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation.
Keywords :
independent component analysis; parameter estimation; source separation; approximate upper bound; continuous random variables; contrast function; independent component analysis; instantaneous ICA; kernel density estimate; kernel generalised variance; kernel mutual information; signal processing; signal separation; upper bound; Australia; Covariance matrix; Cybernetics; Density measurement; Independent component analysis; Kernel; Mutual information; Random variables; Signal processing; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03). 2003 IEEE International Conference on
ISSN :
1520-6149
Print_ISBN :
0-7803-7663-3
Type :
conf
DOI :
10.1109/ICASSP.2003.1202784
Filename :
1202784
Link To Document :
بازگشت