Title :
The mutual information as a measure of statistical dependence
Author :
Darbellay, Georges A.
Author_Institution :
Inst. of Inf. Theory & Autom., Praha, Poland
fDate :
29 Jun-4 Jul 1997
Abstract :
The mutual information I, if appropriately normalised, can serve as a measure of correlation. In encompassing nonlinear dependences, it generalises the classical measures of linear correlation. An efficient nonparametric estimator of I can be derived from Dobrushin´s (1963) information theorem
Keywords :
information theory; Dobrushin´s information theorem; linear correlation; nonlinear dependence; nonparametric estimator; normalised mutual information; statistical dependence; Automation; Gaussian distribution; Hypercubes; Information theory; Mutual information; Partitioning algorithms; Probability distribution; Random variables; Stochastic processes; Vectors;
Conference_Titel :
Information Theory. 1997. Proceedings., 1997 IEEE International Symposium on
Conference_Location :
Ulm
Print_ISBN :
0-7803-3956-8
DOI :
10.1109/ISIT.1997.613342