DocumentCode
40863
Title
Mutual Information Matrices Are Not Always Positive Semidefinite
Author
Jakobsen, Sune K.
Author_Institution
Sch. of Math. Sci., Queen Mary Univ. of London, London, UK
Volume
60
Issue
5
fYear
2014
fDate
May-14
Firstpage
2694
Lastpage
2696
Abstract
For discrete random variables X1, ..., Xn we construct an n by n matrix. In the (i, j)-entry we put the mutual information I(Xi ; Xj) between Xi and Xj. In particular, in the (i, i)-entry we put the entropy H(Xi) = I(Xi; Xi) of Xi. This matrix, called the mutual information matrix of (X1, ..., Xn), has been conjectured to be positive semidefinite. In this paper, we give counterexamples to the conjecture, and show that the conjecture holds for up to three random variables.
Keywords
entropy; matrix algebra; discrete random variables; entropy; mutual information matrix; Computer science; Educational institutions; Eigenvalues and eigenfunctions; Entropy; Linear matrix inequalities; Mutual information; Random variables; Information inequalities; linear algebra; mutual information;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2014.2311434
Filename
6774945
Link To Document