DocumentCode :
1554279
Title :
Maximum independence and mutual information
Author :
Meo, Rosa
Author_Institution :
Dipt. di Informatica, Torino Univ., Italy
Volume :
48
Issue :
1
fYear :
2002
fDate :
1/1/2002 12:00:00 AM
Firstpage :
318
Lastpage :
324
Abstract :
If I1, I2, ..., Ik are random Boolean variables and the joint probabilities up to the (k-1)th order are known, the values of the kth-order probabilities maximizing the overall entropy have been defined as the maximum independence estimate.. In this article, some contributions deriving from the definition of maximum independence probabilities are proposed. First, it is shown that the maximum independence values are reached when the product of the probabilities of the minterms i1* i2*...ik * containing an even number of complemented variables is equal to the products of the probabilities of the other minterms. Second, the new definition of group mutual information, as the difference between the maximum independence entropy and the real entropy, is proposed and discussed. Finally, the new concept of mutual information is applied to the determination of dependencies in data mining problems
Keywords :
Boolean algebra; data mining; maximum entropy methods; probability; random processes; set theory; data mining; entropy; itemsets; joint probabilities; maximum independence entropy; maximum independence estimate; maximum independence probabilities; minterms; mutual information; random Boolean variables; Data mining; Databases; Entropy; Mutual information; Neural networks; Probability; Vocabulary;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.971763
Filename :
971763
Link To Document :
بازگشت