Title :
On essentially conditional information inequalities
Author :
Kaced, Tarik ; Romashchenko, Andrei
Author_Institution :
LIF de Marseille, Univ. Aix-Marseille, Marseille, France
fDate :
July 31 2011-Aug. 5 2011
Abstract :
In 1997, Z. Zhang and R.W. Yeung found the first example of a conditional information inequality in four variables that is not “Shannon-type”. This linear inequality for entropies is called conditional (or constraint) since it holds only under condition that some linear equations are satisfied for the involved entropies. Later, the same authors and other researchers discovered several unconditional information inequalities that do not follow from Shannon´s inequalities for entropy. In this paper we show that some non Shannon-type conditional inequalities are “essentially” conditional, i.e., they cannot be extended to any unconditional inequality. We prove one new essentially conditional information inequality for Shannon´s entropy and discuss conditional information inequalities for Kolmogorov complexity.
Keywords :
computational complexity; entropy; Kolmogorov complexity; Shannon entropy; conditional information inequality; entropy; linear equation; linear inequality; Cloning; Complexity theory; Cramer-Rao bounds; Entropy; Joints; Mutual information; Random variables;
Conference_Titel :
Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on
Conference_Location :
St. Petersburg
Print_ISBN :
978-1-4577-0596-0
Electronic_ISBN :
2157-8095
DOI :
10.1109/ISIT.2011.6033889