Title : 
Proving and disproving information inequalities
         
        
            Author : 
Siu-Wai Ho ; Chee Wei Tan ; Yeung, Raymond W.
         
        
            Author_Institution : 
Inst. for Telecommun. Res., Univ. of South Australia, Adelaide, SA, Australia
         
        
        
            fDate : 
June 29 2014-July 4 2014
         
        
        
        
            Abstract : 
Proving an information inequality is a crucial step in establishing the converse results in coding theorems. However, an information inequality involving many random variables is difficult to be proved manually. In [1], Yeung developed a framework that uses linear programming for verifying linear information inequalities. Under this framework, this paper considers a few other problems that can be solved by using Lagrange duality and convex approximation. We will demonstrate how linear programming can be used to find an analytic proof of an information inequality. The way to find a shortest proof is explored. When a given information inequality cannot be proved, the sufficient conditions for a counterexample to disprove the information inequality are found by linear programming.
         
        
            Keywords : 
approximation theory; encoding; information theory; linear programming; Lagrange duality; coding theorems; convex approximation; information inequality disproving; information inequality proving; linear information inequality; linear programming; Channel coding; Cramer-Rao bounds; Entropy; Joints; Linear programming; Random variables;
         
        
        
        
            Conference_Titel : 
Information Theory (ISIT), 2014 IEEE International Symposium on
         
        
            Conference_Location : 
Honolulu, HI
         
        
        
            DOI : 
10.1109/ISIT.2014.6875347