Title of article :
Some bounds on entropy measures in information theory
Original Research Article
Author/Authors :
S.S. Dragomir، نويسنده , , C.J. Goh، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 1997
Abstract :
In information theory, the fundamental tool is the entropy function, whose upper bound is derived by the use of Jensen Inequality. In this paper, we extend the Jensen Inequality and apply it to derive some useful lower bounds for various entropy measures of discrete random variables.
Keywords :
Jensenיs Inequality , Mutual information , Entropy , Information theory
Journal title :
Applied Mathematics Letters
Journal title :
Applied Mathematics Letters