Abstract :
We investigate the basic question of information theory, namely, evaluation of Shannon entropy, and a more general Renyi (1961) entropy, for some discrete distributions (e.g., binomial, negative binomial, etc.). We aim at establishing analytic methods (i.e., those in which complex analysis plays a pivotal role) for such computations which often yield estimates of unparalleled precision. The main analytic tool used here is that of analytic poissonization and depoissonization. We illustrate our approach on the entropy evaluation of the binomial distribution, that is, we prove that for binomial (n, p) distribution Shannon´s hn becomes hn≈½ln n+½+ln√(2πp(1-p))+Σk⩾1ak n-k where ak are explicitly computable constants. Moreover, we argue that analytic methods (e.g., complex asymptotics such as Rice´s method and singularity analysis, Mellin transforms, poissonization, and depoissonization) can offer new tools for information theory, especially for studying second-order asymptotics (e.g., redundancy). In fact, there has been a resurgence of interest and a few successful applications of analytic methods to a variety of problems of information theory, therefore, we propose to name such investigations as analytic information theory
Keywords :
binomial distribution; entropy; Mellin transforms; Rice´s method; Shannon entropy; analytic depoissonization; analytic methods; analytic poissonization; binomial distribution; complex analysis; discrete distributions; entropy computations; general Renyi entropy; information theory; negative binomial distribution; second-order asymptotics; singularity analysis; Application software; Collaborative work; Computer science; Distributed computing; Entropy; Information analysis; Information systems; Information theory; Jacobian matrices; Yield estimation;