Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Waterloo, Waterloo, ON, Canada
Abstract :
Given an independent and identically distributed source X = {Xi}i=1∞ with finite Shannon entropy or differential entropy (as the case may be) H(X), the non-asymptotic equipartition property (NEP) with respect to H(X) is established, which characterizes, for any finite block length n, how close −1 over n ln p(X1X2…Xn) is to H(X) by determining the information spectrum of X1X2…Xn, i.e., the distribution of −1 over n ln p(X1X2…Xn). Non-asymptotic equipartition properties (with respect to conditional entropy, mutual information, and relative entropy) in a similar nature can also be established [3]. These non-asymptotic equipartition properties are instrumental to the development of non-asymptotic coding (including both source and channel coding) results in information theory in the same way as the asymptotic equipartition property to all asymptotic coding theorems established so far in information theory. As an example, the NEP with respect to H(X) is used to establish a non-asymptotic fixed rate source coding theorem, which reveals, for any finite block length n, a complete picture about the tradeoff between the minimum rate of fixed rate coding of X1…Xn and error probability when the error probability is a constant, or goes to 0 with block length n at a sub-polynomial n−α, 0 < α < 1, polynomial n−α, α ≥ 1, or sub-exponential ℯ−nα, 0 < α < 1, speed. In particular, it is shown that for any finite block length n, the minimum rate (in nats per symbol) of fixed rate coding of X1X2…Xn with error probability equation is equation, where α > 0 - nd σH2(X) = E[−lnp(X1)]2 − H2(X) is the information variance of X. With the help of the NEP with respect to other information quantities, non-asymptotic channel coding theorems of similar nature will be established in a separate paper.
Keywords :
channel coding; error statistics; information theory; source coding; NEP; asymptotic coding theorem; asymptotic equipartition property; differential entropy; error probability; finite Shannon entropy; finite block length; independent identically distributed source; information quantity; information spectrum; information theory; information variance; nonasymptotic channel coding theorem; nonasymptotic equipartition property; nonasymptotic fixed rate source coding theorem; Channel coding; Entropy; Error probability; Polynomials; Source coding; Asymptotic equipartition property (AEP); conditional entropy; entropy; fixed rate coding; information spectrum; mutual information; non-asymptotic equipartition property (NEP);