Title :
On the asymptotic tightness of the Shannon lower bound
Author :
Linder, Tamas ; Zamir, Ram
Author_Institution :
Coordinated Sci. Lab., Illinois Univ., Urbana, IL
fDate :
11/1/1994 12:00:00 AM
Abstract :
New results are proved on the convergence of the Shannon (1959) lower bound to the rate distortion function as the distortion decreases to zero. The key convergence result is proved using a fundamental property of informational divergence. As a corollary, it is shown that the Shannon lower bound is asymptotically tight for norm-based distortions, when the source vector has a finite differential entropy and a finite α th moment for some α>0, with respect to the given norm. Moreover, we derive a theorem of Linkov (1965) on the asymptotic tightness of the Shannon lower bound for general difference distortion measures with more relaxed conditions on the source density. We also show that the Shannon lower bound relative to a stationary source and single-letter difference distortion is asymptotically tight under very weak assumptions on the source distribution
Keywords :
convergence of numerical methods; entropy; rate distortion theory; Linkov theorem; Shannon lower bound; asymptotic tightness; convergence; distortion measures; finite differential entropy; informational divergence; norm-based distortions; rate distortion function; rate distortion theory; single-letter difference distortion; source density; source distribution; source vector; stationary source; Convergence; Density measurement; Distortion measurement; Entropy; Gaussian processes; Rate distortion theory; Rate-distortion; Redundancy; Stochastic processes; Upper bound;
Journal_Title :
Information Theory, IEEE Transactions on