DocumentCode :
1253525
Title :
Gaussian codes and Shannon bounds for multiple descriptions
Author :
Zamir, Ram
Author_Institution :
Dept. of Electr. Eng.-Syst., Tel Aviv Univ., Israel
Volume :
45
Issue :
7
fYear :
1999
fDate :
11/1/1999 12:00:00 AM
Firstpage :
2629
Lastpage :
2636
Abstract :
A pair of well-known inequalities, due to Shannon, upper/lower bound the rate-distortion function of a real source by the rate-distortion function of the Gaussian source with the same variance/entropy. We extend these bounds to multiple descriptions, a problem for which a general “single-letter” solution is not known. We show that the set DX(R1, R2) of achievable marginal (d1, d2) and central (d0) mean-squared errors in decoding X from two descriptions at rates R1 and R2 satisfies D*(σx2, R1, R2)⊆D X(R1, R2)⊆D*(Px, R1, R2) where σx2 and Px are the variance and the entropy-power of X, respectively, and D*(σ2, R1, R2) is the multiple description distortion region for a Gaussian source with variance σ2 found by Ozarow (1980). We further show that like in the single description case, a Gaussian random code achieves the outer bound in the limit as d1, d2→0, thus the outer bound is asymptotically tight at high resolution conditions
Keywords :
Gaussian processes; decoding; random codes; rate distortion theory; source coding; Gaussian codes; Gaussian random code; Gaussian source; Shannon bounds; achievable marginal; asymptotically tight bound; decoding; entropy; high resolution conditions; mean-squared errors; multiple description distortion region; multiple descriptions; outer bound; rate-distortion function; variance; Codes; Communication channels; Decoding; Feedback; Information theory; Memoryless systems; Notice of Violation; Source coding; Stability; Testing;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.796418
Filename :
796418
Link To Document :
بازگشت