Title :
On the Redundancy of Slepian–Wolf Coding
Author :
Da-ke He;Luis A. Lastras-Montano;En-hui Yang;Ashish Jagmohan;Jun Chen
Author_Institution :
IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
Abstract :
In this paper, the redundancy of both variable and fixed rate Slepian-Wolf coding is considered. Given any jointly memoryless source-side information pair {(Xi, Yi)}i=1 infin with finite alphabet, the redundancy Rn(isinn) of variable rate Slepian-Wolf coding of X1 n with decoder only side information Y1 n depends on both the block length n and the decoding block error probability isinn, and is defined as the difference between the minimum average compression rate of order n variable rate Slepian-Wolf codes having the decoding block error probability less than or equal to isinn, and the conditional entropy H(X|Y), where H(X|Y) is the conditional entropy rate of the source given the side information. The redundancy of fixed rate Slepian-Wolf coding of X1 n with decoder only side information Y1 n is defined similarly and denoted by RF n(isinn). It is proved that under mild assumptions about isinn, Rn(isinn) = dvradic-log isinn/n + (oradic-log isinn/n) and RF n(isinn) - dfradic-log isinn/n + o(radic-log isinn/n), where df and dnu are two constants completely determined by the joint distribution of the source-side information pair. Since dv is generally smaller than df, our results show that variable rate Slepian-Wolf coding is indeed more efficient than fixed rate Slepian-Wolf coding.
Keywords :
"Decoding","Redundancy","Error probability","Source coding","Information theory","Entropy","Data compression","Random variables","Councils"
Journal_Title :
IEEE Transactions on Information Theory
DOI :
10.1109/TIT.2009.2032803