Abstract :
We characterize the rate distortion function for the source coding with decoder side information setting when the i-th reconstruction symbol is allowed to depend only on the first i + d side information symbols, for some finite lookahead d, in addition to the index from the encoder. For the case of causal side information, i.e., d = 0, we find that the penalty of causality is the omission of the subtracted mutual information term in the Wyner-Ziv rate distortion function. For d > 0, we derive a computable "infinite-letter" expression for the rate distortion function. When specialized to the near-lossless case, our results characterize the best achievable rate for the Slepian-Wolf source coding problem with limited side information lookahead, and have some surprising implications. We find that side information is useless for any fixed d when the joint PMF of the source and side information satisfies the positivity condition P(x,y) > 0 for all (x,y). More generally, the optimal rate depends on the distribution of the pair X, Y only through the distribution of X and the bipartite graph whose edges represent the pairs x,y for which P(x,y) > 0. On the other hand, if side information lookahead dn is allowed to grow faster than logarithmic in the block length n, then H(X|Y) is achievable. Finally, we apply our approach to derive a computable expression for channel capacity when state information is available at the encoder with limited lookahead
Keywords :
channel capacity; combined source-channel coding; decoding; Slepian-Wolf source coding; Wyner-Ziv rate distortion function; channel capacity; decoder; limited side information lookahead; source coding; Bipartite graph; Channel coding; Decoding; Delay; Mutual information; Noise reduction; Rate-distortion; Source coding; Target tracking; Trajectory;