Title :
Joint Source–Channel Coding Error Exponent for Discrete Communication Systems With Markovian Memory
Author :
Zhong, Yangfan ; Alajaji, Fady ; Campbell, L. Lorne
Author_Institution :
Dept. of Math. & Stat., Queen´´s Univ., Kingston, ON
Abstract :
We study the error exponent, EJ, for reliably transmitting a discrete stationary ergodic Markov (SEM) source Q over a discrete channel W with additive SEM noise via a joint source-channel (JSC) code. We first establish an upper bound for EJ in terms of the Renyi entropy rates of the source and noise processes. We next investigate the analytical computation of EJ by comparing our bound with Gallager´s lower bound (1968) when the latter one is specialized to the SEM source-channel system. We also note that both bounds can be represented in Csiszar´s form (1980), as the minimum of the sum of the source and channel error exponents. Our results provide us with the tools to systematically compare EJ with the tandem (separate) coding exponent EJ. We show that as in the case of memoryless source-channel pairs EJ les 2Er and we provide explicit conditions for which EJ > ET. Numerical results indicate that EJ ap 2ET for many SEM source-channel pairs, hence illustrating a substantial advantage of JSC coding over tandem coding for systems with Markovian memory.
Keywords :
Markov processes; combined source-channel coding; entropy; Markovian memory; additive noise; discrete communication systems; error probability; joint source-channel coding error; stationary ergodic Markov source; tandem separate coding; Additive noise; Channel coding; Councils; Entropy; Error probability; Helium; Information theory; Memoryless systems; Propagation losses; Upper bound; Additive noise; Markov types; RÉnyi entropy rate; error exponent; error probability; joint source-channel (JSC) coding; stationary ergodic Markov source–channel; tandem separate coding;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2007.909092