Title :
On the capacity of Markov sources over noisy channels
Author_Institution :
Div. of Eng. & Appl. Sci., Harvard Univ., Cambridge, MA, USA
fDate :
6/23/1905 12:00:00 AM
Abstract :
We present an expectation-maximization method for optimizing Markov process transition probabilities to increase the mutual information rate achievable when the Markov process is transmitted over a noisy finite-state machine channel. The method provides a tight lower bound on the achievable information rate of a Markov process over a noisy channel and it is conjectured that it actually maximizes this information rate. The latter statement is supported by empirical evidence (not shown in this paper) obtained through brute-force optimization methods on low-order Markov processes. The proposed expectation-maximization procedure can be used to find tight lower bounds on the capacities of finite-state machine channels (say, partial response channels) or the noisy capacities of constrained (say, run-length limited) sequences, with the bounds becoming arbitrarily tight as the memory-length of the input Markov process approaches infinity. The method links the Arimoto-Blahut algorithm to Shannon´s noise-free entropy maximization by introducing the noisy adjacency matrix.
Keywords :
"Markov processes","Optimization methods","Random variables","Mutual information","Information rates","Entropy","Symmetric matrices","Partial response channels","Upper bound","H infinity control"
Conference_Titel :
Global Telecommunications Conference, 2001. GLOBECOM ´01. IEEE
Print_ISBN :
0-7803-7206-9
DOI :
10.1109/GLOCOM.2001.965977