Title :
Relative entropy at the channel output of a capacity-achieving code
Author :
Polyanskiy, Yury ; Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., MIT, Cambridge, MA, USA
Abstract :
In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).
Keywords :
channel capacity; channel coding; auxiliary output distribution; capacity-achieving code; channel output; relative entropy; AWGN; Entropy; Manganese; Memoryless systems; Mutual information; Random variables; USA Councils; Shannon theory; additive white Gaussian noise; concentration of measure; discrete memoryless channels; empirical output statistics; general channels; information measures; strong converse;
Conference_Titel :
Communication, Control, and Computing (Allerton), 2011 49th Annual Allerton Conference on
Conference_Location :
Monticello, IL
Print_ISBN :
978-1-4577-1817-5
DOI :
10.1109/Allerton.2011.6120149