Title :
Some inequalities for channel capacities, mutual informations and mean-squared errors
Author_Institution :
Sch. of Inf. & Sci., Nagoya Univ., Japan
fDate :
9/1/1996 12:00:00 AM
Abstract :
We derive some information-theoretic inequalities to evaluate the channel capacity and mean-squared error. We prove an inequality for the capacity of an additive-noise channel with feedback. We also prove an inequality for mutual information and mean-squared error. The inequality is applied to bound minimum mean-squared transmission errors. The results for the channel capacity and the mean-squared error are expressed in terms of the capacity of the related Gaussian channel, the distortion rate function of the related Gaussian process, the mutual information, and the relative entropy
Keywords :
Gaussian channels; Gaussian noise; approximation theory; channel capacity; entropy; feedback; rate distortion theory; Gaussian channel; Gaussian process; additive noise channel; channel capacity; distortion rate function; feedback; information theoretic inequalities; minimum mean squared transmission errors; mutual information; relative entropy; Additive noise; Channel capacity; Entropy; Feedback; Gaussian channels; Gaussian processes; Mutual information; Random variables; Rate-distortion; Stochastic processes;
Journal_Title :
Information Theory, IEEE Transactions on