Title :
Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels
Author :
Payaró, Miquel ; Palomar, Daniel P.
Author_Institution :
Dept. of Electron. & Comput. Eng., Hong Kong Univ. of Sci. & Technol., Kowloon, China
Abstract :
Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa.
Keywords :
Gaussian channels; Hessian matrices; Jacobian matrices; entropy; least mean squares methods; Fisher information matrices; arbitrary signaling; differential entropy; entropy power; linear vector Gaussian channels; minimum mean square error Jacobian; mutual information Hessian; mutual information concavity; Entropy; Gaussian channels; Gaussian noise; Information theory; Jacobian matrices; Linear matrix inequalities; Mean square error methods; Mutual information; System analysis and design; Vectors; Concavity properties; Fisher information matrix; Gaussian noise; Hessian matrices; differential entropy; entropy power; linear vector Gaussian channels; minimum mean-square error (MMSE); mutual information; nonlinear estimation;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2009.2023749