Title :
Extensions of Fisher Information and Stam´s Inequality
Author :
Lutwak, Erwin ; Lv, Songjun ; Yang, Deane ; Zhang, Gaoyong
Author_Institution :
Dept. of Math., Polytech. Inst. of New York Univ., Brooklyn, NY, USA
fDate :
3/1/2012 12:00:00 AM
Abstract :
We explain how the classical notions of Fisher information of a random variable and Fisher information matrix of a random vector can be extended to a much broader setting. We also show that Stam´s inequality for Fisher information and Shannon entropy, as well as the more generalized versions proved earlier by the authors, are all special cases of more general sharp inequalities satisfied by random vectors. The extremal random vectors, which we call generalized Gaussians, contain Gaussians as a limiting case but are noteworthy because they are heavy-tailed.
Keywords :
Gaussian processes; entropy; matrix algebra; random processes; Fisher information matrix; Gaussian process; Shannon entropy; Stam inequality; random variable; random vector; Covariance matrix; Entropy; Linear matrix inequalities; Random variables; Symmetric matrices; Transforms; Vectors; Entropy; Fisher information; Rényi entropy; Shannon entropy; Shannon theory; Stam inequality; information measure; information theory;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2011.2177563