Abstract :
The marginal, joint, and conditional entropy and the trans-information are derived for random variables with lognormal probability distributions, revealing some interesting deviations from its sister distribution, the normal. A maximal entropy property, capacity theorem for the lognormal channel, and implications for some nonlinear transformations are also presented. Potential applications of the measures to psychophysics are mentioned.