Title :
Families of distributions characterized by entropy
Author_Institution :
Div. of Stat., Northern Illinois Univ., DeKalb, IL, USA
fDate :
7/1/2001 12:00:00 AM
Abstract :
In this correspondence, we examine the relation between two random variables with common entropy function. The entropy has been used to measure dispersion, uncertainty, risk, and volatility. There is no general relationship between two random variables having equal entropies. We present results that identify certain conditions based on the entropy under which two random variables are stochastically equal. We also obtain similar results for the relative entropy
Keywords :
entropy; information theory; probability; random processes; stochastic processes; continuous case; discrete case; entropy function; information theory; probability distributions; random variables; stochastic equality; Density measurement; Dispersion; Distribution functions; Entropy; Exponential distribution; Measurement uncertainty; Probability distribution; Random variables; Statistical distributions; Stochastic processes;
Journal_Title :
Information Theory, IEEE Transactions on