DocumentCode :
1406471
Title :
Divergence measures based on the Shannon entropy
Author :
Lin, Jianhua
Author_Institution :
Dept. of Comput. Sci., Brandeis Univ., Waltham, MA, USA
Volume :
37
Issue :
1
fYear :
1991
fDate :
1/1/1991 12:00:00 AM
Firstpage :
145
Lastpage :
151
Abstract :
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness
Keywords :
entropy; information theory; Shannon entropy; boundedness; divergence measures; finiteness; information theory; nonnegativity; probability of misclassification error; semiboundedness; variational distance; Computer science; Entropy; Genetics; Pattern analysis; Pattern recognition; Probability distribution; Signal analysis; Signal processing; Taxonomy; Upper bound;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.61115
Filename :
61115
Link To Document :
بازگشت