DocumentCode :
1027253
Title :
On classification with empirically observed statistics and universal data compression
Author :
Ziv, Jacob
Author_Institution :
Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa, Israel
Volume :
34
Issue :
2
fYear :
1988
fDate :
3/1/1988 12:00:00 AM
Firstpage :
278
Lastpage :
286
Abstract :
Classification with empirically observed statistics is studied for finite alphabet sources. Efficient universal discriminant functions are described and shown to be related to universal data compression. It is demonstrated that if one of the probability measure of the two classes is not known, it is still possible to define a universal discrimination function which performs as the optimal (likelihood ratio) discriminant function (which can be evaluated only if the probability measures of the two classes are available). If both of the probability measures are not available but training vectors from at least one of the two classes are available, it is demonstrated that no discriminant function can perform efficiency of the length of the training vectors does not grow at least linearly with the length of the classified vector. A universal discriminant function is introduced and shown to perform efficiently when the length of the training vectors grows linearly with the length of the classified sequence, in the sense that it yields an error exponent that is arbitrarily close to that of the optimal discriminant function
Keywords :
data compression; information theory; probability; classification; empirically observed statistics; finite alphabet sources; probability; training vectors; universal data compression; universal discriminant functions; Data compression; Fading; Length measurement; Markov processes; Out of order; Performance evaluation; Probability; Statistics;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.2636
Filename :
2636
Link To Document :
بازگشت