Title :
Sufficiency, classification, and the class-specific feature theorem
Author_Institution :
Dept. of Electr. & Comput. Eng., Rhode Island Univ., Kingston, RI, USA
fDate :
7/1/2000 12:00:00 AM
Abstract :
A new proof of the class-specific feature theorem is given. The proof makes use of the observed data as opposed to the set of sufficient statistics as in the original formulation. We prove the theorem for the classical case, in which the parameter vector is deterministic and known, as well as for the Bayesian case, in which the parameter vector is modeled as a random vector with known prior probability density function. The essence of the theorem is that with a suitable normalization the probability density function of the sufficient statistic for each probability density function family can be used for optimal classification. One need not have knowledge of the probability density functions of the data under each hypothesis
Keywords :
Bayes methods; optimisation; probability; signal classification; statistical analysis; Bayesian case; PDF normalization; class-specific feature theorem; deterministic parameter vector; observed data; optimal classification; optimal decision rules; probability density function; random vector; sufficient statistics; Bayesian methods; Data models; Information theory; Neural networks; Pattern recognition; Probability density function; Signal detection; Statistical analysis; Statistics; Testing;
Journal_Title :
Information Theory, IEEE Transactions on