Title of article
On the selection and classification of independent features
Author/Authors
M.، Bressan, نويسنده , , J.، Vitria, نويسنده ,
Issue Information
روزنامه با شماره پیاپی سال 2003
Pages
-1311
From page
1312
To page
0
Abstract
This paper is focused on the problems of feature selection and classification when classes are modeled by statistically independent features. We show that, under the assumption of class-conditional independence, the class separability measure of divergence is greatly simplified, becoming a sum of unidimensional divergences, providing a feature selection criterion where no exhaustive search is required. Since the hypothesis of independence is infrequently met in practice, we also provide a framework making use of class-conditional Independent Component Analyzers where this assumption can be held on stronger grounds. Divergence and the Bayes decision scheme are adapted to this class-conditional representation. An algorithm that integrates the proposed representation, feature selection technique, and classifier is presented. Experiments on artificial, benchmark, and real-world data illustrate our technique and evaluate its performance.
Keywords
developable surface , electromagnetic scattering , Physical optics , radar backscatter
Journal title
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Serial Year
2003
Journal title
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Record number
95103
Link To Document