Title :
Learning feature transforms is an easier problem than feature selection
Author_Institution :
Motorola Labs., Tempe, AZ, USA
Abstract :
We argue that optimal feature selection is intrinsically a harder problem than learning discriminative feature transforms, provided a suitable criterion for the latter. We discuss mutual information between class labels and transformed features as such a criterion. Instead of Shannon\´s definition we use measures based on Renyi entropy, which lends itself into an efficient implementation and an interpretation of "information forces" induced by samples of data that drive the transform.
Keywords :
entropy; feature extraction; transforms; discriminative feature transforms; feature transform learning; optimal feature selection; Discrete transforms; Entropy; Error analysis; Error probability; Force measurement; Mutual information; Random variables; Rivers; Stochastic processes; Training data;
Conference_Titel :
Pattern Recognition, 2002. Proceedings. 16th International Conference on
Print_ISBN :
0-7695-1695-X
DOI :
10.1109/ICPR.2002.1048248