DocumentCode :
384266
Title :
Learning feature transforms is an easier problem than feature selection
Author :
Torkkola, Kari
Author_Institution :
Motorola Labs., Tempe, AZ, USA
Volume :
2
fYear :
2002
fDate :
2002
Firstpage :
104
Abstract :
We argue that optimal feature selection is intrinsically a harder problem than learning discriminative feature transforms, provided a suitable criterion for the latter. We discuss mutual information between class labels and transformed features as such a criterion. Instead of Shannon\´s definition we use measures based on Renyi entropy, which lends itself into an efficient implementation and an interpretation of "information forces" induced by samples of data that drive the transform.
Keywords :
entropy; feature extraction; transforms; discriminative feature transforms; feature transform learning; optimal feature selection; Discrete transforms; Entropy; Error analysis; Error probability; Force measurement; Mutual information; Random variables; Rivers; Stochastic processes; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2002. Proceedings. 16th International Conference on
ISSN :
1051-4651
Print_ISBN :
0-7695-1695-X
Type :
conf
DOI :
10.1109/ICPR.2002.1048248
Filename :
1048248
Link To Document :
بازگشت