Title :
Nonlinear feature transforms using maximum mutual information
Author_Institution :
Motorola Inc., MD, USA
Abstract :
Finding the right features is an essential part of a pattern recognition system. This can be accomplished either by selection or by a transform from a larger number of “raw” features. In this work we learn nonlinear dimension reducing discriminative transforms that are implemented as neural networks, either as radial basis function networks or as multilayer perceptrons. As the criterion, we use the joint mutual information (MI) between the class labels of training data and transformed features. Our measure of MI makes use of Renyi entropy as formulated by Principe et al. (1998, 2000). Resulting low-dimensional features enable a classifier to operate with less computational resources and memory without compromising the accuracy
Keywords :
computational complexity; entropy; multilayer perceptrons; optimisation; pattern classification; radial basis function networks; transforms; MI; Renyi entropy; computational resources; joint mutual information; low-dimensional features; maximum mutual information; memory; multilayer perceptrons; neural networks; nonlinear dimension reducing discriminative transforms; nonlinear feature transforms; pattern recognition system; radial basis function networks; raw features; Discrete transforms; Entropy; Error analysis; Multilayer perceptrons; Mutual information; Pattern recognition; Radial basis function networks; Random variables; Rivers; Uncertainty;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.938809