Title :
Neural-network feature selector
Author :
Setiono, Rudy ; Liu, Huan
Author_Institution :
Dept. of Inf. Syst. & Comput. Sci., Nat. Univ. of Singapore, Singapore
fDate :
5/1/1997 12:00:00 AM
Abstract :
Feature selection is an integral part of most learning algorithms. Due to the existence of irrelevant and redundant attributes, by selecting only the relevant attributes of the data, higher predictive accuracy can be expected from a machine learning method. In this paper, we propose the use of a three-layer feedforward neural network to select those input attributes that are most useful for discriminating classes in a given set of input patterns. A network pruning algorithm is the foundation of the proposed algorithm. By adding a penalty term to the error function of the network, redundant network connections can be distinguished from those relevant ones by their small weights when the network training process has been completed. A simple criterion to remove an attribute based on the accuracy rate of the network is developed. The network is retrained after removal of an attribute, and the selection process is repeated until no attribute meets the criterion for removal. Our experimental results suggest that the proposed method works very well on a wide variety of classification problems
Keywords :
backpropagation; entropy; feature extraction; feedforward neural nets; minimisation; pattern classification; redundancy; backpropagation; cross entropy; error function; feature selection; input attributes; learning algorithms; multilayer feedforward neural network; network pruning algorithm; pattern classification; penalty term; redundant attribute removal; Accuracy; Backpropagation; Decision trees; Entropy; Feature extraction; Feedforward neural networks; Impurities; Learning systems; Machine learning algorithms; Neural networks;
Journal_Title :
Neural Networks, IEEE Transactions on