Title :
Effective neural network pruning using cross-validation
Author :
Huynh, Thuan Q. ; Setiono, Rudy
Author_Institution :
Sch. of Comput., Nat. Univ. of Singapore, Singapore
fDate :
31 July-4 Aug. 2005
Abstract :
This paper addresses the problem of finding neural networks with optimal topology such that their generalization capability is maximized. Our approach is to combine the use of a penalty function during network training and a subset of the training samples for cross-validation. The penalty is added to the error function so that the weights of network connections that are not useful have small magnitude. Such network connections can be pruned if the resulting accuracy of the network does not change beyond a preset level. Training samples in the cross-validation set are used to indicate when network pruning is terminated. Our results on 32 publicly available data sets show that the proposed method outperforms existing neural network and decision tree methods for classification.
Keywords :
learning (artificial intelligence); neural nets; decision tree methods; error function; network training; neural network pruning; optimal topology; penalty function; Backpropagation algorithms; Classification tree analysis; Computer networks; Decision trees; Electronic mail; Feedforward neural networks; Heuristic algorithms; Multi-layer neural network; Network topology; Neural networks;
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
DOI :
10.1109/IJCNN.2005.1555984