DocumentCode :
3598820
Title :
Cross-validation without a validation set in BP-trained neural nets
Author :
Hassoun, Mohamad H. ; Watta, Paul B. ; Shringarpure, Rahul
Author_Institution :
Dept. of Electr. & Comput. Eng., Wayne State Univ., Detroit, MI, USA
Volume :
1
fYear :
1995
Firstpage :
369
Abstract :
Generalization in backprop-trained multilayer neural networks is discussed for problems where training data is either scarce or else extremely costly to obtain. In this case, the usual method of cross-validation, whereby the data set is partitioned into training, testing, and validation sets, is not feasible. In this paper we demonstrate that it is sometimes possible to use all available data for training a large network (a network capable of overfitting the data) and yet still determine an appropriate stopping point to ensure that the network generalizes properly
Keywords :
backpropagation; generalisation (artificial intelligence); multilayer perceptrons; backpropagation-trained multilayer neural networks; cross-validation; generalization; Computer networks; Data engineering; Function approximation; Gaussian noise; Intelligent networks; Laboratories; Multi-layer neural network; Neural networks; Testing; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.488127
Filename :
488127
Link To Document :
بازگشت