Title :
The influence of training sets on generalization in feed-forward neural networks
Author :
Wann, M. ; Hediger, T. ; Greenbaum, N.N.
Abstract :
A nontrivial computation task that attempts to recognize two-or-more-clumps in a 5-b string is used to illustrate the important influence of training set selection on generalization properties of back-propagation networks. For this problem, the input patterns can be clustered into four groups indexed by their distances from the class boundary. With various combinations of these groups, the authors constructed training sets ranging from those containing only typical patterns of each class to those of border patterns. A series of simulation experiments were carried out to study the generalization capability of networks trained with these sets. The results are consistent with the following conclusions: (1) larger sizes of training examples do not guarantee better generalization performance: (2) there exists a proper subset of border patterns, which constitutes a critical training set for perfect generalization; and (3) a network trained with an arbitrary subset of border sets is not necessarily a better performer compared with one trained with a typical or other collection of input patterns
Keywords :
learning systems; neural nets; back-propagation networks; feed-forward neural networks; generalization properties; nontrivial computation task; simulation experiments; training sets;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137836