DocumentCode :
3259023
Title :
A growing network that optimizes between undertraining and overtraining
Author :
Chakraborty, Goutam ; Murakami, Mitsuru ; Shiratori, Norio ; Noguchi, Shoichi
Author_Institution :
Aizu Univ., Fukushima, Japan
Volume :
2
fYear :
1995
fDate :
Nov/Dec 1995
Firstpage :
1116
Abstract :
A feedforward network classifier trained with a finite set of available samples tries to estimate properly the different class boundaries in the input feature space. The aim is that the network would then be able to classify unknown new samples with some confidence. For problems of practical interest, due to the complexity of the class boundaries, noise in the available sample set and finite (usually inadequate) number of available samples, it is a hard task to ensure (1) optimum network size with respect to the available sample set of unknown distribution and noise characteristics, (2) the optimum value of the network parameters. Several researchers tried to estimate a bound on network size depending on sample size, or estimate the prediction error in terms of the number of parameters in the model and size of the available sample set. Other practical approaches either start with a small network and grow it to proper size, or with a big network and prune it optimally. These methods are based on particular network structures. The authors propose an idea for ascertaining proper network size for maximizing generalization as well as correct classification, and propose an algorithm to grow the network to that size. The algorithm is generic and applicable to any FF network. In this paper the authors worked with a special variety of Φ-network, proposed earlier
Keywords :
feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); pattern classification; Φ-network; feedforward network classifier; generalization; growing network; optimum network size; overtraining; undertraining; Artificial neural networks; Character recognition; Cost function; Electronic mail; Feeds; Learning systems; Predictive models;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.487579
Filename :
487579
Link To Document :
بازگشت