DocumentCode :
2697749
Title :
Descending epsilon in back-propagation: a technique for better generalization
Author :
Yu, Yeong-Ho ; Simmond, R.F.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
167
Abstract :
There are two measures for the optimality of a trained feedforward network for the given training patterns: the global error function and the correctness ratio. In the present work, the authors argue that these two measures are not parallel and present a technique (called descending epsilon) with which the back-propagation method results in a high correctness ratio. It is shown that, with this technique, the trained networks often exhibit high correctness ratios not only for the training patterns but also for novel patterns
Keywords :
learning systems; neural nets; back-propagation; better generalization; correctness ratio; descending epsilon; global error function; optimality; trained feedforward network; training patterns;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137840
Filename :
5726798
Link To Document :
بازگشت