DocumentCode :
286888
Title :
Improved generalization in multi-layer perceptrons with the log-likelihood cost function
Author :
Holt, Murray J J
Author_Institution :
Dept. of Electron. & Electr. Eng., Loughborough Univ. of Technol., UK
fYear :
1991
fDate :
33564
Firstpage :
42583
Lastpage :
42586
Abstract :
In supervised training of neural networks, synaptic weights are usually updated by an iterative algorithm which searches for the minimum of some cost function. The most common choice of cost function is the sum of squares (SS). An alternative choice of cost function is the log likelihood (LL). An analytical comparison of the SS and LL has suggested that the latter should lead to improved generalization when a multilayer perception trained on non-separable data by back-propagation. This is confirmed by results from simulated data, where the LL-trained networks result in a significant reduction in test-set errors. Moreover, a good generalizing solution appears to be achievable in fewer iterations, with a smaller network, or with fewer training samples
Keywords :
iterative methods; learning systems; neural nets; analytical comparison; back-propagation; fewer iterations; fewer training samples; iterative algorithm; log-likelihood cost function; multi-layer perceptrons; neural network training; non-separable data; smaller network; sum of squares; synaptic weights; test-set errors;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Adaptive Filtering, Non-Linear Dynamics and Neural Networks, IEE Colloquium on
Conference_Location :
London
Type :
conf
Filename :
263737
Link To Document :
بازگشت