DocumentCode :
3057018
Title :
Comparison of generalization in multi-layer perceptrons with the log-likelihood and least-squares cost functions
Author :
Holt, Murray J J
Author_Institution :
Dept. of Electron & Electr. Eng., Loughborough Univ. of Technol., UK
fYear :
1992
fDate :
30 Aug-3 Sep 1992
Firstpage :
17
Lastpage :
20
Abstract :
The log likelihood cost function is discussed as an alternative to the least-squares criterion for training feedforward neural networks. An analysis is presented which suggests how the use of this function can improve convergence and generalization. Tests on simulated data using both training algorithms provide evidence of improved generalization with the log likelihood cost function
Keywords :
feedforward neural nets; learning systems; probability; convergence; feedforward neural networks; learning systems; log likelihood cost function; multilayer perceptrons; Analytical models; Cost function; Electronic switching systems; Feedforward neural networks; Iterative algorithms; Maximum likelihood estimation; Multilayer perceptrons; Neural networks; Random variables; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 1992. Vol.II. Conference B: Pattern Recognition Methodology and Systems, Proceedings., 11th IAPR International Conference on
Conference_Location :
The Hague
Print_ISBN :
0-8186-2915-0
Type :
conf
DOI :
10.1109/ICPR.1992.201712
Filename :
201712
Link To Document :
بازگشت