DocumentCode
328276
Title
A regularization method for the minimum estimation error
Author
Yamada, Miki
Author_Institution
Adv. Res. Lab., Toshiba Corp., Kawasaki, Japan
Volume
1
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
497
Abstract
A new cost function of regularization for generalization is proposed. This cost function is derived from the maximum likelihood method using a modified sample distribution, and consists of a sum of square errors and a stabilizer which is an integrated square derivative. The regularization parameters which give the minimum estimation error can be obtained nonempirically. Numerical simulation shows that this cost function predicts the true error accurately and is effective in neural network learning.
Keywords
error analysis; generalisation (artificial intelligence); learning (artificial intelligence); maximum likelihood estimation; minimisation; neural nets; cost function; generalization; integrated square derivative; maximum likelihood method; minimum estimation error; neural network learning; regularization; sample distribution; square errors; Cost function; Density functional theory; Equations; Estimation error; Kernel; Mean square error methods; Numerical simulation; Taylor series;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.713962
Filename
713962
Link To Document