Title :
A note on the effective number of parameters in nonlinear learning systems
Author :
Mao, Jianchang ; Jain, Anil K.
Author_Institution :
IBM Almaden Res. Center, San Jose, CA, USA
Abstract :
Moody´s notion of effective number of parameters in a nonlinear learning system has been used for studying the generalization ability of feedforward neural networks. It is more meaningful than the number of free parameters in a nonlinear learning system because the former explains explicitly how the generalization error is related to the expected training set error. In this paper, we extend Moody´s model to include a more general noise model. We show that the addition of noise in both sampling points in test data and observations increases the deviation of the expected test set mean-squared-error (MSE) from the expected training set MSE, and also increases the effective number of parameters. Our extension makes less restrictive assumptions about the data generation process than in the original Moody´s notion. Monte Carlo experiments have been conducted to verify our extension, and to demonstrate the role of the weight-decay regularization in improving the generalization ability of feedforward networks
Keywords :
error analysis; feedforward neural nets; generalisation (artificial intelligence); learning systems; noise; optimisation; parameter estimation; Moody model; feedforward neural networks; generalization; mean-squared-error; noise model; nonlinear learning systems; sampling points; test data; weight-decay regularization; Backpropagation algorithms; Computer errors; Feedforward neural networks; Iterative algorithms; Learning systems; Neural networks; Nonhomogeneous media; Supervised learning; Testing; Virtual colonoscopy;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.616172