Title :
Optimally generalizing neural networks
Author :
Ogawa, H. ; Oja, E.
Author_Institution :
Dept. of Comput. Sci., Tokyo Inst. of Technol., Japan
Abstract :
The problem of approximating a real function f of L variables, given only in terms of its values y1,. . .,yM at a small set of sample points x1,. . .,xM in RL, is studied in the context of multilayer neural networks. Using the theory of reproducing kernels of Hilbert spaces, it is shown that this problem is the inverse of a linear model relating the values ym to the function f itself. The authors consider the least-mean-square training criterion for nonlinear multilayer neural network architectures that learn the training set completely. The generalization property of a neural network is defined in terms of function reconstruction and the concept of the optimally generalizing neural network (OGNN) is proposed. It is a network that minimizes a criterion given in terms of the true error between the original function f and the reconstruction f1 in the function space, instead of minimizing the error at the sample points only. As an example of the OGNN, a projection filter (PF) criterion is considered and the PFGNN is introduced. The network is of the two-layer nonlinear-linear type
Keywords :
function approximation; learning systems; neural nets; Hilbert spaces; function approximation; least-mean-square training criterion; nonlinear multilayer neural network architectures; optimally generalizing neural network; projection filter criterion; Computer science; Hilbert space; Inverse problems; Kernel; Least squares approximation; Multi-layer neural network; Multilayer perceptrons; Neural networks; Sampling methods; Space technology;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170648