DocumentCode
3442152
Title
Minimal training set size estimation for neural network-based function approximation
Author
Malinowski, Aleksander ; Zurada, Jacek M. ; Aronhime, Peter B.
Author_Institution
Dept. of Electr. Eng., Louisville Univ., KY, USA
Volume
6
fYear
1994
fDate
30 May-2 Jun 1994
Firstpage
403
Abstract
A new approach to the problem of n-dimensional continuous and sampled-data function approximation using a two-layer neural network is presented. The generalized Nyquist theorem is introduced to solve for the optimum number of training examples in n-dimensional input space. Choosing the smallest but still sufficient set of training vectors results in a reduced learning time for the network. Analytical formulas and algorithm for training set size reduction are developed and illustrated by two-dimensional data examples
Keywords
function approximation; learning (artificial intelligence); minimisation; neural nets; sampled data systems; Nyquist theorem; algorithm; continuous data; function approximation; learning time; minimal training set size; sampled data; two-dimensional data; two-layer neural network; Electronic mail; Fourier transforms; Frequency estimation; Function approximation; Multi-layer neural network; Multidimensional systems; Neural networks; Sampling methods; Signal restoration; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Circuits and Systems, 1994. ISCAS '94., 1994 IEEE International Symposium on
Conference_Location
London
Print_ISBN
0-7803-1915-X
Type
conf
DOI
10.1109/ISCAS.1994.409611
Filename
409611
Link To Document