DocumentCode :
3147067
Title :
On the number of training points needed for adequate training of feedforward neural networks
Author :
Hashemi, K.S. ; Thomas, R.J.
Author_Institution :
Sch. of Electr. Eng., Cornell Univ., Ithaca, NY, USA
fYear :
1991
fDate :
23-26 Jul 1991
Firstpage :
232
Lastpage :
236
Abstract :
The authors address the problem of training neural networks to act as approximations of continuous mappings. In the case where the only representation of the mapping within the training process is through a finite set of training points, they show that in order for this set of points to provide an adequate representation of the mapping, it must contain a number of points which rises at least exponentially quickly with the dimension of the input space. Thus they also show that the time taken to train the networks will rise at least exponentially quickly with the dimension of the input. They conclude that if the only training algorithms available rely upon a finite training set, then the application of neural networks to the approximation problem is impractical whenever the dimension of the input is large. By extrapolating their experimental results, they estimate that `large´ in this respect means `greater than ten´
Keywords :
approximation theory; feedforward neural nets; learning (artificial intelligence); AI; algorithms; approximations; continuous mappings; feedforward neural networks; input space; learning; training points; Acceleration; Algorithm design and analysis; Approximation algorithms; Feedforward neural networks; Mean square error methods; Neural networks; Terminology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks to Power Systems, 1991., Proceedings of the First International Forum on Applications of
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0065-3
Type :
conf
DOI :
10.1109/ANN.1991.213472
Filename :
213472
Link To Document :
بازگشت