DocumentCode :
2615947
Title :
Can multilayer mapping networks with finite number of real parameters harness the computational power of Kolmogorov´s representation theorem?
Author :
Kowalczyk, Adam
Author_Institution :
Telecom Australia Res. Lab., Clayton, Vic., Australia
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
2722
Abstract :
It is shown that any continuous function on any compact subnet of n-dimensional space can be computed by a simple feedforward neural network after adjusting a single real parameter. For a finite subset even a single unit network with continuous dependence on the parameter can do this. However, these dependencies on the parameters cannot be made continuously differentiable, so the typical gradient descent methods for parameter adjustment cannot be used. It is shown that the plausible and practical assumption on continuing differentiable dependence of a neural network on parameters introduces hidden limitations on abilities of such a structure
Keywords :
neural nets; set theory; Kolmogorov´s representation theorem; continuing differentiable dependence; multilayer mapping networks; neural network; real parameters; set theory; Australia; Computer networks; Feedforward neural networks; History; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural networks; Nonhomogeneous media; Telecommunication computing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170280
Filename :
170280
Link To Document :
بازگشت