Title :
Can multilayer mapping networks with finite number of real parameters harness the computational power of Kolmogorov´s representation theorem?
Author_Institution :
Telecom Australia Res. Lab., Clayton, Vic., Australia
Abstract :
It is shown that any continuous function on any compact subnet of n-dimensional space can be computed by a simple feedforward neural network after adjusting a single real parameter. For a finite subset even a single unit network with continuous dependence on the parameter can do this. However, these dependencies on the parameters cannot be made continuously differentiable, so the typical gradient descent methods for parameter adjustment cannot be used. It is shown that the plausible and practical assumption on continuing differentiable dependence of a neural network on parameters introduces hidden limitations on abilities of such a structure
Keywords :
neural nets; set theory; Kolmogorov´s representation theorem; continuing differentiable dependence; multilayer mapping networks; neural network; real parameters; set theory; Australia; Computer networks; Feedforward neural networks; History; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural networks; Nonhomogeneous media; Telecommunication computing;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170280