Title :
On function approximators implementable as layered neural networks
Author_Institution :
Res. Inst. for Inf., Bucharest, Romania
Abstract :
The paper deals with the approximation of continuous functions by feedforward neural networks. In the first part of paper are presented some main results of Y. Ito (1992) and P. Cardaliaguet and G. Euvrard (1992) regarding universal approximators implementable as four-layer neural networks. In the second part is presented an explicit formula similar to Cybenko expression for approximating a continuous multivariate function using characteristic function as a particular bell-shaped function in place of sigmoidal function. This approximation formula is implementable as three-layer feedforward neural networks that, surprisingly, have in the hidden layer the same number of neurons as Ito and Cardaliaguet-Euvrard four-layer neural networks have in the second hidden layer
Keywords :
feedforward neural nets; function approximation; Cybenko expression; continuous functions; feedforward neural networks; function approximators; layered neural networks; neurons; universal approximators; Feedforward neural networks; Hypercubes; Indium tin oxide; Informatics; Multi-layer neural network; Neural networks; Neurons; Polynomials;
Conference_Titel :
Euromicro Conference, 1998. Proceedings. 24th
Conference_Location :
Vasteras
Print_ISBN :
0-8186-8646-4
DOI :
10.1109/EURMIC.1998.708085