Title :
On the (1+1/2) layer neural networks as universal approximators
Author :
Ciuca, I. ; Ware, J.A.
Author_Institution :
Res. Inst. for Inf., Bucharest, Romania
Abstract :
Deals with the approximation of continuous functions by feedforward neural networks. After presenting one of the main results of Ito, the paper tries to get a universal approximator implementable as a (1+1/2) layer neural network using Heaviside functions as univariate functions. The paper presents an explicit formula for function approximation implementable as a three-layer feedforward neural network instead of a four-layer neural network. These three-layer feedforward neural networks have the same number of neurons in the hidden layer as the equivalent four-layer neural networks have in the second hidden layer
Keywords :
feedforward neural nets; function approximation; multilayer perceptrons; (1+1/2) layer neural networks; Heaviside functions; continuous functions; function approximation; three-layer feedforward neural network; universal approximators; Equations; Frequency locked loops; Indium tin oxide; Neural networks; Tiles;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.685947