DocumentCode :
3289616
Title :
Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions
Author :
Stinchcombe, Maxwell ; White, Halbert
Author_Institution :
Dept. of Econ., California Univ., San Diego, CA, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
613
Abstract :
K.M. Hornik, M. Stinchcombe, and H. White (Univ. of California at San Diego, Dept. of Economics Discussion Paper, June 1988; to appear in Neural Networks) showed that multilayer feedforward networks with as few as one hidden layer, no squashing at the output layer, and arbitrary sigmoid activation function at the hidden layer are universal approximators: they are capable of arbitrarily accurate approximation to arbitrary mappings, provided sufficiently many hidden units are available. The present authors obtain identical conclusions but do not require the hidden-unit activation to be sigmoid. Instead, it can be a rather general nonlinear function. Thus, multilayer feedforward networks possess universal approximation capabilities by virtue of the presence of intermediate layers with sufficiently many parallel processors; the properties of the intermediate-layer activation function are not so crucial. In particular, sigmoid activation functions are not necessary for universal approximation.<>
Keywords :
function approximation; neural nets; multilayer feedforward networks; neural nets; nonlinear function; nonsigmoid hidden layer activation functions; parallel processors; universal approximation; Approximation methods; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118640
Filename :
118640
Link To Document :
بازگشت