DocumentCode
328400
Title
Radon transform and differentiable approximation by neural networks
Author
Ito, Yoshifusa
Author_Institution
Toyohashi Univ. of Technol., Japan
Volume
3
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
2288
Abstract
We treat the problem of simultaneously approximating Cm-functions in several variables and their derivatives by superpositions of a fixed activation function in one variable. The domain of approximation can be either compact subsets or the whole Euclidean space. If the domain is compact, the activation function does not need to be scalable. Even if the domain is the whole space, the activation function can be used without scaling under a certain condition. The approximation can be implemented by a three layered neural network with hidden layer units having the activation function.
Keywords
Radon transforms; approximation theory; feedforward neural nets; function approximation; set theory; Euclidean space; Radon transform; differentiable approximation; fixed activation function; function approximation; hidden layer units; three layered neural network; Computer networks; Differential equations; Fourier transforms; Indium tin oxide; Mathematics; Neural networks; Robots; Space technology;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.714182
Filename
714182
Link To Document