DocumentCode :
288332
Title :
Explicit synthesis of multilayer perceptrons using Walsh expansion
Author :
Xiao-hu Yu
Author_Institution :
Dept. of Radio Eng., Southeast Univ., Nanjing
Volume :
1
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
209
Abstract :
A synthetic method for explicitly designing multilayer perceptrons is addressed in this paper. The basic idea of the present method is to use the hidden units of perceptrons to form the basis functions of a truncated Walsh series expansion. The synthesis of a multilayer perceptron is therefore transformed to Walsh expansion of the desired mapping, leading to the desired weights of the perceptrons being explicitly solvable. As compared with conventional backpropagation training, the present approach can provide a special advantage that the generalization errors are bounded and easily controllable. Applications of the present synthetic method are illustrated
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; series (mathematics); Walsh expansion; backpropagation training; basis functions; explicit synthesis; generalization errors; hidden units; multilayer perceptrons; truncated Walsh series expansion; Backpropagation algorithms; Convergence; Design engineering; Design methodology; Error correction; Multilayer perceptrons; Network synthesis; Neurons; Nonhomogeneous media; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374164
Filename :
374164
Link To Document :
بازگشت