DocumentCode :
396728
Title :
Effect of regularization term upon fault tolerant training
Author :
Takase, Haruhiko ; Kita, Hidehiko ; Hayashi, Tetumine
Author_Institution :
Dept. of Electr. & Electron. Eng., Mie Univ., Japan
Volume :
2
fYear :
2003
fDate :
20-24 July 2003
Firstpage :
1048
Abstract :
To enhance fault tolerance of multi-layer neural networks, we proposed PAWMA (partially adaptive weight minimization approach). This method minimizes not only output error but also the sum of squares of weights (the regularization term). This method aims to decrease the number of connections whose faults strongly degrade the performance of MLNs (important connections). On the other hand, weight decay, which aims to eliminate unimportant connections, is base on the same idea. This method expects to keeping important connections and decaying unimportant connections. In this paper, we discuss about the contradiction between two effects of the regularization term. Through some experiment, we show that the difference between two effects is brought by the partially application of the regularization term.
Keywords :
backpropagation; fault tolerance; minimisation; multilayer perceptrons; fault tolerance; multilayer neural networks; partially adaptive weight minimization approach; regularization term; weight decay; Artificial neural networks; Degradation; Equations; Fault tolerance; Multi-layer neural network; Neural networks; Neurofeedback; Output feedback; Relays;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-7898-9
Type :
conf
DOI :
10.1109/IJCNN.2003.1223835
Filename :
1223835
Link To Document :
بازگشت