Title :
Enhancing both generalization and fault tolerance of multilayer neural networks
Author :
Haruhiko, Takase ; Masahiko, Mayumi ; Hidehiko, Kita ; Terumine, Hayashi
Author_Institution :
Mie Univ., Mie
Abstract :
In this paper, we propose the method to enhance both generalization ability and fault tolerance of multilayer neural networks. Many methods that enhance either generalization ability or fault tolerance have been proposed, but very few methods enhance both of them. We discuss the combination of the method for good generalization and the method for high fault tolerance. Avoiding the interference, we propose local augmentation method (LAUG) to enhance fault tolerance. It duplicates hidden units according to the importance of each unit. Since it manipulates a trained network keeping the input-output relation of the network, LAUG does not interfere with any training algorithms to enhance generalization ability. Finally, we show the effectiveness of our method through some experiments.
Keywords :
fault tolerance; generalisation (artificial intelligence); multilayer perceptrons; fault tolerance; generalization ability; local augmentation method; multilayer neural network generalization; training algorithms; Artificial neural networks; Backpropagation algorithms; Distributed computing; Fault tolerance; Interference; Large scale integration; Multi-layer neural network; Neural network hardware; Neural networks; Redundancy;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371168