DocumentCode :
3313770
Title :
Fault tolerance of neural networks
Author :
Damarla, T. Raju ; Bhagat, P.K.
Author_Institution :
Kentucky Univ., Lexington, KY, USA
fYear :
1989
fDate :
9-12 Apr 1989
Firstpage :
328
Abstract :
The robustness and learning speeds of a neural network using the backpropagation algorithm are explored. An XOR experiment was performed on neural networks with one and two hidden layers. Robustness of the net was studied through removal of nodes and/or branches in hidden layers. It is observed that simulations with final output weights constrained to lie below a specified value provided superior performance, even when they were structurally damaged. Hence, for a two-hidden layer net, the weight constraint on interconnecting links yields a robust and faster-learning network
Keywords :
fault tolerant computing; learning systems; neural nets; reliability theory; XOR experiment; backpropagation algorithm; branches; fault tolerance; final output weights; interconnecting links; learning speeds; neural networks; nodes; one-hidden layer net; performance; removal; robustness; simulations; two-hidden layer net; weight constraint; Artificial neural networks; Backpropagation algorithms; Biological neural networks; Biology computing; Computer networks; Fault tolerance; Joining processes; Neural networks; Neurons; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Southeastcon '89. Proceedings. Energy and Information Technologies in the Southeast., IEEE
Conference_Location :
Columbia, SC
Type :
conf
DOI :
10.1109/SECON.1989.132388
Filename :
132388
Link To Document :
بازگشت