Title : 
A functional manipulation for improving tolerance against multiple-valued weight faults of feedforward neural networks
         
        
            Author : 
Kamiura, Naotake ; Taniguchi, Yasuyuki ; Matsui, Nobuyuki
         
        
            Author_Institution : 
Dept. of Comput. Eng., Himeji Inst. of Technol., Hyogo, Japan
         
        
        
        
        
        
            Abstract : 
In this paper we propose feedforward neural networks (NNs for short) tolerating multiple-valued stuck-at faults of connection weights. To improve the fault tolerance against faults with small false absolute values, we employ the activation function with the relatively gentle gradient for the last layer, and steepen the gradient of the function in the intermediate layer. For faults with large false absolute values, the function working as filter inhibits their influence by setting products of inputs and faulty weights to allowable values. The experimental results show that our NN is superior in fault tolerance and learning time to other NNs employing approaches based on fault injection, forcible weight limit and so forth
         
        
            Keywords : 
fault tolerant computing; feedforward neural nets; multivalued logic; fault tolerance; feedforward neural networks; learning time; multiple-valued stuck-at faults; multiple-valued weight fault; tolerance; Backpropagation algorithms; Character recognition; Computer networks; Fault tolerance; Feedforward neural networks; Filters; Hardware; Neural networks; Neurons;
         
        
        
        
            Conference_Titel : 
Multiple-Valued Logic, 2001. Proceedings. 31st IEEE International Symposium on
         
        
            Conference_Location : 
Warsaw
         
        
        
            Print_ISBN : 
0-7695-1083-3
         
        
        
            DOI : 
10.1109/ISMVL.2001.924593