Title : 
Weight shifting techniques for self-recovery neural networks
         
        
            Author : 
Khunasaraphan, C. ; Vanapipat, K. ; Lursinsap, C.
         
        
            Author_Institution : 
Center for Adv. Comput. Studies, Southwestern Louisiana Univ., Lafayette, LA, USA
         
        
        
        
        
            fDate : 
7/1/1994 12:00:00 AM
         
        
        
        
            Abstract : 
In this paper, a self-recovery technique of feedforward neural networks called weight shifting and its analytical models are proposed. The technique is applied to recover a network when some faulty links and/or neurons occur during the operation. If some input links of a specific neuron are detected faulty, their weights will be shifted to healthy links of the same neuron. On the other hand, if a faulty neuron is encountered, then we can treat it as a special case of faulty links by considering all the output links of that neuron to be faulty. The aim of this technique is to recover the network in a short time without any retraining and hardware repair. We also propose the hardware architecture for implementing this technique
         
        
            Keywords : 
built-in self test; feedforward neural nets; neural chips; faulty links; feedforward neural networks; self-recovery neural networks; weight shifting techniques; Analytical models; Computer networks; Fault detection; Fault tolerance; Feedforward neural networks; Helium; Neural network hardware; Neural networks; Neurons; Very large scale integration;
         
        
        
            Journal_Title : 
Neural Networks, IEEE Transactions on