Title : 
The generalized back-propagation algorithm with convergence analysis
         
        
            Author : 
Ng, S.C. ; Leung, S.H. ; Luk, A.
         
        
            Author_Institution : 
Dept. of Comput. & Math., Hong Kong Tech. Coll., Hong Kong
         
        
        
        
        
        
            Abstract : 
The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. The generalized back-propagation algorithm which can effectively speed up the convergence rate has been proposed previously. In this paper, the convergence proof of the algorithm is analyzed. The generalized backpropagation algorithm changes the derivative of the activation function so as to magnify the backward propagated error signal when the output approaches a wrong value; thus the convergence rate can be accelerated and the local minimum escaped. From the convergence analysis, it is shown that the generalized back-propagation algorithm will improve the original backpropagation algorithm in terms of faster convergence and global search capability
         
        
            Keywords : 
backpropagation; convergence; generalisation (artificial intelligence); neural nets; activation function; backward propagated error signal; convergence analysis; generalized back-propagation algorithm; global search capability; neural nets; Acceleration; Algorithm design and analysis; Australia; Convergence; Educational institutions; Equations; Investments; Mathematics; Neural networks; Neurons;
         
        
        
        
            Conference_Titel : 
Circuits and Systems, 1999. ISCAS '99. Proceedings of the 1999 IEEE International Symposium on
         
        
            Conference_Location : 
Orlando, FL
         
        
            Print_ISBN : 
0-7803-5471-0
         
        
        
            DOI : 
10.1109/ISCAS.1999.777646