Title : 
An empirical paralogism of the backpropagation networks
         
        
        
            Author_Institution : 
Dept. of Manage. Inf. Syst., Nat. Chengchi Univ., Taipei, Taiwan
         
        
        
        
        
        
            Abstract : 
In the learning process of the backpropagation networks, the numerical phenomenon of being likely to have a sluggish processing somewhere and to terminate there is usually interpreted as being trapped by a relatively minimal point. However, this claim might be wrong because this numerical phenomenon might also happen in the vicinity of a saddle stationary point.
         
        
            Keywords : 
backpropagation; feedforward neural nets; numerical analysis; optimisation; attractor; backpropagation networks; empirical paralogism; feedforward neural network; learning process; minimal point; numerical phenomenon; saddle stationary point; Eigenvalues and eigenfunctions; Electronic mail; Entropy; Feedforward systems; Gradient methods; Iterative algorithms; Management information systems; Optimization methods;
         
        
        
        
            Conference_Titel : 
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
         
        
            Print_ISBN : 
0-7803-1421-2
         
        
        
            DOI : 
10.1109/IJCNN.1993.714209