Title : 
The computational intractability of training sigmoidal neural networks
         
        
        
            Author_Institution : 
Dept. of Math. Sci., Massachusetts Univ., Lowell, MA, USA
         
        
        
        
        
            fDate : 
1/1/1997 12:00:00 AM
         
        
        
        
            Abstract : 
We demonstrate that the problem of approximately interpolating a target function by a neural network is computationally intractable. In particular the interpolation training problem for a neural network with two monotone Lipschitzian sigmoidal internal activation functions and one linear output node is shown to be NP-hard and NP-complete if the internal nodes are in addition piecewise ratios of polynomials. This partially answers a question of Blum and Rivest (1992) concerning the NP-completeness of training a logistic sigmoidal 3-node network. An extension of the result is then given for networks with n monotone sigmoidal internal nodes and one convex output node. This indicates that many multivariate nonlinear regression problems may be computationally infeasible
         
        
            Keywords : 
computational complexity; feedforward neural nets; interpolation; learning (artificial intelligence); statistical analysis; NP-complete problem; NP-hard problem; computational intractability; convex output node; internal nodes; interpolation; interpolation training problem; linear output node; logistic sigmoidal 3-node network; monotone Lipschitzian sigmoidal internal activation functions; multivariate nonlinear regression problems; piecewise ratios; polynomials; sigmoidal neural networks training; target function; Computer networks; Feedforward neural networks; Interpolation; Logistics; Neural networks; Polynomials; Search problems; Vectors;
         
        
        
            Journal_Title : 
Information Theory, IEEE Transactions on