Title :
On the complexity of training neural networks with continuous activation functions
Author :
DasGupta, Bhaskar ; Siegelmann, Hava T. ; Sontag, Eduardo
Author_Institution :
Dept. of Comput. Sci., Minnesota Univ., Minneapolis, MN, USA
fDate :
11/1/1995 12:00:00 AM
Abstract :
Deals with computational issues of loading a fixed-architecture neural network with a set of positive and negative examples. This is the first result on the hardness of loading a simple three-node architecture which does not consist of the binary-threshold neurons, but rather utilizes a particular continuous activation function, commonly used in the neural-network literature. The authors observe that the loading problem is polynomial-time if the input dimension is constant. Otherwise, however, any possible learning algorithm based on particular fixed architectures faces severe computational barriers. Similar theorems have already been proved by Megiddo and by Blum and Rivest, to the case of binary-threshold networks only. The authors´ theoretical results lend further suggestion to the use of incremental (architecture-changing) techniques for training networks rather than fixed architectures. Furthermore, they imply hardness of learnability in the probably approximately correct sense as well
Keywords :
computational complexity; learning (artificial intelligence); neural nets; complexity; continuous activation functions; fixed-architecture neural network; learning algorithm; neural networks; polynomial-time; three-node architecture; Artificial neural networks; Computer architecture; Computer networks; Computer science; Machine learning; Neural networks; Neurons; Parallel processing; Polynomials; Training data;
Journal_Title :
Neural Networks, IEEE Transactions on