Title :
Global optimization algorithms for training product unit neural networks
Author :
Ismail, A. ; Engelbrecht, AP
Author_Institution :
Dept. of Comput. Sci., Univ. of Western Cape, South Africa
Abstract :
Product units in the hidden layer of multilayer neural networks provide a powerful mechanism for neural networks to efficiently learn higher-order combinations of inputs. Training product unit networks using local optimization algorithms is difficult due to an increased number of local minima and increased chances of network paralysis. The paper discusses the problems with using gradient descent to train product unit neural networks, and shows that particle swarm optimization, genetic algorithms and LeapFrog are efficient alternatives to successfully train product unit neural networks
Keywords :
genetic algorithms; learning (artificial intelligence); multilayer perceptrons; LeapFrog; global optimization algorithms; gradient descent; hidden layer; particle swarm optimization; product unit neural networks; Africa; Computer architecture; Computer networks; Computer science; Equations; Function approximation; Genetic algorithms; Multi-layer neural network; Neural networks; Particle swarm optimization;
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
Print_ISBN :
0-7695-0619-4
DOI :
10.1109/IJCNN.2000.857826