Title :
MEKA-a fast, local algorithm for training feedforward neural networks
Author :
Shah, Samir ; Palmieri, Francesco
Abstract :
It is noted that the training of feedforward networks using the conventional backpropagation algorithm is plagued by poor convergence and misadjustment. The authors introduce the multiple extended Kalman algorithm (MEKA) to train feedforward networks. It is based on the idea of partitioning the global problem of finding the weights into a set of manageable nonlinear subproblems. The algorithm is local at the neuron level. The superiority of MEKA over the global extended Kalman algorithm in terms of convergence and quality of solution obtained on two benchmark problems is demonstrated. The superior performance can be attributed to the nonlinear localized approach. In fact, the nonconvex nature of the local performance surface reduces the chances of getting trapped into a local minima
Keywords :
computational complexity; learning systems; neural nets; MEKA; benchmark problems; conventional backpropagation algorithm; convergence; feedforward neural networks; local algorithm; local performance surface; manageable nonlinear subproblems; minima; misadjustment; multiple extended Kalman algorithm; nonconvex nature; nonlinear localized approach;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137822