Title :
Exponentiated backpropagation algorithm for multilayer feedforward neural networks
Author :
Srinivasan, N. ; Ravichandran, V. ; Chan, K.L. ; Vidhya, J.R. ; Ramakirishnan, S. ; Krishnan, S.M.
Abstract :
The gradient descent backpropagation learning algorithm is based on minimizing the mean square error. An alternate approach to gradient descent is the exponentiated gradient descent algorithm which minimizes the relative entropy. Exponentiated gradient descent applied to backpropagation is proposed for a multilayer feedforward neural network. The learning rules for changing weights in the output layer as well the hidden layer neurons in the network are developed. Simulations were performed to explore the convergence and learning of the backpropagation algorithm with exponentiated gradient descent. Accuracy obtained with exponentiated gradient descent back propagation was comparable to the gradient descent back propagation while convergence was faster. The results show that exponentiated gradient descent can be extended to a multilayer feedforward neural network and used in pattern classification applications.
Keywords :
backpropagation; convergence; entropy; feedforward neural nets; pattern classification; vectors; backpropagation; convergence; entropy; exponentiated gradient descent algorithm; feedforward neural network; hidden layer neurons; learning algorithm; mean square error minimization; multilayer neural network; pattern classification; vectors; Backpropagation algorithms; Biomedical engineering; Convergence; Feedforward neural networks; Feeds; Mean square error methods; Multi-layer neural network; Neural networks; Neurons; Pattern classification;
Conference_Titel :
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN :
981-04-7524-1
DOI :
10.1109/ICONIP.2002.1202187