DocumentCode :
2699707
Title :
Optimization techniques applied to neural networks: Line search implementation for back propagation
Author :
Jones, Kim L. ; Lustig, Irvin J. ; Kornhasuer, A.L.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
933
Abstract :
An introductory study to determine the feasibility and benefits of applying optimization techniques to standard learning algorithms in neural networks is presented. A gradient-descent (GD) algorithm that includes a line search for determining a better step size is implemented. This algorithm is compared to the traditional back propagation (BP) technique without the search. The comparison is made with an effort to isolate the effects of the line-search procedure. The comparative analysis was performed on four problems. Confidence intervals are presented for the ratios of the mean CPU times and mean number of epochs over all the problems. These show that the line search decreases the mean number of iterations required for learning by an order of magnitude and that the mean CPU time required by the modified technique decreases by threefold on a serial computing machine
Keywords :
learning systems; neural nets; optimisation; search problems; Line search implementation; back propagation; gradient descent algorithm; learning algorithms; neural networks; optimization techniques; serial computing machine;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137953
Filename :
5726910
Link To Document :
بازگشت