DocumentCode :
2778408
Title :
Improving the Convergence of Backpropagation by Opposite Transfer Functions
Author :
Ventresca, Mario ; Tizhoosh, Hamid R.
Author_Institution :
Waterloo Univ., Waterloo
fYear :
0
fDate :
0-0 0
Firstpage :
4777
Lastpage :
4784
Abstract :
The backpropagation algorithm is a very popular approach to learning in feed-forward multi-layer perceptron networks. However, in many scenarios the time required to adequately learn the task is considerable. Many existing approaches have improved the convergence rate by altering the learning algorithm. We present a simple alternative approach inspired by opposition-based learning that simultaneously considers each network transfer function and its opposite. The effect is an improvement in convergence rate and over traditional backpropagation learning with momentum. We use four common benchmark problems to illustrate the improvement in convergence time.
Keywords :
backpropagation; convergence; multilayer perceptrons; transfer functions; backpropagation convergence; feed-forward multi-layer perceptron networks; learning algorithm; opposite transfer functions; opposition-based learning; Backpropagation algorithms; Convergence; Design engineering; Genetic algorithms; Laboratories; Machine intelligence; Neural networks; Pattern analysis; System analysis and design; Transfer functions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
Type :
conf
DOI :
10.1109/IJCNN.2006.247153
Filename :
1716763
Link To Document :
بازگشت