DocumentCode :
1560694
Title :
Back propagation based on selective attention for fast convergence of training neural network
Author :
Yang, Bo ; Su, Xiaohong ; Wang, Yadong
Author_Institution :
Sch. of Comput. Sci. & Technol., Harbin Inst. of Technol., China
Volume :
3
fYear :
2004
Firstpage :
2009
Abstract :
A back propagation neural network based on selective attention (SABP) is proposed for improving the learning speed of multi-layer artificial neural networks with sigmoid activation function. The algorithm embedded the selective attention model in the NN training. The key to this algorithm lies in the pre-attention function, in which primitive features of the stimulus were detected in parallel by what might be called feature detectors by an improved GA, and the attention function, in which detail features were handled with the attention spotlight by the BP with an enlarged error item in the output layer. Therefore, the SABP obtained the ability of searching the global optimum solution relying on the pre-attention function that made a lower fault rate to drop into local optimum, and the attention function updated the weights of neural network effectively by enlarging the error in the output layer, which kept a high leaning rate to meet the convergence criteria quickly. The simulation experiments demonstrate that the algorithm is more powerful than Momentum algorithm or Delta-bar-Delta rule in learning, and it can effectively avoid failure training caused by randomizing the initial weights and thresholds, and solve the slow convergence problem resulting from the Flat-Spots when the error signal becomes too small.
Keywords :
backpropagation; convergence; genetic algorithms; multilayer perceptrons; transfer functions; Delta-bar-Delta rule; Flat-Spots; Momentum algorithm; back propagation neural network; backpropagation based on selective attention; error signal; fast convergence; feature detectors; genetic algorithms; multilayer artificial neural networks; preattention function; sigmoid activation function; slow convergence; training neural network; Artificial neural networks; Computer hacking; Computer science; Computer vision; Convergence; Detectors; Electronic mail; Genetic algorithms; Multi-layer neural network; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Control and Automation, 2004. WCICA 2004. Fifth World Congress on
Print_ISBN :
0-7803-8273-0
Type :
conf
DOI :
10.1109/WCICA.2004.1341934
Filename :
1341934
Link To Document :
بازگشت