DocumentCode :
2973064
Title :
VFSR trained artificial neural networks
Author :
Rosen, Bruce E. ; Goodwin, James M.
Author_Institution :
Div. of Math., Comput. Sci. & Stat., Texas Univ., San Antonio, TX, USA
Volume :
3
fYear :
1993
fDate :
25-29 Oct. 1993
Firstpage :
2959
Abstract :
Artificial neural networks are most often trained using backward error propagation (BEP), which works quite well for network training problems having a single minimum in the error function. Although BEP has been successful in many applications, there can be substantial problems in convergence because of the existence of local minima and network paralysis. We describe a method for avoiding local minima by combining very fast simulated reannealing (VFSR) with BEP. While convergence to the best training weights can be slower than gradient descent methods, it is faster than other SA network training methods. More importantly, convergence to the optimal weight set is guaranteed. We demonstrate VFSR network training on a variety of test problems, such as the exclusive-or and parity problems, and compare performances of VFSR network training with conjugate gradient trained backpropagation networks.
Keywords :
backpropagation; convergence; neural nets; simulated annealing; EXOR problem; artificial neural networks; backpropagation; backward error propagation; convergence; exclusive-or problem; local minima; network paralysis; optimal weight set; parity problem; simulated annealing; very fast simulated reannealing; Artificial neural networks; Backpropagation algorithms; Computer errors; Computer science; Convergence; Error analysis; Mathematics; Optimization methods; Simulated annealing; Temperature distribution;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
Type :
conf
DOI :
10.1109/IJCNN.1993.714343
Filename :
714343
Link To Document :
بازگشت