DocumentCode :
2623356
Title :
Multiple training concept for back-propagation networks
Author :
Wang, Yeou-Fang ; Cruz, Jose B., Jr. ; Mulligan, J.H., Jr.
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA, USA
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
535
Abstract :
The multiple training concept first applied to bidirectional associative memory training is applied to the one-sweep back-propagation algorithm. The algorithm is called multiple training back-propagation. Computer simulations show that by putting different weights on different pairs in the energy function, this algorithm can increase the training speed of the network. The pair weights are updated during the training phase using the basic differential multiplier method. However, those pair weights are not used during the decoding phase. A sufficient condition for convergence of the training phase is provided, followed by two simulation examples, XOR and stochastic test
Keywords :
learning systems; neural nets; XOR function; back-propagation networks; basic differential multiplier method; energy function; multiple training concept; one-sweep back-propagation algorithm; pair weights; stochastic test; Associative memory; Computer simulation; Convergence; Decoding; Manufacturing automation; Neural networks; Neurons; Stochastic processes; Sufficient conditions; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170455
Filename :
170455
Link To Document :
بازگشت