DocumentCode :
3166691
Title :
Activation of connections to accelerate the learning in recurrent back-propagation
Author :
Kamimura, Ryotaro
Author_Institution :
Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
fYear :
1992
fDate :
4-8 May 1992
Firstpage :
187
Lastpage :
192
Abstract :
A method of accelerating learning in recurrent neural networks is described. To activate and use connections, a complexity term defined by an equation was added to a standard quadratic error function. In experiments, a method was used in which a derivative of the complexity term was normally effective for positive connections, while negative connections were pushed toward smaller values. Thus, some connections were expected to be activated and were large enough to speed up learning. It was confirmed that the complexity term was effective in increasing the variance of the connections, especially the hidden connections. It was also confirmed that eventually some connections, especially some hidden connections, were activated and were large enough to be used in speeding up learning.<>
Keywords :
backpropagation; computational complexity; learning (artificial intelligence); recurrent neural nets; complexity term; hidden connections; negative connections; positive connections; recurrent neural networks; standard quadratic error function; Acceleration; Equations; Feedforward systems; Information science; Intelligent networks; Laboratories; Learning systems; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
CompEuro '92 . 'Computer Systems and Software Engineering',Proceedings.
Conference_Location :
The Hague, Netherlands
Print_ISBN :
0-8186-2760-3
Type :
conf
DOI :
10.1109/CMPEUR.1992.218512
Filename :
218512
Link To Document :
بازگشت