DocumentCode :
1817327
Title :
Activated hidden connections to accelerate the learning in recurrent neural networks
Author :
Kamimura, Ryotaro
Author_Institution :
Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
Volume :
1
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
693
Abstract :
A method of accelerating the learning in recurrent neural networks is considered. Owing to a possible large number of connections, it has been expected that recurrent neural networks will converge faster. To activate hidden connections and use hidden units efficiently, a complexity term proposed by D.E. Rumelhart was added to the standard quadratic error function. A complexity term method is modified with a parameter to be normally effective for positive values, while negative values are pushed toward values with larger absolute values. Thus, some hidden connections are expected to be large enough to use hidden units and to speed up the learning. From the author´s experiments, it was confirmed that the complexity term was effective in increasing the variance of connections, especially hidden connections, and that eventually some hidden connections were activated and large enough for hidden units to be used in speeding up the learning
Keywords :
computational complexity; learning (artificial intelligence); neural nets; complexity term; learning acceleration; recurrent neural networks; standard quadratic error function; Acceleration; Computer networks; Equations; Feedforward systems; Information science; Intelligent networks; Laboratories; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.287106
Filename :
287106
Link To Document :
بازگشت