Title :
Tuning of learning rate and momentum on backpropagation
Author :
Kamiyama, Naoki ; Izumi, Hiroyuki ; Iijima, Nobukazu ; Mitsui, Hideo ; Yoshida, Masao ; Sone, Mototaka
Author_Institution :
Musashi Inst. of Technol., Tokyo, Japan
Abstract :
Summary form only given. In the backpropagation process, the amendment of interconnecting weights given to the units of the input layer or hidden layer is calculated by the momentum (α) and the learning rate (η). The number of training cycles, therefore, depends on α and η, so that it is necessary to choose the most suitable values for α and η. By changing α and η, the authors tried to search for the most suitable values for the learning. The combinations α and η behave under the constant rule, which is represented by η=K(1-α). Moreover, the constant K is determined by the ratio between the number of output units and hidden units. This conclusion is very important for deciding the size of a neural network
Keywords :
learning systems; neural nets; backpropagation; hidden layer; input layer; interconnecting weights; learning rate; momentum; neural network; training cycles; Backpropagation; Emulation; Expert systems; Hardware; Joining processes; Large-scale systems; Neural networks; Pattern recognition;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155608