Title :
Backpropagation algorithm which varies the number of hidden units
Author :
Hirose, Y. ; Yamashita, Katsumi ; Hijiya, S.
Author_Institution :
Fujitsu Lab. Ltd., Atsugi, Japan
Abstract :
Summary form only given, as follows. A backpropagation algorithm is presented that varies the number of hidden units. The algorithm is expected to escape local minima and makes it no longer necessary to decide on the number of hidden units. Exclusive-OR training and 8*8 dot alphanumeric font training using this algorithm are explained. In exclusive-OR training, the probability of being trapped in local minima is reduced. In alphanumeric font training, the network converted two to three times faster than the conventional backpropagation algorithm.<>
Keywords :
learning systems; neural nets; XOR training; alphanumeric font training; backpropagation algorithm; hidden units; learning systems; local minima; neural nets; Learning systems; Neural networks;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118518