DocumentCode :
3269769
Title :
Backpropagation algorithm which varies the number of hidden units
Author :
Hirose, Y. ; Yamashita, Katsumi ; Hijiya, S.
Author_Institution :
Fujitsu Lab. Ltd., Atsugi, Japan
fYear :
1989
fDate :
0-0 1989
Abstract :
Summary form only given, as follows. A backpropagation algorithm is presented that varies the number of hidden units. The algorithm is expected to escape local minima and makes it no longer necessary to decide on the number of hidden units. Exclusive-OR training and 8*8 dot alphanumeric font training using this algorithm are explained. In exclusive-OR training, the probability of being trapped in local minima is reduced. In alphanumeric font training, the network converted two to three times faster than the conventional backpropagation algorithm.<>
Keywords :
learning systems; neural nets; XOR training; alphanumeric font training; backpropagation algorithm; hidden units; learning systems; local minima; neural nets; Learning systems; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118518
Filename :
118518
Link To Document :
بازگشت