DocumentCode :
288390
Title :
Speed up the learning process of feedforward neural networks
Author :
Tseng, L.Y. ; Huang, T.H.
Author_Institution :
Dept. of Appl. Math., Nat. Chung-Hsing Univ., Taichung, Taiwan
Volume :
1
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
578
Abstract :
Among the feedforward network learning rules, the backpropagation learning rule is one that has been successfully applied to a variety of problems. However, the process of backpropagation learning is somewhat a “black box”, it suffers from several limitations. We propose a scheme which uses the Hamming coding, the partitioned network, and the logic design theory to help the feedforward network to learn so as to speed up the learning process. Our experimental results reveal that the feedforward neural networks do not need to learn blindly, in fact, they can be taught to learn. The advantages of the proposed scheme are: the network size becomes smaller, the learning rate is higher, and the learning speed is faster as well
Keywords :
Hamming codes; backpropagation; feedforward neural nets; logic design; Hamming coding; backpropagation; feedforward neural networks; learning process; learning speed; logic design; partitioned network; Backpropagation algorithms; Convergence; Data mining; Data preprocessing; Feedforward neural networks; Image coding; Mathematics; Multi-layer neural network; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374229
Filename :
374229
Link To Document :
بازگشت