DocumentCode :
1810463
Title :
A novel fast learning algorithms for time-delay neural networks
Author :
Minghu, Jiang ; Xiaoyan, Zhu
Author_Institution :
Dept. of Comput., Tsinghua Univ., Beijing, China
Volume :
2
fYear :
1999
fDate :
36342
Firstpage :
1380
Abstract :
To counter the drawbacks of long training time required by Waibel´s time-delay neural networks (TDNN) in phoneme recognition, the paper puts forward several improved fast learning methods for TDNN. Merging the unsupervised Oja rule and the similar error backpropagation algorithm for initial training of TDNN weights can effectively increase the convergence speed. Improving the error energy function and updating the changing of weights according to size of output error, can increase the training speed. From backpropagation along layer, to average overlap part of backpropagation error of the first hidden layer along a frame, the training samples gradually increase the convergence speed increases. For multi-class phonemic modular TDNNs, we improve the architecture of Waibel´s modular networks, and obtain an optimum modular TDNNs of tree structure to accelerate its learning. Its training time is less than Waibel´s modular TDNNs
Keywords :
backpropagation; convergence; neural nets; speech recognition; unsupervised learning; Oja rule; Waibel neural networks; convergence; error backpropagation; fast learning algorithms; phoneme recognition; time-delay neural networks; unsupervised learning; Computer errors; Computer networks; Convergence; Counting circuits; IEEE members; Intelligent networks; Intelligent systems; Learning systems; Merging; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-5529-6
Type :
conf
DOI :
10.1109/IJCNN.1999.831164
Filename :
831164
Link To Document :
بازگشت