Title :
Improving speech recognition learning through lazy training
Author :
Rimer, Michael E. ; Martinez, Tony R. ; Wilson, D. Randall
Author_Institution :
Dept. of Comput. Sci., Brigham Young Univ., Provo, UT, USA
fDate :
6/24/1905 12:00:00 AM
Abstract :
Multilayer backpropagation, like most learning algorithms that can create complex decision surfaces, is prone to overfitting. We present a novel approach, called lazy training, for reducing the overfit in multiple-layer networks. Lazy training consistently reduces generalization error of optimized neural networks by more than half on a large OCR dataset and on several real world problems from the UCI machine learning database repository. Here, lazy training is shown to be effective in a multilayered adaptive learning system, reducing the error of an optimized backpropagation network in a speech recognition system by 50.0% on the TIDIGITS corpus
Keywords :
backpropagation; generalisation (artificial intelligence); multilayer perceptrons; optimisation; speech recognition; OCR; TIDIGITS corpus; complex decision surfaces; generalization error reduction; lazy training; multilayer backpropagation; multilayer neural networks; multilayered adaptive learning system; multiple-layer networks; optimized neural networks; overfitting; speech recognition learning; Backpropagation algorithms; Computer science; Databases; Hidden Markov models; Machine learning; Neural networks; Optical character recognition software; Robustness; Speech recognition; Training data;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1007548