Title :
Speed training: improving the rate of backpropagation learning through stochastic sample presentation
Author :
Rimer, Michael E. ; Andersen, Timothy L. ; Martinez, Tony R.
Author_Institution :
Dept. of Comput. Sci., Brigham Young Univ., Provo, UT, USA
Abstract :
Artificial neural networks provide an effective empirical predictive model for pattern classification. However, using complex neural networks to learn very large training sets is often problematic, imposing prohibitive time constraints on the training process. We present four practical methods for dramatically decreasing training time through dynamic stochastic sample presentation, a technique we call speed training. These methods are shown to be robust to retaining generalization accuracy over a diverse collection of real world data sets. In particular, the stochastic presentation with error threshold technique achieves a training speedup of 4278% on a large OCR database with no detectable loss in generalization
Keywords :
backpropagation; character recognition; generalisation (artificial intelligence); neural nets; pattern classification; backpropagation learning; error threshold; generalization; neural networks; optical character recognition; pattern classification; stochastic sample presentation; Backpropagation algorithms; Computer science; Convergence; Neural networks; Predictive models; Robustness; Stochastic processes; Testing; Time factors; Training data;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.938790