• DocumentCode
    2722643
  • Title

    Training speed-up methods for neural networks applied to word recognition

  • Author

    Thierer, Gebhard ; Krause, Andreas ; Hackbarth, Heidi

  • Author_Institution
    SEL Alcatel, Stuttgart, Germany
  • fYear
    1991
  • fDate
    8-14 Jul 1991
  • Firstpage
    865
  • Abstract
    Two methods for speeding up the training of neural networks for word recognition are presented. The idea of the first method is to reduce the number of training patterns. The number of vocabulary repetitions can be cut down from seven to one if a pretrained network is used as the basis for further learning instead of a randomly initialized network. The second method does not need a pretrained network. Instead, training is performed alternatively with the entire training set and a subset thereof. This saves unnecessary backpropagation cycles for patterns that have already been learned. Depending on the data material, network training time is reduced at least by a factor of seven in the first case, and by a factor of two in the second case. Moreover, the error rate for a 100-word vocabulary can be lowered by one fourth by applying the second method
  • Keywords
    artificial intelligence; learning systems; neural nets; speech recognition; 100-word vocabulary; backpropagation cycles; error rate; learning; neural networks; training speed-up; vocabulary repetitions; word recognition; Artificial neural networks; Cepstrum; Databases; Error analysis; Multilayer perceptrons; Neural networks; Optimizing compilers; Radio access networks; Speech recognition; Vocabulary;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
  • Conference_Location
    Seattle, WA
  • Print_ISBN
    0-7803-0164-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.1991.155448
  • Filename
    155448