DocumentCode :
3280423
Title :
Design of hierarchical perceptron structures and their application to the task of isolated-word recognition
Author :
Kämmerer, Bernhard R. ; Küpper, Wolfgang A.
Author_Institution :
Siemens AG, Munich, West Germany
fYear :
1989
fDate :
0-0 1989
Firstpage :
243
Abstract :
Several design strategies for feedforward networks are examined within the scope of pattern classification. Single- and two-layer perceptron models are adapted for experiments in isolated-word recognition. Direct (one-step) classification and several hierarchical (two-step) schemes have been considered. For a vocabulary of 20 English words spoken repeatedly by 11 speakers, the word classes are found to be separable by hyperplanes in the chosen feature space. Since for speaker-dependent word recognition the underlying database contains only a small training set, an automatic expansion of the training material improves the generalization properties of the networks. This method accounts for a wide variety of observable temporal structures for each word and gives a better overall estimate of the network parameters, which leads to a recognition rate of 99.5%. For speaker-independent word recognition, a hierarchical structure with pairwise training of two-class models is superior to a single uniform network (98% average recognition rate).<>
Keywords :
neural nets; speech recognition; feature space; feedforward networks; hierarchical perceptron structures; hyperplanes; isolated-word recognition; network parameters; observable temporal structures; pattern classification; recognition rate; speaker-dependent word recognition; two-class models; vocabulary; Neural networks; Speech recognition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118587
Filename :
118587
Link To Document :
بازگشت