Title :
Noisy hidden Markov models for speech recognition
Author :
Audhkhasi, Kartik ; Osoba, Osonde ; Kosko, B.
Author_Institution :
Electr. Eng. Dept., Univ. of Southern California, Los Angeles, CA, USA
Abstract :
We show that noise can speed training in hidden Markov models (HMMs). The new Noisy Expectation-Maximization (NEM) algorithm shows how to inject noise when learning the maximum-likelihood estimate of the HMM parameters because the underlying Baum-Welch training algorithm is a special case of the Expectation-Maximization (EM) algorithm. The NEM theorem gives a sufficient condition for such an average noise boost. The condition is a simple quadratic constraint on the noise when the HMM uses a Gaussian mixture model at each state. Simulations show that a noisy HMM converges faster than a noiseless HMM on the TIMIT data set.
Keywords :
expectation-maximisation algorithm; hidden Markov models; learning (artificial intelligence); speech recognition; Baum-Welch training algorithm; Gaussian mixture model; HMM; NEM theorem; TIMIT data set; average noise boost; maximum-likelihood estimate; noisy expectation-maximization algorithm; noisy hidden Markov models; quadratic constraint; speech recognition; sufficient condition; Hidden Markov models; Maximum likelihood estimation; Noise; Noise measurement; Signal processing algorithms; Speech recognition; Training; Expectation Maximization algorithm; Hidden Markov model; noise injection; noisy EM algorithm; speech recognition; stochastic resonance;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6707088