Title :
Gradient steepness metrics using extended Baum-Welch transformations for universal pattern recognition tasks
Author :
Sainath, Tara N. ; Kanevsky, Dimitri ; Ramabhadran, Bhuvana
Author_Institution :
Comput. Sci., Artificial Intell. Lab., MIT, Cambridge, MA
fDate :
March 31 2008-April 4 2008
Abstract :
In many pattern recognition tasks, given some input data and a family of models, the "best" model is defined as the one which maximizes the likelihood of the data given the model. Extended Baum- Welch (EBW) transformations are most commonly used as a discriminative technique for estimating parameters of Gaussian mixtures. In this paper, we use the EBW transformations to derive a novel gradient steepness measurement to find which model best explains the data. We use this gradient measurement to derive a variety of EBW metrics to explain model fit to the data. We apply these EBW metrics to audio segmentation via Hidden Markov Models (HMMs) and show that our gradient steepness measurement is robust across different EBW metrics and model complexities.
Keywords :
Gaussian processes; audio signal processing; hidden Markov models; maximum likelihood estimation; speech recognition; Gaussian mixture parameter estimation; audio segmentation; extended Baum-Welch transformation; gradient steepness metrics; hidden Markov model; maximum likelihood estimation; pattern recognition; speech recognition; Artificial intelligence; Computer science; Gradient methods; Hidden Markov models; Laboratories; Parameter estimation; Pattern recognition; Robustness; Speech recognition; Vocabulary; Pattern recognition; gradient methods;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
Conference_Location :
Las Vegas, NV
Print_ISBN :
978-1-4244-1483-3
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2008.4518664