Author_Institution :
AT&T Labs.-Res., Florham Park, NJ, USA
Abstract :
Hidden Markov models (HMMs) are popular in many applications, such as automatic speech recognition, control theory, biology, communication theory over channels with bursts of errors, queueing theory, and many others. Therefore, it is important to have robust and fast methods for fitting HMMs to experimental data (training). Standard statistical methods of maximum likelihood parameter estimation (such as Newton-Raphson, conjugate gradients, etc.) are not robust and are difficult to use for fitting HMMs with many parameters. On the other hand, the Baum-Welch algorithm is robust, but slow. In this paper, we present a parallel version of the Baum-Welch algorithm. We consider also unidirectional procedures which, in contrast with the well-known forward-backward algorithm, use an amount of memory that is independent of the observation sequence length
Keywords :
hidden Markov models; parallel algorithms; probability; HMM; automatic speech recognition; biology; communication theory; control theory; error bursts; experimental data; hidden Markov models; observation sequence length; parallel Baum-Welch algorithms; parallel version; queueing theory; speech; training; unidirectional procedures; Automatic speech recognition; Computational biology; Control theory; Error correction; Hidden Markov models; Maximum likelihood estimation; Parameter estimation; Queueing analysis; Robustness; Statistical analysis;