Abstract :
A hidden Markov model (HMM) consists of a nonobservable Markov chain X=(X0,X1,...) and a measured process Y=(Y 0,Y1,...) whose distribution is determined by X. To estimate the hidden path of X up to time n,X0,X1,...,Xn, by the observations Y 0,Y1,...,Yn, usually the maximum a posteriori probability path estimator (MAP path estimator) is applied. An effective means for calculating this estimator is the Viterbi algorithm, which is widely employed in the fields of coding theory, correction of intersymbol interference and text recognition. Here, properties of the MAP estimator are derived. Under a certain Condition C, it is shown that the limiting process U=(U0,U1,...) is a regenerative process. Particularly, this means that U has an asymptotic distribution, satisfies the Central Limit Theorem, and possesses a mean error. Furthermore, Condition C is satisfied for a broad class of HMMs including the most important case for applications, the HMM with additive white Gaussian noise
Keywords :
AWGN; Viterbi decoding; hidden Markov models; maximum likelihood sequence estimation; AWGN; HMM; MAP path estimator; Viterbi algorithm; additive white Gaussian noise; central limit theorem; coding theory; hidden Markov model; intersymbol interference correction; maximum a posteriori probability; maximum-likelihood sequence estimator; regenerative process; text recognition; Additive white noise; Codes; Estimation error; Hidden Markov models; Intersymbol interference; Maximum likelihood detection; Maximum likelihood estimation; State-space methods; Text recognition; Viterbi algorithm; Central Limit Theorem; HMM with additive white Gaussian noise; Viterbi algorithm; hidden Markov model (HMM); maximum a posteriori (MAP) estimation; maximum-likelihood sequence estimator; mean error; path estimation; regenerative process; renewal process;