Title :
Hidden Gauss-Markov models for signal classification
Author :
Ainsleigh, Phillip L. ; Kehtarnavaz, Nasser ; Streit, Roy L.
Author_Institution :
Naval Underwater Warfare Center, Newport, RI, USA
fDate :
6/1/2002 12:00:00 AM
Abstract :
Continuous-state hidden Markov models (CS-HMMs) are developed as a tool for signal classification. Analogs of the Baum (1972), Viterbi (1962), and Baum-Welch algorithms are formulated for this class of models. The CS-HMM algorithms are then specialized to hidden Gauss-Markov models (HGMMs) with linear Gaussian state-transition and output densities. A new Gaussian refactorization lemma is used to show that the Baum and Viterbi algorithms for HGMMs are implemented by two different formulations of the fixed-interval Kalman smoother. The measurement likelihoods obtained from the forward pass of the HGMM Baum algorithm and from the Kalman-filter innovation sequence are shown to be equal. A direct link between the Baum-Welch training algorithm and an existing expectation-maximization (EM) algorithm for Gaussian models is demonstrated. A new expression for the cross covariance between time-adjacent states in HGMMs is derived from the off-diagonal block of the conditional joint covariance matrix. A parameter invariance structure is noted for the HGMM likelihood function. CS-HMMs and HGMMs are extended to incorporate mixture densities for the a priori density of the initial state. Application of HGMMs to signal classification is demonstrated with a three-class test simulation
Keywords :
Gaussian processes; Kalman filters; covariance matrices; filtering theory; hidden Markov models; optimisation; signal classification; Baum algorithm; Baum-Welch algorithm; Baum-Welch training algorithm; CS-HMM algorithms; EM algorithm; Gaussian models; Gaussian output density; HGMM Baum algorithm; HGMM likelihood function; Kalman-filter innovation sequence; Viterbi algorithm; a priori density; conditional joint covariance matrix; continuous-state hidden Markov models; cross covariance; expectation-maximization algorithm; forward pass; hidden Gauss-Markov models; linear Gaussian state-transition; measurement likelihoods; mixture densities; parameter invariance structure; three-class test simulation; Covariance matrix; Data mining; Gaussian processes; Hidden Markov models; Kalman filters; Pattern classification; Signal processing algorithms; Technological innovation; Training data; Viterbi algorithm;
Journal_Title :
Signal Processing, IEEE Transactions on
DOI :
10.1109/TSP.2002.1003060