Title :
Prior-updating ensemble learning for discrete HMM
Author :
Heo, Gyeongyong ; Gader, Paul
Author_Institution :
Comput. & Inf. Sci. & Eng., Univ. of Florida, FL, USA
Abstract :
Ensemble learning is a variational Bayesian method in which an intractable distribution is approximated by a lower-bound. Ensemble learning results in models with better generalization and is less likely to fall into local maxima than Baum-Welch learning. However, it does not fully make use of the statistics of training data. In this paper, we propose a prior-updating variant which combines the data-driven property of Baum-Welch learning and the generalization property of ensemble learning. First we present experimental results suggesting that ensemble learning is better than the Baum-Welch learning in the aspects mentioned above and then we introduce a prior-updating method using training data. The prior-updating ensemble learning performs better than Baum-Welch as well as the pure ensemble learning in experiments with an artificial and a real data set.
Keywords :
Bayes methods; hidden Markov models; learning (artificial intelligence); variational techniques; Baum-Welch learning; data driven property; discrete HMM; generalization; intractable distribution; local maxima; lower bound; prior updating method; prior updating variant; prior-updating ensemble learning; training data statistics; variational Bayesian method; Bayesian methods; Distributed computing; Hidden Markov models; Information science; Laplace equations; Maximum likelihood estimation; Sampling methods; Statistical distributions; Taylor series; Training data;
Conference_Titel :
Pattern Recognition, 2008. ICPR 2008. 19th International Conference on
Conference_Location :
Tampa, FL
Print_ISBN :
978-1-4244-2174-9
Electronic_ISBN :
1051-4651
DOI :
10.1109/ICPR.2008.4761920