DocumentCode :
441951
Title :
A variational EM algorithm for large databases
Author :
Huang, Hao ; Bi, Le-Peng ; Song, Han-tao ; Lu, Yu-Chang
Author_Institution :
Dept. of Comput. Sci., Beijing Inst. of Technol., China
Volume :
5
fYear :
2005
fDate :
18-21 Aug. 2005
Firstpage :
3048
Abstract :
The EM algorithm is one of the most popular statistical learning algorithms. It is a method for parameter estimation in various problems involving missing data. However, it is a batch learning method and often requires significant computational resources. So we need to develop more elaborate methods to adapt the databases with a large number of records or large dimensionality. In this paper, we present an algorithm which significantly reduces the intensity of computation. The algorithm is based on partial E-steps which has the standard convergence guarantee of EM. It is a version of the incremental EM algorithm which cycles through data cases in blocks. We confirm that the algorithm can reduce computational costs evidently through its application to large databases.
Keywords :
expectation-maximisation algorithm; learning (artificial intelligence); statistical analysis; variational techniques; very large databases; EM algorithm; batch learning method; expectation maximization; incremental EM; large databases; parameter estimation; partial E-steps; statistical learning; Acceleration; Computational efficiency; Computer science; Convergence; Databases; Hidden Markov models; Iterative algorithms; Machine learning algorithms; Maximum likelihood estimation; Parameter estimation; EM algorithm; ILEM; incremental EM; lazy EM;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics, 2005. Proceedings of 2005 International Conference on
Conference_Location :
Guangzhou, China
Print_ISBN :
0-7803-9091-1
Type :
conf
DOI :
10.1109/ICMLC.2005.1527465
Filename :
1527465
Link To Document :
بازگشت