DocumentCode
424088
Title
An improved EM algorithm for Bayesian networks parameter learning
Author
Zhang, Shao-Zhang ; Zhang, Zeng-Nian ; Yang, Nan-Hai ; Zhang, Jian-Ying ; Wang, Xiu-Kun
Author_Institution
Inst. of Electron. & Inf., Zhejiang Wanli Univ., Ningbo, China
Volume
3
fYear
2004
fDate
26-29 Aug. 2004
Firstpage
1503
Abstract
The automated creation of Bayesian networks can be separated into two tasks, structure learning, which consists of creating the structure of the Bayesian networks from the collected data, and parameter learning, which consists of calculating the numerical parameters for a given structure. EM algorithm is a normal method for parameter learning in incomplete data. The traditional EM algorithm has some shortages such as that could not deal with large data sets, convergence is slow and easily results in local maximum. This paper is based on E step and M step respectively. It divides large data set into several small blocks and optimizes them in the small ones. An improved simulating anneal algorithm is used in E step. Policy of dynamic iteration, starting temperature for simulating anneal and ending condition is proposed. It adopts Cauchy as the chronology to generate adjacent value. Experimental results indicate that the improved EM algorithm proposed in the paper has more advantages than the standard EM.
Keywords
belief networks; iterative methods; learning (artificial intelligence); simulated annealing; Bayesian networks; EM algorithm; dynamic iteration; parameter learning; simulating anneal algorithm; structure learning; Bayesian methods; Computer science; Convergence; Data mining; Electronic mail; Equations; Graphical models; Machine learning; Simulated annealing; Temperature;
fLanguage
English
Publisher
ieee
Conference_Titel
Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference on
Print_ISBN
0-7803-8403-2
Type
conf
DOI
10.1109/ICMLC.2004.1382011
Filename
1382011
Link To Document