DocumentCode
401667
Title
Asymptotic normality of posterior in consistent Bayesian learning
Author
Zhen-Yu ; Lin, Shi-Min ; Lu, Yu-Chang
Author_Institution
Dept. of Comput. Sci., Guangxi Normal Univ., Guilin, China
Volume
3
fYear
2003
fDate
2-5 Nov. 2003
Firstpage
1400
Abstract
This paper presents the study of asymptotic normality of posterior in Bayesian learning from the point of view of computational learning theory. Three reduced regular conditions, which are more convenient to be used than Walker´s and Heyde´s, are presented. The theorem, which shows under certain regular condition the poster distribution is not only consistent, but also approximately normal, is proved. Since the computation of normal distribution is relatively simpler, then the results can become the theoretic foundation for further study, for instance, assigning results prior distribution and simplifying the computation in Bayesian learning.
Keywords
Bayes methods; computational complexity; learning (artificial intelligence); normal distribution; Bayesian learning; asymptotic normality; computational learning theory; machine learning; normal distribution; posterior distribution; Approximation algorithms; Bayesian methods; Computational efficiency; Computer networks; Computer science; Data mining; Distributed computing; Gaussian distribution; Machine learning; Probability;
fLanguage
English
Publisher
ieee
Conference_Titel
Machine Learning and Cybernetics, 2003 International Conference on
Print_ISBN
0-7803-8131-9
Type
conf
DOI
10.1109/ICMLC.2003.1259711
Filename
1259711
Link To Document