DocumentCode :
3563754
Title :
Tight lower bound of generalization error in ensemble learning
Author :
Uchida, Masato
Author_Institution :
Fac. of Eng., Chiba Inst. of Technol., Narashino, Japan
fYear :
2014
Firstpage :
1130
Lastpage :
1133
Abstract :
A machine learning method that integrates multiple component predictors into one predictor is referred to as ensemble learning. In this paper, we derive a tight lower bound of the generalization error in ensemble learning based on asymptotic analysis using a mathematical model based on an exponential mixture of probability distributions and the Kullback-Leibler divergence. In addition, we derive the explicit expression of the weight parameter used in the exponential mixture of probability distributions that minimizes the tight lower bound and discuss the property of the derived weight parameter.
Keywords :
exponential distribution; learning (artificial intelligence); Kullback-Leibler divergence; asymptotic analysis; ensemble learning; exponential mixture; generalization error; machine learning method; mathematical model; probability distribution; tight lower bound; Analytical models; Boosting; Mathematical model; Probability density function; Probability distribution; Upper bound; asymptotic analysis; ensemble learning; exponential mixture model; generalization error; parameter estimation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Soft Computing and Intelligent Systems (SCIS), 2014 Joint 7th International Conference on and Advanced Intelligent Systems (ISIS), 15th International Symposium on
Type :
conf
DOI :
10.1109/SCIS-ISIS.2014.7044723
Filename :
7044723
Link To Document :
بازگشت