Title :
Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence
Author_Institution :
Fac. of Eng., Chiba Inst. of Technol., Narashino, Japan
Abstract :
When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.
Keywords :
exponential distribution; inference mechanisms; parameter estimation; advanced reasoning; component probability distributions; exponential mixture; exponential mixture distribution; performance evaluation; probability distribution; stochastic model; symmetric Kullback-Leibler divergence; unsupervised weight parameter estimation; Boosting; Mathematical model; Parameter estimation; Performance evaluation; Predictive models; Probability distribution; ensemble learning; exponential mixture model; parameter estimation; symmetric Kullback-Leibler divergence;
Conference_Titel :
Soft Computing and Intelligent Systems (SCIS), 2014 Joint 7th International Conference on and Advanced Intelligent Systems (ISIS), 15th International Symposium on
DOI :
10.1109/SCIS-ISIS.2014.7044722