Title :
Exploring the parameter state space of stacking
Author :
Seewald, Alexander K.
Author_Institution :
Austrian Res. Inst. for Artificial Intelligence, Wien, Austria
Abstract :
Ensemble learning schemes are a new field in data mining. While current research concentrates mainly on improving the performance of single learning algorithms, an alternative is to combine learners with different biases. Stacking is the best-known such scheme which tries to combine learners´ predictions or confidences via another learning algorithm. However, the adoption of stacking into the data mining community is hampered by its large parameter space, consisting mainly of other learning algorithms: (1) the set of learning algorithms to combine, (2) the meta-learner responsible for the combining; and (3) the type of meta-data to use - confidences or predictions. None of these parameters are obvious choices. Furthermore, little is known about the relation between the parameter settings and performance of stacking. By exploring all of stacking´s parameter settings and their interdependencies, we attempt to make stacking a suitable choice for mainstream data mining applications.
Keywords :
data mining; learning (artificial intelligence); meta data; probability; state-space methods; confidences; data mining; learning algorithms; meta-data; parameter state space; probability distribution; stacking; Artificial intelligence; Bagging; Boosting; Data mining; Learning systems; Linear regression; Machine learning; Probability distribution; Stacking; State-space methods;
Conference_Titel :
Data Mining, 2002. ICDM 2003. Proceedings. 2002 IEEE International Conference on
Print_ISBN :
0-7695-1754-4
DOI :
10.1109/ICDM.2002.1184029