Title :
An empirical evaluation of bagging and boosting for artificial neural networks
Author :
Opitz, David W. ; Maclin, Richard E.
Author_Institution :
Dept. of Comput. Sci., Montana Univ., Missoula, MT, USA
Abstract :
Bagging and boosting are two relatively new but popular methods for producing classifier ensembles. An ensemble consists a set of independently trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying instances. Previous research suggests that an ensemble as a whole is often more accurate than any of the single classifiers in the ensemble. In this paper we evaluate bagging and boosting as methods for creating an ensemble of neural networks. We also include results from Quinlan´s (1996) decision tree evaluation of these methods. Our results indicate that the ensemble methods can indeed produce very accurate classifiers for some dataset, but that these gains may depend on aspects of the dataset. In particular we find that bagging is probably appropriate for most problems, but when properly applied boosting may produce even larger gains in accuracy
Keywords :
neural nets; pattern classification; probability; bagging; boosting; decision tree; neural classifier; neural networks; pattern classification; probability; Artificial neural networks; Bagging; Boosting; Classification tree analysis; Computer science; Decision trees; Neural networks; Testing;
Conference_Titel :
Neural Networks,1997., International Conference on
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.613999