Author :
Johansson, Ulf ; Lofstrom, Tuve ; Bostrom, Henrik
Author_Institution :
Sch. of Bus. & IT, Univ. of Boras, Boras, Sweden
Abstract :
In this paper, we introduce and evaluate a novel method, called random brains, for producing neural network ensembles. The suggested method, which is heavily inspired by the random forest technique, produces diversity implicitly by using bootstrap training and randomized architectures. More specifically, for each base classifier multilayer perceptron, a number of randomly selected links between the input layer and the hidden layer are removed prior to training, thus resulting in potentially weaker but more diverse base classifiers. The experimental results on 20 UCI data sets show that random brains obtained significantly higher accuracy and AUC, compared to standard bagging of similar neural networks not utilizing randomized architectures. The analysis shows that the main reason for the increased ensemble performance is the ability to produce effective diversity, as indicated by the increase in the difficulty diversity measure.
Keywords :
learning (artificial intelligence); multilayer perceptrons; pattern classification; UCI data sets; base classifier multilayer perceptron; bootstrap training; diversity measure; diversity production; machine learning; neural network ensembles; random brains; random forest technique; randomized architectures; Accuracy; Artificial neural networks; Bagging; Diversity methods; Diversity reception; Standards; Training;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6707026