DocumentCode
423679
Title
Sharing training patterns in neural network ensembles
Author
Dara, Rozita A. ; Kamel, Mohamed
Author_Institution
Pattern Anal. & Machine Intelligence Lab., Waterloo Univ., Ont., Canada
Volume
2
fYear
2004
fDate
25-29 July 2004
Firstpage
1157
Abstract
The need for the design of complex and incremental training algorithms in multiple neural network systems has motivated us to study combining methods from the cooperation perspective. One way of achieving effective cooperation is through sharing resources such as information and components. The degree and method by which multiple classifier systems share training resources can be a measure of cooperation. Despite the growing number of interests in data modification techniques, such as bagging and k-fold cross-validation, there is no guidance for whether sharing or not sharing training patterns results in higher accuracy and under what conditions. We implemented several partitioning techniques and examined the effect of sharing training patterns by varying the size of overlap between 0-100% of the size of training subsets. Under most conditions studied, multinet systems showed improvement over the presence of larger overlap subsets.
Keywords
learning (artificial intelligence); neural nets; pattern classification; bagging technique; data modification techniques; k-fold crossvalidation technique; multiple classifier systems; multiple neural networks; neural network ensembles; sharing resources; training pattern sharing; training subsets; Algorithm design and analysis; Bagging; Boosting; Diversity reception; Intelligent networks; Intelligent systems; Laboratories; Machine intelligence; Neural networks; Pattern analysis;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
ISSN
1098-7576
Print_ISBN
0-7803-8359-1
Type
conf
DOI
10.1109/IJCNN.2004.1380100
Filename
1380100
Link To Document