Title :
Effect of sharing training patterns on the performance of classifier ensemble
Author :
Dara, Rozita A. ; Kamel, Mohamed
Author_Institution :
Pattern Anal. & Machine Intelligence Lab., Waterloo Univ., Ont., Canada
Abstract :
The availability of large and complex data sets has shifted the focus of pattern recognition toward developing techniques that can efficiently handle these types of data sets. Multiple classifier systems have the promise of reducing the error and complexity by partitioning the data space and combining classifier predictions. However, difficulties arise in ways of generating these various partitions and using them effectively. In this paper, several methods of partitioning is studied and compared, by examining implicit and explicit sharing of the training patterns among multiple classifiers. We implemented several partitioning techniques using random and a more intelligent approach, clustering, to obtain more insight into the effect of shared and disjoint data representation across the training subsets. Improved classification accuracy suggests that implicit sharing of training patterns is always beneficial, while explicit sharing is useful for small size training data.
Keywords :
learning (artificial intelligence); pattern classification; classifier ensemble performance; classifier predictions; data representation; multiple classifier systems; partitioning techniques; pattern recognition; training pattern sharing; training patterns; Availability; Boosting; Diversity methods; Diversity reception; Laboratories; Machine intelligence; Pattern analysis; Pattern recognition; Training data; Voting;
Conference_Titel :
Systems, Man and Cybernetics, 2004 IEEE International Conference on
Print_ISBN :
0-7803-8566-7
DOI :
10.1109/ICSMC.2004.1399791