Title :
Large margin rectangle learning an alternative way to learn interpretable and representative models
Author :
Kirmse, Matthias ; Petersohn, Uwe
Author_Institution :
Dept. of Comput. Sci., Dresden Univ. of Technol., Dresden, Germany
Abstract :
In this paper we propose a new hyperrectangle based learning method called Large Margin Rectangle Learning (LMRL). The goal of LMRL is to combine the interpretability of decision trees and other rectangle based learning models with the accuracy gain enabled by the large margin principle known from support vector machines. LMRL consists of two basic steps: a supervised clustering step to create an initial rectangle based generalization of the training data and a growing rectangle step to increase both the representativity of the rectangle configuration and the corresponding margins between single rectangles. Our experiments show that LMRL provides an interpretable and representative rectangle configuration in contrast to nearest rectangle learning methods like LearnRight and at the same time performs equally well or better than the compared decision tree and rule learner. These results support our assumption that the presented combination of supervised clustering and the large margin principle is at least comparably qualified to produce interpretable decision boundaries as greedy local heuristics used in most decision tree and rule learning algorithms today.
Keywords :
decision trees; learning (artificial intelligence); pattern clustering; support vector machines; decision tree; hyperrectangle based learning method; interpretable models; large margin rectangle learning; representative models; rule learning; supervised clustering; support vector machines; Accuracy; Clustering algorithms; Clustering methods; Computational modeling; Decision trees; Support vector machines; Training;
Conference_Titel :
Soft Computing and Pattern Recognition (SoCPaR), 2011 International Conference of
Conference_Location :
Dalian
Print_ISBN :
978-1-4577-1195-4
DOI :
10.1109/SoCPaR.2011.6089133