DocumentCode :
2664606
Title :
Combining Classifiers in a Tree Structure
Author :
Woloszynski, Tomasz ; Kurzynski, Marek
Author_Institution :
Dept. of Syst. & Comput. Networks, Wroclaw Univ. of Technol., Wroclaw, Poland
fYear :
2008
fDate :
10-12 Dec. 2008
Firstpage :
785
Lastpage :
790
Abstract :
This paper presents a method for combining classifiers in a tree structure, where each node of the tree contains single hypothesis trained in respective region of the feature space. All base classifiers are then combined using weighted average. Majority vote and Newton-Raphson numerical optimization are used for fitting the coefficients in the additive model. Two loss functions (quadratic and boosting-like exponential) as well as new splitting criteria for inducing the tree are examined within proposed framework. The idea of combining classifiers in a tree structure is then compared with other iteratively built classifiers: Adaboost.MH and MART (multiple additive regression trees). The experiments were conducted with the usage of well-known databases from the UCI Repository and the ELENA project.
Keywords :
Newton-Raphson method; decision making; pattern classification; regression analysis; tree data structures; Adaboost.MH; MART; Newton-Raphson numerical optimization; classification problems; classifier combining; coefficient fitting; iteratively built classifiers; loss functions; majority vote; multiple additive regression trees; real-life decision making processes; tree structure; weighted average; Additives; Boosting; Classification tree analysis; Computer networks; Databases; Pattern recognition; Regression tree analysis; Space technology; Tree data structures; Voting; Combining Clasifiers;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Intelligence for Modelling Control & Automation, 2008 International Conference on
Conference_Location :
Vienna
Print_ISBN :
978-0-7695-3514-2
Type :
conf
DOI :
10.1109/CIMCA.2008.22
Filename :
5172725
Link To Document :
بازگشت