DocumentCode :
3420577
Title :
Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees
Author :
Mac Aodha, Oisin ; Brostow, Gabriel J.
Author_Institution :
Univ. Coll. London, London, UK
fYear :
2013
fDate :
1-8 Dec. 2013
Firstpage :
193
Lastpage :
200
Abstract :
Typical approaches to classification treat class labels as disjoint. For each training example, it is assumed that there is only one class label that correctly describes it, and that all other labels are equally bad. We know however, that good and bad labels are too simplistic in many scenarios, hurting accuracy. In the realm of example dependent cost-sensitive learning, each label is instead a vector representing a data point´s affinity for each of the classes. At test time, our goal is not to minimize the misclassification rate, but to maximize that affinity. We propose a novel example dependent cost-sensitive impurity measure for decision trees. Our experiments show that this new impurity measure improves test performance while still retaining the fast test times of standard classification trees. We compare our approach to classification trees and other cost-sensitive methods on three computer vision problems, tracking, descriptor matching, and optical flow, and show improvements in all three domains.
Keywords :
computer vision; decision trees; image classification; learning (artificial intelligence); computer vision problems; data point affinity; decision trees; descriptor matching; example dependent cost-sensitive impurity measure; example dependent cost-sensitive learning; optical flow; standard classification trees; tracking; Decision trees; Impurities; Standards; Tracking; Training; Vectors; Vegetation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision (ICCV), 2013 IEEE International Conference on
Conference_Location :
Sydney, VIC
ISSN :
1550-5499
Type :
conf
DOI :
10.1109/ICCV.2013.31
Filename :
6751133
Link To Document :
بازگشت