Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees

Oisin Mac Aodha, Gabriel J. Brostow

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Typical approaches to classification treat class labels as disjoint. For each training example, it is assumed that there is only one class label that correctly describes it, and that all other labels are equally bad. We know however, that good and bad labels are too simplistic in many scenarios, hurting accuracy. In the realm of example dependent cost-sensitive learning, each label is instead a vector representing a data point’s affinity for each of the classes. At test time, our goal is not to minimize the misclassification rate, but to maximize that affinity. We propose a novel example dependent cost-sensitive impurity measure for decision trees. Our experiments show that this new impurity measure improves test performance while still retaining the fast test times of standard classification trees. We compare our approach to classification trees and other cost-sensitive methods on three computer vision problems, tracking, descriptor matching, and optical flow, and show improvements in all three domains.
Original languageEnglish
Title of host publication2013 IEEE International Conference on Computer Vision
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages193-200
Number of pages8
ISBN (Electronic)978-1-4799-2840-8
DOIs
Publication statusPublished - 3 Mar 2014
Event2013 IEEE International Conference on Computer Vision - Sydney, Australia
Duration: 1 Dec 20138 Dec 2013
http://www.pamitc.org/iccv13/index.php

Publication series

Name
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
ISSN (Print)1550-5499
ISSN (Electronic)2380-7504

Conference

Conference2013 IEEE International Conference on Computer Vision
Abbreviated titleICCV 2013
Country/TerritoryAustralia
CitySydney
Period1/12/138/12/13
Internet address

Keywords

  • computer vision
  • decision trees
  • image classification
  • learning (artificial intelligence)
  • example dependent cost-sensitive learning
  • data point affinity
  • example dependent cost-sensitive impurity measure
  • standard classification trees
  • computer vision problems
  • tracking
  • descriptor matching
  • optical flow
  • Impurities
  • Vectors
  • Training
  • Vegetation
  • Decision trees
  • Tracking
  • Standards

Fingerprint

Dive into the research topics of 'Revisiting Example Dependent Cost-Sensitive Learning with Decision Trees'. Together they form a unique fingerprint.

Cite this