L1 Graph Based Sparse Model for Label De-noising

Xiaobin Chang, Tao Xiang, Timothy Hospedales

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

The abundant images and user-provided tags available on social media websites provide an intriguing opportunity to scale vision problems beyond the limits imposed by manual dataset collection and annotation. However, exploiting
user-tagged data in practice is challenging since it contains many noisy (incorrect and missing) labels. In this work, we propose a novel robust graph-based approach for label de-noising. Specifically, the proposed model is built upon (i) label smoothing via a visual similarity graph in a form of L1 graph regulariser, which is more robust against visual outliers than the conventional L2 regulariser, and (ii) explicitly modelling the label noise pattern, which helps to further improve de-noising performance. An efficient algorithm is formulated to optimise the proposed model, which contains multiple robust L1 terms in its objective function and is thus non-trivial to optimise. We demonstrate our model’s superior denoising performance across the spectrum of problems from multi-class with label noise to real social media data with more complex multi-label structured label noise patterns.
Original languageEnglish
Title of host publicationBritish Machine Vision Conference (BMVC 2016, Oral)
PublisherBMVA Press
Pages74.1-74.12
Number of pages12
ISBN (Electronic)1-901725-59-6
DOIs
Publication statusPublished - 22 Sept 2016
Event27th British Machine Vision Conference - York, United Kingdom
Duration: 19 Sept 201622 Sept 2016
http://bmvc2016.cs.york.ac.uk/
http://bmvc2016.cs.york.ac.uk/

Conference

Conference27th British Machine Vision Conference
Abbreviated titleBMVC 2016
Country/TerritoryUnited Kingdom
CityYork
Period19/09/1622/09/16
Internet address

Fingerprint

Dive into the research topics of 'L1 Graph Based Sparse Model for Label De-noising'. Together they form a unique fingerprint.

Cite this