Approximate Nearest Neighbor Search to Support Manual Image Annotation of Large Domain-specific Datasets

Bastiaan J. Boom, Phoenix X. Huang, Robert B. Fisher

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The annotation of large datasets containing domain-specific images is both time-consuming and difficult. However, currently computer vision and machine learning methods have to deal with ever increasing amounts of data, where annotation of this data is essential. The annotated images allow these kind of methods to learn the variation in large datasets and evaluate methods based on large datasets. This paper presents a method for annotation of domain-specific (fish species) images using approximate nearest neighbor search to retrieve similar fish species in a large set (216,501) of images. The approximate nearest neighbor search allows us to find a ranked set of images in large datasets. Presenting similar images to users allows them to annotate images much more efficiently. In this case, our user interface present these images in such a way that the user does not need to have knowledge of a specific domain to contribute in the annotation of images.
Original languageEnglish
Title of host publicationProceedings of the International Workshop on Video and Image Ground Truth in Computer Vision Applications
Place of PublicationNew York, NY, USA
PublisherACM
Pages4:1-4:8
Number of pages8
ISBN (Print)978-1-4503-2169-3
DOIs
Publication statusPublished - 2013

Keywords

  • approximate nearest neighbor search, large-scale clustering of fish images, manual image annotation

Fingerprint

Dive into the research topics of 'Approximate Nearest Neighbor Search to Support Manual Image Annotation of Large Domain-specific Datasets'. Together they form a unique fingerprint.

Cite this