Edinburgh Research Explorer

Approximate Nearest Neighbor Search to Support Manual Image Annotation of Large Domain-specific Datasets

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Related Edinburgh Organisations

Open Access permissions

Open

Documents

http://doi.acm.org/10.1145/2501105.2501112
Original languageEnglish
Title of host publicationProceedings of the International Workshop on Video and Image Ground Truth in Computer Vision Applications
Place of PublicationNew York, NY, USA
PublisherACM
Pages4:1-4:8
Number of pages8
ISBN (Print)978-1-4503-2169-3
DOIs
Publication statusPublished - 2013

Abstract

The annotation of large datasets containing domain-specific images is both time-consuming and difficult. However, currently computer vision and machine learning methods have to deal with ever increasing amounts of data, where annotation of this data is essential. The annotated images allow these kind of methods to learn the variation in large datasets and evaluate methods based on large datasets. This paper presents a method for annotation of domain-specific (fish species) images using approximate nearest neighbor search to retrieve similar fish species in a large set (216,501) of images. The approximate nearest neighbor search allows us to find a ranked set of images in large datasets. Presenting similar images to users allows them to annotate images much more efficiently. In this case, our user interface present these images in such a way that the user does not need to have knowledge of a specific domain to contribute in the annotation of images.

    Research areas

  • approximate nearest neighbor search, large-scale clustering of fish images, manual image annotation

Download statistics

No data available

ID: 11411345