The annotation of large datasets containing domain-specific images is both time-consuming and difficult. However, currently computer vision and machine learning methods have to deal with ever increasing amounts of data, where annotation of this data is essential. The annotated images allow these kind of methods to learn the variation in large datasets and evaluate methods based on large datasets. This paper presents a method for annotation of domain-specific (fish species) images using approximate nearest neighbor search to retrieve similar fish species in a large set (216,501) of images. The approximate nearest neighbor search allows us to find a ranked set of images in large datasets. Presenting similar images to users allows them to annotate images much more efficiently. In this case, our user interface present these images in such a way that the user does not need to have knowledge of a specific domain to contribute in the annotation of images.
|Title of host publication||Proceedings of the International Workshop on Video and Image Ground Truth in Computer Vision Applications|
|Place of Publication||New York, NY, USA|
|Number of pages||8|
|Publication status||Published - 2013|
- approximate nearest neighbor search, large-scale clustering of fish images, manual image annotation