Abstract / Description of output
State-of-the-art computer-vision algorithms rely on big and accurately annotated data, which are expensive, laborious and time-consuming to generate. This task is even more challenging when it comes to microbiological images, because they require specialized expertise for accurate annotation. Previous studies show that crowdsourcing and assistive-annotation tools are two potential solutions to address this challenge. In this work, we have developed a web-based platform to enable crowdsourcing annotation of image data; the platform is powered by a semi-automated assistive tool to support non-expert annotators to improve the annotation efficiency. The behavior of annotators with and without the assistive tool is analyzed, using biological images of different complexity. More specifically, non-experts have been asked to use the platform to annotate microbiological images of gut parasites, which are compared with annotations by experts. A quantitative evaluation is carried out on the results, confirming that the assistive tools can noticeably decrease the non-expert annotation's cost (time, click, interaction, etc.) while preserving or even improving the annotation's quality. The annotation quality of non-experts has been investigated using IoU (intersection over union), precision and recall; based on this analysis we propose some ideas on how to better design similar crowdsourcing and assistive platforms.
Original language | English |
---|---|
Article number | 104204 |
Number of pages | 15 |
Journal | Computers in Biology and Medicine |
Volume | 130 |
Early online date | 2 Jan 2021 |
DOIs | |
Publication status | Published - 1 Mar 2021 |
Keywords / Materials (for Non-textual outputs)
- Semi-auto segmentation
- Object detection
- Computational biology
- Crowdsourcing
- Image annotation
- Instance segmentation