Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/102997
Title: Exploring how weak supervision can assist the annotation of computer vision datasets
Authors: Abela, Andrea
Seychell, Dylan
Bugeja, Mark
Keywords: Artificial intelligence -- Case studies
Computer vision
Neural networks (Computer science)
Issue Date: 2022
Publisher: Institute of Electrical and Electronics Engineers
Citation: Abela, A., Seychell, D., & Bugeja, M. (2022). Exploring How Weak Supervision Can Assist the Annotation of Computer Vision Datasets. 2022 IEEE 21st Mediterranean Electrotechnical Conference (MELECON), Palermo. 960-965.
Abstract: Current artificial intelligence (AI) workflows depend on researchers performing laborious annotation work. In the case of computer vision, crowdsourcing is a popular alternative to alleviate this effort. The general public can provide even more trustworthy annotations with the help of existing frameworks. This paper proposes an image dataset annotator helper that uses weak supervision and explains how class activation maps (CAMs) are integrated with deep image classifiers to produce weakly supervised localisers that could further improve human image annotation performance. Comparing these models with primary crowdsourcing data revealed that the models can annotate better than humans by 9.7% when measuring the localisation error while taking into account both false positives (FPs) and false negatives (FNs). Moreover, the models can also save up to 36% of the time required to perform manual image annotation. This confirms that there is potential within CAM-empowered models to further improve the image annotation experience.
URI: https://www.um.edu.mt/library/oar/handle/123456789/102997
Appears in Collections:Scholarly Works - FacICTAI

Files in This Item:
File Description SizeFormat 
Exploring_how_weak_supervision_can_assist_the_annotation_of_computer_vision_datasets.pdf
  Restricted Access
1.65 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.