Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/121547
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBarthet, Matthew-
dc.contributor.authorTrivedi, Chintan-
dc.contributor.authorPinitas, Kosmas-
dc.contributor.authorXylakis, Emmanouil-
dc.contributor.authorMakantasis, Konstantinos-
dc.contributor.authorLiapis, Antonios-
dc.contributor.authorYannakakis, Georgios N.-
dc.date.accessioned2024-04-29T12:43:18Z-
dc.date.available2024-04-29T12:43:18Z-
dc.date.issued2023-
dc.identifier.citationBarthet, M., Trivedi, C., Pinitas, K., Xylakis, E., Makantasis, K., Liapis, A., & Yannakakis, G. N. (2023). Knowing your annotator : rapidly testing the reliability of affect annotation. 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). Cambridge, MA, USA.en_GB
dc.identifier.urihttps://www.um.edu.mt/library/oar/handle/123456789/121547-
dc.description.abstractThe laborious and costly nature of affect annotation is a key detrimental factor for obtaining large scale corpora with valid and reliable affect labels. Motivated by the lack of tools that can effectively determine an annotator’s reliability, this paper proposes general quality assurance (QA) tests for real-time continuous annotation tasks. Assuming that the annotation tasks rely on stimuli with audiovisual components, such as videos, we propose and evaluate two QA tests: a visual and an auditory QA test. We validate the QA tool across 20 annotators that are asked to go through the test followed by a lengthy task of annotating the engagement of gameplay videos. Our findings suggest that the proposed QA tool reveals, unsurprisingly, that trained annotators are more reliable than the best of untrained crowdworkers we could employ. Importantly, the QA tool introduced can predict effectively the reliability of an affect annotator with 80% accuracy, thereby, saving on resources, effort and cost, and maximizing the reliability of labels solicited in affective corpora. The introduced QA tool is available and accessible through the PAGAN annotation platform.en_GB
dc.language.isoenen_GB
dc.publisherInstitute of Electrical and Electronics Engineersen_GB
dc.rightsinfo:eu-repo/semantics/openAccessen_GB
dc.subjectDeep learning (Machine learning)en_GB
dc.subjectVideo games -- Designen_GB
dc.subjectApplication softwareen_GB
dc.subjectArtificial intelligenceen_GB
dc.subjectComputer scienceen_GB
dc.subjectDatabase managementen_GB
dc.titleKnowing your annotator : rapidly testing the reliability of affect annotationen_GB
dc.typeconferenceObjecten_GB
dc.rights.holderThe copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder.en_GB
dc.bibliographicCitation.conferencename11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)en_GB
dc.bibliographicCitation.conferenceplaceCambridge, MA, United States. 10-13/09/2023en_GB
dc.description.reviewedpeer-revieweden_GB
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
knowing_your_annotator_rapidly_testing_the_reliability_of_affect_annotation.pdf4.11 MBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.