Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/121547
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Barthet, Matthew | - |
dc.contributor.author | Trivedi, Chintan | - |
dc.contributor.author | Pinitas, Kosmas | - |
dc.contributor.author | Xylakis, Emmanouil | - |
dc.contributor.author | Makantasis, Konstantinos | - |
dc.contributor.author | Liapis, Antonios | - |
dc.contributor.author | Yannakakis, Georgios N. | - |
dc.date.accessioned | 2024-04-29T12:43:18Z | - |
dc.date.available | 2024-04-29T12:43:18Z | - |
dc.date.issued | 2023 | - |
dc.identifier.citation | Barthet, M., Trivedi, C., Pinitas, K., Xylakis, E., Makantasis, K., Liapis, A., & Yannakakis, G. N. (2023). Knowing your annotator : rapidly testing the reliability of affect annotation. 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). Cambridge, MA, USA. | en_GB |
dc.identifier.uri | https://www.um.edu.mt/library/oar/handle/123456789/121547 | - |
dc.description.abstract | The laborious and costly nature of affect annotation is a key detrimental factor for obtaining large scale corpora with valid and reliable affect labels. Motivated by the lack of tools that can effectively determine an annotator’s reliability, this paper proposes general quality assurance (QA) tests for real-time continuous annotation tasks. Assuming that the annotation tasks rely on stimuli with audiovisual components, such as videos, we propose and evaluate two QA tests: a visual and an auditory QA test. We validate the QA tool across 20 annotators that are asked to go through the test followed by a lengthy task of annotating the engagement of gameplay videos. Our findings suggest that the proposed QA tool reveals, unsurprisingly, that trained annotators are more reliable than the best of untrained crowdworkers we could employ. Importantly, the QA tool introduced can predict effectively the reliability of an affect annotator with 80% accuracy, thereby, saving on resources, effort and cost, and maximizing the reliability of labels solicited in affective corpora. The introduced QA tool is available and accessible through the PAGAN annotation platform. | en_GB |
dc.language.iso | en | en_GB |
dc.publisher | Institute of Electrical and Electronics Engineers | en_GB |
dc.rights | info:eu-repo/semantics/openAccess | en_GB |
dc.subject | Deep learning (Machine learning) | en_GB |
dc.subject | Video games -- Design | en_GB |
dc.subject | Application software | en_GB |
dc.subject | Artificial intelligence | en_GB |
dc.subject | Computer science | en_GB |
dc.subject | Database management | en_GB |
dc.title | Knowing your annotator : rapidly testing the reliability of affect annotation | en_GB |
dc.type | conferenceObject | en_GB |
dc.rights.holder | The copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder. | en_GB |
dc.bibliographicCitation.conferencename | 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) | en_GB |
dc.bibliographicCitation.conferenceplace | Cambridge, MA, United States. 10-13/09/2023 | en_GB |
dc.description.reviewed | peer-reviewed | en_GB |
Appears in Collections: | Scholarly Works - InsDG |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
knowing_your_annotator_rapidly_testing_the_reliability_of_affect_annotation.pdf | 4.11 MB | Adobe PDF | View/Open |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.