Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/104343
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPaggio, Patrizia-
dc.contributor.authorNavarretta, Costanza-
dc.date.accessioned2022-12-12T14:04:54Z-
dc.date.available2022-12-12T14:04:54Z-
dc.date.issued2013-
dc.identifier.citationPaggio, P., & Navarretta, C. (2013). Head movements, facial expressions and feedback in conversations: Empirical evidence from Danish multimodal data. Journal on Multimodal User Interfaces, 7(1-2), 29-37.en_GB
dc.identifier.isbn17838738-
dc.identifier.urihttps://www.um.edu.mt/library/oar/handle/123456789/104343-
dc.description.abstractThis article deals with multimodal feedback in two Danish multimodal corpora, i.e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour—more specifically head movements and facial expressions—and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.en_GB
dc.language.isoenen_GB
dc.publisherSpringeren_GB
dc.rightsinfo:eu-repo/semantics/restrictedAccessen_GB
dc.subjectModality (Linguistics)en_GB
dc.subjectCorpora (Linguistics)en_GB
dc.subjectBody language -- Researchen_GB
dc.subjectFacial expression -- Data processingen_GB
dc.subjectMachine learningen_GB
dc.subjectLinguistics -- Methodologyen_GB
dc.subjectHuman-computer interactionen_GB
dc.subjectSpeech and gestureen_GB
dc.subjectConversation analysisen_GB
dc.subjectDanish language -- Discourse analysisen_GB
dc.titleHead movements, facial expressions and feedback in conversations : empirical evidence from Danish multimodal dataen_GB
dc.typearticleen_GB
dc.rights.holderThe copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder.en_GB
dc.description.reviewedpeer-revieweden_GB
dc.identifier.doi10.1007/s12193-012-0105-9-
dc.publication.titleJournal on Multimodal User Interfacesen_GB
Appears in Collections:Scholarly Works - InsLin



Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.