Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/22963
Title: Mining multimodal sequential patterns : a case study on affect detection
Authors: Martinez, Hector P.
Yannakakis, Georgios N.
Keywords: Human-computer interaction
Computer games
Issue Date: 2011
Publisher: ACM Publications
Citation: Martinez, H. P., & Yannakakis, G. N. (2011). Mining multimodal sequential patterns: a case study on affect detection. 13th International Conference on Multimodal Interfaces, Alicante. 3-10.
Abstract: Temporal data from multimodal interaction such as speech and bio-signals cannot be easily analysed without a preprocessing phase through which some key characteristics of the signals are extracted. Typically, standard statistical signal features such as average values are calculated prior to the analysis and, subsequently, are presented either to a multimodal fusion mechanism or a computational model of the interaction. This paper proposes a feature extraction methodology which is based on frequent sequence mining within and across multiple modalities of user input. The proposed method is applied for the fusion of physiological signals and gameplay information in a game survey dataset. The obtained sequences are analysed and used as predictors of user affect resulting in computational models of equal or higher accuracy compared to the models built on standard statistical features.
URI: https://www.um.edu.mt/library/oar//handle/123456789/22963
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
Mining_multimodal_sequential_patterns_A_case_study.pdf240.62 kBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.