Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/121435
Title: Predicting player engagement in Tom Clancy’s The Division 2 : a multimodal approach via pixels and gamepad actions
Authors: Pinitas, Kosmas
Renaudie, David
Thomsen, Mike
Barthet, Matthew
Makantasis, Konstantinos
Liapis, Antonios
Yannakakis, Georgios N.
Keywords: Computer games -- Design
Computer games -- Psychological aspects
Computer games -- Social aspects
Level design (Computer science)
Machine learning
Issue Date: 2023
Publisher: Association for Computing Machinery
Citation: Pinitas, K., Renaudie, D., Thomsen, M., Barthet, M., Makantasis, K., Liapis, A., & Yannakakis, G. N. (2023). Predicting player engagement in Tom Clancy’s The Division 2 : a multimodal approach via pixels and gamepad actions. 25th ACM International Conference on Multimodal Interaction. Paris, France.
Abstract: This paper introduces a large scale multimodal corpus collected for the purpose of analysing and predicting player engagement in commercial-standard games. The corpus is solicited from 25 players of the action role-playing game Tom Clancy’s The Division 2, who annotated their level of engagement using a time-continuous annotation tool. The cleaned and processed corpus presented in this paper consists of nearly 20 hours of annotated gameplay videos accompanied by logged gamepad actions. We report preliminary results on predicting long-term player engagement based on ingame footage and game controller actions using Convolutional Neural Network architectures. Results obtained suggest we can predict the player engagement with up to 72% accuracy on average (88% at best) when we fuse information from the game footage and the player’s controller input. Our findings validate the hypothesis that long-term (i.e. 1 hour of play) engagement can be predicted efficiently solely from pixels and gamepad actions.
URI: https://www.um.edu.mt/library/oar/handle/123456789/121435
Appears in Collections:Scholarly Works - InsDG



Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.