Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/80607
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Papavlasopoulou, Sofia | - |
dc.contributor.author | Sharma, Kshitij | - |
dc.contributor.author | Melhart, David | - |
dc.contributor.author | Schellekens, Jasper | - |
dc.contributor.author | Lee-Cultura, Serena | - |
dc.contributor.author | Giannakos, Michail | - |
dc.contributor.author | Yannakakis, Georgios N. | - |
dc.date.accessioned | 2021-09-02T07:45:05Z | - |
dc.date.available | 2021-09-02T07:45:05Z | - |
dc.date.issued | 2021-12 | - |
dc.identifier.citation | Papavlasopoulou, S., Sharma, K., Melhart, D., Schellekens, J., Lee-Cultura, S., Giannakos, M. N., & Yiannakakis, G. N. (2021). Investigating gaze interaction to support children’s gameplay. International Journal of Child-Computer Interaction, 30, 100349. | en_GB |
dc.identifier.issn | 2212-8689 | - |
dc.identifier.uri | https://www.um.edu.mt/library/oar/handle/123456789/80607 | - |
dc.description.abstract | Gaze interaction has become an affordable option in the development of innovative interaction methods for user input. Gaze holds great promise as an input modality, offering increased immersion and opportunities for combined interactions (e.g., gaze and mouse, touch). However, the use of gaze as an input modality to support children’s gameplay has not been examined to unveil those opportunities. To investigate the potential of gaze interaction to support children’s gameplay, we designed and developed a game that enables children to utilize gaze interaction as an input modality. Then, we performed a between subjects research design study with 28 children using mouse as an input mechanism and 29 children using their gaze (8–14 years old). During the study, we collected children’s attitudes (via self-reported questionnaire) and actual usage behavior (using facial video, physiological data and computer logs). The results show no significant difference on children’s attitudes regarding the ease of use and enjoyment of the two conditions, as well as on the scores achieved and number of sessions played. Usage data from children’s facial video and physiological data show that sadness and stress are significantly higher in the mouse condition, while joy, surprise, physiological arousal and emotional arousal are significantly higher in the gaze condition. In addition, our findings highlight the benefits of using multimodal data to reveal children’s behavior while playing the game, by complementing self-reported measures. As well, we uncover a need for more studies to examine gaze as an input mechanism. | en_GB |
dc.language.iso | en | en_GB |
dc.publisher | Elsevier | en_GB |
dc.rights | info:eu-repo/semantics/openAccess | en_GB |
dc.subject | Games -- Design | en_GB |
dc.subject | Eye tracking | en_GB |
dc.subject | Children's plays | en_GB |
dc.subject | Computer systems | en_GB |
dc.subject | Human-computer interaction | en_GB |
dc.title | Investigating gaze interaction to support children’s gameplay | en_GB |
dc.type | article | en_GB |
dc.rights.holder | The copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder. | en_GB |
dc.description.reviewed | peer-reviewed | en_GB |
dc.identifier.doi | 10.1016/j.ijcci.2021.100349. | - |
dc.publication.title | International Journal of Child-Computer Interaction | en_GB |
Appears in Collections: | Scholarly Works - InsDG |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
1-s2.0-S2212868921000611-main.pdf | 2.05 MB | Adobe PDF | View/Open |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.