Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/102281
Title: AffectGAN : affect-based generative art driven by semantics
Authors: Galanos, Theodoros
Liapis, Antonios
Yannakakis, Georgios N.
Keywords: Art
Generative art
Emotion recognition
Deep learning (Machine learning)
Issue Date: 2021
Publisher: Institute of Electrical and Electronics Engineers
Citation: Galanos, T., Liapis, A. & Yannakakis, G. N. (2021). AffectGAN : affect-based generative art driven by semantics. 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), Nara.
Abstract: This paper introduces a novel method for generating artistic images that express particular affective states. Leveraging state-of-the-art deep learning methods for visual generation (through generative adversarial networks), semantic models from OpenAI, and the annotated dataset of the visual art encyclopedia WikiArt, our AffectGAN model is able to generate images based on specific or broad semantic prompts and intended affective outcomes. A small dataset of 32 images generated by AffectGAN is annotated by 50 participants in terms of the particular emotion they elicit, as well as their quality and novelty. Results show that for most instances the intended emotion used as a prompt for image generation matches the participants’ responses. This smallscale study brings forth a new vision towards blending affective computing with computational creativity, enabling generative systems with intentionality in terms of the emotions they wish their output to elicit.
URI: https://www.um.edu.mt/library/oar/handle/123456789/102281
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
AffectGAN_Affect-Based_Generative_Art_Driven_by_Semantics_2021.pdf
  Restricted Access
5 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.