Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/29585
Title: Modelling affect for horror soundscapes
Authors: Lopes, Phil
Liapis, Antonios
Yannakakis, Georgios N.
Keywords: Crowdsourcing
Human computation
Human-computer interaction
Video games -- Psychological aspects
Horror
Issue Date: 2017
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Lopes, P., Liapis, A., & Yannakakis, G. N. (2017). Modelling affect for horror soundscapes. IEEE Transactions on Affective Computing, 1-14.
Abstract: The feeling of horror within movies or games relies on the audience’s perception of a tense atmosphere — often achieved through sound accompanied by the on-screen drama — guiding its emotional experience throughout the scene or game-play sequence. These progressions are often crafted through an a priori knowledge of how a scene or game-play sequence will playout, and the intended emotional patterns a game director wants to transmit. The appropriate design of sound becomes even more challenging once the scenery and the general context is autonomously generated by an algorithm. Towards realizing sound-based affective interaction in games this paper explores the creation of computational models capable of ranking short audio pieces based on crowdsourced annotations of tension, arousal and valence. Affect models are trained via preference learning on over a thousand annotations with the use of support vector machines, whose inputs are low-level features extracted from the audio assets of a comprehensive sound library. The models constructed in this work are able to predict the tension, arousal and valence elicited by sound, respectively, with an accuracy of approximately 65%, 66% and 72%.
URI: https://www.um.edu.mt/library/oar//handle/123456789/29585
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
Modelling_affect_for_horror_soundscapes.pdf1.51 MBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.