Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/47339
Title: Fusing level and ruleset features for multimodal learning of gameplay outcomes
Authors: Liapis, Antonios
Karavolos, Daniel
Makantasis, Konstantinos
Sfikas, Konstantinos
Yannakakis, Georgios N.
Keywords: Computer games -- Design
Machine learning
Human-computer interaction
Level design (Computer science)
Issue Date: 2019
Publisher: Institute of Electrical and Electronics Engineers
Citation: Liapis, A., Karavolos, D., Makantasis, K., Sfikas, K., & Yannakakis, G. N. (2019). Fusing level and ruleset features for multimodal learning of gameplay outcomes. Proceedings of the IEEE Conference on Games, London.
Abstract: Which features of a game influence the dynamics of players interacting with it? Can a level’s architecture change the balance between two competing players, or is it mainly determined by the character classes and roles that players choose before the game starts? This paper assesses how quantifiable gameplay outcomes such as score, duration and features of the heatmap can be predicted from different facets of the initial game state, specifically the architecture of the level and the character classes of the players. Experiments in this paper explore how different representations of a level and class parameters in a shooter game affect a deep learning model which attempts to predict gameplay outcomes in a large corpus of simulated matches. Findings in this paper indicate that a few features of the ruleset (i.e. character class parameters) are the main drivers for the model’s accuracy in all tested gameplay outcomes, but the levels (especially when processed) can augment the model.
URI: https://www.um.edu.mt/library/oar/handle/123456789/47339
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
Fusing_level_and_ruleset_features_for_multimodal_learning_of_gameplay_outcomes_2019.pdf
  Restricted Access
2.18 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.