Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/81846
Title: Obstacle tower : a generalization challenge in vision, control, and planning
Authors: Juliani, Arthur
Khalifa, Ahmed
Berges, Vincent-Pierre
Harper, Jonathan
Teng, Ervin
Henry, Hunter
Crespi, Adam
Togelius, Julian
Lange, Danny
Keywords: Computer games -- Design
Artificial intelligence
Machine learning
Issue Date: 2018
Publisher: International Joint Conference on Artificial Intelligence
Citation: Juliani, A., Khalifa, A., Berges, V.-P., Harper, J., Teng, E., Henry, H., Crespi, A., Togelius, J., & Lange, D. (2019). Obstacle tower : a generalization challenge in vision, control, and planning. 28th International Joint Conference on Artificial Intelligence, Macao.
Abstract: The rapid pace of recent research in AI has been driven in part by the presence of fast and challenging simulation environments. These environments often take the form of games; with tasks ranging from simple board games, to competitive video games. We propose a new benchmark - Obstacle Tower: a high fidelity, 3D, 3rd person, procedurally generated environment. An agent playing Obstacle Tower must learn to solve both low-level control and high-level planning problems in tandem while learning from pixels and a sparse reward signal. Unlike other benchmarks such as the Arcade Learning Environment, evaluation of agent performance in Obstacle Tower is based on an agent's ability to perform well on unseen instances of the environment. In this paper we outline the environment and provide a set of baseline results produced by current state-of-the-art Deep RL methods as well as human players. These algorithms fail to produce agents capable of performing near human level.
URI: https://www.um.edu.mt/library/oar/handle/123456789/81846
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
Obstacle_tower_a_generalization_challenge_in_vision_control_and_planning_2019.pdf2.81 MBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.