Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/80770
Title: Monte Carlo elites : quality-diversity selection as a multi-armed bandit problem
Authors: Sfikas, Konstantinos
Liapis, Antonios
Yannakakis, Georgios N.
Keywords: Computer games -- Design
Computer games -- Decision making
Level design (Computer science)
Labyrinths
Issue Date: 2021
Publisher: ArXiv
Citation: Sfikas, K., Liapis, A., & Yannakakis, G. N. (2021). Monte Carlo elites: quality-diversity selection as a multi-armed bandit problem. arXiv preprint arXiv:2104.08781.
Abstract: A core challenge of evolutionary search is the need to balance between exploration of the search space and exploitation of highly fit regions. Quality-diversity search has explicitly walked this tightrope between a population’s diversity and its quality. This paper extends a popular quality-diversity search algorithm, MAP-Elites, by treating the selection of parents as a multi-armed bandit problem. Using variations of the upper-confidence bound to select parents from under-explored but potentially rewarding areas of the search space can accelerate the discovery of new regions as well as improve its archive’s total quality. The paper tests an indirect measure of quality for parent selection: the survival rate of a parent’s offspring. Results show that maintaining a balance between exploration and exploitation leads to the most diverse and high-quality set of solutions in three different testbeds.
URI: https://www.um.edu.mt/library/oar/handle/123456789/80770
Appears in Collections:Scholarly Works - InsDG

Files in This Item:
File Description SizeFormat 
2104.08781.pdf1.39 MBAdobe PDFView/Open


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.