Automatic Evolution of Multimodal Behavior with Multi-Brain HyperNEAT
Item
-
Title
Automatic Evolution of Multimodal Behavior with Multi-Brain HyperNEAT
-
Description
This is an Accepted Manuscript of an article published in Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion (pp. 21–22). New York, NY, USA: ACM. https://doi.org/10.1145/2908961.2908965
-
Creator
Schrum, Jacob
Lehman, Joel
Risi, Sebastian
-
Identifier
Schrum, J., Lehman, J., & Risi, S. (2016). Automatic Evolution of Multimodal Behavior with Multi-Brain HyperNEAT. In Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion (pp. 21–22). New York, NY, USA: ACM. https://doi.org/10.1145/2908961.2908965
-
uri
https://collections.southwestern.edu/s/suscholar/item/223
-
Abstract
An important challenge in neuroevolution is to evolve multimodal behavior. Indirect network encodings can potentially answer this challenge. Yet in practice, indirect encodings do not yield effective multimodal controllers. This paper introduces novel multimodal extensions to HyperNEAT, a popular indirect encoding. A previous multimodal approach called situational policy geometry assumes that multiple brains benefit from being embedded within an explicit geometric space. However, this paper introduces HyperNEAT extensions for evolving many brains without assuming geometric relationships between them. The resulting Multi-Brain HyperNEAT can exploit human-specified task divisions, or can automatically discover when brains should be used, and how many to use. Experiments show that multi-brain approaches are more effective than HyperNEAT without multimodal extensions, and that brains without a geometric relation to each other are superior.
-
Subject
Multimodal Behavior
HyperNEAT
Multi-Brain HyperNEAT
Indirect Encoding