Big Archive-Assisted Ensemble of Many-Objective Evolutionary Algorithms

Multiobjective evolutionary algorithms (MOEAs) have witnessed prosperity in solving many-objective optimization problems (MaOPs) over the past three decades. Unfortunately, no one single MOEA equipped with given parameter settings, mating-variation operator, and environmental selection mechanism is...

Full description

Saved in:
Bibliographic Details
Main Authors: Wen Zhong, Jian Xiong, Anping Lin, Lining Xing, Feilong Chen, Yingwu Chen
Format: Article
Language:English
Published: Wiley 2021-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2021/6614283
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multiobjective evolutionary algorithms (MOEAs) have witnessed prosperity in solving many-objective optimization problems (MaOPs) over the past three decades. Unfortunately, no one single MOEA equipped with given parameter settings, mating-variation operator, and environmental selection mechanism is suitable for obtaining a set of solutions with excellent convergence and diversity for various types of MaOPs. The reality is that different MOEAs show great differences in handling certain types of MaOPs. Aiming at these characteristics, this paper proposes a flexible ensemble framework, namely, ASES, which is highly scalable for embedding any number of MOEAs to promote their advantages. To alleviate the undesirable phenomenon that some promising solutions are discarded during the evolution process, a big archive that number of contained solutions be far larger than population size is integrated into this ensemble framework to record large-scale nondominated solutions, and also an efficient maintenance strategy is developed to update the archive. Furthermore, the knowledge coming from updating archive is exploited to guide the evolutionary process for different MOEAs, allocating limited computational resources for efficient algorithms. A large number of numerical experimental studies demonstrated superior performance of the proposed ASES. Among 52 test instances, the ASES performs better than all the six baseline algorithms on at least half of the test instances with respect to both metrics hypervolume and inverted generational distance.
ISSN:1076-2787
1099-0526