TY - GEN
T1 - Dimension Dropout for Evolutionary High-Dimensional Expensive Multiobjective Optimization
AU - Lin, Jianqing
AU - He, Cheng
AU - Cheng, Ran
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - In the past decades, a number of surrogate-assisted evolutionary algorithms (SAEAs) have been developed to solve expensive multiobjective optimization problems (EMOPs). However, most existing SAEAs focus on low-dimensional optimization problems, since a large number of training samples are required (which is unrealistic for EMOPs) to build an accurate surrogate model for high-dimensional problems. In this paper, an SAEA with Dimension Dropout is proposed to solve high-dimensional EMOPs. At each iteration of the proposed algorithm, it randomly selects a part of the decision variables by Dimension Dropout, and then optimizes the selected decision variables with the assistance of surrogate models. To balance the convergence and diversity, those candidate solutions with good diversity are modified by replacing the selected decision variables with those optimized ones (i.e., decision variables from some better-converged candidate solutions). Eventually, the new candidate solutions are evaluated using expensive functions to update the archive. Empirical studies on ten benchmark problems with up to 200 decision variables demonstrate the competitiveness of the proposed algorithm.
AB - In the past decades, a number of surrogate-assisted evolutionary algorithms (SAEAs) have been developed to solve expensive multiobjective optimization problems (EMOPs). However, most existing SAEAs focus on low-dimensional optimization problems, since a large number of training samples are required (which is unrealistic for EMOPs) to build an accurate surrogate model for high-dimensional problems. In this paper, an SAEA with Dimension Dropout is proposed to solve high-dimensional EMOPs. At each iteration of the proposed algorithm, it randomly selects a part of the decision variables by Dimension Dropout, and then optimizes the selected decision variables with the assistance of surrogate models. To balance the convergence and diversity, those candidate solutions with good diversity are modified by replacing the selected decision variables with those optimized ones (i.e., decision variables from some better-converged candidate solutions). Eventually, the new candidate solutions are evaluated using expensive functions to update the archive. Empirical studies on ten benchmark problems with up to 200 decision variables demonstrate the competitiveness of the proposed algorithm.
KW - Dimension dropout
KW - High-dimensional
KW - Multiobjective optimization
KW - Surrogate-assisted optimization
UR - https://www.scopus.com/pages/publications/85107271850
U2 - 10.1007/978-3-030-72062-9_45
DO - 10.1007/978-3-030-72062-9_45
M3 - Conference article published in proceeding or book
AN - SCOPUS:85107271850
SN - 9783030720612
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 567
EP - 579
BT - Evolutionary Multi-Criterion Optimization - 11th International Conference, EMO 2021, Proceedings
A2 - Ishibuchi, Hisao
A2 - Zhang, Qingfu
A2 - Cheng, Ran
A2 - Li, Ke
A2 - Li, Hui
A2 - Wang, Handing
A2 - Zhou, Aimin
PB - Springer Science and Business Media Deutschland GmbH
T2 - 11th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2021
Y2 - 28 March 2021 through 31 March 2021
ER -