Expensive Many-Objective Optimization by Learning of the Strengthened Dominance Relation

Jiangtao Shen, Peng Wang, Huachao Dong, Wenxin Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Expensive many-objective optimization problems (EMaOPs) are common in the real world, whose objective values need to be calculated by time-consuming computational simulations or expensive experiments. For EMaOPs, guiding the optimization process by surrogate models is a popular method. In this paper, an FNN-assisted evolutionary algorithm is proposed for better solving EMaOPs. Concretely, a feedforward neural network (FNN) is constructed by learning the strengthened dominance relation (SDR) between two solutions. Then promising samples are generated by evolutionary search based on the constructed FNN. The proposed method is compared with four state-of-the-art peer algorithms on a set of benchmark problems. Experimental results demonstrate its superiority.

Original languageEnglish
Title of host publication2023 IEEE Congress on Evolutionary Computation, CEC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350314588
DOIs
StatePublished - 2023
Event2023 IEEE Congress on Evolutionary Computation, CEC 2023 - Chicago, United States
Duration: 1 Jul 20235 Jul 2023

Publication series

Name2023 IEEE Congress on Evolutionary Computation, CEC 2023

Conference

Conference2023 IEEE Congress on Evolutionary Computation, CEC 2023
Country/TerritoryUnited States
CityChicago
Period1/07/235/07/23

Keywords

  • evolutionary algorithm
  • expensive optimization
  • feedforward neural network
  • many-objective optimization

Fingerprint

Dive into the research topics of 'Expensive Many-Objective Optimization by Learning of the Strengthened Dominance Relation'. Together they form a unique fingerprint.

Cite this