TY - GEN
T1 - Effective Training of PINNs by Combining CMA-ES with Gradient Descent
AU - Liu, Lin
AU - Yuan, Yuan
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Physics-Informed Neural Networks (PINNs) have recently received increasing attention, however, optimizing the loss function of PINNs is notoriously difficult, where the landscape of the loss function is often highly non-convex and rugged. Local optimization methods based on gradient information can converge quickly but are prone to being trapped in local minima for training PINNs. Evolutionary algorithms (EAs) are well known for the global search ability, which can help escape from local minima. It has been reported in the literature that EAs show some advantages over gradient-based methods in training PINNs. Inspired by the Memetic Algorithm, we combine global-search based EAs and local-search based batch gradient descent in order to make the best of both word. In addition, since the PINN loss function is composed of multiple terms, balancing these terms is also a challenging issue. Therefore, we also attempt to combine EAs with multiple-gradient descent algorithm for multi-objective optimization. Our experiments provide strong evidence for the superiority of the above algorithms.
AB - Physics-Informed Neural Networks (PINNs) have recently received increasing attention, however, optimizing the loss function of PINNs is notoriously difficult, where the landscape of the loss function is often highly non-convex and rugged. Local optimization methods based on gradient information can converge quickly but are prone to being trapped in local minima for training PINNs. Evolutionary algorithms (EAs) are well known for the global search ability, which can help escape from local minima. It has been reported in the literature that EAs show some advantages over gradient-based methods in training PINNs. Inspired by the Memetic Algorithm, we combine global-search based EAs and local-search based batch gradient descent in order to make the best of both word. In addition, since the PINN loss function is composed of multiple terms, balancing these terms is also a challenging issue. Therefore, we also attempt to combine EAs with multiple-gradient descent algorithm for multi-objective optimization. Our experiments provide strong evidence for the superiority of the above algorithms.
KW - evolutionary algorithm
KW - gradient descent
KW - memetic algorithm
KW - multi-objective optimization
KW - Physics-informed neural networks
UR - http://www.scopus.com/inward/record.url?scp=85201731895&partnerID=8YFLogxK
U2 - 10.1109/CEC60901.2024.10611964
DO - 10.1109/CEC60901.2024.10611964
M3 - 会议稿件
AN - SCOPUS:85201731895
T3 - 2024 IEEE Congress on Evolutionary Computation, CEC 2024 - Proceedings
BT - 2024 IEEE Congress on Evolutionary Computation, CEC 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE Congress on Evolutionary Computation, CEC 2024
Y2 - 30 June 2024 through 5 July 2024
ER -