Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling

Yicheng Zhou, Zhenzhou Lu, Kai Cheng

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

In the study, we propose a new polynomial chaos expansion surrogate modeling method based on Adaboost (Adaboost-PCE) for uncertainty quantification. Adaboost is an ensemble learning technique originating from the machine learning field. The idea of Adaboost-PCE is to construct many times the “weak” PCE model with assigned weights to each sample point. Each time, the weight of a particular sample point in the training set depends on the performance of the surrogate models on that sample. In this way, weighted least-squares approximation is employed to exploit the weights of each sample in order to reduce the effect of outliers. The Adaboost-PCE is appealing since it is possible to estimate the ensemble weights without using any explicit error metrics as in most existent ensemble methods. Moreover, it has an expectation of the prediction error that enables the efficient adaptive sampling. The proposed method is validated with a numerical comparison of their performance on a series of numerical tests including partial differential equations with high-dimensional inputs.

Original languageEnglish
Article number114238
JournalComputer Methods in Applied Mechanics and Engineering
Volume388
DOIs
StatePublished - 1 Jan 2022

Keywords

  • Adaptive sampling
  • Ensemble learning
  • Polynomial chaos expansion
  • Uncertainty quantification

Fingerprint

Dive into the research topics of 'Adaboost-based ensemble of polynomial chaos expansion with adaptive sampling'. Together they form a unique fingerprint.

Cite this