跳到主要导航 跳到搜索 跳到主要内容

VNAS: Variational Neural Architecture Search

  • Benteng Ma
  • , Jing Zhang
  • , Yong Xia
  • , Dacheng Tao
  • Northwestern Polytechnical University Xian
  • University of Sydney

科研成果: 期刊稿件文章同行评审

12 引用 (Scopus)

摘要

Differentiable neural architecture search delivers point estimation to the optimal architecture, which yields arbitrarily high confidence to the learned architecture. This approach thus suffers in calibration and robustness, in contrast with the maximum a posteriori estimation scheme. In this paper, we propose a novel Variational Neural Architecture Search (VNAS) method that estimates and exploits the weight variability in the following three steps. VNAS first learns the weight distribution through variational inference which minimizes the expected lower bound on the marginal likelihood of architecture using unbiased Monte Carlo gradient estimation. A group of optimal architecture candidates is then drawn according to the learned weight distribution with the complexity constraint. The optimal architecture is further inferred under a novel training-free architecture-performance estimator, designed to score the network architectures at initialization without training, which significantly reduces the computational cost of the optimal architecture estimator. Extensive experiments show that VNAS significantly outperforms the state-of-the-art methods in classification performance and adversarial robustness.

源语言英语
页(从-至)3689-3713
页数25
期刊International Journal of Computer Vision
132
9
DOI
出版状态已出版 - 9月 2024

指纹

探究 'VNAS: Variational Neural Architecture Search' 的科研主题。它们共同构成独一无二的指纹。

引用此