Bayesian uncertainty analysis for underwater 3D reconstruction with neural radiance fields

Haojie Lian, Xinhao Li, Yilin Qu, Jing Du, Zhuxuan Meng, Jie Liu, Leilei Chen

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Neural radiance fields (NeRFs) are a deep learning technique that generates novel views of 3D scenes from multi-view images. As an extension of NeRFs, SeaThru-NeRF mitigates the effects of scattering media on the structural appearance and geometric information. However, like most deep learning models, SeaThru-NeRF has inherent uncertainty in its predictions and produces artifacts in the rendering results, which limits its practical deployment in underwater unmanned autonomous navigation. To address this issue, we introduce a spatial perturbation field based on Bayes' rays into SeaThru-NeRF and perform Laplace approximation to obtain Gaussian distribution of the parameters, so that the uncertainty at each spatial location can be evaluated. Additionally, because artifacts inherently correspond to regions of high uncertainty, we remove them by thresholding based on our uncertainty field. Numerical experiments are provided to demonstrate the effectiveness of this approach.

Original languageEnglish
Article number115806
JournalApplied Mathematical Modelling
Volume138
DOIs
StatePublished - Feb 2025

Keywords

  • Neural radiance fields
  • Uncertainty quantification
  • Underwater scenes

Fingerprint

Dive into the research topics of 'Bayesian uncertainty analysis for underwater 3D reconstruction with neural radiance fields'. Together they form a unique fingerprint.

Cite this