PNRNet: Physically-Inspired Neural Rendering for Any-to-Any Relighting

Zhongyun Hu, Ntumba Elie Nsampi, Xue Wang, Qing Wang

科研成果: 期刊稿件文章同行评审

5 引用 (Scopus)

摘要

Existing any-to-any relighting methods suffer from the task-aliasing effects and the loss of local details in the image generation process, such as shading and attached-shadow. In this paper, we present PNRNet, a novel neural architecture that decomposes the any-to-any relighting task into three simpler sub-tasks, i.e. lighting estimation, color temperature transfer, and lighting direction transfer, to avoid the task-aliasing effects. These sub-tasks are easy to learn and can be trained with direct supervisions independently. To better preserve local shading and attached-shadow details, we propose a parallel multi-scale network that incorporates multiple physical attributes to model local illuminations for lighting direction transfer. We also introduce a simple yet effective color temperature transfer network to learn a pixel-level non-linear function which allows color temperature adjustment beyond the predefined color temperatures and generalizes well to real images. Extensive experiments demonstrate that our proposed approach achieves better results quantitatively and qualitatively than prior works.

源语言英语
页(从-至)3935-3948
页数14
期刊IEEE Transactions on Image Processing
31
DOI
出版状态已出版 - 2022

指纹

探究 'PNRNet: Physically-Inspired Neural Rendering for Any-to-Any Relighting' 的科研主题。它们共同构成独一无二的指纹。

引用此