DGAN: Disentangled Representation Learning for Anisotropic BRDF Reconstruction

Zhongyun Hu, Xue Wang, Qing Wang

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Accurate reconstruction of real-world materials' appearance from a very limited number of samples is still a huge challenge in computer vision and graphics. In this paper, we present a novel deep architecture, Disentangled Generative Adversarial Network (DGAN), which performs anisotropic Bidirectional Reflectance Distribution Function (BRDF) reconstruction from single BRDF subspace with the maximum entropy. In contrast to previous approaches that directly map known samples to a full BRDF using a CNN, a disentangled representation learning is applied to guide the reconstruction process. In order to learn different physical factors of the BRDF, the generator of the DGAN mainly consists of a fresnel estimator module (FEM) and a directional module (DM). Considering the fact that the entropy of different BRDF subspace varies, we further divide the BRDF into He-BRDF and Le-BRDF to reconstruct the interior part and the exterior part of the directional factor. Experimental results show that our approach outperforms state-of-the-art methods.

源语言英语
主期刊名2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Proceedings
出版商Institute of Electrical and Electronics Engineers Inc.
4397-4401
页数5
ISBN(电子版)9781509066315
DOI
出版状态已出版 - 5月 2020
活动2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Barcelona, 西班牙
期限: 4 5月 20208 5月 2020

出版系列

姓名ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
2020-May
ISSN(印刷版)1520-6149

会议

会议2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020
国家/地区西班牙
Barcelona
时期4/05/208/05/20

指纹

探究 'DGAN: Disentangled Representation Learning for Anisotropic BRDF Reconstruction' 的科研主题。它们共同构成独一无二的指纹。

引用此