TY - CONF
T1 - Learning Exposure Correction Via Consistency Modeling
AU - Nsampi, Ntumba Elie
AU - Hu, Zhongyun
AU - Wang, Qing
N1 - Publisher Copyright:
© 2021. The copyright of this document resides with its authors.
PY - 2021
Y1 - 2021
N2 - Existing works on exposure correction have exclusively focused on either underexposure or over-exposure. Recent work targeting both under-, and over-exposure achieved state of the art. However, it tends to produce images with inconsistent correction and sometimes color artifacts. In this paper, we propose a novel neural network architecture for exposure correction. The proposed network targets both under-, and over-exposure. We introduce a deep feature matching loss that enables the network to learn exposure-invariant representation in the feature space, which guarantees image exposure consistency. Moreover, we leverage a global attention mechanism to allow long-range interactions between distant pixels for exposure correction. This results in consistently corrected images, free of localized color distortions. Through extensive quantitative and qualitative experiments, we demonstrate that the proposed network outperforms the existing state-of-the-art. Code: https://github.com/elientumba2019/Exposure-Correction-BMVC-2021.
AB - Existing works on exposure correction have exclusively focused on either underexposure or over-exposure. Recent work targeting both under-, and over-exposure achieved state of the art. However, it tends to produce images with inconsistent correction and sometimes color artifacts. In this paper, we propose a novel neural network architecture for exposure correction. The proposed network targets both under-, and over-exposure. We introduce a deep feature matching loss that enables the network to learn exposure-invariant representation in the feature space, which guarantees image exposure consistency. Moreover, we leverage a global attention mechanism to allow long-range interactions between distant pixels for exposure correction. This results in consistently corrected images, free of localized color distortions. Through extensive quantitative and qualitative experiments, we demonstrate that the proposed network outperforms the existing state-of-the-art. Code: https://github.com/elientumba2019/Exposure-Correction-BMVC-2021.
UR - http://www.scopus.com/inward/record.url?scp=85173926485&partnerID=8YFLogxK
M3 - 论文
AN - SCOPUS:85173926485
T2 - 32nd British Machine Vision Conference, BMVC 2021
Y2 - 22 November 2021 through 25 November 2021
ER -