TY - JOUR
T1 - Zero-shot illumination adaption for improved real-time underwater visual perception
AU - Mao, Ruiqi
AU - Cui, Rongxin
AU - Yan, Weisheng
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2025/5/1
Y1 - 2025/5/1
N2 - High quality of underwater images can provide great visual effect for improving numerous undersea investigations, such as underwater archaeological exploration and underwater topographic surveying, etc. Underwater images often suffer from distortions at various depths due to harsh conditions, leading to inconsistencies that compromise their reliability. Existing approaches to underwater image enhancement (UIE) often struggle with limited adaptability and may even fail when confronted with previously unseen real-world underwater conditions. Meanwhile, the inefficiency of model inference significantly constrains its feasibility for high-demand real-time underwater visual perception. To address these issues, we need to reduce the reliance on extremely limited labeled data and develop a lightweight scheme to adapt to diverse scenes. With this idea in mind, we present a joint CNN-Transformer framework termed AIDUIE that contains a zero-shot illumination disentanglement network (ZSIdNet) and a self-attention based dynamic camera response function (SADCRF) with long-range dependency modeling capability. We develop a multi-step iterative training mechanism for illumination disentanglement, which employs zero-shot learning at each step using pseudo labels generated by an image degradation module. Accordingly, we can adjust the disentangled illumination using SADCRF with parameter maps of adaptive pixel retention factor (PRF) to reconstruct the enhanced image. We substantiate the diverse advantages of our scheme over existing methods through a meticulous and comprehensive experimental evaluation process, illustrating its superiority in both quality and efficiency under unexplored and convoluted underwater circumstances.
AB - High quality of underwater images can provide great visual effect for improving numerous undersea investigations, such as underwater archaeological exploration and underwater topographic surveying, etc. Underwater images often suffer from distortions at various depths due to harsh conditions, leading to inconsistencies that compromise their reliability. Existing approaches to underwater image enhancement (UIE) often struggle with limited adaptability and may even fail when confronted with previously unseen real-world underwater conditions. Meanwhile, the inefficiency of model inference significantly constrains its feasibility for high-demand real-time underwater visual perception. To address these issues, we need to reduce the reliance on extremely limited labeled data and develop a lightweight scheme to adapt to diverse scenes. With this idea in mind, we present a joint CNN-Transformer framework termed AIDUIE that contains a zero-shot illumination disentanglement network (ZSIdNet) and a self-attention based dynamic camera response function (SADCRF) with long-range dependency modeling capability. We develop a multi-step iterative training mechanism for illumination disentanglement, which employs zero-shot learning at each step using pseudo labels generated by an image degradation module. Accordingly, we can adjust the disentangled illumination using SADCRF with parameter maps of adaptive pixel retention factor (PRF) to reconstruct the enhanced image. We substantiate the diverse advantages of our scheme over existing methods through a meticulous and comprehensive experimental evaluation process, illustrating its superiority in both quality and efficiency under unexplored and convoluted underwater circumstances.
KW - Camera response function (CRF)
KW - Illumination disentangled
KW - Joint CNN-Transformer framework
KW - Pixel retention factor (PRF)
KW - Underwater image enhancement
KW - Underwater visual perception
UR - http://www.scopus.com/inward/record.url?scp=85216656731&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2025.126616
DO - 10.1016/j.eswa.2025.126616
M3 - 文章
AN - SCOPUS:85216656731
SN - 0957-4174
VL - 271
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 126616
ER -