TY - JOUR
T1 - Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution
AU - Liu, Bingzan
AU - Ning, Xin
AU - Ma, Shichao
AU - Yang, Yizhen
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024.
PY - 2024/10
Y1 - 2024/10
N2 - With the development of edge intelligent devices such as unmanned aerial vehicles (UAVs), the demand of for high-resolution (HR) images increase significantly. However, noise and blurring from finite inspector sizes and optics make high-resolution images difficult to acquire directly. Therefore, lightweight super-resolution of optical images based on convolutional neural network (CNN) has become a hot spot. While most state-of-the-art methods pay more attention to local features on a particular dimension. Though composite attention mechanism has employed in them, the conflict among features of different attention types affects the SR performance thoroughly. In this paper, we propose a dynamic feature distillation and pyramid split large kernel attention network (DPLKA) to solve such problems. In particular, a pyramid split large kernel attention module (PSLKA) is introduced to obtain the multi-scale global information and long-range dependence. Subsequently, by constructing a global-to-local feature extraction block (GL-FEB), a global-to-local feature extraction approach similar to swin transformer with multi-scale self-attention is established. Furthermore, a dynamic feature distillation block (DFDB) is considered in this model with the purpose of utilizing hierarchical features from different layers and realizing adaptive recalibration of different response. Specifically, DPLKA applies lightweight architecture such as depth-wise separable convolution (SDC) and distillation feature extraction module (DFEM) which greatly improves the effectiveness of the method. Extensive experimental results on five benchmark datasets indicate that DPLKA is dominant in reconstruction accuracy (0.21 ~ 2 dB in Urban100 dataset with the scale of ×4), excellent running time (0.047 s in Urban100 dataset) and recipient parameters and flops.
AB - With the development of edge intelligent devices such as unmanned aerial vehicles (UAVs), the demand of for high-resolution (HR) images increase significantly. However, noise and blurring from finite inspector sizes and optics make high-resolution images difficult to acquire directly. Therefore, lightweight super-resolution of optical images based on convolutional neural network (CNN) has become a hot spot. While most state-of-the-art methods pay more attention to local features on a particular dimension. Though composite attention mechanism has employed in them, the conflict among features of different attention types affects the SR performance thoroughly. In this paper, we propose a dynamic feature distillation and pyramid split large kernel attention network (DPLKA) to solve such problems. In particular, a pyramid split large kernel attention module (PSLKA) is introduced to obtain the multi-scale global information and long-range dependence. Subsequently, by constructing a global-to-local feature extraction block (GL-FEB), a global-to-local feature extraction approach similar to swin transformer with multi-scale self-attention is established. Furthermore, a dynamic feature distillation block (DFDB) is considered in this model with the purpose of utilizing hierarchical features from different layers and realizing adaptive recalibration of different response. Specifically, DPLKA applies lightweight architecture such as depth-wise separable convolution (SDC) and distillation feature extraction module (DFEM) which greatly improves the effectiveness of the method. Extensive experimental results on five benchmark datasets indicate that DPLKA is dominant in reconstruction accuracy (0.21 ~ 2 dB in Urban100 dataset with the scale of ×4), excellent running time (0.047 s in Urban100 dataset) and recipient parameters and flops.
KW - Large kernel attention
KW - Lightweight network
KW - Pyramid split module
KW - Single image super-resolution
UR - http://www.scopus.com/inward/record.url?scp=85186409084&partnerID=8YFLogxK
U2 - 10.1007/s11042-024-18501-8
DO - 10.1007/s11042-024-18501-8
M3 - 文章
AN - SCOPUS:85186409084
SN - 1380-7501
VL - 83
SP - 79963
EP - 79984
JO - Multimedia Tools and Applications
JF - Multimedia Tools and Applications
IS - 33
ER -