Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution

Bingzan Liu, Xin Ning, Shichao Ma, Yizhen Yang

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

With the development of edge intelligent devices such as unmanned aerial vehicles (UAVs), the demand of for high-resolution (HR) images increase significantly. However, noise and blurring from finite inspector sizes and optics make high-resolution images difficult to acquire directly. Therefore, lightweight super-resolution of optical images based on convolutional neural network (CNN) has become a hot spot. While most state-of-the-art methods pay more attention to local features on a particular dimension. Though composite attention mechanism has employed in them, the conflict among features of different attention types affects the SR performance thoroughly. In this paper, we propose a dynamic feature distillation and pyramid split large kernel attention network (DPLKA) to solve such problems. In particular, a pyramid split large kernel attention module (PSLKA) is introduced to obtain the multi-scale global information and long-range dependence. Subsequently, by constructing a global-to-local feature extraction block (GL-FEB), a global-to-local feature extraction approach similar to swin transformer with multi-scale self-attention is established. Furthermore, a dynamic feature distillation block (DFDB) is considered in this model with the purpose of utilizing hierarchical features from different layers and realizing adaptive recalibration of different response. Specifically, DPLKA applies lightweight architecture such as depth-wise separable convolution (SDC) and distillation feature extraction module (DFEM) which greatly improves the effectiveness of the method. Extensive experimental results on five benchmark datasets indicate that DPLKA is dominant in reconstruction accuracy (0.21 ~ 2 dB in Urban100 dataset with the scale of ×4), excellent running time (0.047 s in Urban100 dataset) and recipient parameters and flops.

源语言英语
页(从-至)79963-79984
页数22
期刊Multimedia Tools and Applications
83
33
DOI
出版状态已出版 - 10月 2024

指纹

探究 'Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution' 的科研主题。它们共同构成独一无二的指纹。

引用此