Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution

Bingzan Liu, Xin Ning, Shichao Ma, Yizhen Yang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

With the development of edge intelligent devices such as unmanned aerial vehicles (UAVs), the demand of for high-resolution (HR) images increase significantly. However, noise and blurring from finite inspector sizes and optics make high-resolution images difficult to acquire directly. Therefore, lightweight super-resolution of optical images based on convolutional neural network (CNN) has become a hot spot. While most state-of-the-art methods pay more attention to local features on a particular dimension. Though composite attention mechanism has employed in them, the conflict among features of different attention types affects the SR performance thoroughly. In this paper, we propose a dynamic feature distillation and pyramid split large kernel attention network (DPLKA) to solve such problems. In particular, a pyramid split large kernel attention module (PSLKA) is introduced to obtain the multi-scale global information and long-range dependence. Subsequently, by constructing a global-to-local feature extraction block (GL-FEB), a global-to-local feature extraction approach similar to swin transformer with multi-scale self-attention is established. Furthermore, a dynamic feature distillation block (DFDB) is considered in this model with the purpose of utilizing hierarchical features from different layers and realizing adaptive recalibration of different response. Specifically, DPLKA applies lightweight architecture such as depth-wise separable convolution (SDC) and distillation feature extraction module (DFEM) which greatly improves the effectiveness of the method. Extensive experimental results on five benchmark datasets indicate that DPLKA is dominant in reconstruction accuracy (0.21 ~ 2 dB in Urban100 dataset with the scale of ×4), excellent running time (0.047 s in Urban100 dataset) and recipient parameters and flops.

Original languageEnglish
Pages (from-to)79963-79984
Number of pages22
JournalMultimedia Tools and Applications
Volume83
Issue number33
DOIs
StatePublished - Oct 2024

Keywords

  • Large kernel attention
  • Lightweight network
  • Pyramid split module
  • Single image super-resolution

Fingerprint

Dive into the research topics of 'Dynamic feature distillation and pyramid split large kernel attention network for lightweight image super-resolution'. Together they form a unique fingerprint.

Cite this