Hyperspectral image denoising by low-rank models with hyper-Laplacian total variation prior

Shuang Xu, Jiangshe Zhang, Chunxia Zhang

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

The total variation (TV) regularized low-rank models have emerged as a powerful tool for hyperspectral image (HSI) denoising. TV, defined by the ℓ1-norm of gradients, is assumed that gradients obey the Laplacian distribution from the statistics point of view. By investigating the histogram of HSI's gradients, we find that gradients in real HSIs are actually distributed as the hyper-Laplacian distribution with the power parameter q=1/2. Taking this prior into account, a hyper-Laplacian spectral-spatial total variation (HTV), defined by the ℓ1/2-norm of gradients, is proposed for HSI denoising. Furthermore, by incorporating HTV as the regularizer, a low-rank matrix model and a low-rank tensor model are proposed. The two models can be solved by the augmented Lagrange multiplier algorithm. To validate the effectiveness of HTV, we formulate baseline models by replacing HTV with ℓ1-norm and ℓ0-norm based TV regularizations, and it is revealed that our proposed HTV outperforms them. Furthermore, compared with several popular HSI denoising algorithms, the experiments conducted on both the simulated and real data demonstrate the superiority of proposed models.

Original languageEnglish
Article number108733
JournalSignal Processing
Volume201
DOIs
StatePublished - Dec 2022

Keywords

  • Hyperspectral image denoising
  • Low-rank matrix factorization
  • Low-rank tensor factorization

Fingerprint

Dive into the research topics of 'Hyperspectral image denoising by low-rank models with hyper-Laplacian total variation prior'. Together they form a unique fingerprint.

Cite this