Hyperspectral Anomaly Detection via Structured Sparsity Plus Enhanced Low-Rankness

Yin Ping Zhao, Hongyan Li, Yongyong Chen, Zhen Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Hyperspectral anomaly detection (HAD), distinguishing anomalous pixels or subpixels from the background, has received increasing attention in recent years. Low-rank representation (LRR)-based methods have also been promoted rapidly for HAD, but they may encounter three challenges: 1) they adopted the nuclear norm as the convex approximation, yet a suboptimal solution of the rank function; 2) they overlook the structured spatial correlation of anomalous pixels; and 3) they fail to comprehensively explore the local structure details of the original background. To address these challenges, in this article, we proposed the structured sparsity plus enhanced low-rankness ( $\text{S}^{2}$ ELR) method for HAD. Specifically, our $\text{S}^{2}$ ELR method adopts the weighted tensor Schatten- $p$ norm, acting as an enhanced approximation of the rank function than the tensor nuclear norm (TNN), and the structured sparse norm to characterize the low-rank properties of the background and the sparsity of the abnormal pixels, respectively. To preserve the local structural details, the position-based Laplace regularizer is accompanied. An iterative algorithm is derived from the popular alternating direction methods of multipliers. Compared to the existing state-of-the-art HAD methods, the experimental results have demonstrated the superiority of our proposed $\text{S}^{2}$ ELR method.

Original languageEnglish
Article number5515115
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume61
DOIs
StatePublished - 2023

Keywords

  • Anomaly detection
  • Laplacian graph
  • low-rank
  • structure tensor
  • tensor decomposition

Fingerprint

Dive into the research topics of 'Hyperspectral Anomaly Detection via Structured Sparsity Plus Enhanced Low-Rankness'. Together they form a unique fingerprint.

Cite this