Proportional Perturbation Model for Hyperspectral Unmixing Accounting for Endmember Variability

Wei Gao, Jingyu Yang, Jie Chen

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

During the last decade, many methods have been proposed to enhance the performance of hyperspectral unmixing (HU) for linear mixing problems. However, most methods typically do not take into account the effects of spectral variability, limiting their ability to improve unmixing performance. Therefore, we propose a proportional perturbation model (PPM) for HU accounting for endmember variability. The PPM can characterize both the proportional variations of endmembers and the local fluctuations in real-world scenarios by incorporating scaling factors and a perturbation term. In addition, we design an unmixing network based on PPM, so-called PPM-Net. The PPM-Net can learn more accurate endmember parameters from the latent representation of input pixels and estimate abundance simultaneously. Specifically, we constrain the abundance through a traditional method during the pretraining phase to further enhance its robustness. The experimental results on synthetic and real data indicate that the proposed PPM-Net can outperform the state-of-the-art unmixing methods, particularly improving over 5.9% in terms of average root-mean-square error (aRMSEA) over the second best method. The source code is available at https://github.com/yjysimply/PPM-Net.

Original languageEnglish
Article number5501405
Pages (from-to)1-5
Number of pages5
JournalIEEE Geoscience and Remote Sensing Letters
Volume21
DOIs
StatePublished - 2024

Keywords

  • Deep learning (DL)
  • endmember variability
  • hyperspectral unmixing (HU)
  • variational inference

Fingerprint

Dive into the research topics of 'Proportional Perturbation Model for Hyperspectral Unmixing Accounting for Endmember Variability'. Together they form a unique fingerprint.

Cite this