A hyperspectral spatial-spectral enhancement algorithm

Chen Yi, Yongqiang Zhao, Jingxiang Yang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Low spatial and spectral resolution hyperspectral image will always degrade the performance of the subsequent applications, such as classification and object detection. The desired hyperspectral image is assumed to be reconstructed based on both high spatial and spectral features, which are always represented using endmembers and their abundances. In this paper, we propose a hyperspectral spatial and spectral resolution enhancement algorithm based on spectral unmixing and spatial constraints to simultaneously obtain high spatial-spectral resolution result. An intermediate high spatial but low spectral resolution HSI is introduced to establish mapping scheme of abundances and endmembers between low resolution input and desired high spatial-spectral resolution result. Experiments on the Sandigo dataset have illustrated that the proposed method is comparable or superior to other state-of-art methods.

Original languageEnglish
Title of host publication2016 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages7228-7231
Number of pages4
ISBN (Electronic)9781509033324
DOIs
StatePublished - 1 Nov 2016
Event36th IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2016 - Beijing, China
Duration: 10 Jul 201615 Jul 2016

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)
Volume2016-November

Conference

Conference36th IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2016
Country/TerritoryChina
CityBeijing
Period10/07/1615/07/16

Keywords

  • Hyperspectral image
  • spatial-spectral super-resolution
  • unmixing

Fingerprint

Dive into the research topics of 'A hyperspectral spatial-spectral enhancement algorithm'. Together they form a unique fingerprint.

Cite this