Interband prediction compressed sensing reconstruction algorithm for hyperspectral image

Ying Hou, Yanning Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Block compressed sensing algorithm exhibits the characteristics of the simple, effective and low memory burdens. Due to strong spatial and spectral correlation of hyperspectral image, in this paper an effective hyperspectral images block compressed sensing based on interband residual prediction reconstruction algorithm is proposed. Furthermore, an improved noise variance estimation method and the optimal distribution strategies of spectral sampling rates is presented by comparing and analyzing the experimental results. Experimental results demonstrate that the reconstruction performances of proposed algorithm outperform several state-of-the-art compressed sensing algorithms.

Original languageEnglish
Title of host publication4th International Workshop on Earth Observation and Remote Sensing Applications, EORSA 2016 - Proceedings
EditorsPaolo Gamba, George Xian, Shunlin Liang, Qihao Weng, Jing Ming Chen, Shunlin Liang
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages465-469
Number of pages5
ISBN (Electronic)9781509014798
DOIs
StatePublished - 25 Aug 2016
Event4th International Workshop on Earth Observation and Remote Sensing Applications, EORSA 2016 - Guangzhou, China
Duration: 4 Jul 20166 Jul 2016

Publication series

Name4th International Workshop on Earth Observation and Remote Sensing Applications, EORSA 2016 - Proceedings

Conference

Conference4th International Workshop on Earth Observation and Remote Sensing Applications, EORSA 2016
Country/TerritoryChina
CityGuangzhou
Period4/07/166/07/16

Keywords

  • block compressed sensing
  • hyperspectral image
  • Interband prediction

Fingerprint

Dive into the research topics of 'Interband prediction compressed sensing reconstruction algorithm for hyperspectral image'. Together they form a unique fingerprint.

Cite this