Hyperspectral image classification with capsnet and markov random fields

Xuefeng Jiang, Yue Zhang, Wenbo Liu, Junyu Gao, Junrui Liu, Yanning Zhang, Jianzhe Lin

科研成果: 期刊稿件文章同行评审

15 引用 (Scopus)

摘要

Hyperspectral image (HSI) classification is one of the most challenging problems in under- standing HSI. Convolutional neural network(CNN), with the strong ability to extract features using the hidden layers in the network, has been introduced to solve this problem. However, several fully connected layers are always appended at the end of CNN, which dramatically reduced the efficiency of space utilization and make the classification algorithm hard to converge. Recently, a new network architecture called capsule network (CapsNet) was presented to improve the CNN. It uses groups of neurons as capsules to replace the neurons in traditional neural networks. Since the capsule can provide superior spectral features and spatial information extracted, its performance is better than the most advanced CNN in some fields. Motivated by this idea, a new remote sensing hyperspectral image classification algorithm called Conv-Caps is proposed to make full use of the advantages of both. We integrate spectral and spatial information into the proposed framework and combine Conv-Caps with Markov Random Field (MRF), which uses the graph cut expansion method to solve the classification task. The Caps-MRF method is further proposed. First, select an initial feature extractor,which a CNN without fully connected layers. Then, the initial recognition feature map is put into the newly designed CapsNet to obtain the probability map. Finally, the MRF model is used to calculate the subdivision labels. The presented method is trained with three real HSI datasets and is compared with the latest methods. We find the framework can produce competitive classification performance.

源语言英语
页(从-至)191956-191968
页数13
期刊IEEE Access
8
DOI
出版状态已出版 - 2020

指纹

探究 'Hyperspectral image classification with capsnet and markov random fields' 的科研主题。它们共同构成独一无二的指纹。

引用此