Change detection of SAR images based on supervised contractive autoencoders and fuzzy clustering

Jie Geng, Hongyu Wang, Jianchao Fan, Xiaorui Ma

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

35 Scopus citations

Abstract

In this paper, supervised contractive autoencoders (SCAEs) combined with fuzzy c-means (FCM) clustering are developed for change detection of synthetic aperture radar (SAR) images, which aim to take advantage of deep neural networks to capture changed features. Given two original SAR images, Lee filter is used in preprocessing and the difference image (DI) is obtained by log ratio method. Then, FCM is adopted to analyse DI, which yields pseudo labels for guiding the training of SCAEs. Finally, SCAEs are developed to learn changed features from bitemporal images and DI, which can obtain discriminative features and improve detection accuracies. Experiments on three data demonstrate that the proposed method outperforms some related approaches.

Original languageEnglish
Title of host publicationRSIP 2017 - International Workshop on Remote Sensing with Intelligent Processing, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538619902
DOIs
StatePublished - 23 Jun 2017
Externally publishedYes
Event2017 International Workshop on Remote Sensing with Intelligent Processing, RSIP 2017 - Shanghai, China
Duration: 19 May 201721 May 2017

Publication series

NameRSIP 2017 - International Workshop on Remote Sensing with Intelligent Processing, Proceedings

Conference

Conference2017 International Workshop on Remote Sensing with Intelligent Processing, RSIP 2017
Country/TerritoryChina
CityShanghai
Period19/05/1721/05/17

Keywords

  • autoencoder
  • Change detection
  • deep neural network
  • synthetic aperture radar (SAR) image

Fingerprint

Dive into the research topics of 'Change detection of SAR images based on supervised contractive autoencoders and fuzzy clustering'. Together they form a unique fingerprint.

Cite this