Research on marine floating raft aquaculture SAR image target recognition based on deep collaborative sparse coding network

Jie Geng, Jian Chao Fan, Jia Lan Chu, Hong Yu Wang

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

Floating raft aquaculture is widely distributed in the offshore ocean of China. Since raft information cannot be obtained accurately in the visible remote sensing image, active imaging images acquired from synthetic aperture radar (SAR) are applied. However, oceanic SAR images are seriously contaminated by speckle noise, and effective features of SAR images are deficient, which make recognition difficult. In order to overcome these problems, a deep collaborative sparse coding network (DCSCN) is proposed to extract features and conduct recognition automatically. The proposed method extracts texture features and contour features from the pre-processed image firstly. Then, it segments the image into patches and learns features of each patch collaboratively through the DCSCN network. The optimized features are used for recognition finally. Experiments on the artificial SAR image and the images of Beidaihe demonstrate that the proposed DCSCN network can accurately obtain the area of floating raft aquaculture. Since the network can learn discriminative features and integrate the correlated neighbor pixels, the DCSCN network improves the recognition accuracy and has better performance in overcoming the contamination of speckle noise.

Original languageEnglish
Pages (from-to)593-604
Number of pages12
JournalZidonghua Xuebao/Acta Automatica Sinica
Volume42
Issue number4
DOIs
StatePublished - 1 Apr 2016
Externally publishedYes

Keywords

  • Deep learning
  • Floating raft aquaculture
  • Sparse auto-encoders
  • Synthetic aperture radar (SAR)
  • Target recognition

Fingerprint

Dive into the research topics of 'Research on marine floating raft aquaculture SAR image target recognition based on deep collaborative sparse coding network'. Together they form a unique fingerprint.

Cite this