Ultrasound image de-speckling by a hybrid deep network with transferred filtering and structural prior

Xiangfei Feng, Qinghua Huang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Deep neural-network has been widely used in natural image denoising. However, due to the lack of label of real ultrasound (US) B-mode image for de-speckling, the deep neural network is greatly restricted in US image de-speckling. In this paper, we propose to use transfer learning and two types of prior knowledge to construct a hybrid neural network structure for de-speckling. Firstly, based on a given US image model, the speckle noise is similar to Gaussian distribution in the logarithmic transformation domain, called Gaussian prior knowledge. The distribution parameters are estimated in the logarithmic transformation domain based on four typical traditional US image de-speckling methods with maximum likelihood estimation. Secondly, depending on the prior parameters, a transferable denoising network is trained with clean natural image dataset. Finally, a VGGNet is used to extract the structure boundaries before and after US image de-speckling based on the transfer network, and we call it structural prior knowledge. The structural boundaries of a US image should be unchanged after the de-speckling, and hence we use this constraint to fine-tune the transfer network. The proposed de-speckling framework is verified on artificially generated phantom (AGP) images and real US images, and the results demonstrate its effectiveness.

Original languageEnglish
Pages (from-to)346-355
Number of pages10
JournalNeurocomputing
Volume414
DOIs
StatePublished - 13 Nov 2020

Keywords

  • Gaussian distribution prior
  • Hybrid neural network
  • Structural prior
  • Transfer learning
  • US image de-speckling

Fingerprint

Dive into the research topics of 'Ultrasound image de-speckling by a hybrid deep network with transferred filtering and structural prior'. Together they form a unique fingerprint.

Cite this