Learning shape statistics for hierarchical 3D medical image segmentation

Wuxia Zhang, Yuan Yuan, Xuelong Li, Pingkun Yan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Accurate image segmentation is important for many medical imaging applications, whereas it remains challenging due to the complexity in medical images, such as the complex shapes and varied neighbor structures. This paper proposes a new hierarchical 3D image segmentation method based on patient-specific shape prior and surface patch shape statistics (SURPASS) model. In the segmentation process, a coarse-to-fine, two-stage strategy is designed, which contains global segmentation and local segmentation. In the global segmentation stage, patient-specific shape prior is estimated by using manifold learning techniques to achieve the overall segmentation. In the second stage, SURPASS is computed to solve the problem of poor segmentation at certain surface patches. The effectiveness of the proposed 3D image segmentation method has been demonstrated by the experiments on segmenting the prostate from a series of MR images.

Original languageEnglish
Title of host publicationICIP 2011
Subtitle of host publication2011 18th IEEE International Conference on Image Processing
Pages2189-2192
Number of pages4
DOIs
StatePublished - 2011
Externally publishedYes
Event2011 18th IEEE International Conference on Image Processing, ICIP 2011 - Brussels, Belgium
Duration: 11 Sep 201114 Sep 2011

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference2011 18th IEEE International Conference on Image Processing, ICIP 2011
Country/TerritoryBelgium
CityBrussels
Period11/09/1114/09/11

Keywords

  • 3D image segmentation
  • manifold learning
  • shape modeling
  • surface patch shape statistics

Fingerprint

Dive into the research topics of 'Learning shape statistics for hierarchical 3D medical image segmentation'. Together they form a unique fingerprint.

Cite this