CEFM: A heuristic mesh segmentation method based on convexity estimation and fast marching

Jun Zhang, Zhouhui Lian, Zhenbao Liu, Jianguo Xiao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Mesh segmentation is a fundamental way of shape analysis and understanding for 3D mesh models. In this paper, we propose an effective heuristic mesh segmentation algorithm, which is based on concave areas detection and heuristic 2-category classification via fast marching. The algorithm has several merits. First, the boundary between each pair of segments is close to the natural seams of 3D objects. Second, it is robust against pose variations and isometric transformations. Finally, our algorithm decomposes non-rigid 3D models into a set of rigid components in a short period of time and the procedure is fully automatic. Extensive experiments in this paper demonstrate that the proposed method outperforms the state of the art in mesh segmentation.

Original languageEnglish
Title of host publicationGRAPP 2015 - 10th International Conference on Computer Graphics Theory and Applications; VISIGRAPP, Proceedings
EditorsJose Braz, Julien Pettre, Paul Richard
PublisherSciTePress
Pages114-121
Number of pages8
ISBN (Electronic)9789897580871
DOIs
StatePublished - 2015
Event10th International Conference on Computer Graphics Theory and Applications, GRAPP 2015 - Berlin, Germany
Duration: 11 Mar 201514 Mar 2015

Publication series

NameGRAPP 2015 - 10th International Conference on Computer Graphics Theory and Applications; VISIGRAPP, Proceedings

Conference

Conference10th International Conference on Computer Graphics Theory and Applications, GRAPP 2015
Country/TerritoryGermany
CityBerlin
Period11/03/1514/03/15

Keywords

  • Convexity
  • Fast marching
  • Local depth
  • Mesh segmentation
  • Shape descriptor

Fingerprint

Dive into the research topics of 'CEFM: A heuristic mesh segmentation method based on convexity estimation and fast marching'. Together they form a unique fingerprint.

Cite this