Instance Mask Growing on Leaf

Research output: Contribution to conferencePaperpeer-review

6 Scopus citations

Abstract

Contour-based instance segmentation methods represent masks through a series of points. However, the point number is fixed once the model is trained, which limits the model’s flexibility in dealing with various instances. We follow this issue and present an idea to predict an appropriate number of points dynamically according to instance shapes. Concretely, we observe that the leaf locates coarse margins via major veins and grows minor veins to refine twisty parts, which helps cover any masks accurately. Meanwhile, major and minor veins share the same growth mode, which makes it possible to generate minor veins dynamically according to the trained major vein mode. Considering the superiorities above, we propose VeinMask to formulate the instance segmentation problem as the simulation of the vein growth process and to predict the major and minor veins in polar coordinates for instance segmenting. Besides, centroidness is introduced for instance segmentation tasks to help suppress low-quality instances. Furthermore, a surroundings cross-correlation sensitive (SCCS) module is designed to enhance the feature expression by utilizing the surroundings of each pixel. Additionally, a Residual IoU (RIoU) loss is formulated to supervise the regression tasks of major and minor veins effectively. Experiments demonstrate the effectiveness of VeinMask. Particularly, our method outperforms existing one-stage contour-based methods on the COCO dataset with almost half the trained point number.

Original languageEnglish
StatePublished - 2023
Event34th British Machine Vision Conference, BMVC 2023 - Aberdeen, United Kingdom
Duration: 20 Nov 202324 Nov 2023

Conference

Conference34th British Machine Vision Conference, BMVC 2023
Country/TerritoryUnited Kingdom
CityAberdeen
Period20/11/2324/11/23

Fingerprint

Dive into the research topics of 'Instance Mask Growing on Leaf'. Together they form a unique fingerprint.

Cite this