TY - JOUR
T1 - Intra- and Inter-Pair Consistency for Semi-Supervised Gland Segmentation
AU - Xie, Yutong
AU - Zhang, Jianpeng
AU - Liao, Zhibin
AU - Verjans, Johan
AU - Shen, Chunhua
AU - Xia, Yong
N1 - Publisher Copyright:
© 1992-2012 IEEE.
PY - 2022
Y1 - 2022
N2 - Accurate gland segmentation in histology tissue images is a critical but challenging task. Although deep models have demonstrated superior performance in medical image segmentation, they commonly require a large amount of annotated data, which are hard to obtain due to the extensive labor costs and expertise required. In this paper, we propose an intra- and inter-pair consistency-based semi-supervised (I 2 CS) model that can be trained on both labeled and unlabeled histology images for gland segmentation. Considering that each image contains glands and hence different images could potentially share consistent semantics in the feature space, we introduce a novel intra- and inter-pair consistency module to explore such consistency for learning with unlabeled data. It first characterizes the pixel-level relation between a pair of images in the feature space to create an attention map that highlights the regions with the same semantics but on different images. Then, it imposes a consistency constraint on the attention maps obtained from multiple image pairs, and thus filters low-confidence attention regions to generate refined attention maps that are then merged with original features to improve their representation ability. In addition, we also design an object-level loss to address the issues caused by touching glands. We evaluated our model against several recent gland segmentation methods and three typical semi-supervised methods on the GlaS and CRAG datasets. Our results not only demonstrate the effectiveness of the proposed due consistency module and Obj-Dice loss, but also indicate that the proposed I 2 CS model achieves state-of-the-art gland segmentation performance on both benchmarks.
AB - Accurate gland segmentation in histology tissue images is a critical but challenging task. Although deep models have demonstrated superior performance in medical image segmentation, they commonly require a large amount of annotated data, which are hard to obtain due to the extensive labor costs and expertise required. In this paper, we propose an intra- and inter-pair consistency-based semi-supervised (I 2 CS) model that can be trained on both labeled and unlabeled histology images for gland segmentation. Considering that each image contains glands and hence different images could potentially share consistent semantics in the feature space, we introduce a novel intra- and inter-pair consistency module to explore such consistency for learning with unlabeled data. It first characterizes the pixel-level relation between a pair of images in the feature space to create an attention map that highlights the regions with the same semantics but on different images. Then, it imposes a consistency constraint on the attention maps obtained from multiple image pairs, and thus filters low-confidence attention regions to generate refined attention maps that are then merged with original features to improve their representation ability. In addition, we also design an object-level loss to address the issues caused by touching glands. We evaluated our model against several recent gland segmentation methods and three typical semi-supervised methods on the GlaS and CRAG datasets. Our results not only demonstrate the effectiveness of the proposed due consistency module and Obj-Dice loss, but also indicate that the proposed I 2 CS model achieves state-of-the-art gland segmentation performance on both benchmarks.
KW - deep convolutional neural network
KW - Gland segmentation
KW - pairwise learning
KW - semi-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=85122074458&partnerID=8YFLogxK
U2 - 10.1109/TIP.2021.3136716
DO - 10.1109/TIP.2021.3136716
M3 - 文章
C2 - 34951847
AN - SCOPUS:85122074458
SN - 1057-7149
VL - 31
SP - 894
EP - 905
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
ER -