TY - GEN
T1 - An Internal-External Constrained Distillation Framework for Continual Semantic Segmentation
AU - Yan, Qingsen
AU - Liu, Shengqiang
AU - Zhang, Xing
AU - Zhu, Yu
AU - Sun, Jinqiu
AU - Zhang, Yanning
N1 - Publisher Copyright:
© 2024, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
PY - 2024
Y1 - 2024
N2 - Deep neural networks have a notorious catastrophic forgetting problem when training serialized tasks on image semantic segmentation. It refers to the phenomenon of forgetting previously learned knowledge due to the plasticity-stability dilemma and background shift in the segmentation task. Continual Semantic Segmentation (CSS) comes into being to handle this challenge. Previous distillation-based methods only consider the knowledge of the features at the same level but neglect the relationship between different levels. To alleviate this problem, in this paper, we propose a mixed distillation framework called Internal-external Constrained Distillation (ICD), which includes multi-information-based internal feature distillation and attention-based external feature distillation. Specifically, we utilize the statistical information of features to perform internal distillation between the old model and the new model, which effectively avoids interference at the same scale. Furthermore, for the external distillation of features at different scales, we employ multi-scale convolutional attention to capture the relationships among features of different scales and ensure their consistency across old and new tasks. We evaluate our method on standard semantic segmentation datasets, such as Pascal-VOC2012 and ADE20K, and demonstrate significant performance improvements in various scenarios.
AB - Deep neural networks have a notorious catastrophic forgetting problem when training serialized tasks on image semantic segmentation. It refers to the phenomenon of forgetting previously learned knowledge due to the plasticity-stability dilemma and background shift in the segmentation task. Continual Semantic Segmentation (CSS) comes into being to handle this challenge. Previous distillation-based methods only consider the knowledge of the features at the same level but neglect the relationship between different levels. To alleviate this problem, in this paper, we propose a mixed distillation framework called Internal-external Constrained Distillation (ICD), which includes multi-information-based internal feature distillation and attention-based external feature distillation. Specifically, we utilize the statistical information of features to perform internal distillation between the old model and the new model, which effectively avoids interference at the same scale. Furthermore, for the external distillation of features at different scales, we employ multi-scale convolutional attention to capture the relationships among features of different scales and ensure their consistency across old and new tasks. We evaluate our method on standard semantic segmentation datasets, such as Pascal-VOC2012 and ADE20K, and demonstrate significant performance improvements in various scenarios.
KW - Continual Learning
KW - Feature Distillation
KW - Semantic Segmentation
UR - http://www.scopus.com/inward/record.url?scp=85180761387&partnerID=8YFLogxK
U2 - 10.1007/978-981-99-8435-0_26
DO - 10.1007/978-981-99-8435-0_26
M3 - 会议稿件
AN - SCOPUS:85180761387
SN - 9789819984343
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 325
EP - 336
BT - Pattern Recognition and Computer Vision - 6th Chinese Conference, PRCV 2023, Proceedings
A2 - Liu, Qingshan
A2 - Wang, Hanzi
A2 - Ji, Rongrong
A2 - Ma, Zhanyu
A2 - Zheng, Weishi
A2 - Zha, Hongbin
A2 - Chen, Xilin
A2 - Wang, Liang
PB - Springer Science and Business Media Deutschland GmbH
T2 - 6th Chinese Conference on Pattern Recognition and Computer Vision, PRCV 2023
Y2 - 13 October 2023 through 15 October 2023
ER -