TY - JOUR
T1 - Federated Cross-Incremental Self-Supervised Learning for Medical Image Segmentation
AU - Zhang, Fan
AU - Liu, Huiying
AU - Cai, Qing
AU - Feng, Chun Mei
AU - Wang, Binglu
AU - Wang, Shanshan
AU - Dong, Junyu
AU - Zhang, David
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Federated cross learning has shown impressive performance in medical image segmentation. However,it encounters the catastrophic forgetting issue caused by data heterogeneity across different clients and is particularly pronounced when simultaneously facing pixelwise label deficiency problem. In this article,we propose a novel federated cross-incremental self-supervised learning method,coined FedCSL,which not only can enable any client in the federation incrementally yet effectively learn from others without inducing knowledge forgetting or requiring massive labeled samples,but also preserve maximum data privacy. Specifically,to overcome the catastrophic forgetting issue,a novel cross-incremental collaborative distillation (CCD) mechanism is proposed,which distills explicit knowledge learned from previous clients to subsequent clients based on secure multiparty computation (MPC). Besides,an effective retrospect mechanism is designed to rearrange the training sequence of clients per round,further releasing the power of CCD by enforcing interclient knowledge propagation. In addition,to alleviate the need of large-scale densely annotated pretraining medical datasets,we also propose a two-stage training framework,in which federated cross-incremental self-supervised pretraining paradigm first extracts robust yet general image-level patterns across multi-institutional data silos via a novel round-robin distributed masked image modeling (MIM) pipeline; then,the resulting visual concepts,e.g.,semantics,are transferred to the federated cross-incremental supervised fine-tuning paradigm,favoring various cross-silo medical image segmentation tasks. The experimental results on public datasets demonstrate the effectiveness of the proposed method as well as the consistently superior performance of our method over most state-of-the-art methods quantitatively and qualitatively.
AB - Federated cross learning has shown impressive performance in medical image segmentation. However,it encounters the catastrophic forgetting issue caused by data heterogeneity across different clients and is particularly pronounced when simultaneously facing pixelwise label deficiency problem. In this article,we propose a novel federated cross-incremental self-supervised learning method,coined FedCSL,which not only can enable any client in the federation incrementally yet effectively learn from others without inducing knowledge forgetting or requiring massive labeled samples,but also preserve maximum data privacy. Specifically,to overcome the catastrophic forgetting issue,a novel cross-incremental collaborative distillation (CCD) mechanism is proposed,which distills explicit knowledge learned from previous clients to subsequent clients based on secure multiparty computation (MPC). Besides,an effective retrospect mechanism is designed to rearrange the training sequence of clients per round,further releasing the power of CCD by enforcing interclient knowledge propagation. In addition,to alleviate the need of large-scale densely annotated pretraining medical datasets,we also propose a two-stage training framework,in which federated cross-incremental self-supervised pretraining paradigm first extracts robust yet general image-level patterns across multi-institutional data silos via a novel round-robin distributed masked image modeling (MIM) pipeline; then,the resulting visual concepts,e.g.,semantics,are transferred to the federated cross-incremental supervised fine-tuning paradigm,favoring various cross-silo medical image segmentation tasks. The experimental results on public datasets demonstrate the effectiveness of the proposed method as well as the consistently superior performance of our method over most state-of-the-art methods quantitatively and qualitatively.
KW - Catastrophic forgetting
KW - cross-incremental collaboration distillation
KW - federated learning (FL)
KW - medical image segmentation
KW - retrospect mechanism
UR - http://www.scopus.com/inward/record.url?scp=85207712616&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2024.3469962
DO - 10.1109/TNNLS.2024.3469962
M3 - 文章
AN - SCOPUS:85207712616
SN - 2162-237X
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
ER -