TY - JOUR
T1 - Efficient Statistical Sampling Adaptation for Exemplar-Free Class Incremental Learning
AU - Cheng, De
AU - Zhao, Yuxin
AU - Wang, Nannan
AU - Li, Guozhang
AU - Zhang, Dingwen
AU - Gao, Xinbo
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Deep learning systems typically suffer from catastrophic forgetting of old knowledge when learning from new data continually. Recently, various class incremental learning (CIL) methods have been proposed to address this issue, and some approaches achieve promising performances by relying on rehearsing the training data of previous tasks. However, storing data from previous tasks would encounter data privacy and memory issues in real-world applications. In this paper, we propose a statistical sampling adaptation method for efficient Exemplar-Free Class-Incremental Learning (EFCIL). Here, instead of preserving the images/features themselves of previous tasks/classes, we store image feature statistics from previous classes to maintain the decision boundary, which is memory-efficient and much semantic-representative. When utilizing the old-class feature statistics, we build a statistical feature adaptation network (SFAN) with a manifold consistency regularization and then train it in a transductive learning paradigm, which can map the outdated statistics onto the current feature space to facilitate a compatible and balanced classifier training subsequently. In this way, the final classifier can be jointly optimized with all the old-class features projected by SFAN and current new-class features, thus alleviating the classification bias problem in EFCIL. Experimental results greatly demonstrate the effectiveness of the proposed method, achieving superior performances than state-of-the-art approaches. Our source code is released in https://github.com/yxzhcv/ESSA-EFCIL.
AB - Deep learning systems typically suffer from catastrophic forgetting of old knowledge when learning from new data continually. Recently, various class incremental learning (CIL) methods have been proposed to address this issue, and some approaches achieve promising performances by relying on rehearsing the training data of previous tasks. However, storing data from previous tasks would encounter data privacy and memory issues in real-world applications. In this paper, we propose a statistical sampling adaptation method for efficient Exemplar-Free Class-Incremental Learning (EFCIL). Here, instead of preserving the images/features themselves of previous tasks/classes, we store image feature statistics from previous classes to maintain the decision boundary, which is memory-efficient and much semantic-representative. When utilizing the old-class feature statistics, we build a statistical feature adaptation network (SFAN) with a manifold consistency regularization and then train it in a transductive learning paradigm, which can map the outdated statistics onto the current feature space to facilitate a compatible and balanced classifier training subsequently. In this way, the final classifier can be jointly optimized with all the old-class features projected by SFAN and current new-class features, thus alleviating the classification bias problem in EFCIL. Experimental results greatly demonstrate the effectiveness of the proposed method, achieving superior performances than state-of-the-art approaches. Our source code is released in https://github.com/yxzhcv/ESSA-EFCIL.
KW - adaptation
KW - catastrophic forgetting
KW - class incremental learning
KW - Exemplar-free
KW - feature statistics
UR - http://www.scopus.com/inward/record.url?scp=85197494390&partnerID=8YFLogxK
U2 - 10.1109/TCSVT.2024.3421587
DO - 10.1109/TCSVT.2024.3421587
M3 - 文章
AN - SCOPUS:85197494390
SN - 1051-8215
VL - 34
SP - 11451
EP - 11463
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 11
ER -