TY - JOUR
T1 - A masking, linkage and guidance framework for online class incremental learning
AU - Liang, Guoqiang
AU - Chen, Zhaojie
AU - Su, Shibin
AU - Zhang, Shizhou
AU - Zhang, Yanning
N1 - Publisher Copyright:
© 2024 Elsevier Ltd
PY - 2025/4
Y1 - 2025/4
N2 - Due to the powerful ability to acquire new knowledge and preserve previously learned concepts from a dynamic data stream, continual learning has recently garnered substantial interest. Since training data can only be used once, online class incremental learning (OCIL) is more practical and difficult. Although replay-based OCIL methods have made great progress, there is still a severe class imbalance problem. Specifically, limited by the small memory size, the number of samples for new classes is much larger than that for old classes, which finally leads to task recency bias and abrupt feature drift. To alleviate this problem, we propose a masking, linkage, and guidance framework (MLG) for OCIL, which consists of three effective modules, i.e. batch-level logit mask (BLM, masking), batch-level feature cross fusion (BFCF, linkage) and accumulative mean feature distillation (AMFD, guidance). The former two focus on class imbalance problem while the last aims to alleviate abrupt feature drift. In BLM, we only activate the logits of classes occurring in a batch, which makes the model learn knowledge within each batch. The BFCF module employs a transformer encoder layer to fuse the sample features within a batch, which rebalances the gradients of classifier's weights and implicitly learns the sample relationship. Instead of a strict regularization in traditional feature distillation, the proposed AMFD guides previously learned features to move on purpose, which can reduce abrupt feature drift and produce a clearer boundary in feature space. Extensive experiments on four popular datasets for OCIL have shown the effectiveness of proposed MLG framework.
AB - Due to the powerful ability to acquire new knowledge and preserve previously learned concepts from a dynamic data stream, continual learning has recently garnered substantial interest. Since training data can only be used once, online class incremental learning (OCIL) is more practical and difficult. Although replay-based OCIL methods have made great progress, there is still a severe class imbalance problem. Specifically, limited by the small memory size, the number of samples for new classes is much larger than that for old classes, which finally leads to task recency bias and abrupt feature drift. To alleviate this problem, we propose a masking, linkage, and guidance framework (MLG) for OCIL, which consists of three effective modules, i.e. batch-level logit mask (BLM, masking), batch-level feature cross fusion (BFCF, linkage) and accumulative mean feature distillation (AMFD, guidance). The former two focus on class imbalance problem while the last aims to alleviate abrupt feature drift. In BLM, we only activate the logits of classes occurring in a batch, which makes the model learn knowledge within each batch. The BFCF module employs a transformer encoder layer to fuse the sample features within a batch, which rebalances the gradients of classifier's weights and implicitly learns the sample relationship. Instead of a strict regularization in traditional feature distillation, the proposed AMFD guides previously learned features to move on purpose, which can reduce abrupt feature drift and produce a clearer boundary in feature space. Extensive experiments on four popular datasets for OCIL have shown the effectiveness of proposed MLG framework.
KW - Class incremental learning
KW - Feature distillation
KW - Logit mask
UR - http://www.scopus.com/inward/record.url?scp=85209549358&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2024.111185
DO - 10.1016/j.patcog.2024.111185
M3 - 文章
AN - SCOPUS:85209549358
SN - 0031-3203
VL - 160
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 111185
ER -