A masking, linkage and guidance framework for online class incremental learning

Guoqiang Liang, Zhaojie Chen, Shibin Su, Shizhou Zhang, Yanning Zhang

科研成果: 期刊稿件文章同行评审

摘要

Due to the powerful ability to acquire new knowledge and preserve previously learned concepts from a dynamic data stream, continual learning has recently garnered substantial interest. Since training data can only be used once, online class incremental learning (OCIL) is more practical and difficult. Although replay-based OCIL methods have made great progress, there is still a severe class imbalance problem. Specifically, limited by the small memory size, the number of samples for new classes is much larger than that for old classes, which finally leads to task recency bias and abrupt feature drift. To alleviate this problem, we propose a masking, linkage, and guidance framework (MLG) for OCIL, which consists of three effective modules, i.e. batch-level logit mask (BLM, masking), batch-level feature cross fusion (BFCF, linkage) and accumulative mean feature distillation (AMFD, guidance). The former two focus on class imbalance problem while the last aims to alleviate abrupt feature drift. In BLM, we only activate the logits of classes occurring in a batch, which makes the model learn knowledge within each batch. The BFCF module employs a transformer encoder layer to fuse the sample features within a batch, which rebalances the gradients of classifier's weights and implicitly learns the sample relationship. Instead of a strict regularization in traditional feature distillation, the proposed AMFD guides previously learned features to move on purpose, which can reduce abrupt feature drift and produce a clearer boundary in feature space. Extensive experiments on four popular datasets for OCIL have shown the effectiveness of proposed MLG framework.

源语言英语
文章编号111185
期刊Pattern Recognition
160
DOI
出版状态已出版 - 4月 2025

指纹

探究 'A masking, linkage and guidance framework for online class incremental learning' 的科研主题。它们共同构成独一无二的指纹。

引用此