CLIP-guided continual novel class discovery

Qingsen Yan, Yiting Yang, Yutong Dai, Xing Zhang, Katarzyna Wiltos, Marcin Woźniak, Wei Dong, Yanning Zhang

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Continual Novel Class Discovery (CNCD) aims to adapt a trained classification model to a new task while maintaining its performance on the old task. This presents two main challenges: (1) unsupervised learning of new tasks and (2) avoiding forgetting old classes when previous data is unavailable. Some prior works use task IDs to identify old and novel classes for parameter isolation, while others waive the requirement of task IDs by combining novel class discovery and old knowledge preservation into a single training process. However, this often leads to interference with feature space and presents difficulties in balancing old and new knowledge. This work proposes a method that does not require task IDs and argues that decoupling the training process is beneficial. We find that a simple semi-supervised learning strategy with prototype adaptation can unleash the strong generalization ability of the CLIP model to a small CNCD model for novel class discovery. However, this operation may deteriorate the performance of old classes. To address this issue, CutMix is utilized to improve the network's representation and preserve old knowledge. Compared to the baseline method, our method not only surpasses it on the novel classes to a significant margin (33.1% on the TinyImageNet) but also exhibits more accurate prediction on old classes (2.9% on the TinyImageNet). These advantages are further boosted when multiple novel class discovery steps are required (31.2%→56.1% on the TinyImageNet regarding the overall performance). Code will be made available.

源语言英语
文章编号112920
期刊Knowledge-Based Systems
310
DOI
出版状态已出版 - 15 2月 2025

指纹

探究 'CLIP-guided continual novel class discovery' 的科研主题。它们共同构成独一无二的指纹。

引用此