Nearest-neighbor class prototype prompt and simulated logits for continual learning

Yue Lu, Jie Tan, Shizhou Zhang, Yinghui Xing, Guoqiang Liang, Yanning Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

Continual learning allows a single model to acquire knowledge from a sequence of tasks within a non-static data stream without succumbing to catastrophic forgetting. Vision transformers, pre-trained on extensive datasets, have recently made prompt-based methods viable as exemplar-free alternatives to methods reliant on rehearsal. Nonetheless, the majority of these methods employ a key–value query system for integrating pertinent prompts, which might result in the keys becoming stuck in local minima. To counter this, we suggest a straightforward nearest-neighbor class prototype search approach for deducing task labels, which improves the alignment with appropriate prompts. Additionally, we boost task label inference accuracy by embedding prompts within the query function itself, thereby enabling better feature extraction from the samples. To further minimize inter-task confusion in cross-task classification, we incorporate simulated logits into the classifier during training. These logits emulate strong responses from other tasks, aiding in the refinement of the classifier's decision boundaries. Our method outperforms many existing prompt-based approaches, setting a new state-of-the-art record on three widely-used class-incremental learning datasets.

Original languageEnglish
Article number111933
JournalPattern Recognition
Volume170
DOIs
StatePublished - Feb 2026

Keywords

  • Class incremental learning
  • Class prototype
  • Continual learning
  • Prompt tuning
  • Simulated logits

Fingerprint

Dive into the research topics of 'Nearest-neighbor class prototype prompt and simulated logits for continual learning'. Together they form a unique fingerprint.

Cite this