Abstract
Continual learning allows a single model to acquire knowledge from a sequence of tasks within a non-static data stream without succumbing to catastrophic forgetting. Vision transformers, pre-trained on extensive datasets, have recently made prompt-based methods viable as exemplar-free alternatives to methods reliant on rehearsal. Nonetheless, the majority of these methods employ a key–value query system for integrating pertinent prompts, which might result in the keys becoming stuck in local minima. To counter this, we suggest a straightforward nearest-neighbor class prototype search approach for deducing task labels, which improves the alignment with appropriate prompts. Additionally, we boost task label inference accuracy by embedding prompts within the query function itself, thereby enabling better feature extraction from the samples. To further minimize inter-task confusion in cross-task classification, we incorporate simulated logits into the classifier during training. These logits emulate strong responses from other tasks, aiding in the refinement of the classifier's decision boundaries. Our method outperforms many existing prompt-based approaches, setting a new state-of-the-art record on three widely-used class-incremental learning datasets.
Original language | English |
---|---|
Article number | 111933 |
Journal | Pattern Recognition |
Volume | 170 |
DOIs | |
State | Published - Feb 2026 |
Keywords
- Class incremental learning
- Class prototype
- Continual learning
- Prompt tuning
- Simulated logits