Abstract
Traditional machine learning algorithms perform well only when the training and testing sets are identically distributed. They cannot perform incremental learning for new categories or tasks that were not present in the original training set. Continual learning enables models to learn new knowledge adaptively while preventing the forgetting of old tasks. However, they still face challenges related to computation, storage overhead, and performance stability. Recent advances in pre-training models have provided new research directions for continual learning, which are promising for further performance improvements. This survey summarizes existing pre-training-based continual learning methods. According to the anti-forgetting mechanism, they are categorized into five types: Methods based on prompt pools, methods with slow parameter updating, methods based on backbone branch extension, methods based on parameter regularization, and methods based on classifier design. Additionally, these methods are classified according to the number of phases, fine-tuning approaches, and use of language modalities. Subsequently, the overall challenges of continual learning methods are analyzed, and the applicable scenarios and limitations of various continual learning methods are summarized. The main characteristics and advantages of each method are also outlined. Comprehensive experiments are conducted on multiple benchmarks, followed by in-depth discussions on the performance gaps among the different methods. Finally, the survey discusses research trends in pre-training-based continual learning methods.
| Translated title of the contribution | Survey of Pre-training-based Continual Learning Methods (Invited) |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 1-17 |
| Number of pages | 17 |
| Journal | Jisuanji Gongcheng/Computer Engineering |
| Volume | 51 |
| Issue number | 10 |
| DOIs | |
| State | Published - 15 Oct 2025 |
Fingerprint
Dive into the research topics of 'Survey of Pre-training-based Continual Learning Methods (Invited)'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver