基于预训练的持续学习方法综述(特邀)

Translated title of the contribution: Survey of Pre-training-based Continual Learning Methods (Invited)
  • Yue Lu
  • , Xiangyu Zhou
  • , Shizhou Zhang
  • , Guoqiang Liang
  • , Yinghui Xing
  • , De Cheng
  • , Yanning Zhang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Traditional machine learning algorithms perform well only when the training and testing sets are identically distributed. They cannot perform incremental learning for new categories or tasks that were not present in the original training set. Continual learning enables models to learn new knowledge adaptively while preventing the forgetting of old tasks. However, they still face challenges related to computation, storage overhead, and performance stability. Recent advances in pre-training models have provided new research directions for continual learning, which are promising for further performance improvements. This survey summarizes existing pre-training-based continual learning methods. According to the anti-forgetting mechanism, they are categorized into five types: Methods based on prompt pools, methods with slow parameter updating, methods based on backbone branch extension, methods based on parameter regularization, and methods based on classifier design. Additionally, these methods are classified according to the number of phases, fine-tuning approaches, and use of language modalities. Subsequently, the overall challenges of continual learning methods are analyzed, and the applicable scenarios and limitations of various continual learning methods are summarized. The main characteristics and advantages of each method are also outlined. Comprehensive experiments are conducted on multiple benchmarks, followed by in-depth discussions on the performance gaps among the different methods. Finally, the survey discusses research trends in pre-training-based continual learning methods.

Translated title of the contributionSurvey of Pre-training-based Continual Learning Methods (Invited)
Original languageChinese (Traditional)
Pages (from-to)1-17
Number of pages17
JournalJisuanji Gongcheng/Computer Engineering
Volume51
Issue number10
DOIs
StatePublished - 15 Oct 2025

Fingerprint

Dive into the research topics of 'Survey of Pre-training-based Continual Learning Methods (Invited)'. Together they form a unique fingerprint.

Cite this