TY - GEN
T1 - A Comprehensive Review of Continual Learning with Machine Learning Models
AU - Liu, Shengqiang
AU - Pan, Ting
AU - Wang, Chaoqun
AU - Ma, Xiaowen
AU - Dong, Wei
AU - Hu, Tao
AU - Zhang, Song
AU - Zhang, Yanning
AU - Yan, Qingsen
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - Machine learning models have demonstrated exceptional performance in a wide array of individual tasks, and in some instances, they have even surpassed human-level capabilities. Nevertheless, these models grapple with substantial challenges when it comes to achieving continual learning in the face of dynamically incoming data from diverse tasks. Continual learning, which involves consistently acquiring new knowledge while retaining past experiences over extended periods, stands as a pivotal aspect of machine learning systems. Regrettably, continual learning encounters a significant hurdle known as catastrophic forgetting, stemming from the inherent constraints within neural networks, particularly the stability-plasticity dilemma. Catastrophic forgetting manifests as the tendency to disregard previously acquired knowledge when new tasks or domains are introduced, resulting in a pronounced deterioration in performance on tasks or domains learned earlier. To counteract catastrophic forgetting, researchers have devised a multitude of continual learning approaches. In this paper, we aim to provide a comprehensive introduction to the fundamentals of continual learning and present various scenarios where continual learning is applicable. Furthermore, we will meticulously classify and critically evaluate the methodologies put forth in previous research.
AB - Machine learning models have demonstrated exceptional performance in a wide array of individual tasks, and in some instances, they have even surpassed human-level capabilities. Nevertheless, these models grapple with substantial challenges when it comes to achieving continual learning in the face of dynamically incoming data from diverse tasks. Continual learning, which involves consistently acquiring new knowledge while retaining past experiences over extended periods, stands as a pivotal aspect of machine learning systems. Regrettably, continual learning encounters a significant hurdle known as catastrophic forgetting, stemming from the inherent constraints within neural networks, particularly the stability-plasticity dilemma. Catastrophic forgetting manifests as the tendency to disregard previously acquired knowledge when new tasks or domains are introduced, resulting in a pronounced deterioration in performance on tasks or domains learned earlier. To counteract catastrophic forgetting, researchers have devised a multitude of continual learning approaches. In this paper, we aim to provide a comprehensive introduction to the fundamentals of continual learning and present various scenarios where continual learning is applicable. Furthermore, we will meticulously classify and critically evaluate the methodologies put forth in previous research.
KW - Catastrophic Forgetting
KW - Continual Learning
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=85187803723&partnerID=8YFLogxK
U2 - 10.1007/978-981-97-0855-0_47
DO - 10.1007/978-981-97-0855-0_47
M3 - 会议稿件
AN - SCOPUS:85187803723
SN - 9789819708543
T3 - Lecture Notes in Electrical Engineering
SP - 504
EP - 512
BT - Proceedings of International Conference on Image, Vision and Intelligent Systems, ICIVIS 2023
A2 - You, Peng
A2 - Liu, Shuaiqi
A2 - Wang, Jun
PB - Springer Science and Business Media Deutschland GmbH
T2 - International Conference on Image, Vision and Intelligent Systems, ICIVIS 2023
Y2 - 16 August 2023 through 18 August 2023
ER -