TY - GEN
T1 - Recurrent online kernel recursive least square algorithm for nonlinear modeling
AU - Fan, Haijin
AU - Song, Qing
AU - Xu, Zhao
PY - 2012
Y1 - 2012
N2 - In this paper, we proposed a recurrent kernel recursive least square (RLS) algorithm for online learning. In classical kernel methods, the kernel function number grows as the number of training sample increases, which makes the computational cost of the algorithm very high and only applicable for offline learning. In order to make the kernel methods suitable for online learning where the system is updated when a new training sample is obtained, a compact dictionary (support vectors set) should be chosen to represent the whole training data, which in turn reduces the number of kernel functions. For this purpose, a sparsification method based on the Hessian matrix of the loss function is applied to continuously examine the importance of the new training sample and determine the update of the dictionary according to the importance measure. We show that the Hessian matrix is equivalent to the correlation matrix of the training samples in the RLS algorithm. This makes the sparsification method able to be easily incorporated into the RLS algorithm and reduce the computational cost futher. Simulation results show that our algorithm is an effective learning method for online chaotic signal prediction and nonlinear system identification.
AB - In this paper, we proposed a recurrent kernel recursive least square (RLS) algorithm for online learning. In classical kernel methods, the kernel function number grows as the number of training sample increases, which makes the computational cost of the algorithm very high and only applicable for offline learning. In order to make the kernel methods suitable for online learning where the system is updated when a new training sample is obtained, a compact dictionary (support vectors set) should be chosen to represent the whole training data, which in turn reduces the number of kernel functions. For this purpose, a sparsification method based on the Hessian matrix of the loss function is applied to continuously examine the importance of the new training sample and determine the update of the dictionary according to the importance measure. We show that the Hessian matrix is equivalent to the correlation matrix of the training samples in the RLS algorithm. This makes the sparsification method able to be easily incorporated into the RLS algorithm and reduce the computational cost futher. Simulation results show that our algorithm is an effective learning method for online chaotic signal prediction and nonlinear system identification.
UR - http://www.scopus.com/inward/record.url?scp=84872895532&partnerID=8YFLogxK
U2 - 10.1109/IECON.2012.6388534
DO - 10.1109/IECON.2012.6388534
M3 - 会议稿件
AN - SCOPUS:84872895532
SN - 9781467324212
T3 - IECON Proceedings (Industrial Electronics Conference)
SP - 1574
EP - 1579
BT - Proceedings, IECON 2012 - 38th Annual Conference on IEEE Industrial Electronics Society
T2 - 38th Annual Conference on IEEE Industrial Electronics Society, IECON 2012
Y2 - 25 October 2012 through 28 October 2012
ER -