TY - JOUR
T1 - An efficient gradient-based model selection algorithm for multi-output least-squares support vector regression machines
AU - Zhu, Xinqi
AU - Gao, Zhenghong
N1 - Publisher Copyright:
© 2018 Elsevier B.V.
PY - 2018/8/1
Y1 - 2018/8/1
N2 - Multi-output least-squares support vector regression machines (MLS-SVR) is proposed by Xu et al. [29] to handle multi-output regression problems. However, the prohibitive cost of model selection severely hinders MLS-SVR's application. In this paper, an efficient gradient-based model selection algorithm for MLS-SVR is proposed. Firstly, a new training algorithm for MLS-SVR is developed, which allows one to obtain the solution vector for each output independently by dealing with matrices of much lower order. Based on the new training algorithm, a new leave-one-out error estimate is derived through virtual leave-one-out cross-validation. The model selection criterion is based on the new leave-one-out error estimate and its derivatives with respect to the hyper-parameters are also derived analytically. Both the model selection criterion and its partial derivatives can be obtained straightway once a training process ended. Finally, the hyper-parameters corresponding to the lowest model selection criterion is obtained through gradient decent method. The effectiveness and generalization performance of the proposed algorithm are validated through experiments on several multi-output datasets. Experiment results show that the proposed algorithm can save computational time dramatically without losing accuracy.
AB - Multi-output least-squares support vector regression machines (MLS-SVR) is proposed by Xu et al. [29] to handle multi-output regression problems. However, the prohibitive cost of model selection severely hinders MLS-SVR's application. In this paper, an efficient gradient-based model selection algorithm for MLS-SVR is proposed. Firstly, a new training algorithm for MLS-SVR is developed, which allows one to obtain the solution vector for each output independently by dealing with matrices of much lower order. Based on the new training algorithm, a new leave-one-out error estimate is derived through virtual leave-one-out cross-validation. The model selection criterion is based on the new leave-one-out error estimate and its derivatives with respect to the hyper-parameters are also derived analytically. Both the model selection criterion and its partial derivatives can be obtained straightway once a training process ended. Finally, the hyper-parameters corresponding to the lowest model selection criterion is obtained through gradient decent method. The effectiveness and generalization performance of the proposed algorithm are validated through experiments on several multi-output datasets. Experiment results show that the proposed algorithm can save computational time dramatically without losing accuracy.
KW - Gradient descent optimization
KW - Leave-one-out cross-validation
KW - Model selection
KW - Multi-output regression
KW - Support vector machines
UR - http://www.scopus.com/inward/record.url?scp=85045396571&partnerID=8YFLogxK
U2 - 10.1016/j.patrec.2018.01.023
DO - 10.1016/j.patrec.2018.01.023
M3 - 文章
AN - SCOPUS:85045396571
SN - 0167-8655
VL - 111
SP - 16
EP - 22
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
ER -