TY - JOUR
T1 - L p norm localized multiple kernel learning via semi-definite programming
AU - Han, Yina
AU - Yang, Kunde
AU - Liu, Guizhong
PY - 2012
Y1 - 2012
N2 - Our objective is to train SVM based Localized Multiple Kernel Learning with arbitrary l p-norm constraint using the alternating optimization between the standard SVM solvers with the localized combination of base kernels and associated sample-specific kernel weights. Unfortunately, the latter forms a difficult l p-norm constraint quadratic optimization. In this letter, by approximating the l p-norm using Taylor expansion, the problem of updating the localized kernel weights is reformulated as a non-convex quadratically constraint quadratic programming, and then solved via associated convex Semi-Definite Programming relaxation. Experiments on ten benchmark machine learning datasets demonstrate the advantages of our approach.
AB - Our objective is to train SVM based Localized Multiple Kernel Learning with arbitrary l p-norm constraint using the alternating optimization between the standard SVM solvers with the localized combination of base kernels and associated sample-specific kernel weights. Unfortunately, the latter forms a difficult l p-norm constraint quadratic optimization. In this letter, by approximating the l p-norm using Taylor expansion, the problem of updating the localized kernel weights is reformulated as a non-convex quadratically constraint quadratic programming, and then solved via associated convex Semi-Definite Programming relaxation. Experiments on ten benchmark machine learning datasets demonstrate the advantages of our approach.
KW - Localized multiple kernel learning
KW - semi-definite programming
KW - support vector machine
UR - http://www.scopus.com/inward/record.url?scp=84865364988&partnerID=8YFLogxK
U2 - 10.1109/LSP.2012.2212431
DO - 10.1109/LSP.2012.2212431
M3 - 文章
AN - SCOPUS:84865364988
SN - 1070-9908
VL - 19
SP - 688
EP - 691
JO - IEEE Signal Processing Letters
JF - IEEE Signal Processing Letters
IS - 10
M1 - 6263271
ER -