TY - JOUR
T1 - Localized Multiple Kernel Learning With Dynamical Clustering and Matrix Regularization
AU - Han, Yina
AU - Yang, Kunde
AU - Yang, Yixin
AU - Ma, Yuanliang
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2018/2
Y1 - 2018/2
N2 - Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features with regard to their discriminative power for each individual sample. However, the learning of numerous local solutions may not scale well even for a moderately sized training set, and the independently learned local models may suffer from overfitting. Hence, in existing local methods, the distributed samples are typically assumed to share the same weights, and various unsupervised clustering methods are applied as preprocessing. In this paper, to enable the learner to discover and benefit from the underlying local coherence and diversity of the samples, we incorporate the clustering procedure into the canonical support vector machine-based LMKL framework. Then, to explore the relatedness among different samples, which has been ignored in a vector ℓp-norm analysis, we organize the cluster-specific kernel weights into a matrix and introduce a matrix-based extension of the ℓp-norm for constraint enforcement. By casting the joint optimization problem as a problem of alternating optimization, we show how the cluster structure is gradually revealed and how the matrix-regularized kernel weights are obtained. A theoretical analysis of such a regularizer is performed using a Rademacher complexity bound, and complementary empirical experiments on real-world data sets demonstrate the effectiveness of our technique.
AB - Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features with regard to their discriminative power for each individual sample. However, the learning of numerous local solutions may not scale well even for a moderately sized training set, and the independently learned local models may suffer from overfitting. Hence, in existing local methods, the distributed samples are typically assumed to share the same weights, and various unsupervised clustering methods are applied as preprocessing. In this paper, to enable the learner to discover and benefit from the underlying local coherence and diversity of the samples, we incorporate the clustering procedure into the canonical support vector machine-based LMKL framework. Then, to explore the relatedness among different samples, which has been ignored in a vector ℓp-norm analysis, we organize the cluster-specific kernel weights into a matrix and introduce a matrix-based extension of the ℓp-norm for constraint enforcement. By casting the joint optimization problem as a problem of alternating optimization, we show how the cluster structure is gradually revealed and how the matrix-regularized kernel weights are obtained. A theoretical analysis of such a regularizer is performed using a Rademacher complexity bound, and complementary empirical experiments on real-world data sets demonstrate the effectiveness of our technique.
KW - Dynamical clustering
KW - localized multiple kernel learning (LMKL)
KW - matrix regularization
KW - support vector machine (SVM)
UR - http://www.scopus.com/inward/record.url?scp=85007347263&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2016.2635151
DO - 10.1109/TNNLS.2016.2635151
M3 - 文章
C2 - 28029631
AN - SCOPUS:85007347263
SN - 2162-237X
VL - 29
SP - 486
EP - 499
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 2
M1 - 7792117
ER -