TY - JOUR
T1 - Discrete and Parameter-Free Multiple Kernel k-Means
AU - Wang, Rong
AU - Lu, Jitao
AU - Lu, Yihang
AU - Nie, Feiping
AU - Li, Xuelong
N1 - Publisher Copyright:
© 1992-2012 IEEE.
PY - 2022
Y1 - 2022
N2 - The multiple kernel $k$ -means (MKKM) and its variants utilize complementary information from different sources, achieving better performance than kernel $k$ -means (KKM). However, the optimization procedures of most previous works comprise two stages, learning the continuous relaxation matrix and obtaining the discrete one by extra discretization procedures. Such a two-stage strategy gives rise to a mismatched problem and severe information loss. Even worse, most existing MKKM methods overlook the correlation among prespecified kernels, which leads to the fusion of mutually redundant kernels and bad effects on the diversity of information sources, finally resulting in unsatisfying results. To address these issues, we elaborate a novel Discrete and Parameter-free Multiple Kernel $k$ -means (DPMKKM) model solved by an alternative optimization method, which can directly obtain the cluster assignment results without subsequent discretization procedure. Moreover, DPMKKM can measure the correlation among kernels by implicitly introducing a regularization term, which is able to enhance kernel fusion by reducing redundancy and improving diversity. Noteworthily, the time complexity of optimization algorithm is successfully reduced, through masterly utilizing of coordinate descent technique, which contributes to higher algorithm efficiency and broader applications. What's more, our proposed model is parameter-free avoiding intractable hyperparameter tuning, which makes it feasible in practical applications. Lastly, extensive experiments conducted on a number of real-world datasets illustrated the effectiveness and superiority of the proposed DPMKKM model.
AB - The multiple kernel $k$ -means (MKKM) and its variants utilize complementary information from different sources, achieving better performance than kernel $k$ -means (KKM). However, the optimization procedures of most previous works comprise two stages, learning the continuous relaxation matrix and obtaining the discrete one by extra discretization procedures. Such a two-stage strategy gives rise to a mismatched problem and severe information loss. Even worse, most existing MKKM methods overlook the correlation among prespecified kernels, which leads to the fusion of mutually redundant kernels and bad effects on the diversity of information sources, finally resulting in unsatisfying results. To address these issues, we elaborate a novel Discrete and Parameter-free Multiple Kernel $k$ -means (DPMKKM) model solved by an alternative optimization method, which can directly obtain the cluster assignment results without subsequent discretization procedure. Moreover, DPMKKM can measure the correlation among kernels by implicitly introducing a regularization term, which is able to enhance kernel fusion by reducing redundancy and improving diversity. Noteworthily, the time complexity of optimization algorithm is successfully reduced, through masterly utilizing of coordinate descent technique, which contributes to higher algorithm efficiency and broader applications. What's more, our proposed model is parameter-free avoiding intractable hyperparameter tuning, which makes it feasible in practical applications. Lastly, extensive experiments conducted on a number of real-world datasets illustrated the effectiveness and superiority of the proposed DPMKKM model.
KW - Kernel method
KW - coordinate descent
KW - kernel k-means
KW - multiple kernel clustering
UR - http://www.scopus.com/inward/record.url?scp=85126281432&partnerID=8YFLogxK
U2 - 10.1109/TIP.2022.3141612
DO - 10.1109/TIP.2022.3141612
M3 - 文章
C2 - 35263253
AN - SCOPUS:85126281432
SN - 1057-7149
VL - 31
SP - 2796
EP - 2808
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
ER -