Self-Paced and Discrete Multiple Kernel k-Means

Yihang Lu, Xuan Zheng, Jitao Lu, Rong Wang, Feiping Nie, Xuelong Li

科研成果: 书/报告/会议事项章节会议稿件同行评审

4 引用 (Scopus)

摘要

Multiple Kernel K-means (MKKM) uses various kernels from different sources to improve clustering performance. However, most of the existing models are non-convex, which is prone to be stuck into bad local optimum, especially with noise and outliers. To address the issue, we propose a novel Self-Paced and Discrete Multiple Kernel K-Means (SPD-MKKM). It learns the MKKM model in a meaningful order by progressing both samples and kernels from easy to complex, which is beneficial to avoid bad local optimum. In addition, whereas existing methods optimize in two stages: learning the relaxation matrix and then finding the discrete one by extra discretization, our work can directly gain the discrete cluster indicator matrix without extra process. What's more, a well-designed alternative optimization is employed to reduce the overall computational complexity via using the coordinate descent technique. Finally, thorough experiments performed on real-world datasets illustrated the excellence and efficacy of our method.

源语言英语
主期刊名CIKM 2022 - Proceedings of the 31st ACM International Conference on Information and Knowledge Management
出版商Association for Computing Machinery
4284-4288
页数5
ISBN(电子版)9781450392365
DOI
出版状态已出版 - 17 10月 2022
活动31st ACM International Conference on Information and Knowledge Management, CIKM 2022 - Atlanta, 美国
期限: 17 10月 202221 10月 2022

出版系列

姓名International Conference on Information and Knowledge Management, Proceedings

会议

会议31st ACM International Conference on Information and Knowledge Management, CIKM 2022
国家/地区美国
Atlanta
时期17/10/2221/10/22

指纹

探究 'Self-Paced and Discrete Multiple Kernel k-Means' 的科研主题。它们共同构成独一无二的指纹。

引用此