Fast Clustering via Maximizing Adaptively Within-Class Similarity

Jingjing Xue, Feiping Nie, Rong Wang, Liang Zhang, Xuelong Li

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Clustering aims to make data points in the same group have higher similarity or make data points in different groups have lower similarity. Therefore, we propose three novel fast clustering models motivated by maximizing within-class similarity, which can obtain more instinct clustering structure of data. Different from traditional clustering methods, we divide all n samples into m classes by the pseudo label propagation algorithm first, and then m classes are merged to c classes (m>c ) by the proposed three co-clustering models, where c is the real number of categories. On the one hand, dividing all samples into more subclasses first can preserve more local information. On the other hand, proposed three co-clustering models are motivated by the thought of maximizing the sum of within-class similarity, which can utilize the dual information between rows and columns. Besides, the proposed pseudo label propagation algorithm can be a new method to construct anchor graphs with linear time complexity. A series of experiments are conducted on both synthetic and real-world datasets and the experimental results show the superior performance of three models. It is worth noting that for the proposed models, FMAWS2 is the generalization of FMAWS1 and FMAWS3 is the generalization of other two.

源语言英语
页(从-至)9800-9813
页数14
期刊IEEE Transactions on Neural Networks and Learning Systems
35
7
DOI
出版状态已出版 - 2024

指纹

探究 'Fast Clustering via Maximizing Adaptively Within-Class Similarity' 的科研主题。它们共同构成独一无二的指纹。

引用此