Hyperspherical Prototype Node Clustering

Jitao Lu, Danyang Wu, Feiping Nie, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

Abstract

The general workflow of deep node clustering is to encode the nodes into node embeddings via graph neural networks and uncover clustering decisions from them, so clustering performance is heavily affected by the embeddings. However, existing works only consider preserving the semantics of the graph but ignore the inter-cluster separability of the nodes, so there’s no guarantee that the embeddings can present a clear clustering structure. To remedy this deficiency, we propose Hyperspherical Prototype Node Clustering (HPNC), an end-to-end clustering paradigm that explicitly enhances the inter-cluster separability of learned node embeddings. Concretely, we constrain the embedding space to a unit-hypersphere, enabling us to scatter the cluster prototypes over the space with maximized pairwise distances. Then, we employ a graph autoencoder to map nodes onto the same hypersphere manifold. Consequently, cluster affinities can be directly retrieved from cosine similarities between node embeddings and prototypes. A clustering-oriented loss is imposed to sharpen the affinity distribution so that the learned node embeddings are encouraged to have small intra-cluster distances and large inter-cluster distances. Based on the proposed HPNC paradigm, we devise two schemes (HPNC-IM and HPNC-DEC) with distinct clustering backbones. Empirical results on popular benchmark datasets demonstrate the superiority of our method compared to other state-of-the-art clustering methods, and visualization results illustrate improved separability of the learned embeddings.

Original languageEnglish
JournalTransactions on Machine Learning Research
Volume2024
StatePublished - 2024

Fingerprint

Dive into the research topics of 'Hyperspherical Prototype Node Clustering'. Together they form a unique fingerprint.

Cite this