Toward Balance Deep Semisupervised Clustering

Yu Duan, Zhoumin Lu, Rong Wang, Xuelong Li, Feiping Nie

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The goal of balanced clustering is partitioning data into distinct groups of equal size. Previous studies have attempted to address this problem by designing balanced regularizers or utilizing conventional clustering methods. However, these methods often rely solely on classic methods, which limits their performance and primarily focuses on low-dimensional data. Although neural networks exhibit effective performance on high-dimensional datasets, they struggle to effectively leverage prior knowledge for clustering with a balanced tendency. To overcome the above limitations, we propose deep semisupervised balanced clustering, which simultaneously learns clustering and generates balance-favorable representations. Our model is based on the autoencoder paradigm incorporating a semisupervised module. Specifically, we introduce a balance-oriented clustering loss and incorporate pairwise constraints into the penalty term as a pluggable module using the Lagrangian multiplier method. Theoretically, we ensure that the proposed model maintains a balanced orientation and provides a comprehensive optimization process. Empirically, we conducted extensive experiments on four datasets to demonstrate significant improvements in clustering performance and balanced measurements. Our code is available at https://github.com/DuannYu/BalancedSemi-TNNLS.

Original languageEnglish
Pages (from-to)2816-2828
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume36
Issue number2
DOIs
StatePublished - 2025

Keywords

  • Balanced clustering
  • Lagrangian multipliers
  • deep clustering
  • pairwise information

Fingerprint

Dive into the research topics of 'Toward Balance Deep Semisupervised Clustering'. Together they form a unique fingerprint.

Cite this