Collaborative Global-Local Structure Network With Knowledge Distillation for Imbalanced Data Classification

Feiyan Wu, Zhunga Liu, Zuowei Zhang, Jiaxiang Liu, Longfei Wang

Research output: Contribution to journalArticlepeer-review

Abstract

Multi-expert networks have shown great superioritfor imbalanced data classification tasks due to their complementary and diverse. We have summarized two aspects forfurther explorations: (1) uncontrollable results, arising from the performance differences of individual experts and variations in sample difficulty; (2) insufficient exploration of the internal data structure. These factors result in inconsistent model performance across different data distributions, there by impact the model’s generalization ability. To address the above issues, we propose a Collaborative Global-Local Structure Network (CGL-Net) with knowledge distillation for imbalanced data classification. Firstly,CGL-Net, as a new framework, decouples the representation learning of imbalanced data into global and local structure,enhancing the controllability of integration model in a hierarchical manner. Secondly, CGL-Net innovatively combines knowledgedistillation, data augmentation, and multiple expert networks, efficiently extracting the internal structure of the data andimproving robust recognition on imbalanced data. In particular, the global structure learning introduces an independent student network that integrates knowledge from diverse experts, enabling the model to achieve comprehensive and balanced performance across categories in imbalanced data. The local structure learningincorporates augmented data, allowing the model to focus ondiscriminative regional learning of individual objects, therebyenhances the robust representation for imbalanced data. After completing these two sequential learning stages, the model hierarchically integrates knowledge to achieve robust recognition performance on imbalanced data. Extensive experiments on six benchmark data sets demonstrate that the proposed CGL-Net significantly outperforms recent state-of-the-art methods.

Original languageEnglish
Pages (from-to)2450-2460
Number of pages11
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume35
Issue number3
DOIs
StatePublished - 2025

Keywords

  • Global structure learning
  • imbalanced data classification
  • knowledge distillation
  • local structure learning

Fingerprint

Dive into the research topics of 'Collaborative Global-Local Structure Network With Knowledge Distillation for Imbalanced Data Classification'. Together they form a unique fingerprint.

Cite this