TY - JOUR
T1 - Collaborative Global-Local Structure Network With Knowledge Distillation for Imbalanced Data Classification
AU - Wu, Feiyan
AU - Liu, Zhunga
AU - Zhang, Zuowei
AU - Liu, Jiaxiang
AU - Wang, Longfei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2025
Y1 - 2025
N2 - Multi-expert networks have shown great superioritfor imbalanced data classification tasks due to their complementary and diverse. We have summarized two aspects forfurther explorations: (1) uncontrollable results, arising from the performance differences of individual experts and variations in sample difficulty; (2) insufficient exploration of the internal data structure. These factors result in inconsistent model performance across different data distributions, there by impact the model’s generalization ability. To address the above issues, we propose a Collaborative Global-Local Structure Network (CGL-Net) with knowledge distillation for imbalanced data classification. Firstly,CGL-Net, as a new framework, decouples the representation learning of imbalanced data into global and local structure,enhancing the controllability of integration model in a hierarchical manner. Secondly, CGL-Net innovatively combines knowledgedistillation, data augmentation, and multiple expert networks, efficiently extracting the internal structure of the data andimproving robust recognition on imbalanced data. In particular, the global structure learning introduces an independent student network that integrates knowledge from diverse experts, enabling the model to achieve comprehensive and balanced performance across categories in imbalanced data. The local structure learningincorporates augmented data, allowing the model to focus ondiscriminative regional learning of individual objects, therebyenhances the robust representation for imbalanced data. After completing these two sequential learning stages, the model hierarchically integrates knowledge to achieve robust recognition performance on imbalanced data. Extensive experiments on six benchmark data sets demonstrate that the proposed CGL-Net significantly outperforms recent state-of-the-art methods.
AB - Multi-expert networks have shown great superioritfor imbalanced data classification tasks due to their complementary and diverse. We have summarized two aspects forfurther explorations: (1) uncontrollable results, arising from the performance differences of individual experts and variations in sample difficulty; (2) insufficient exploration of the internal data structure. These factors result in inconsistent model performance across different data distributions, there by impact the model’s generalization ability. To address the above issues, we propose a Collaborative Global-Local Structure Network (CGL-Net) with knowledge distillation for imbalanced data classification. Firstly,CGL-Net, as a new framework, decouples the representation learning of imbalanced data into global and local structure,enhancing the controllability of integration model in a hierarchical manner. Secondly, CGL-Net innovatively combines knowledgedistillation, data augmentation, and multiple expert networks, efficiently extracting the internal structure of the data andimproving robust recognition on imbalanced data. In particular, the global structure learning introduces an independent student network that integrates knowledge from diverse experts, enabling the model to achieve comprehensive and balanced performance across categories in imbalanced data. The local structure learningincorporates augmented data, allowing the model to focus ondiscriminative regional learning of individual objects, therebyenhances the robust representation for imbalanced data. After completing these two sequential learning stages, the model hierarchically integrates knowledge to achieve robust recognition performance on imbalanced data. Extensive experiments on six benchmark data sets demonstrate that the proposed CGL-Net significantly outperforms recent state-of-the-art methods.
KW - Global structure learning
KW - imbalanced data classification
KW - knowledge distillation
KW - local structure learning
UR - http://www.scopus.com/inward/record.url?scp=86000805626&partnerID=8YFLogxK
U2 - 10.1109/TCSVT.2024.3487867
DO - 10.1109/TCSVT.2024.3487867
M3 - 文章
AN - SCOPUS:86000805626
SN - 1051-8215
VL - 35
SP - 2450
EP - 2460
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 3
ER -