TY - GEN
T1 - HFL
T2 - 2025 IEEE/CIC International Conference on Communications in China, ICCC 2025
AU - Li, Lin
AU - Li, Lixin
AU - Lin, Wensheng
AU - Zhang, Kexin
AU - Han, Zhu
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - In frequency-division duplex (FDD) large-scale multiple-input multiple-output (MIMO) systems, acquiring accurate downlink channel state information (CSI) is crucial for enhancing system performance but faces challenges such as feedback overhead and channel aging. Federated learning (FL) offers a privacy-preserving collaborative CSI prediction approach; however, the inherent heterogeneity of base station environments and antenna configurations in real-world networks impedes the application of FL algorithms. To address this issue, we design a Heterogeneous Federated Learning (HFL) Framework based on Transformer. This framework employs heterogeneous neural network architectures where each client shares a powerful backbone network that learns universal channel dynamics through FL. Additionally, we propose a hierarchical attentionbased dynamic weighted federated aggregation mechanism to tackle convergence challenges caused by heterogeneity. We also investigate the impact of the shared backbone network scale on prediction performance and computational cost, validating its scalability. Simulation results demonstrate that the proposed HFL framework significantly outperforms both local-only training and standard federated averaging baseline methods in terms of CSI prediction accuracy under varying user speeds and signal-to-noise ratios (SNRs), proving the method's effectiveness in handling heterogeneity.
AB - In frequency-division duplex (FDD) large-scale multiple-input multiple-output (MIMO) systems, acquiring accurate downlink channel state information (CSI) is crucial for enhancing system performance but faces challenges such as feedback overhead and channel aging. Federated learning (FL) offers a privacy-preserving collaborative CSI prediction approach; however, the inherent heterogeneity of base station environments and antenna configurations in real-world networks impedes the application of FL algorithms. To address this issue, we design a Heterogeneous Federated Learning (HFL) Framework based on Transformer. This framework employs heterogeneous neural network architectures where each client shares a powerful backbone network that learns universal channel dynamics through FL. Additionally, we propose a hierarchical attentionbased dynamic weighted federated aggregation mechanism to tackle convergence challenges caused by heterogeneity. We also investigate the impact of the shared backbone network scale on prediction performance and computational cost, validating its scalability. Simulation results demonstrate that the proposed HFL framework significantly outperforms both local-only training and standard federated averaging baseline methods in terms of CSI prediction accuracy under varying user speeds and signal-to-noise ratios (SNRs), proving the method's effectiveness in handling heterogeneity.
KW - Attention-based Aggregation
KW - Channel Prediction
KW - Heterogeneous Federated Learning
UR - https://www.scopus.com/pages/publications/105017632102
U2 - 10.1109/ICCC65529.2025.11148877
DO - 10.1109/ICCC65529.2025.11148877
M3 - 会议稿件
AN - SCOPUS:105017632102
T3 - 2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025
BT - 2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 10 August 2025 through 13 August 2025
ER -