Abstract
Federated learning (FL) enables collaborative model training across distributed clients without sharing raw data, making it a key paradigm for privacy-preserving the Internet of Things (IoT) applications. However, FL suffers from client drift caused by data heterogeneity, which leads to degraded global model performance and slow convergence. Existing solutions, such as FedProx, SCAFFOLD, and FedDC, partially mitigate drift, while they either fail to explicitly correct parameter deviations (e.g., FedProx) or require the exchange of auxiliary states (e.g., SCAFFOLD and FedDC), which leads to introduce substantial extra communication overhead. To address these limitations, we propose a drift-corrected FL algorithm that separately maintains local and global drift variables to capture deviations at different levels, and incorporates a global parameter correction term into the local objective function. This design not only enhances consistency between local and global updates but also eliminates the need for transmitting additional variables beyond model parameters, thereby reducing communication costs. Extensive experiments on five benchmark datasets, including CIFAR-10, CIFAR-100, MNIST, EMNIST-L, and EuroSAT, demonstrate that FedDCA provides consistent performance gains over baseline methods, achieving on average 3% higher accuracy while reducing the number of communication rounds by half under heterogeneous data distributions. Moreover, FedDCA exhibits strong robustness under low client participation rates in large-scale settings, maintaining stable training behavior across a wide range of experimental configurations. Therefore, this study offers a valuable framework for applying FL in IoT environments, effectively enhancing the model’s ability to combat client drift in both theoretical and practical contexts.
| Original language | English |
|---|---|
| Journal | IEEE Internet of Things Journal |
| DOIs | |
| State | Accepted/In press - 2026 |
Keywords
- Client drift
- Data heterogeneity
- Federated learning
- Internet of Things
- Model correction
Fingerprint
Dive into the research topics of 'Federated Learning with Drift Correction and Convergence Acceleration'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver