TY - JOUR
T1 - Decentralized federated learning based on bivariate controlled averaging and sharpness aware minimization
AU - Yang, Jihao
AU - Jiang, Wen
AU - Nie, Laisen
N1 - Publisher Copyright:
© 2025
PY - 2026/2/1
Y1 - 2026/2/1
N2 - To avoid the single-point-of-failure problem of the central server of centralized federated learning (CFL) and mitigate the communication burden, decentralized federated learning (DFL) implements model training by establishing peer-to-peer communication directly among clients. However, the lack of coordination and consistency of the central server causes DFL to encounter more serious data heterogeneity challenges. In addition, different communication topologies also affect the performance of DFL aggregated models. In order to overcome the data heterogeneity challenge faced in DFL, we consider extending the client drift in CFL to DFL, and target to propose a decentralized federated learning algorithm based on bivariate controlled averaging to improve the adaptability and generalization of the model to heterogeneous data by solving the distributed client drift problem caused by data heterogeneity. Furthermore, we propose a variant based on sharpness aware minimization, which further improves the model performance and accelerates the model convergence by optimizing the gradient during the local model update. Experimental results show that our approach achieves better model performance and faster model convergence under various experimental settings on multiple datasets, as well as exhibits strong adaptability to different topologies.
AB - To avoid the single-point-of-failure problem of the central server of centralized federated learning (CFL) and mitigate the communication burden, decentralized federated learning (DFL) implements model training by establishing peer-to-peer communication directly among clients. However, the lack of coordination and consistency of the central server causes DFL to encounter more serious data heterogeneity challenges. In addition, different communication topologies also affect the performance of DFL aggregated models. In order to overcome the data heterogeneity challenge faced in DFL, we consider extending the client drift in CFL to DFL, and target to propose a decentralized federated learning algorithm based on bivariate controlled averaging to improve the adaptability and generalization of the model to heterogeneous data by solving the distributed client drift problem caused by data heterogeneity. Furthermore, we propose a variant based on sharpness aware minimization, which further improves the model performance and accelerates the model convergence by optimizing the gradient during the local model update. Experimental results show that our approach achieves better model performance and faster model convergence under various experimental settings on multiple datasets, as well as exhibits strong adaptability to different topologies.
KW - Data heterogeneity
KW - Decentralized federated learning
KW - Distributed client drift
KW - Model correction
UR - https://www.scopus.com/pages/publications/105014266660
U2 - 10.1016/j.eswa.2025.129438
DO - 10.1016/j.eswa.2025.129438
M3 - 文章
AN - SCOPUS:105014266660
SN - 0957-4174
VL - 297
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 129438
ER -