TY - GEN
T1 - Grouped Federated Learning
T2 - 2022 IEEE International Conference on Communications Workshops, ICC Workshops 2022
AU - Yin, Tong
AU - Li, Lixin
AU - Lin, Wensheng
AU - Ma, Donghui
AU - Han, Zhu
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - In recent years, federated learning (FL) plays an important role in data privacy-sensitive scenarios to perform learning works collectively without data exchange. However, due to the centralized model aggregation for heterogeneous devices in FL, the convergence is delayed by the last updated model after local training, which increases the economic cost and dampens clients' motivations for participating FL. In this paper, we propose a decentralized FL framework by grouping the clients with the similar computing and communication performance, named federated averaging-inspired group-based federated learning (FGFL). Specifically, we provide a cost function and a greedy-based grouping strategy, which divides the clients into several groups to accelerate the convergence of the FL model. The simulation results verify the effectiveness of FGFL for accelerating the convergence of FL with heterogeneous clients. Besides the exemplified convolutional neural network (CNN), FGFL is also applicable with other learning models.
AB - In recent years, federated learning (FL) plays an important role in data privacy-sensitive scenarios to perform learning works collectively without data exchange. However, due to the centralized model aggregation for heterogeneous devices in FL, the convergence is delayed by the last updated model after local training, which increases the economic cost and dampens clients' motivations for participating FL. In this paper, we propose a decentralized FL framework by grouping the clients with the similar computing and communication performance, named federated averaging-inspired group-based federated learning (FGFL). Specifically, we provide a cost function and a greedy-based grouping strategy, which divides the clients into several groups to accelerate the convergence of the FL model. The simulation results verify the effectiveness of FGFL for accelerating the convergence of FL with heterogeneous clients. Besides the exemplified convolutional neural network (CNN), FGFL is also applicable with other learning models.
KW - decentralized aggregation
KW - Federated learning
KW - grouped learning
UR - http://www.scopus.com/inward/record.url?scp=85134736812&partnerID=8YFLogxK
U2 - 10.1109/ICCWorkshops53468.2022.9814558
DO - 10.1109/ICCWorkshops53468.2022.9814558
M3 - 会议稿件
AN - SCOPUS:85134736812
T3 - 2022 IEEE International Conference on Communications Workshops, ICC Workshops 2022
SP - 55
EP - 60
BT - 2022 IEEE International Conference on Communications Workshops, ICC Workshops 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 May 2022 through 20 May 2022
ER -