TY - JOUR
T1 - FLIGHT
T2 - Federated Learning with IRS for Grouped Heterogeneous Training
AU - Yin, Tong
AU - Li, Lixin
AU - Ma, Donghui
AU - Lin, Wensheng
AU - Liang, Junli
AU - Han, Zhu
N1 - Publisher Copyright:
© 2022, Posts and Telecom Press Co Ltd. All rights reserved.
PY - 2022/6
Y1 - 2022/6
N2 - In recent years, federated learning (FL) has played an important role in private data-sensitive scenarios to perform learning tasks collectively without data exchange. However, due to the centralized model aggregation for heterogeneous devices in FL, the last updated model after local training delays the conver-gence, which increases the economic cost and dampens clients’ motivations for participating in FL. In addition, with the rapid development and application of intelligent reflecting surface (IRS) in the next-generation wireless communication, IRS has proven to be one effective way to enhance the communication quality. In this paper, we propose a framework of federated learning with IRS for grouped heterogeneous training (FLIGHT) to reduce the latency caused by the heterogeneous communication and computation of the clients. Specifically, we formulate a cost function and a greedy-based grouping strategy, which divides the clients into several groups to accelerate the convergence of the FL model. The simulation results verify the effectiveness of FLIGHT for accelerating the convergence of FL with heterogeneous clients. Besides the exemplified linear regression (LR) model and convolu-tional neural network (CNN), FLIGHT is also applicable to other learning models.
AB - In recent years, federated learning (FL) has played an important role in private data-sensitive scenarios to perform learning tasks collectively without data exchange. However, due to the centralized model aggregation for heterogeneous devices in FL, the last updated model after local training delays the conver-gence, which increases the economic cost and dampens clients’ motivations for participating in FL. In addition, with the rapid development and application of intelligent reflecting surface (IRS) in the next-generation wireless communication, IRS has proven to be one effective way to enhance the communication quality. In this paper, we propose a framework of federated learning with IRS for grouped heterogeneous training (FLIGHT) to reduce the latency caused by the heterogeneous communication and computation of the clients. Specifically, we formulate a cost function and a greedy-based grouping strategy, which divides the clients into several groups to accelerate the convergence of the FL model. The simulation results verify the effectiveness of FLIGHT for accelerating the convergence of FL with heterogeneous clients. Besides the exemplified linear regression (LR) model and convolu-tional neural network (CNN), FLIGHT is also applicable to other learning models.
KW - decentralized aggrega-tion
KW - federated learning
KW - grouped learning
KW - intelligent reflecting surfaces
UR - http://www.scopus.com/inward/record.url?scp=85134168595&partnerID=8YFLogxK
U2 - 10.23919/jcin.2022.9815197
DO - 10.23919/jcin.2022.9815197
M3 - 文章
AN - SCOPUS:85134168595
SN - 2096-1081
VL - 7
SP - 135
EP - 146
JO - Journal of Communications and Information Networks
JF - Journal of Communications and Information Networks
IS - 2
ER -