FLIGHT: Federated Learning with IRS for Grouped Heterogeneous Training

Tong Yin, Lixin Li, Donghui Ma, Wensheng Lin, Junli Liang, Zhu Han

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

In recent years, federated learning (FL) has played an important role in private data-sensitive scenarios to perform learning tasks collectively without data exchange. However, due to the centralized model aggregation for heterogeneous devices in FL, the last updated model after local training delays the conver-gence, which increases the economic cost and dampens clients’ motivations for participating in FL. In addition, with the rapid development and application of intelligent reflecting surface (IRS) in the next-generation wireless communication, IRS has proven to be one effective way to enhance the communication quality. In this paper, we propose a framework of federated learning with IRS for grouped heterogeneous training (FLIGHT) to reduce the latency caused by the heterogeneous communication and computation of the clients. Specifically, we formulate a cost function and a greedy-based grouping strategy, which divides the clients into several groups to accelerate the convergence of the FL model. The simulation results verify the effectiveness of FLIGHT for accelerating the convergence of FL with heterogeneous clients. Besides the exemplified linear regression (LR) model and convolu-tional neural network (CNN), FLIGHT is also applicable to other learning models.

Original languageEnglish
Pages (from-to)135-146
Number of pages12
JournalJournal of Communications and Information Networks
Volume7
Issue number2
DOIs
StatePublished - Jun 2022

Keywords

  • decentralized aggrega-tion
  • federated learning
  • grouped learning
  • intelligent reflecting surfaces

Fingerprint

Dive into the research topics of 'FLIGHT: Federated Learning with IRS for Grouped Heterogeneous Training'. Together they form a unique fingerprint.

Cite this