TY - GEN
T1 - FedAux
T2 - 2022 IEEE International Conference on Communications, ICC 2022
AU - Gu, Hang
AU - Guo, Bin
AU - Wang, Jiangtao
AU - Sun, Wen
AU - Liu, Jiaqi
AU - Liu, Sicong
AU - Yu, Zhiwen
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - As an enabler of sixth-generation communication technology (6G), Federated Learning (FL) triggers a paradigm shift from "connected things"to "connected intelligence". FL implements on-device learning, where massive end devices jointly and locally train a model without private data leakage. However, FL suffers from problems of low accuracy and convergence rate when no data is shared to the central server and the data distribution is non-IID. In recent years, attempts have been made on hybrid FL, where very small amounts of data (e.g., less than 1%) is shared from the participants. With the opportunities brought by shared data, we notice that the server is capable of receiving the data in order to assist the FL process and mitigate the challenge of non-IID. Notably, existing hybrid FL only applies the model-level technologies belonging to the traditional FL and does not make full use of the characteristics of shared data to make targeted improvements. In this paper, we propose FedAux, a novel hybrid FL method at knowledge-level, which utilizes shared data to construct an auxiliary model and then transfer general knowledge to traditional aggregated model or client model for enhancing the accuracy of global model and speeding up the convergence of global model. We also propose two specific knowledge transfer strategies named c-transfer and i-transfer. We conduct extensive analysis and evaluation of our methods against the well-known FL methods, FedAvg and Hybrid-FL protocol. The results indicate that FedAux shows higher accuracy (10.89%) and faster convergence rate compared with other methods.
AB - As an enabler of sixth-generation communication technology (6G), Federated Learning (FL) triggers a paradigm shift from "connected things"to "connected intelligence". FL implements on-device learning, where massive end devices jointly and locally train a model without private data leakage. However, FL suffers from problems of low accuracy and convergence rate when no data is shared to the central server and the data distribution is non-IID. In recent years, attempts have been made on hybrid FL, where very small amounts of data (e.g., less than 1%) is shared from the participants. With the opportunities brought by shared data, we notice that the server is capable of receiving the data in order to assist the FL process and mitigate the challenge of non-IID. Notably, existing hybrid FL only applies the model-level technologies belonging to the traditional FL and does not make full use of the characteristics of shared data to make targeted improvements. In this paper, we propose FedAux, a novel hybrid FL method at knowledge-level, which utilizes shared data to construct an auxiliary model and then transfer general knowledge to traditional aggregated model or client model for enhancing the accuracy of global model and speeding up the convergence of global model. We also propose two specific knowledge transfer strategies named c-transfer and i-transfer. We conduct extensive analysis and evaluation of our methods against the well-known FL methods, FedAvg and Hybrid-FL protocol. The results indicate that FedAux shows higher accuracy (10.89%) and faster convergence rate compared with other methods.
KW - 6G networks
KW - Feature transferable theory
KW - Federated learning
KW - Non-IID data
UR - http://www.scopus.com/inward/record.url?scp=85137267190&partnerID=8YFLogxK
U2 - 10.1109/ICC45855.2022.9839129
DO - 10.1109/ICC45855.2022.9839129
M3 - 会议稿件
AN - SCOPUS:85137267190
T3 - IEEE International Conference on Communications
SP - 195
EP - 200
BT - ICC 2022 - IEEE International Conference on Communications
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 16 May 2022 through 20 May 2022
ER -