TY - JOUR
T1 - Towards Robust Differential Privacy in Adaptive Federated Learning Architectures
AU - Jin, Zengwang
AU - Xu, Chenhao
AU - Wang, Zhen
AU - Sun, Changyin
N1 - Publisher Copyright:
© 1975-2011 IEEE.
PY - 2025
Y1 - 2025
N2 - The essential issues of data silos and user privacy leakage could be relaxed substantially by the development of the federated learning (FL) architecture. In a collaborative multi-user modeling situation, malicious attackers could still use user gradient information to infer the danger of user privacy. To mitigate the issue of privacy leakage, differential privacy (DP) mechanism is integrated into the federated learning framework to assess privacy loss and introduce noise to the local model parameters of users. In addition, in order to minimize information leakage and provide better noise rejection, Rényi differential privacy (RDP) is introduced as a privacy metric, which improves the balance between model privacy and utility. Owing to the unknown target model and limited communication cost resources, a client-based adaptive learning algorithm is developed in which each local model parameter is adaptively updated during local iterations to accelerate model convergence and avoid model overfitting. The experimental results reveal that the client-based adaptive federation learning model in this paper outperforms the classic model at a fixed communication cost, is more robust to noise resistance and variable hyperparameter settings, and provides more accurate privacy protection during transmission.
AB - The essential issues of data silos and user privacy leakage could be relaxed substantially by the development of the federated learning (FL) architecture. In a collaborative multi-user modeling situation, malicious attackers could still use user gradient information to infer the danger of user privacy. To mitigate the issue of privacy leakage, differential privacy (DP) mechanism is integrated into the federated learning framework to assess privacy loss and introduce noise to the local model parameters of users. In addition, in order to minimize information leakage and provide better noise rejection, Rényi differential privacy (RDP) is introduced as a privacy metric, which improves the balance between model privacy and utility. Owing to the unknown target model and limited communication cost resources, a client-based adaptive learning algorithm is developed in which each local model parameter is adaptively updated during local iterations to accelerate model convergence and avoid model overfitting. The experimental results reveal that the client-based adaptive federation learning model in this paper outperforms the classic model at a fixed communication cost, is more robust to noise resistance and variable hyperparameter settings, and provides more accurate privacy protection during transmission.
KW - Adaptive gradient descent
KW - Differential privacy
KW - Federated learning
KW - Privacy computing
UR - http://www.scopus.com/inward/record.url?scp=85215598143&partnerID=8YFLogxK
U2 - 10.1109/TCE.2024.3525084
DO - 10.1109/TCE.2024.3525084
M3 - 文章
AN - SCOPUS:85215598143
SN - 0098-3063
JO - IEEE Transactions on Consumer Electronics
JF - IEEE Transactions on Consumer Electronics
ER -