TY - JOUR
T1 - ClassTer
T2 - Mobile Shift-Robust Personalized Federated Learning via Class-Wise Clustering
AU - Li, Xiaochen
AU - Liu, Sicong
AU - Zhou, Zimu
AU - Xu, Yuan
AU - Guo, Bin
AU - Yu, Zhiwen
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - The rise of mobile devices with abundant sensor data and computing power has driven the trend of federated learning (FL) on them. Personalized FL (PFL) aims to train tailored models for each device, addressing data heterogeneity from diverse user behaviors and preferences. However, due to dynamic mobile environments, PFL faces challenges in test-time data shifts, i.e., variations between training and testing. While this issue is well studied in generic deep learning through model generalization or adaptation, this issue remains less explored in PFL, where models often overfit local data. To address this, we introduce ClassTer, a shift-robust PFL framework. We observe that class-wise clustering of clients in cluster-based PFL (CFL) can avoid class-specific biases by decoupling the training of classes. Thus, we propose a paradigm shift from traditional client-wise clustering to class-wise clustering, which allows effective aggregation of cluster models into a generalized one via knowledge distillation. Additionally, we extend ClassTer to asynchronous mobile clients to optimize wall clock time by leveraging critical learning periods and both intra- and inter-device scheduling. Experiments show that compared to status quo approaches, ClassTer achieves a reduction of up to 91% in convergence time, and an improvement of up to 50.45% in accuracy.
AB - The rise of mobile devices with abundant sensor data and computing power has driven the trend of federated learning (FL) on them. Personalized FL (PFL) aims to train tailored models for each device, addressing data heterogeneity from diverse user behaviors and preferences. However, due to dynamic mobile environments, PFL faces challenges in test-time data shifts, i.e., variations between training and testing. While this issue is well studied in generic deep learning through model generalization or adaptation, this issue remains less explored in PFL, where models often overfit local data. To address this, we introduce ClassTer, a shift-robust PFL framework. We observe that class-wise clustering of clients in cluster-based PFL (CFL) can avoid class-specific biases by decoupling the training of classes. Thus, we propose a paradigm shift from traditional client-wise clustering to class-wise clustering, which allows effective aggregation of cluster models into a generalized one via knowledge distillation. Additionally, we extend ClassTer to asynchronous mobile clients to optimize wall clock time by leveraging critical learning periods and both intra- and inter-device scheduling. Experiments show that compared to status quo approaches, ClassTer achieves a reduction of up to 91% in convergence time, and an improvement of up to 50.45% in accuracy.
KW - Asynchronous mobile devices
KW - personalized federated learning
KW - shift-robust
UR - http://www.scopus.com/inward/record.url?scp=85208097294&partnerID=8YFLogxK
U2 - 10.1109/TMC.2024.3487294
DO - 10.1109/TMC.2024.3487294
M3 - 文章
AN - SCOPUS:85208097294
SN - 1536-1233
VL - 24
SP - 2014
EP - 2028
JO - IEEE Transactions on Mobile Computing
JF - IEEE Transactions on Mobile Computing
IS - 3
ER -