TY - JOUR
T1 - Doubly contrastive representation learning for federated image recognition
AU - Zhang, Yupei
AU - Xu, Yunan
AU - Wei, Shuangshuang
AU - Wang, Yifei
AU - Li, Yuxin
AU - Shang, Xuequn
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/7
Y1 - 2023/7
N2 - This paper focuses on the problem of personalized federated learning (FL) with the schema of contrastive learning (CL), which is to implement collaborative pattern classification by many clients. The traditional FL frameworks mostly facilitate the global model for the server and the local models for the clients to be similar, often ignoring the data heterogeneity of the clients. Aiming at achieving better performance in clients, this study introduces a personalized federated contrastive learning model, dubbed PerFCL, by proposing a new approach to doubly contrastive representation learning (DCL). Concretely, PerFCL borrows the DCL scheme, where one CL loss compares the shared parts of local models with the global model and the other CL loss compares the personalized parts of local models with the global model. To encourage the difference between the two parts, we created a double optimization problem composed of maximizing the comparison agreement for the former and minimizing the comparison agreement for the latter. We evaluated the proposed model on three publicly available data sets for federated image classification. Experiment results show that PerFCL benefits from the proposed DCL strategy and performs better than the state-of-the-art federated-learning models.
AB - This paper focuses on the problem of personalized federated learning (FL) with the schema of contrastive learning (CL), which is to implement collaborative pattern classification by many clients. The traditional FL frameworks mostly facilitate the global model for the server and the local models for the clients to be similar, often ignoring the data heterogeneity of the clients. Aiming at achieving better performance in clients, this study introduces a personalized federated contrastive learning model, dubbed PerFCL, by proposing a new approach to doubly contrastive representation learning (DCL). Concretely, PerFCL borrows the DCL scheme, where one CL loss compares the shared parts of local models with the global model and the other CL loss compares the personalized parts of local models with the global model. To encourage the difference between the two parts, we created a double optimization problem composed of maximizing the comparison agreement for the former and minimizing the comparison agreement for the latter. We evaluated the proposed model on three publicly available data sets for federated image classification. Experiment results show that PerFCL benefits from the proposed DCL strategy and performs better than the state-of-the-art federated-learning models.
KW - Doubly contrastive learning
KW - Federated machine learning
KW - Image recognition
KW - Non-IID data classification
KW - Representation learning
UR - http://www.scopus.com/inward/record.url?scp=85149795622&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2023.109507
DO - 10.1016/j.patcog.2023.109507
M3 - 文章
AN - SCOPUS:85149795622
SN - 0031-3203
VL - 139
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 109507
ER -