TY - GEN
T1 - Online Parameter Estimation Over Distributed Multitask Networks With A Rank-one Model
AU - Chen, Yitong
AU - Jin, Danqi
AU - Chen, Jie
AU - Richard, Cédric
AU - Zhang, Wen
AU - Huang, Gongping
AU - Chen, Jingdong
N1 - Publisher Copyright:
© 2024 European Signal Processing Conference, EUSIPCO. All rights reserved.
PY - 2024
Y1 - 2024
N2 - In recent years, modeling multitask relations in distributed networks has garnered considerable attention. Motivated by various practical applications, we propose a novel distributed multitask network model, termed the rank-one model, where each optimum vector to be estimated is a scaled representation of the others. To address the optimization problem with the rank-one constraint in a distributed manner, it is crucial to decouple the variables within the constraint, achieved by locally relaxing it at each node. Subsequently, local constrained distributed optimization problems are resolved using the projected gradient descent method, with the added challenge of projecting onto a non-convex rank-one space. Efficient evaluation of this projection is achieved using the computationally efficient power method. Additionally, theoretical analyses are performed on the proposed algorithm, particularly focusing on a special case of star topologies, with provided conditions ensuring stability in both the mean and mean-square senses. Finally, simulation results are presented to demonstrate the effectiveness of the proposed algorithm.
AB - In recent years, modeling multitask relations in distributed networks has garnered considerable attention. Motivated by various practical applications, we propose a novel distributed multitask network model, termed the rank-one model, where each optimum vector to be estimated is a scaled representation of the others. To address the optimization problem with the rank-one constraint in a distributed manner, it is crucial to decouple the variables within the constraint, achieved by locally relaxing it at each node. Subsequently, local constrained distributed optimization problems are resolved using the projected gradient descent method, with the added challenge of projecting onto a non-convex rank-one space. Efficient evaluation of this projection is achieved using the computationally efficient power method. Additionally, theoretical analyses are performed on the proposed algorithm, particularly focusing on a special case of star topologies, with provided conditions ensuring stability in both the mean and mean-square senses. Finally, simulation results are presented to demonstrate the effectiveness of the proposed algorithm.
KW - diffusion strategy
KW - Distributed optimization
KW - multitask
KW - power method
KW - rank-one model
UR - http://www.scopus.com/inward/record.url?scp=85208425360&partnerID=8YFLogxK
U2 - 10.23919/eusipco63174.2024.10715251
DO - 10.23919/eusipco63174.2024.10715251
M3 - 会议稿件
AN - SCOPUS:85208425360
T3 - European Signal Processing Conference
SP - 1042
EP - 1046
BT - 32nd European Signal Processing Conference, EUSIPCO 2024 - Proceedings
PB - European Signal Processing Conference, EUSIPCO
T2 - 32nd European Signal Processing Conference, EUSIPCO 2024
Y2 - 26 August 2024 through 30 August 2024
ER -